Usage¶
As a Python Module¶
To use datacube-ows in a project:
import datacube_ows
To use the stand-alone styling API:
from datacube_ows.styles.api import *
OWS Command Line Tools¶
Datacube-OWS provides two command line tools:
datacube-ows-update
which is used for creating and maintaining OWS’s database tables and views.datacube-ows-cfg
which is used for managing OWS configuration files.
datacube-ows-update¶
Manage datacube-ows range tables. Exposed on setup as datacube-ows-update
Valid invocations:
Schema/permissions/migration management
- datacube-ows-update –schema
Create (re-create) the OWS schema (including materialised views)
- datacube-ows-update –read-role role1 –read-role role2 –write-role role3
Grants read or read/write permissions to the OWS tables and views to the indicated role(s).
The –read-role and –write-role options can also be passed in combination with the –schema option described above.
Read permissions are required for the database role that the datacube-ows service uses.
Write permissions are required for the database role used to run the Data Management actions below.
(These schema management actions require higher level permissions.)
- datacube-ows-update –cleanup
Clean up (drop) any datcube-ows 1.8.x database entities.
The –cleanup option can also be passed in combination with the –schema option described above.
All of the above schema management actions can also be used with the –env or -E option:
- datacube-ows-update –cleanup –env dev
Use the “dev” environment from the ODC configuration for connecting to the database. (Defaults to env defined in OWS global config, or “default”)
Schema management functions attempt to create or modify database objects and assign permissions over those objects. They typically need to run with a very high level of database permissions - e.g. depending on the requested action and the current state of the database schema, they may need to be able to create schemas, roles and/or extensions.
Data management (updating OWS indexes)
- datacube-ows-update –views
Refresh the materialised views
- datacube-ows-update layer1 layer2 …
Update ranges for the specified LAYERS (Note that ODC product names are no longer supported)
- datacube-ows-update
Update ranges for all configured OWS layers.
Uses the DATACUBE_OWS_CFG environment variable to find the OWS config file.
datacube-ows-update [OPTIONS] [LAYERS]...
Options
- --views¶
Refresh the ODC spatio-temporal materialised views.
- --schema¶
Create or update the OWS database schema, including the spatio-temporal materialised views.
- --read-role <read_role>¶
(Only valid with –schema) Role(s) to grant read-only database permissions to
- --write-role <write_role>¶
(Only valid with –schema) Role(s) to grant both read and write/update database permissions to
- --cleanup¶
Cleanup up any datacube-ows 1.8.x tables/views
- -E, --env <env>¶
(Only valid with –schema or –read-role or –write-role or –cleanup) environment to write to.
- --version¶
Print version string and exit
Arguments
- LAYERS¶
Optional argument(s)
datacube-ows-update¶
datacube-ows-update [OPTIONS] COMMAND [ARGS]...
Options
- --version¶
Show OWS version number and exit
check¶
Check configuration files
Takes a list of configuration specifications which are each loaded and validated in turn, with each specification being interpreted as per the $DATACUBE_OWS_CFG environment variable.
If no specification is provided, the $DATACUBE_OWS_CFG environment variable is used.
datacube-ows-update check [OPTIONS] [PATHS]...
Options
- -p, --parse-only¶
Only parse the syntax of the config file - do not validate against database
- -f, --folders¶
Print the folder/layer heirarchy(ies) to stdout.
- -s, --styles¶
Print the styles for each layer to stdout (format depends on –folders flag).
- -i, --input-file <input_file>¶
Provide a file path for the input inventory json file to be compared with config file
- -o, --output-file <output_file>¶
Provide an output inventory file name with extension .json
Arguments
- PATHS¶
Optional argument(s)
compile¶
Compile completed translation files.
Takes a list of languages to generate catalogs for. “all” can be included as a shorthand for all languages listed as supported in the configuration.
datacube-ows-update compile [OPTIONS] [LANGUAGES]...
Options
- -d, --translations-dir <translations_dir>¶
Path to the output translations directory. Defaults to value from configuration
- -D, --domain <domain>¶
The domain of the translation files. Defaults to value from configuration
- -c, --cfg <cfg>¶
Configuration specification to use to determine translations directory and domain (defaults to environment $DATACUBE_OWS_CFG)
Arguments
- LANGUAGES¶
Optional argument(s)
extract¶
Extract metadata from existing configuration into a message file template.
Takes a configuration specification which is loaded as per the $DATACUBE_OWS_CFG environment variable.
If no specification is provided, the $DATACUBE_OWS_CFG environment variable is used.
datacube-ows-update extract [OPTIONS] [PATH]
Options
- -c, --cfg-only¶
Read metadata from config only - ignore configured metadata message file.
- -m, --msg-file <msg_file>¶
Write to a message file with the translatable metadata from the configuration. (Defaults to ‘messages.po’)
Arguments
- PATH¶
Optional argument
translation¶
Generate a new translations catalog based on the specified message file.
Takes a list of languages to generate catalogs for. “all” can be included as a shorthand for all languages listed as supported in the configuration.
datacube-ows-update translation [OPTIONS] [LANGUAGES]...
Options
- -n, --new¶
Create a new translation template. (Default is to update an existing one.)
- -m, --msg-file <msg_file>¶
Use this message file as the template for translation files. (defaults to message filename from configuration)
- -d, --translations-dir <translations_dir>¶
Path to the output translations directory. Defaults to value from configuration
- -D, --domain <domain>¶
The domain of the translation files. Defaults to value from configuration
- -c, --cfg <cfg>¶
Configuration specification to use to determine translations directory and domain (defaults to environment $DATACUBE_OWS_CFG)
Arguments
- LANGUAGES¶
Optional argument(s)
As a Web-Service in Docker with Layers deployed¶
Access a sample product definition. This playbook uses ALOS-PALSAR product definitions in the Digital Earth Africa deployment.
$ wget https://raw.githubusercontent.com/digitalearthafrica/config/master/products/alos_palsar_mosaic.yaml
Inject the sample product into datacube using datacube commands. These should be available in the OWS docker image.
$ datacube product add https://raw.githubusercontent.com/digitalearthafrica/config/master/products/alos_palsar_mosaic.yaml
Index all the YAML
files for a particular year of ALOS-PALSAR
using a classic Unix toolchain style,
with AWS CLI grabbing them from S3.
$ aws s3 ls s3://deafrica-data/jaxa/alos_palsar_mosaic/2017/ --recursive \
| grep yaml | awk '{print $4}' \
| xargs -n1 -I {} datacube dataset add s3://deafrica-data/{}
Index a dataset when yaml
file is not available and ONLY .json
file is available.
# How to index Sentinel-2 cogs
## Tooling
pip install --upgrade --extra-index-url="https://packages.dea.ga.gov.au" odc-apps-dc-tools odc-index datacube
## Find the files
s3-find s3://sentinel-cogs/sentinel-s2-l2a-cogs/2019/**/*.json > sentinel-cogs-2020.txt
## Tar them up
s3-to-tar sentinel-cogs-2020.txt sentinel-cogs-2020.tar
## Install the fresh indexing tools (if not already installed)
`pip install --upgrade --extra-index-url="https://packages.dea.ga.gov.au" odc-apps-dc-tools odc-index`
## And index
dc-index-from-tar --stac --product=s2_l2a < sentinel-cogs-2020.tar
Note
The next step will be superseded soon by an OWS sub-command.
Update extents of a new product or to update a product in Datacube to make it easier for OWS to create getcapabilities documents where the ows_cfg.py file is within the code directory.
$ datacube-ows-update --views
$ datacube-ows-update alos_palsar_mosaic
Deploy the Digital Earth Africa OWS config by copying to ows_cfg.py. Ideally load the config outside a docker container to iterate faster.
Run GetCapabilities via curl to ensure data is present. Perform GetMap via QGIS to ensure data is visible.
$ curl "localhost:8000/?service=wms&request=getcapabilities"