Command-line interface#
CLI reference documentation.
hpcflow#
hpcflow [OPTIONS] COMMAND [ARGS]...
Options
- --version#
Show the version of hpcFlow and exit.
- --hpcflow-version#
Show the version of hpcflow and exit.
- --help#
Show this message and exit.
- --run-time-info#
Print run-time information!
- --config-dir <config_dir>#
Set the configuration directory.
- --config-key <config_key>#
Set the configuration invocation key.
- --with-config <with_config>#
Override a config item in the config file
cancel#
Stop all running jobscripts of the specified workflow.
WORKFLOW_REF is the local ID (that provided by the show command}) or the workflow path.
hpcflow cancel [OPTIONS] WORKFLOW_REF
Options
- -r, --ref-type <ref_type>#
- Options:
assume-id | id | path
Arguments
- WORKFLOW_REF#
Required argument
config#
Configuration sub-command for getting and setting data in the configuration file(s).
hpcflow config [OPTIONS] COMMAND [ARGS]...
Options
- --no-callback <no_callback>#
Exclude a named get/set callback function during execution of the command.
add-scheduler#
hpcflow config add-scheduler [OPTIONS] NAME
Options
- --defaults <defaults>#
Arguments
- NAME#
Required argument
append#
Append a new value to the specified configuration item.
NAME is the dot-delimited path to the list to be appended to.
hpcflow config append [OPTIONS] NAME VALUE
Options
- --json#
Interpret VALUE as a JSON string.
Arguments
- NAME#
Required argument
- VALUE#
Required argument
get#
Show the value of the specified configuration item.
hpcflow config get [OPTIONS] NAME
Options
- --all#
Show all configuration items.
- --metadata#
Show all metadata items.
- --file#
Show the contents of the configuration file.
Arguments
- NAME#
Required argument
import#
Update the config file with keys from a YAML file.
hpcflow config import [OPTIONS] FILE_PATH
Options
- --rename, --no-rename#
Rename the currently loaded config file according to the name of the file that is being imported (default is to rename). Ignored if –new is specified.
- --new#
If True, generate a new default config, and import the file into this config. If False, modify the currently loaded config.
Arguments
- FILE_PATH#
Required argument
init#
hpcflow config init [OPTIONS] KNOWN_NAME
Options
- --path <path>#
An fsspec-compatible path in which to look for configuration-import files.
Arguments
- KNOWN_NAME#
Required argument
list#
Show a list of all configurable keys.
hpcflow config list [OPTIONS]
load-data-files#
Check we can load the data files (e.g. task schema files) as specified in the configuration.
hpcflow config load-data-files [OPTIONS]
open#
Alias for hpcflow open config: open the configuration file, or retrieve it’s path.
hpcflow config open [OPTIONS]
Options
- --path#
pop#
Remove a value from a list-like configuration item.
NAME is the dot-delimited path to the list to be modified.
hpcflow config pop [OPTIONS] NAME INDEX
Arguments
- NAME#
Required argument
- INDEX#
Required argument
prepend#
Prepend a new value to the specified configuration item.
NAME is the dot-delimited path to the list to be prepended to.
hpcflow config prepend [OPTIONS] NAME VALUE
Options
- --json#
Interpret VALUE as a JSON string.
Arguments
- NAME#
Required argument
- VALUE#
Required argument
set#
Set and save the value of the specified configuration item.
hpcflow config set [OPTIONS] NAME VALUE
Options
- --json#
Interpret VALUE as a JSON string.
Arguments
- NAME#
Required argument
- VALUE#
Required argument
unset#
Unset and save the value of the specified configuration item.
hpcflow config unset [OPTIONS] NAME
Arguments
- NAME#
Required argument
update#
Update a map-like value in the configuration.
NAME is the dot-delimited path to the map to be updated.
hpcflow config update [OPTIONS] NAME VALUE
Options
- --json#
Interpret VALUE as a JSON string.
Arguments
- NAME#
Required argument
- VALUE#
Required argument
demo-software#
hpcflow demo-software [OPTIONS] COMMAND [ARGS]...
doSomething#
hpcflow demo-software doSomething [OPTIONS]
Options
- -i1, --infile1 <infile1>#
Required
- -i2, --infile2 <infile2>#
Required
- -v, --value <value>#
- -o, --out <out>#
demo-workflow#
Interact with builtin demo workflows.
hpcflow demo-workflow [OPTIONS] COMMAND [ARGS]...
Options
- -l, --list#
Print available builtin demo workflows.
copy#
hpcflow demo-workflow copy [OPTIONS] WORKFLOW_NAME DESTINATION
Options
- --doc, --no-doc#
Arguments
- WORKFLOW_NAME#
Required argument
- DESTINATION#
Required argument
go#
hpcflow demo-workflow go [OPTIONS] WORKFLOW_NAME
Options
- --format <format>#
If specified, one of “json” or “yaml”. This forces parsing from a particular format.
- Options:
yaml | json
- --path <path>#
The directory path into which the new workflow will be generated.
- --name <name>#
The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.
- --overwrite#
If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.
- --store <store>#
The persistent store type to use.
- Options:
zarr | zip | json
- --ts-fmt <ts_fmt>#
The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.
- --ts-name-fmt <ts_name_fmt>#
The datetime format to use when generating the workflow name, where it includes a timestamp.
- --js-parallelism <js_parallelism>#
If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.
- --wait#
If True, this command will block until the workflow execution is complete.
- --add-to-known, --no-add-to-known#
If True, add this submission to the known-submissions file.
Arguments
- WORKFLOW_NAME#
Required argument
make#
hpcflow demo-workflow make [OPTIONS] WORKFLOW_NAME
Options
- --format <format>#
If specified, one of “json” or “yaml”. This forces parsing from a particular format.
- Options:
yaml | json
- --path <path>#
The directory path into which the new workflow will be generated.
- --name <name>#
The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.
- --overwrite#
If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.
- --store <store>#
The persistent store type to use.
- Options:
zarr | zip | json
- --ts-fmt <ts_fmt>#
The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.
- --ts-name-fmt <ts_name_fmt>#
The datetime format to use when generating the workflow name, where it includes a timestamp.
Arguments
- WORKFLOW_NAME#
Required argument
show#
hpcflow demo-workflow show [OPTIONS] WORKFLOW_NAME
Options
- --syntax, --no-syntax#
- --doc, --no-doc#
Arguments
- WORKFLOW_NAME#
Required argument
go#
Generate and submit a new hpcFlow workflow.
TEMPLATE_FILE_OR_STR is either a path to a template file in YAML or JSON format, or a YAML/JSON string.
hpcflow go [OPTIONS] TEMPLATE_FILE_OR_STR
Options
- --string#
Determines if passing a file path or a string.
- --format <format>#
If specified, one of “json” or “yaml”. This forces parsing from a particular format.
- Options:
yaml | json
- --path <path>#
The directory path into which the new workflow will be generated.
- --name <name>#
The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.
- --overwrite#
If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.
- --store <store>#
The persistent store type to use.
- Options:
zarr | zip | json
- --ts-fmt <ts_fmt>#
The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.
- --ts-name-fmt <ts_name_fmt>#
The datetime format to use when generating the workflow name, where it includes a timestamp.
- --js-parallelism <js_parallelism>#
If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.
- --wait#
If True, this command will block until the workflow execution is complete.
- --add-to-known, --no-add-to-known#
If True, add this submission to the known-submissions file.
Arguments
- TEMPLATE_FILE_OR_STR#
Required argument
helper#
hpcflow helper [OPTIONS] COMMAND [ARGS]...
clear#
Remove the PID file (and kill the helper process if it exists). This should not normally be needed.
hpcflow helper clear [OPTIONS]
log-path#
Get the path to the helper log file (may not exist).
hpcflow helper log-path [OPTIONS]
pid#
Get the process ID of the running helper, if running.
hpcflow helper pid [OPTIONS]
Options
- -f, --file#
restart#
Restart (or start) the helper process.
hpcflow helper restart [OPTIONS]
Options
- --timeout <timeout>#
Helper timeout in seconds.
- Default:
3600
- --timeout-check-interval <timeout_check_interval>#
Interval between testing if the timeout has been exceeded in seconds.
- Default:
60
- --watch-interval <watch_interval>#
Polling interval for watching workflows (and the workflow watch list) in seconds.
- Default:
10
run#
Run the helper functionality.
hpcflow helper run [OPTIONS]
Options
- --timeout <timeout>#
Helper timeout in seconds.
- Default:
3600
- --timeout-check-interval <timeout_check_interval>#
Interval between testing if the timeout has been exceeded in seconds.
- Default:
60
- --watch-interval <watch_interval>#
Polling interval for watching workflows (and the workflow watch list) in seconds.
- Default:
10
start#
Start the helper process.
hpcflow helper start [OPTIONS]
Options
- --timeout <timeout>#
Helper timeout in seconds.
- Default:
3600
- --timeout-check-interval <timeout_check_interval>#
Interval between testing if the timeout has been exceeded in seconds.
- Default:
60
- --watch-interval <watch_interval>#
Polling interval for watching workflows (and the workflow watch list) in seconds.
- Default:
10
stop#
Stop the helper process, if it is running.
hpcflow helper stop [OPTIONS]
uptime#
Get the uptime of the helper process, if it is running.
hpcflow helper uptime [OPTIONS]
watch-list#
Get the list of workflows currently being watched.
hpcflow helper watch-list [OPTIONS]
watch-list-path#
Get the path to the workflow watch list file (may not exist).
hpcflow helper watch-list-path [OPTIONS]
internal#
Internal CLI to be invoked by scripts generated by the app.
hpcflow internal [OPTIONS] COMMAND [ARGS]...
get-invoc-cmd#
Get the invocation command for this app instance.
hpcflow internal get-invoc-cmd [OPTIONS]
workflow#
hpcflow internal workflow [OPTIONS] PATH COMMAND [ARGS]...
Arguments
- PATH#
Required argument
get-ear-skipped#
Return 1 if the given EAR is to be skipped, else return 0.
hpcflow internal workflow PATH get-ear-skipped [OPTIONS] EAR_ID
Arguments
- EAR_ID#
Required argument
save-parameter#
hpcflow internal workflow PATH save-parameter [OPTIONS] NAME VALUE EAR_ID
CMD_IDX
Options
- --stderr#
Arguments
- NAME#
Required argument
- VALUE#
Required argument
- EAR_ID#
Required argument
- CMD_IDX#
Required argument
set-ear-end#
hpcflow internal workflow PATH set-ear-end [OPTIONS] JS_IDX JS_ACT_IDX EAR_ID
EXIT_CODE
Arguments
- JS_IDX#
Required argument
- JS_ACT_IDX#
Required argument
- EAR_ID#
Required argument
- EXIT_CODE#
Required argument
set-ear-skip#
hpcflow internal workflow PATH set-ear-skip [OPTIONS] EAR_ID
Arguments
- EAR_ID#
Required argument
set-ear-start#
hpcflow internal workflow PATH set-ear-start [OPTIONS] EAR_ID
Arguments
- EAR_ID#
Required argument
write-commands#
hpcflow internal workflow PATH write-commands [OPTIONS] SUBMISSION_IDX
JOBSCRIPT_IDX JS_ACTION_IDX
EAR_ID
Arguments
- SUBMISSION_IDX#
Required argument
- JOBSCRIPT_IDX#
Required argument
- JS_ACTION_IDX#
Required argument
- EAR_ID#
Required argument
make#
Generate a new hpcFlow workflow.
TEMPLATE_FILE_OR_STR is either a path to a template file in YAML or JSON format, or a YAML/JSON string.
hpcflow make [OPTIONS] TEMPLATE_FILE_OR_STR
Options
- --string#
Determines if passing a file path or a string.
- --format <format>#
If specified, one of “json” or “yaml”. This forces parsing from a particular format.
- Options:
yaml | json
- --path <path>#
The directory path into which the new workflow will be generated.
- --name <name>#
The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.
- --overwrite#
If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.
- --store <store>#
The persistent store type to use.
- Options:
zarr | zip | json
- --ts-fmt <ts_fmt>#
The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.
- --ts-name-fmt <ts_name_fmt>#
The datetime format to use when generating the workflow name, where it includes a timestamp.
Arguments
- TEMPLATE_FILE_OR_STR#
Required argument
manage#
Infrequent app management tasks.
App config is not loaded.
hpcflow manage [OPTIONS] COMMAND [ARGS]...
clear-known-subs#
Delete the contents of the known-submissions file.
hpcflow manage clear-known-subs [OPTIONS]
clear-temp-dir#
Delete all files in the user runtime directory.
hpcflow manage clear-temp-dir [OPTIONS]
get-config-path#
Print the config file path without loading the config.
This can be used instead of {app_name} open config –path if the config file is invalid, because this command does not load the config.
hpcflow manage get-config-path [OPTIONS]
Options
- --config-dir <config_dir>#
The directory containing the config file whose path is to be returned.
reset-config#
Reset the configuration file to defaults.
This can be used if the current configuration file is invalid.
hpcflow manage reset-config [OPTIONS]
Options
- --config-dir <config_dir>#
The directory containing the config file to be reset.
open#
Open a file (for example hpcFlow’s log file) using the default application.
hpcflow open [OPTIONS] COMMAND [ARGS]...
config#
Open the hpcFlow config file, or retrieve it’s path.
hpcflow open config [OPTIONS]
Options
- --path#
env-source#
Open a named environment sources file, or the first one.
hpcflow open env-source [OPTIONS]
Options
- --name <name>#
- --path#
known-subs#
Open the known-submissions text file.
hpcflow open known-subs [OPTIONS]
Options
- --path#
log#
Open the hpcFlow log file.
hpcflow open log [OPTIONS]
Options
- --path#
user-data-dir#
hpcflow open user-data-dir [OPTIONS]
Options
- --path#
workflow#
Open a workflow directory using, for example, File Explorer on Windows.
hpcflow open workflow [OPTIONS] WORKFLOW_REF
Options
- --path#
- -r, --ref-type <ref_type>#
- Options:
assume-id | id | path
Arguments
- WORKFLOW_REF#
Required argument
show#
Show information about running and recently active workflows.
hpcflow show [OPTIONS]
Options
- -r, --max-recent <max_recent>#
The maximum number of inactive submissions to show.
- --no-update#
If True, do not update the known-submissions file to remove workflows that are no longer running.
- -f, --full#
Allow multiple lines per workflow submission.
- --legend#
Display the legend for the show command output.
submission#
Submission-related queries.
hpcflow submission [OPTIONS] COMMAND [ARGS]...
Options
- --os-info#
Print information about the operating system.
get-known#
Print known-submissions information as a formatted Python object.
hpcflow submission get-known [OPTIONS]
Options
- --json#
Do not format and only show JSON-compatible information.
scheduler#
hpcflow submission scheduler [OPTIONS] SCHEDULER_NAME COMMAND [ARGS]...
Arguments
- SCHEDULER_NAME#
Required argument
get-login-nodes#
hpcflow submission scheduler SCHEDULER_NAME get-login-nodes
[OPTIONS]
shell-info#
Show information about the specified shell, such as the version.
hpcflow submission shell-info [OPTIONS] {bash|powershell|wsl+bash|wsl}
Options
- --exclude-os#
Arguments
- SHELL_NAME#
Required argument
tc#
For showing template component data.
hpcflow tc [OPTIONS]
test#
Run hpcFlow test suite.
PY_TEST_ARGS are arguments passed on to Pytest.
hpcflow test [OPTIONS] [PY_TEST_ARGS]...
Arguments
- PY_TEST_ARGS#
Optional argument(s)
workflow#
Interact with existing hpcFlow workflows.
WORKFLOW_REF is the path to, or local ID of, an existing workflow.
hpcflow workflow [OPTIONS] WORKFLOW_REF COMMAND [ARGS]...
Options
- -r, --ref-type <ref_type>#
- Options:
assume-id | id | path
Arguments
- WORKFLOW_REF#
Required argument
abort-run#
Abort the specified run.
hpcflow workflow WORKFLOW_REF abort-run [OPTIONS]
Options
- --submission <submission>#
- --task <task>#
- --element <element>#
get-all-params#
Get all parameter values.
hpcflow workflow WORKFLOW_REF get-all-params [OPTIONS]
get-param#
Get a parameter value by data index.
hpcflow workflow WORKFLOW_REF get-param [OPTIONS] INDEX
Arguments
- INDEX#
Required argument
get-param-source#
Get a parameter source by data index.
hpcflow workflow WORKFLOW_REF get-param-source [OPTIONS] INDEX
Arguments
- INDEX#
Required argument
is-param-set#
Check if a parameter specified by data index is set.
hpcflow workflow WORKFLOW_REF is-param-set [OPTIONS] INDEX
Arguments
- INDEX#
Required argument
show-all-status#
Show the submission status of all workflow EARs.
hpcflow workflow WORKFLOW_REF show-all-status [OPTIONS]
sub#
Interact with existing hpcFlow workflow submissions.
SUB_IDX is the submission index.
hpcflow workflow WORKFLOW_REF sub [OPTIONS] SUB_IDX COMMAND [ARGS]...
Arguments
- SUB_IDX#
Required argument
js#
Interact with existing hpcFlow workflow submission jobscripts.
JS_IDX is the jobscript index within the submission object.
hpcflow workflow WORKFLOW_REF sub SUB_IDX js [OPTIONS] JS_IDX COMMAND
[ARGS]...
Arguments
- JS_IDX#
Required argument
deps#
Get jobscript dependencies.
hpcflow workflow WORKFLOW_REF sub SUB_IDX js JS_IDX deps [OPTIONS]
path#
Get the file path to the jobscript.
hpcflow workflow WORKFLOW_REF sub SUB_IDX js JS_IDX path [OPTIONS]
res#
Get resources associated with this jobscript.
hpcflow workflow WORKFLOW_REF sub SUB_IDX js JS_IDX res [OPTIONS]
show#
Show the jobscript file.
hpcflow workflow WORKFLOW_REF sub SUB_IDX js JS_IDX show [OPTIONS]
needs-submit#
Check if this submission needs submitting.
hpcflow workflow WORKFLOW_REF sub SUB_IDX needs-submit [OPTIONS]
outstanding-js#
Get a list of jobscript indices that have not yet been submitted.
hpcflow workflow WORKFLOW_REF sub SUB_IDX outstanding-js [OPTIONS]
status#
Get the submission status.
hpcflow workflow WORKFLOW_REF sub SUB_IDX status [OPTIONS]
submitted-js#
Get a list of jobscript indices that have been submitted.
hpcflow workflow WORKFLOW_REF sub SUB_IDX submitted-js [OPTIONS]
submit#
Submit the workflow.
hpcflow workflow WORKFLOW_REF submit [OPTIONS]
Options
- --js-parallelism <js_parallelism>#
If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.
- --wait#
If True, this command will block until the workflow execution is complete.
- --add-to-known, --no-add-to-known#
If True, add this submission to the known-submissions file.
wait#
hpcflow workflow WORKFLOW_REF wait [OPTIONS]
Options
- -j, --jobscripts <jobscripts>#
Wait for only these jobscripts to finish. Jobscripts should be specified by their submission index, followed by a colon, followed by a comma-separated list of jobscript indices within that submission (no spaces are allowed). To specify jobscripts across multiple submissions, use a semicolon to separate patterns like these.
zip#
hpcflow workflow WORKFLOW_REF zip [OPTIONS]
Options
- --log <log>#
zip#
Generate a copy of the specified workflow in the zip file format.
WORKFLOW_REF is the local ID (that provided by the show command}) or the workflow path.
hpcflow zip [OPTIONS] WORKFLOW_REF
Options
- --log <log>#
- -r, --ref-type <ref_type>#
- Options:
assume-id | id | path
Arguments
- WORKFLOW_REF#
Required argument