Command-line interface#

CLI reference documentation.

hpcflow#

hpcflow [OPTIONS] COMMAND [ARGS]...

Options

--version#

Show the version of hpcFlow and exit.

--hpcflow-version#

Show the version of hpcflow and exit.

--help#

Show this message and exit.

--run-time-info#

Print run-time information!

--clear-known-subs#

Delete the contents of the known-submissions file

--config-dir <config_dir>#

Set the configuration directory.

--config-invocation-key <config_invocation_key>#

Set the configuration invocation key.

--with-config <with_config>#

Override a config item in the config file

cancel#

hpcflow cancel [OPTIONS] WORKFLOW_REF

Options

-r, --ref-type <ref_type>#
Options:

assume-id | id | path

Arguments

WORKFLOW_REF#

Required argument

config#

Configuration sub-command for getting and setting data in the configuration file(s).

hpcflow config [OPTIONS] COMMAND [ARGS]...

Options

--invocation <invocation>#

append#

Append a new value to the specified configuration item.

hpcflow config append [OPTIONS] NAME VALUE

Arguments

NAME#

Required argument

VALUE#

Required argument

get#

Show the value of the specified configuration item.

hpcflow config get [OPTIONS] NAME

Options

--all#

Show all configuration items.

--file#

Show the contents of the configuration file.

Arguments

NAME#

Required argument

list#

Show a list of all configurable keys.

hpcflow config list [OPTIONS]

load-data-files#

Check we can load the data files (e.g. task schema files) as specified in the configuration.

hpcflow config load-data-files [OPTIONS]

pop#

Remove a value from a list-like configuration item.

hpcflow config pop [OPTIONS] NAME INDEX

Arguments

NAME#

Required argument

INDEX#

Required argument

prepend#

Append a new value to the specified configuration item.

hpcflow config prepend [OPTIONS] NAME VALUE

Arguments

NAME#

Required argument

VALUE#

Required argument

set#

Set and save the value of the specified configuration item.

hpcflow config set [OPTIONS] NAME VALUE

Options

--json#

Interpret VALUE as a JSON string.

Arguments

NAME#

Required argument

VALUE#

Required argument

unset#

Unset and save the value of the specified configuration item.

hpcflow config unset [OPTIONS] NAME

Arguments

NAME#

Required argument

demo-software#

hpcflow demo-software [OPTIONS] COMMAND [ARGS]...

doSomething#

hpcflow demo-software doSomething [OPTIONS]

Options

-i1, --infile1 <infile1>#

Required

-i2, --infile2 <infile2>#

Required

-v, --value <value>#
-o, --out <out>#

go#

Generate and submit a new hpcFlow workflow.

TEMPLATE_FILE_OR_STR is either a path to a template file in YAML or JSON format, or a YAML/JSON string.

hpcflow go [OPTIONS] TEMPLATE_FILE_OR_STR

Options

--string#

Determines if passing a file path or a string.

--format <format>#

If specified, one of “json” or “yaml”. This forces parsing from a particular format.

Options:

yaml | json

--path <path>#

The directory path into which the new workflow will be generated.

--name <name>#

The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.

--overwrite#

If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.

--store <store>#

The persistent store type to use.

Options:

zarr | zip | json

--ts-fmt <ts_fmt>#

The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.

--ts-name-fmt <ts_name_fmt>#

The datetime format to use when generating the workflow name, where it includes a timestamp.

--js-parallelism <js_parallelism>#

If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.

--wait#

If True, this command will block until the workflow execution is complete.

Arguments

TEMPLATE_FILE_OR_STR#

Required argument

helper#

hpcflow helper [OPTIONS] COMMAND [ARGS]...

clear#

Remove the PID file (and kill the helper process if it exists). This should not normally be needed.

hpcflow helper clear [OPTIONS]

log-path#

Get the path to the helper log file (may not exist).

hpcflow helper log-path [OPTIONS]

pid#

Get the process ID of the running helper, if running.

hpcflow helper pid [OPTIONS]

Options

-f, --file#

restart#

Restart (or start) the helper process.

hpcflow helper restart [OPTIONS]

Options

--timeout <timeout>#

Helper timeout in seconds.

Default:

3600

--timeout-check-interval <timeout_check_interval>#

Interval between testing if the timeout has been exceeded in seconds.

Default:

60

--watch-interval <watch_interval>#

Polling interval for watching workflows (and the workflow watch list) in seconds.

Default:

10

run#

Run the helper functionality.

hpcflow helper run [OPTIONS]

Options

--timeout <timeout>#

Helper timeout in seconds.

Default:

3600

--timeout-check-interval <timeout_check_interval>#

Interval between testing if the timeout has been exceeded in seconds.

Default:

60

--watch-interval <watch_interval>#

Polling interval for watching workflows (and the workflow watch list) in seconds.

Default:

10

start#

Start the helper process.

hpcflow helper start [OPTIONS]

Options

--timeout <timeout>#

Helper timeout in seconds.

Default:

3600

--timeout-check-interval <timeout_check_interval>#

Interval between testing if the timeout has been exceeded in seconds.

Default:

60

--watch-interval <watch_interval>#

Polling interval for watching workflows (and the workflow watch list) in seconds.

Default:

10

stop#

Stop the helper process, if it is running.

hpcflow helper stop [OPTIONS]

uptime#

Get the uptime of the helper process, if it is running.

hpcflow helper uptime [OPTIONS]

watch-list#

Get the list of workflows currently being watched.

hpcflow helper watch-list [OPTIONS]

watch-list-path#

Get the path to the workflow watch list file (may not exist).

hpcflow helper watch-list-path [OPTIONS]

internal#

Internal CLI to be invoked by scripts generated by the app.

hpcflow internal [OPTIONS] COMMAND [ARGS]...

workflow#

hpcflow internal workflow [OPTIONS] PATH COMMAND [ARGS]...

Arguments

PATH#

Required argument

get-ear-skipped#

Return 1 if the given EAR is to be skipped, else return 0.

hpcflow internal workflow PATH get-ear-skipped [OPTIONS] EAR_ID

Arguments

EAR_ID#

Required argument

launch-direct-win-js#

fuck me.

hpcflow internal workflow PATH launch-direct-win-js [OPTIONS] SUBMISSION_IDX
                                                    JOBSCRIPT_IDX

Arguments

SUBMISSION_IDX#

Required argument

JOBSCRIPT_IDX#

Required argument

save-parameter#
hpcflow internal workflow PATH save-parameter [OPTIONS] NAME VALUE EAR_ID

Arguments

NAME#

Required argument

VALUE#

Required argument

EAR_ID#

Required argument

set-ear-end#
hpcflow internal workflow PATH set-ear-end [OPTIONS] EAR_ID EXIT_CODE

Arguments

EAR_ID#

Required argument

EXIT_CODE#

Required argument

set-ear-skip#
hpcflow internal workflow PATH set-ear-skip [OPTIONS] EAR_ID

Arguments

EAR_ID#

Required argument

set-ear-start#
hpcflow internal workflow PATH set-ear-start [OPTIONS] EAR_ID

Arguments

EAR_ID#

Required argument

write-commands#
hpcflow internal workflow PATH write-commands [OPTIONS] SUBMISSION_IDX
                                              JOBSCRIPT_IDX JS_ACTION_IDX
                                              EAR_ID

Arguments

SUBMISSION_IDX#

Required argument

JOBSCRIPT_IDX#

Required argument

JS_ACTION_IDX#

Required argument

EAR_ID#

Required argument

make#

Generate a new hpcFlow workflow.

TEMPLATE_FILE_OR_STR is either a path to a template file in YAML or JSON format, or a YAML/JSON string.

hpcflow make [OPTIONS] TEMPLATE_FILE_OR_STR

Options

--string#

Determines if passing a file path or a string.

--format <format>#

If specified, one of “json” or “yaml”. This forces parsing from a particular format.

Options:

yaml | json

--path <path>#

The directory path into which the new workflow will be generated.

--name <name>#

The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.

--overwrite#

If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.

--store <store>#

The persistent store type to use.

Options:

zarr | zip | json

--ts-fmt <ts_fmt>#

The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.

--ts-name-fmt <ts_name_fmt>#

The datetime format to use when generating the workflow name, where it includes a timestamp.

Arguments

TEMPLATE_FILE_OR_STR#

Required argument

open#

Open a file (e.g. {app_name}’s log file) using the default application.

hpcflow open [OPTIONS] COMMAND [ARGS]...

config#

Open the {app_name} config file.

hpcflow open config [OPTIONS]

Options

--path#

env-source#

Open a named environment sources file, or the first one.

hpcflow open env-source [OPTIONS]

Options

--name <name>#
--path#

known-subs#

Open the known-submissions text file.

hpcflow open known-subs [OPTIONS]

Options

--path#

log#

Open the {app_name} log file.

hpcflow open log [OPTIONS]

Options

--path#

user-data-dir#

hpcflow open user-data-dir [OPTIONS]

Options

--path#

workflow#

Open a workflow directory in e.g. File Explorer in Windows.

hpcflow open workflow [OPTIONS] WORKFLOW_REF

Options

--path#
-r, --ref-type <ref_type>#
Options:

assume-id | id | path

Arguments

WORKFLOW_REF#

Required argument

show#

Show information about recent workflows.

hpcflow show [OPTIONS]

Options

--max-recent <max_recent>#

If True, show only recently finished workflows

--no-update#

If True, do not update the known-submissions file to remove workflows that are no longer running.

-f, --full#

Allow multiple lines per workflow submission.

--legend#

Display the legend for the show command output.

submission#

Submission-related queries.

hpcflow submission [OPTIONS] COMMAND [ARGS]...

Options

--os-info#

Print information about the operating system.

shell-info#

hpcflow submission shell-info [OPTIONS] {bash|powershell|wsl+bash|wsl}

Options

--exclude-os#

Arguments

SHELL_NAME#

Required argument

tc#

For showing template component data.

hpcflow tc [OPTIONS]

test#

Run hpcFlow test suite.

PY_TEST_ARGS are arguments passed on to Pytest.

hpcflow test [OPTIONS] [PY_TEST_ARGS]...

Arguments

PY_TEST_ARGS#

Optional argument(s)

test-hpcflow#

Run hpcflow test suite.”.

PY_TEST_ARGS are arguments passed on to Pytest.

hpcflow test-hpcflow [OPTIONS] [PY_TEST_ARGS]...

Arguments

PY_TEST_ARGS#

Optional argument(s)

workflow#

Interact with existing hpcFlow workflows.

WORKFLOW_PATH is the path to an existing workflow.

hpcflow workflow [OPTIONS] WORKFLOW_PATH COMMAND [ARGS]...

Arguments

WORKFLOW_PATH#

Required argument

get-all-params#

Get all parameter values.

hpcflow workflow WORKFLOW_PATH get-all-params [OPTIONS]

get-param#

Get a parameter value by data index.

hpcflow workflow WORKFLOW_PATH get-param [OPTIONS] INDEX

Arguments

INDEX#

Required argument

get-param-source#

Get a parameter source by data index.

hpcflow workflow WORKFLOW_PATH get-param-source [OPTIONS] INDEX

Arguments

INDEX#

Required argument

is-param-set#

Check if a parameter specified by data index is set.

hpcflow workflow WORKFLOW_PATH is-param-set [OPTIONS] INDEX

Arguments

INDEX#

Required argument

show-all-status#

Show the submission status of all workflow EARs.

hpcflow workflow WORKFLOW_PATH show-all-status [OPTIONS]

sub#

Interact with existing hpcFlow workflow submissions.

SUB_IDX is the submission index.

hpcflow workflow WORKFLOW_PATH sub [OPTIONS] SUB_IDX COMMAND [ARGS]...

Arguments

SUB_IDX#

Required argument

js#

Interact with existing hpcFlow workflow submission jobscripts.

JS_IDX is the jobscript index within the submission object.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX js [OPTIONS] JS_IDX COMMAND
                                              [ARGS]...

Arguments

JS_IDX#

Required argument

deps#

Get jobscript dependencies.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX js JS_IDX deps [OPTIONS]
path#

Get the file path to the jobscript.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX js JS_IDX path [OPTIONS]
res#

Get resources associated with this jobscript.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX js JS_IDX res [OPTIONS]
show#

Show the jobscript file.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX js JS_IDX show [OPTIONS]
needs-submit#

Check if this submission needs submitting.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX needs-submit [OPTIONS]
outstanding-js#

Get a list of jobscript indices that have not yet been submitted.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX outstanding-js [OPTIONS]
status#

Get the submission status.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX status [OPTIONS]
submitted-js#

Get a list of jobscript indices that have been submitted.

hpcflow workflow WORKFLOW_PATH sub SUB_IDX submitted-js [OPTIONS]

submit#

Submit the workflow.

hpcflow workflow WORKFLOW_PATH submit [OPTIONS]

Options

--js-parallelism <js_parallelism>#

If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.

--wait#

If True, this command will block until the workflow execution is complete.

wait#

hpcflow workflow WORKFLOW_PATH wait [OPTIONS] SUB_JS_IDX

Arguments

SUB_JS_IDX#

Required argument

zip#

hpcflow zip [OPTIONS] WORKFLOW_REF

Options

--log <log>#
-r, --ref-type <ref_type>#
Options:

assume-id | id | path

Arguments

WORKFLOW_REF#

Required argument