hpcflow SDK#

Subpackages#

Submodules#

hpcflow.sdk.api module#

API functions, which are dynamically added to the BaseApp class on __init__

hpcflow.sdk.api.get_OS_info(app)#

Get information about the operating system.

hpcflow.sdk.api.get_shell_info(app, shell_name, exclude_os=False)#

Get information about a given shell and the operating system.

Parameters:
  • shell_name (str) – One of the supported shell names.

  • exclude_os (bool | None) – If True, exclude operating system information.

hpcflow.sdk.api.make_and_submit_workflow(app, template_file_or_str, is_string=False, template_format='yaml', path=None, name=None, overwrite=False, store='zarr', ts_fmt=None, ts_name_fmt=None, JS_parallelism=None)#

Generate and submit a new {app_name} workflow from a file or string containing a workflow template parametrisation.

Parameters:
  • template_path_or_str – Either a path to a template file in YAML or JSON format, or a YAML/JSON string.

  • is_string (Optional[bool]) – Determines whether template_path_or_str is a string or a file.

  • template_format (Optional[str]) – If specified, one of “json” or “yaml”. This forces parsing from a particular format.

  • path (Optional[PathLike]) – The directory in which the workflow will be generated. The current directory if not specified.

  • name (Optional[str]) – The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the WorkflowTemplate name will be used, in combination with a date-timestamp.

  • overwrite (Optional[bool]) – If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.

  • store (Optional[str]) – The persistent store to use for this workflow.

  • ts_fmt (Optional[str]) – The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.

  • ts_name_fmt (Optional[str]) – The datetime format to use when generating the workflow name, where it includes a timestamp.

  • JS_parallelism (Optional[bool]) – If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.

  • app (App) –

  • template_file_or_str (Union[PathLike, str]) –

hpcflow.sdk.api.make_workflow(app, template_file_or_str, is_string=False, template_format='yaml', path=None, name=None, overwrite=False, store='zarr', ts_fmt=None, ts_name_fmt=None)#

Generate a new {app_name} workflow from a file or string containing a workflow template parametrisation.

Parameters:
  • template_path_or_str – Either a path to a template file in YAML or JSON format, or a YAML/JSON string.

  • is_string (Optional[bool]) – Determines if passing a file path or a string.

  • template_format (Optional[str]) – If specified, one of “json” or “yaml”. This forces parsing from a particular format.

  • path (Optional[PathLike]) – The directory in which the workflow will be generated. The current directory if not specified.

  • name (Optional[str]) – The name of the workflow. If specified, the workflow directory will be path joined with name. If not specified the workflow template name will be used, in combination with a date-timestamp.

  • overwrite (Optional[bool]) – If True and the workflow directory (path + name) already exists, the existing directory will be overwritten.

  • store (Optional[str]) – The persistent store type to use.

  • ts_fmt (Optional[str]) – The datetime format to use for storing datetimes. Datetimes are always stored in UTC (because Numpy does not store time zone info), so this should not include a time zone name.

  • ts_name_fmt (Optional[str]) – The datetime format to use when generating the workflow name, where it includes a timestamp.

  • app (App) –

  • template_file_or_str (Union[PathLike, str]) –

Return type:

Workflow

hpcflow.sdk.api.run_hpcflow_tests(app, *args)#

Run hpcflow test suite. This function is only available from derived apps.

Notes

It may not be possible to run hpcflow tests after/before running tests of the derived app within the same process, due to caching.

hpcflow.sdk.api.run_tests(app, *args)#

Run {app_name} test suite.

hpcflow.sdk.api.submit_workflow(app, workflow_path, JS_parallelism=None)#

Submit an existing {app_name} workflow.

Parameters:
  • workflow_path (PathLike) – Path to an existing workflow

  • JS_parallelism (Optional[bool]) – If True, allow multiple jobscripts to execute simultaneously. Raises if set to True but the store type does not support the jobscript_parallelism feature. If not set, jobscript parallelism will be used if the store type supports it.

  • app (App) –

hpcflow.sdk.app module#

An hpcflow application.

class hpcflow.sdk.app.App(name, version, description, config_options, scripts_dir, template_components=None, pytest_args=None, package_name=None)#

Bases: BaseApp

Class to generate an hpcflow application (e.g. MatFlow)

Parameters:

template_components (Dict) –

class hpcflow.sdk.app.BaseApp(name, version, description, config_options, scripts_dir, template_components=None, pytest_args=None, package_name=None)#

Bases: object

Class to generate the base hpcflow application.

Parameters:

template_components (Dict) –

property API_logger#
property CLI_logger#
property command_files#
property config#
property config_logger#
property envs#
get_info()#
Return type:

Dict

inject_into(cls)#
property is_config_loaded#
property is_template_components_loaded#
classmethod load_builtin_template_component_data(package)#
load_config(config_dir=None, config_invocation_key=None, **overrides)#
load_template_components(warn=True)#
property logger#
property parameters#
reload_config(config_dir=None, config_invocation_key=None, **overrides)#
reload_template_components(warn=True)#
property runtime_info_logger#
property scripts#
property task_schemas#
property template_components#
template_components_from_json_like(json_like)#

hpcflow.sdk.log module#

class hpcflow.sdk.log.AppLog(app, log_console_level=None)#

Bases: object

DEFAULT_LOG_CONSOLE_LEVEL = 'WARNING'#
DEFAULT_LOG_FILE_LEVEL = 'INFO'#
add_file_logger(path, level=None, fmt=None, max_bytes=None)#
update_console_level(new_level)#

hpcflow.sdk.runtime module#

class hpcflow.sdk.runtime.RunTimeInfo(name, package_name, version, logger)#

Bases: PrettyPrinter

Get useful run-time information, including the executable name used to invoke the CLI, in the case a PyInstaller-built executable was used.

sys_prefix#

From sys.prefix. If running in a virtual environment, this will point to the environment directory. If not running in a virtual environment, this will point to the Python installation root.

Type:

str

sys_base_prefix#

From sys.base_prefix. This will be equal to sys_prefix (sys.prefix) if not running within a virtual environment. However, if running within a virtual environment, this will be the Python installation directory, and sys_prefix will be equal to the virtual environment directory.

Type:

str

get_activate_env_command()#
get_deactivate_env_command()#
get_invocation_command()#

Get the command that was used to invoke this instance of the app.

hpcflow.sdk.typing module#

Module contents#

Module to define an extensible hpcFlow application class.

hpcflow.sdk.get_SDK_logger(name=None)#

Get a logger with prefix of “hpcflow_sdk” instead of “hpcflow.sdk” to ensure the handlers of the SDK logger and app logger are distinct.