hpcflow.sdk.persistence.json.JSONPersistentStore#
- class hpcflow.sdk.persistence.json.JSONPersistentStore(app, workflow, path, fs)#
Bases:
PersistentStore
Methods
Add a new EAR to an element iteration.
Add a new element to a task.
Add a new iteration to an element.
Add a new loop to the workflow.
Add a new submission.
Add a new task to the workflow.
Context manager for using the persistent element/iteration/run cache.
Context manager to cache the metadata.
For each parameter ID, return True if it exists, else False
Copy the workflow store.
Delete the persistent workflow.
Permanently delete the workflow data with no confirmation.
Retrieve all loops, including pending.
Retrieve loops by index (ID), including pending.
- param kwargs:
Retrieve all submissions, including pending.
Get element data by an indices within a given task.
Retrieve all tasks, including pending.
Get all template components, including pending.
Generate an store for testing purposes.
Generate a valid store from a specification in terms of nested elements/iterations/EARs.
Try very hard to delete a directory or file.
Revert the replaced workflow path to its original name.
Commit pending changes to disk, if not in batch-update mode.
Context manager for managing StoreResource objects associated with the store.
Attributes
Cache for persistent EARs.
Cache for persistent elements.
Cache for persistent element iterations.
Does this store support workflow submission?
Cache for number of persistent tasks.
Cache for persistent parameter sources.
Cache for persistent parameters.
Cache for persistent tasks.
- property EAR_cache#
Cache for persistent EARs.
- add_EAR(elem_iter_ID, action_idx, commands_idx, data_idx, metadata, save=True)#
Add a new EAR to an element iteration.
- add_element(task_ID, es_idx, seq_idx, src_idx, save=True)#
Add a new element to a task.
- add_element_iteration(element_ID, data_idx, schema_parameters, loop_idx=None, save=True)#
Add a new iteration to an element.
- add_element_set(task_id, es_js, save=True)#
- add_file(store_contents, is_input, source, path=None, contents=None, filename=None, save=True)#
- add_loop(loop_template, iterable_parameters, iter_IDs, save=True)#
Add a new loop to the workflow.
- add_set_parameter(data, source, save=True)#
- add_submission(sub_idx, sub_js, save=True)#
Add a new submission.
- add_submission_part(sub_idx, dt_str, submitted_js_idx, save=True)#
- add_task(idx, task_template, save=True)#
Add a new task to the workflow.
- add_template_components(temp_comps, save=True)#
- cache_ctx()#
Context manager for using the persistent element/iteration/run cache.
- check_parameters_exist(id_lst)#
For each parameter ID, return True if it exists, else False
- copy(path=None)#
Copy the workflow store.
This does not work on remote filesystems.
- Return type:
- delete()#
Delete the persistent workflow.
- Return type:
None
- delete_no_confirm()#
Permanently delete the workflow data with no confirmation.
- Return type:
None
- property element_cache#
Cache for persistent elements.
- property element_iter_cache#
Cache for persistent element iterations.
- get_creation_info()#
- get_element_iterations(id_lst)#
- get_loops_by_IDs(id_lst)#
Retrieve loops by index (ID), including pending.
- get_name()#
- get_parameters(id_lst, **kwargs)#
- get_task_elements(task_id, idx_lst=None)#
Get element data by an indices within a given task.
Element iterations and EARs belonging to the elements are included.
- get_ts_fmt()#
- get_ts_name_fmt()#
- property has_pending#
- property is_submittable#
Does this store support workflow submission?
- property logger#
- classmethod make_test_store_from_spec(app, spec, dir=None, path='test_store.json', overwrite=False)#
Generate an store for testing purposes.
- property num_tasks_cache#
Cache for number of persistent tasks.
- property param_sources_cache#
Cache for persistent parameter sources.
- property parameter_cache#
Cache for persistent parameters.
- static prepare_test_store_from_spec(task_spec)#
Generate a valid store from a specification in terms of nested elements/iterations/EARs.
- reinstate_replaced_dir()#
- Return type:
None
- remove_path(path, fs)#
Try very hard to delete a directory or file.
Dropbox (on Windows, at least) seems to try to re-sync files if the parent directory is deleted soon after creation, which is the case on a failed workflow creation (e.g. missing inputs), so in addition to catching PermissionErrors generated when Dropbox has a lock on files, we repeatedly try deleting the directory tree.
- Parameters:
path (str) –
- Return type:
None
- remove_replaced_dir()#
- Return type:
None
- rename_path(replaced, original, fs)#
Revert the replaced workflow path to its original name.
This happens when new workflow creation fails and there is an existing workflow with the same name; the original workflow which was renamed, must be reverted.
- save()#
Commit pending changes to disk, if not in batch-update mode.
- set_EAR_end(EAR_ID, exit_code, success, save=True)#
- set_EAR_submission_index(EAR_ID, sub_idx, save=True)#
- set_EARs_initialised(iter_ID, save=True)#
- set_file(store_contents, is_input, param_id=None, path=None, contents=None, filename=None, clean_up=False, save=True)#
- set_jobscript_metadata(sub_idx, js_idx, version_info=None, submit_time=None, submit_hostname=None, submit_machine=None, submit_cmdline=None, os_name=None, shell_name=None, scheduler_name=None, scheduler_job_ID=None, process_ID=None, save=True)#
- Parameters:
- set_parameter_value(param_id, value, is_file=False, save=True)#
- property task_cache#
Cache for persistent tasks.
- update_loop_num_iters(index, num_iters, save=True)#
- update_param_source(param_id, source, save=True)#
- property use_cache#
- using_resource(res_label, action)#
Context manager for managing StoreResource objects associated with the store.
- classmethod write_empty_workflow(app, template_js, template_components_js, wk_path, fs, name, replaced_wk, creation_info, ts_fmt, ts_name_fmt)#