hpcflow.app.Submission#
- class hpcflow.app.Submission(index, jobscripts, workflow=None, submission_parts=None, JS_parallelism=None, environments=None)#
Bases:
Submission
A collection of jobscripts to be submitted to a scheduler.
- Parameters:
index (int) – The index of this submission.
jobscripts (list[Jobscript]) – The jobscripts in the submission.
workflow (Workflow) – The workflow this is part of.
submission_parts (dict) – Description of submission parts.
JS_parallelism (bool) – Whether to exploit jobscript parallelism.
environments (EnvironmentsList) – The execution environments to use.
Methods
Cancel the active jobs for this submission's jobscripts.
Make an instance of this class from JSON (or YAML) data.
Get jobscripts that are active on this machine, and their active states.
Get the end time of a given submission part.
Get the start time of a given submission part.
Get unique schedulers and which of this submission's jobscripts they correspond to.
Get unique schedulers and which of the passed jobscripts they correspond to.
Get unique shells and which jobscripts they correspond to.
Generate and submit the jobscripts of this submission.
Serialize this object as a dictionary.
Serialize this object as an object structure that can be trivially converted to JSON.
Attributes
All EARs in this submission, grouped by element.
Whether to exploit jobscript parallelism.
The name of a file describing what EARs have aborted.
The path to the file describing what EARs have aborted in this submission.
The IDs of all EARs in this submission.
All EARs in this this submission.
Get the final non-None end time over all submission parts.
The execution environments to use.
The index of this submission.
All associated jobscript indices.
The jobscripts in this submission.
Whether this submission needs a submit to be done.
Jobscript indices that have not yet been successfully submitted.
The path to files associated with this submission.
Get the first non-None start time over all submission parts.
The status of this submission.
Description of the parts of this submission.
Jobscript indices that have been successfully submitted.
The workflow this is part of.
- property EARs_by_elements#
All EARs in this submission, grouped by element.
- property JS_parallelism#
Whether to exploit jobscript parallelism.
- property abort_EARs_file_name#
The name of a file describing what EARs have aborted.
- property abort_EARs_file_path#
The path to the file describing what EARs have aborted in this submission.
- property all_EAR_IDs#
The IDs of all EARs in this submission.
- property all_EARs#
All EARs in this this submission.
- app = BaseApp(name='hpcFlow', version='0.2.0a181')#
- cancel()#
Cancel the active jobs for this submission’s jobscripts.
- property end_time#
Get the final non-None end time over all submission parts.
- property environments: EnvironmentsList#
The execution environments to use.
- classmethod from_json_like(json_like, shared_data=None)#
Make an instance of this class from JSON (or YAML) data.
- Parameters:
json_like (Union[Dict, List]) – The data to deserialise.
shared_data (Optional[Dict[str, ObjectList]]) – Shared context data.
- Return type:
The deserialised object.
- get_active_jobscripts(as_json=False)#
Get jobscripts that are active on this machine, and their active states.
- Parameters:
as_json (bool) –
- Return type:
List[Tuple[int, Dict[int, JobscriptElementState]]]
- get_end_time(submit_time)#
Get the end time of a given submission part.
- Parameters:
submit_time (str) –
- Return type:
Union[datetime, None]
- get_start_time(submit_time)#
Get the start time of a given submission part.
- Parameters:
submit_time (str) –
- Return type:
Union[datetime, None]
- get_unique_schedulers()#
Get unique schedulers and which of this submission’s jobscripts they correspond to.
- static get_unique_schedulers_of_jobscripts(jobscripts)#
Get unique schedulers and which of the passed jobscripts they correspond to.
Uniqueness is determines only by the Scheduler.unique_properties tuple.
- get_unique_shells()#
Get unique shells and which jobscripts they correspond to.
- property needs_submit#
Whether this submission needs a submit to be done.
- property outstanding_jobscripts: Tuple[int]#
Jobscript indices that have not yet been successfully submitted.
- property path#
The path to files associated with this submission.
- property start_time#
Get the first non-None start time over all submission parts.
- property status#
The status of this submission.
- submit(status, ignore_errors=False, print_stdout=False, add_to_known=True)#
Generate and submit the jobscripts of this submission.
- to_dict()#
Serialize this object as a dictionary.
- to_json_like(dct=None, shared_data=None, exclude=None, path=None)#
Serialize this object as an object structure that can be trivially converted to JSON. Note that YAML can also be produced from the result of this method; it just requires a different final serialization step.