hpcflow.sdk.core.parameters.SchemaInput#
- class hpcflow.sdk.core.parameters.SchemaInput(parameter, multiple=False, labels=None, default_value=NullDefault.NULL, propagation_mode=ParameterPropagationMode.IMPLICIT, group=None)#
Bases:
SchemaParameter
A Parameter as used within a particular schema, for which a default value may be applied.
- Parameters:
parameter (Parameter | str) – The parameter (i.e. type) of this schema input.
multiple (bool) – If True, expect one or more of these parameters defined in the workflow, distinguished by a string label in square brackets. For example p1[0] for a parameter p1.
labels (dict[str, LabelInfo] | None) – Dict whose keys represent the string labels that distinguish multiple parameters if multiple is True. Use the key “*” to mean all labels not matching other label keys. If multiple is False, this will default to a single-item dict with an empty string key: {{“”: {{}}}}. If multiple is True, this will default to a single-item dict with the catch-all key: {{“*”: {{}}}}. On initialisation, remaining keyword-arguments are treated as default values for the dict values of labels.
default_value (InputValue | Any | NullDefault) – The default value for this input parameter. This is itself a default value that will be applied to all labels values if a “default_value” key does not exist.
propagation_mode (ParameterPropagationMode) – Determines how this input should propagate through the workflow. This is a default value that will be applied to all labels values if a “propagation_mode” key does not exist. By default, the input is allowed to be used in downstream tasks simply because it has a compatible type (this is the “implicit” propagation mode). Other options are “explicit”, meaning that the parameter must be explicitly specified in the downstream task input_sources for it to be used, and “never”, meaning that the parameter must not be used in downstream tasks and will be inaccessible to those tasks.
group (str | None) – Determines the name of the element group from which this input should be sourced. This is a default value that will be applied to all labels if a “group” key does not exist.
Methods
Make an instance of this class from JSON (or YAML) data.
Get descriptors for all the labels associated with this input.
Serialize this object as a dictionary.
Serialize this object as an object structure that can be trivially converted to JSON.
Attributes
The types of the input labels.
The default value of the input.
Whether this is an input or output.
The label of this input, assuming it is not mulitple.
The value of this input, assuming it is not mulitple.
The type code of this input, assuming it is not mulitple.
The schema containing this input.
The type code of the parameter.
The parameter (i.e.
Whether to expect more than of these parameters defined in the workflow.
Dict whose keys represent the string labels that distinguish multiple parameters if multiple is True.
- property default_value: InputValue | Literal[NullDefault.NULL] | None#
The default value of the input.
- classmethod from_json_like(json_like, shared_data=None)#
Make an instance of this class from JSON (or YAML) data.
- Parameters:
json_like – The data to deserialise.
shared_data – Shared context data.
- Return type:
The deserialised object.
- labelled_info()#
Get descriptors for all the labels associated with this input.
- Return type:
Iterator[LabellingDescriptor]
- labels: dict[str, LabelInfo]#
Dict whose keys represent the string labels that distinguish multiple parameters if multiple is True.
- multiple#
Whether to expect more than of these parameters defined in the workflow.
- parameter#
The parameter (i.e. type) of this schema input.
- property single_labelled_data: LabelInfo | None#
The value of this input, assuming it is not mulitple.
- property single_labelled_type: str | None#
The type code of this input, assuming it is not mulitple.
- property task_schema: TaskSchema#
The schema containing this input.
- to_json_like(dct=None, shared_data=None, exclude=(), path=None)#
Serialize this object as an object structure that can be trivially converted to JSON. Note that YAML can also be produced from the result of this method; it just requires a different final serialization step.