API Reference
koheesio.BaseModel #
Base model for all models.
Extends pydantic BaseModel with some additional configuration. To be used as a base class for all models in Koheesio instead of pydantic.BaseModel.
Additional methods and properties:
Fields#
Every Koheesio BaseModel has two predefined fields: name
and description
. These fields are used to provide a
name and a description to the model.
-
name
: This is the name of the Model. If not provided, it defaults to the class name. -
description
: This is the description of the Model. It has several default behaviors:- If not provided, it defaults to the docstring of the class.
- If the docstring is not provided, it defaults to the name of the class.
- For multi-line descriptions, it has the following behaviors:
- Only the first non-empty line is used.
- Empty lines are removed.
- Only the first 3 lines are considered.
- Only the first 120 characters are considered.
Validators#
_set_name_and_description
: Set the name and description of the Model as per the rules mentioned above.
Properties#
log
: Returns a logger with the name of the class.
Class Methods#
from_basemodel
: Returns a new BaseModel instance based on the data of another BaseModel.from_context
: Creates BaseModel instance from a given Context.from_dict
: Creates BaseModel instance from a given dictionary.from_json
: Creates BaseModel instance from a given JSON string.from_toml
: Creates BaseModel object from a given toml file.from_yaml
: Creates BaseModel object from a given yaml file.lazy
: Constructs the model without doing validation.
Dunder Methods#
__add__
: Allows to add two BaseModel instances together.__enter__
: Allows for using the model in a with-statement.__exit__
: Allows for using the model in a with-statement.__setitem__
: Set Item dunder method for BaseModel.__getitem__
: Get Item dunder method for BaseModel.
Instance Methods#
hasattr
: Check if given key is present in the model.get
: Get an attribute of the model, but don't fail if not present.merge
: Merge key,value map with self.set
: Allows for subscribing / assigning toclass[key]
.to_context
: Converts the BaseModel instance to a Context object.to_dict
: Converts the BaseModel instance to a dictionary.to_json
: Converts the BaseModel instance to a JSON string.to_yaml
: Converts the BaseModel instance to a YAML string.
Different Modes
This BaseModel class supports lazy mode. This means that validation of the items stored in the class can be called at will instead of being forced to run it upfront.
-
Normal mode: you need to know the values ahead of time
-
Lazy mode: being able to defer the validation until later
The prime advantage of using lazy mode is that you don't have to know all your outputs up front, and can add them as they become available. All while still being able to validate that you have collected all your output at the end. -
With statements: With statements are also allowed. The
Note: that a lazy mode BaseModel object is required to work with a with-statement.validate_output
method from the earlier example will run upon exit of the with-statement.
Examples:
from koheesio.models import BaseModel
class Person(BaseModel):
name: str
age: int
# Using the lazy method to create an instance without immediate validation
person = Person.lazy()
# Setting attributes
person.name = "John Doe"
person.age = 30
# Now we validate the instance
person.validate_output()
print(person)
In this example, the Person instance is created without immediate validation. The attributes name and age are set
afterward. The validate_output
method is then called to validate the instance.
Koheesio specific configuration:
Koheesio models are configured differently from Pydantic defaults. The configuration looks like this:
-
extra="allow"
This setting allows for extra fields that are not specified in the model definition. If a field is present in the data but not in the model, it will not raise an error. Pydantic default is "ignore", which means that extra attributes are ignored.
-
arbitrary_types_allowed=True
This setting allows for fields in the model to be of any type. This is useful when you want to include fields in your model that are not standard Python types. Pydantic default is False, which means that fields must be of a standard Python type.
-
populate_by_name=True
This setting allows an aliased field to be populated by its name as given by the model attribute, as well as the alias. This was known as allow_population_by_field_name in pydantic v1. Pydantic default is False, which means that fields can only be populated by their alias.
-
validate_assignment=False
This setting determines whether the model should be revalidated when the data is changed. If set to
True
, every time a field is assigned a new value, the entire model is validated again.Pydantic default is (also)
False
, which means that the model is not revalidated when the data is changed. By default, Pydantic validates the data when creating the model. If the user changes the data after creating the model, it does not revalidate the model. -
revalidate_instances="subclass-instances"
This setting determines whether to revalidate models during validation if the instance is a subclass of the model. This is important as inheritance is used a lot in Koheesio. Pydantic default is
never
, which means that the model and dataclass instances are not revalidated during validation. -
validate_default=True
This setting determines whether to validate default values during validation. When set to True, default values are checked during the validation process. We opt to set this to True, as we are attempting to make the sure that the data is valid prior to running / executing any Step. Pydantic default is False, which means that default values are not validated during validation.
-
frozen=False
This setting determines whether the model is immutable. If set to True, once a model is created, its fields cannot be changed. Pydantic default is also False, which means that the model is mutable.
-
coerce_numbers_to_str=True
This setting determines whether to convert number fields to strings. When set to True, enables automatic coercion of any
Number
type tostr
. Pydantic doesn't allow number types (int
,float
,Decimal
) to be coerced as typestr
by default. -
use_enum_values=True
This setting determines whether to use the values of Enum fields. If set to True, the actual value of the Enum is used instead of the reference. Pydantic default is False, which means that the reference to the Enum is used.
description
class-attribute
instance-attribute
#
model_config
class-attribute
instance-attribute
#
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
populate_by_name=True,
validate_assignment=False,
revalidate_instances="subclass-instances",
validate_default=True,
frozen=False,
coerce_numbers_to_str=True,
use_enum_values=True,
)
name
class-attribute
instance-attribute
#
from_basemodel
classmethod
#
Returns a new BaseModel instance based on the data of another BaseModel
Source code in src/koheesio/models/__init__.py
from_context
classmethod
#
Creates BaseModel instance from a given Context
You have to make sure that the Context object has the necessary attributes to create the model.
Examples:
class SomeStep(BaseModel):
foo: str
context = Context(foo="bar")
some_step = SomeStep.from_context(context)
print(some_step.foo) # prints 'bar'
Parameters:
Name | Type | Description | Default |
---|---|---|---|
context
|
Context
|
|
required |
Returns:
Type | Description |
---|---|
BaseModel
|
|
Source code in src/koheesio/models/__init__.py
from_dict
classmethod
#
from_json
classmethod
#
Creates BaseModel instance from a given JSON string
BaseModel offloads the serialization and deserialization of the JSON string to Context class. Context uses jsonpickle library to serialize and deserialize the JSON string. This is done to allow for objects to be stored in the BaseModel object, which is not possible with the standard json library.
See Also
Context.from_json : Deserializes a JSON string to a Context object
Parameters:
Name | Type | Description | Default |
---|---|---|---|
json_file_or_str
|
Union[str, Path]
|
Pathlike string or Path that points to the json file or string containing json |
required |
Returns:
Type | Description |
---|---|
BaseModel
|
|
Source code in src/koheesio/models/__init__.py
from_toml
classmethod
#
Creates BaseModel object from a given toml file
Note: BaseModel offloads the serialization and deserialization of the TOML string to Context class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
toml_file_or_str
|
Union[str, Path]
|
Pathlike string or Path that points to the toml file, or string containing toml |
required |
Returns:
Type | Description |
---|---|
BaseModel
|
|
Source code in src/koheesio/models/__init__.py
from_yaml
classmethod
#
Creates BaseModel object from a given yaml file
Note: BaseModel offloads the serialization and deserialization of the YAML string to Context class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
yaml_file_or_str
|
str
|
Pathlike string or Path that points to the yaml file, or string containing yaml |
required |
Returns:
Type | Description |
---|---|
BaseModel
|
|
Source code in src/koheesio/models/__init__.py
get #
Get an attribute of the model, but don't fail if not present
Similar to dict.get()
Examples:
step_output = StepOutput(foo="bar")
step_output.get("foo") # returns 'bar'
step_output.get("non_existent_key", "oops") # returns 'oops'
Parameters:
Name | Type | Description | Default |
---|---|---|---|
key
|
str
|
name of the key to get |
required |
default
|
Optional[Any]
|
Default value in case the attribute does not exist |
None
|
Returns:
Type | Description |
---|---|
Any
|
The value of the attribute |
Source code in src/koheesio/models/__init__.py
hasattr #
lazy
classmethod
#
Constructs the model without doing validation
Essentially an alias to BaseModel.construct()
merge #
Merge key,value map with self
Functionally similar to adding two dicts together; like running {**dict_a, **dict_b}
.
Examples:
step_output = StepOutput(foo="bar")
step_output.merge({"lorem": "ipsum"}) # step_output will now contain {'foo': 'bar', 'lorem': 'ipsum'}
Parameters:
Name | Type | Description | Default |
---|---|---|---|
other
|
Union[Dict, BaseModel]
|
Dict or another instance of a BaseModel class that will be added to self |
required |
Source code in src/koheesio/models/__init__.py
set #
Allows for subscribing / assigning to class[key]
.
Examples:
Parameters:
Name | Type | Description | Default |
---|---|---|---|
key
|
str
|
The key of the attribute to assign to |
required |
value
|
Any
|
Value that should be assigned to the given key |
required |
Source code in src/koheesio/models/__init__.py
to_context #
to_context() -> Context
to_json #
Converts the BaseModel instance to a JSON string
BaseModel offloads the serialization and deserialization of the JSON string to Context class. Context uses jsonpickle library to serialize and deserialize the JSON string. This is done to allow for objects to be stored in the BaseModel object, which is not possible with the standard json library.
See Also
Context.to_json : Serializes a Context object to a JSON string
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pretty
|
bool
|
Toggles whether to return a pretty json string or not |
False
|
Returns:
Type | Description |
---|---|
str
|
containing all parameters of the BaseModel instance |
Source code in src/koheesio/models/__init__.py
to_yaml #
Converts the BaseModel instance to a YAML string
BaseModel offloads the serialization and deserialization of the YAML string to Context class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
clean
|
bool
|
Toggles whether to remove |
False
|
Returns:
Type | Description |
---|---|
str
|
containing all parameters of the BaseModel instance |
Source code in src/koheesio/models/__init__.py
validate #
validate() -> BaseModel
Validate the BaseModel instance
This method is used to validate the BaseModel instance. It is used in conjunction with the lazy method to validate the instance after all the attributes have been set.
This method is intended to be used with the lazy
method. The lazy
method is used to create an instance of
the BaseModel without immediate validation. The validate
method is then used to validate the instance after.
Note: in the Pydantic BaseModel, the
validate
method throws a deprecated warning. This is because Pydantic recommends using thevalidate_model
method instead. However, we are using thevalidate
method here in a different context and a slightly different way.
Examples:
class FooModel(BaseModel):
foo: str
lorem: str
foo_model = FooModel.lazy()
foo_model.foo = "bar"
foo_model.lorem = "ipsum"
foo_model.validate()
foo_model
instance is created without immediate validation. The attributes foo and lorem
are set afterward. The validate
method is then called to validate the instance.
Returns:
Type | Description |
---|---|
BaseModel
|
The BaseModel instance |
Source code in src/koheesio/models/__init__.py
koheesio.Context #
The Context class is a key component of the Koheesio framework, designed to manage configuration data and shared variables across tasks and steps in your application. It behaves much like a dictionary, but with added functionalities.
Key Features
- Nested keys: Supports accessing and adding nested keys similar to dictionary keys.
- Recursive merging: Merges two Contexts together, with the incoming Context having priority.
- Serialization/Deserialization: Easily created from a yaml, toml, or json file, or a dictionary, and can be converted back to a dictionary.
- Handling complex Python objects: Uses jsonpickle for serialization and deserialization of complex Python objects to and from JSON.
For a comprehensive guide on the usage, examples, and additional features of the Context class, please refer to the reference/concepts/context section of the Koheesio documentation.
Methods:
Name | Description |
---|---|
add |
Add a key/value pair to the context. |
get |
Get value of a given key. |
get_item |
Acts just like |
contains |
Check if the context contains a given key. |
merge |
Merge this context with the context of another, where the incoming context has priority. |
to_dict |
Returns all parameters of the context as a dict. |
from_dict |
Creates Context object from the given dict. |
from_yaml |
Creates Context object from a given yaml file. |
from_json |
Creates Context object from a given json file. |
Dunder methods
- _
_iter__()
: Allows for iteration across a Context. __len__()
: Returns the length of the Context.__getitem__(item)
: Makes class subscriptable.
Inherited from Mapping
items()
: Returns all items of the Context.keys()
: Returns all keys of the Context.values()
: Returns all values of the Context.
Source code in src/koheesio/context.py
add #
contains #
from_dict
classmethod
#
from_json
classmethod
#
Creates Context object from a given json file
Note: jsonpickle is used to serialize/deserialize the Context object. This is done to allow for objects to be stored in the Context object, which is not possible with the standard json library.
Why jsonpickle?
(from https://jsonpickle.github.io/)
Data serialized with python’s pickle (or cPickle or dill) is not easily readable outside of python. Using the json format, jsonpickle allows simple data types to be stored in a human-readable format, and more complex data types such as numpy arrays and pandas dataframes, to be machine-readable on any platform that supports json.
Security
(from https://jsonpickle.github.io/)
jsonpickle should be treated the same as the Python stdlib pickle module from a security perspective.
! Warning !#
The jsonpickle module is not secure. Only unpickle data you trust. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling. Never unpickle data that could have come from an untrusted source, or that could have been tampered with. Consider signing data with an HMAC if you need to ensure that it has not been tampered with. Safer deserialization approaches, such as reading JSON directly, may be more appropriate if you are processing untrusted data.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
json_file_or_str
|
Union[str, Path]
|
Pathlike string or Path that points to the json file or string containing json |
required |
Returns:
Type | Description |
---|---|
Context
|
|
Source code in src/koheesio/context.py
from_toml
classmethod
#
Creates Context object from a given toml file
Parameters:
Name | Type | Description | Default |
---|---|---|---|
toml_file_or_str
|
Union[str, Path]
|
Pathlike string or Path that points to the toml file or string containing toml |
required |
Returns:
Type | Description |
---|---|
Context
|
|
Source code in src/koheesio/context.py
from_yaml
classmethod
#
Creates Context object from a given yaml file
Parameters:
Name | Type | Description | Default |
---|---|---|---|
yaml_file_or_str
|
str
|
Pathlike string or Path that points to the yaml file, or string containing yaml |
required |
Returns:
Type | Description |
---|---|
Context
|
|
Source code in src/koheesio/context.py
get #
Get value of a given key
The key can either be an actual key (top level) or the key of a nested value.
Behaves a lot like a dict's .get()
method otherwise.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
key
|
str
|
Can be a real key, or can be a dotted notation of a nested key |
required |
default
|
Any
|
Default value to return |
None
|
safe
|
bool
|
Toggles whether to fail or not when item cannot be found |
True
|
Returns:
Type | Description |
---|---|
Any
|
Value of the requested item |
Example
Example of a nested call:
Returns c
Source code in src/koheesio/context.py
get_item #
Acts just like .get
, except that it returns the key also
Returns:
Type | Description |
---|---|
Dict[str, Any]
|
key/value-pair of the requested item |
Example
Example of a nested call:
Returns {'a.b': 'c'}
Source code in src/koheesio/context.py
merge #
Merge this context with the context of another, where the incoming context has priority.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
context
|
Context
|
Another Context class |
required |
recursive
|
bool
|
Recursively merge two dictionaries to an arbitrary depth |
False
|
Returns:
Type | Description |
---|---|
Context
|
updated context |
Source code in src/koheesio/context.py
process_value #
Processes the given value, converting dictionaries to Context objects as needed.
Source code in src/koheesio/context.py
to_dict #
Returns all parameters of the context as a dict
Returns:
Type | Description |
---|---|
dict
|
containing all parameters of the context |
Source code in src/koheesio/context.py
to_json #
Returns all parameters of the context as a json string
Note: jsonpickle is used to serialize/deserialize the Context object. This is done to allow for objects to be stored in the Context object, which is not possible with the standard json library.
Why jsonpickle?
(from https://jsonpickle.github.io/)
Data serialized with python's pickle (or cPickle or dill) is not easily readable outside of python. Using the json format, jsonpickle allows simple data types to be stored in a human-readable format, and more complex data types such as numpy arrays and pandas dataframes, to be machine-readable on any platform that supports json.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pretty
|
bool
|
Toggles whether to return a pretty json string or not |
False
|
Returns:
Type | Description |
---|---|
str
|
containing all parameters of the context |
Source code in src/koheesio/context.py
to_yaml #
Returns all parameters of the context as a yaml string
Parameters:
Name | Type | Description | Default |
---|---|---|---|
clean
|
bool
|
Toggles whether to remove |
False
|
Returns:
Type | Description |
---|---|
str
|
containing all parameters of the context |
Source code in src/koheesio/context.py
koheesio.ExtraParamsMixin #
Mixin class that adds support for arbitrary keyword arguments to Pydantic models.
The keyword arguments are extracted from the model's values
and moved to a params
dictionary.
koheesio.LoggingFactory #
LoggingFactory(
name: Optional[str] = None,
env: Optional[str] = None,
level: Optional[str] = None,
logger_id: Optional[str] = None,
)
Logging factory to be used to generate logger instances.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
Optional[str]
|
|
None
|
env
|
Optional[str]
|
|
None
|
logger_id
|
Optional[str]
|
|
None
|
Source code in src/koheesio/logger.py
LOGGER_FORMAT
class-attribute
instance-attribute
#
LOGGER_FORMAT: str = (
"[%(logger_id)s] [%(asctime)s] [%(levelname)s] [%(name)s] {%(module)s.py:%(funcName)s:%(lineno)d} - %(message)s"
)
LOGGER_FORMATTER
class-attribute
instance-attribute
#
LOGGER_FORMATTER: Formatter = Formatter(LOGGER_FORMAT)
LOGGER_LEVEL
class-attribute
instance-attribute
#
LOGGER_LEVEL: str = get("KOHEESIO_LOGGING_LEVEL", "WARNING")
add_handlers
staticmethod
#
Add handlers to existing root logger.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
handler_class
|
|
required | |
handlers_config
|
|
required |
Source code in src/koheesio/logger.py
get_logger
staticmethod
#
Provide logger. If inherit_from_koheesio then inherit from LoggingFactory.PIPELINE_LOGGER_NAME.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
str
|
|
required |
inherit_from_koheesio
|
bool
|
|
False
|
Returns:
Name | Type | Description |
---|---|---|
logger |
Logger
|
|
Source code in src/koheesio/logger.py
koheesio.Step #
Base class for a step
A custom unit of logic that can be executed.
The Step class is designed to be subclassed. To create a new step, one would subclass Step and implement the
def execute(self)
method, specifying the expected inputs and outputs.
Note: since the Step class is meta classed, the execute method is wrapped with the
do_execute
function making it always return the Step's output. Hence, an explicit return is not needed when implementing execute.
Methods and Attributes
The Step class has several attributes and methods.
INPUT#
The following fields are available by default on the Step class:
- name
: Name of the Step. If not set, the name of the class will be used.
- description
: Description of the Step. If not set, the docstring of the class will be used. If the docstring
contains multiple lines, only the first line will be used.
When subclassing a Step, any additional pydantic field will be treated as input
to the Step. See also the
explanation on the .execute()
method below.
OUTPUT#
Every Step has an Output
class, which is a subclass of StepOutput
. This class is used to validate the output of
the Step. The Output
class is defined as an inner class of the Step class. The Output
class can be accessed
through the Step.Output
attribute. The Output
class can be extended to add additional fields to the output of
the Step. See also the explanation on the .execute()
.
Output
: A nested class representing the output of the Step used to validate the output of the Step and based on the StepOutput class.output
: Allows you to interact with the Output of the Step lazily (see above and StepOutput)
When subclassing a Step, any additional pydantic field added to the nested Output
class will be treated as
output
of the Step. See also the description of StepOutput
for more information.
Methods:#
execute
: Abstract method to implement for new steps.- The Inputs of the step can be accessed, using
self.input_name
. - The output of the step can be accessed, using
self.output.output_name
.
- The Inputs of the step can be accessed, using
run
: Alias to .execute() method. You can use this to run the step, but execute is preferred.to_yaml
: YAML dump the stepget_description
: Get the description of the Step
When subclassing a Step, execute
is the only method that needs to be implemented. Any additional method added
to the class will be treated as a method of the Step.
Note: since the Step class is meta-classed, the execute method is automatically wrapped with the do_execute
function making it always return a StepOutput. See also the explanation on the do_execute
function.
class methods:#
from_step
: Returns a new Step instance based on the data of another Step instance. for example:MyStep.from_step(other_step, a="foo")
get_description
: Get the description of the Step
dunder methods:#
__getattr__
: Allows input to be accessed throughself.input_name
__repr__
and__str__
: String representation of a step
Background
A Step is an atomic operation and serves as the building block of data pipelines built with the framework. Tasks typically consist of a series of Steps.
A step can be seen as an operation on a set of inputs, that returns a set of outputs. This however does not imply that steps are stateless (e.g. data writes)!
The diagram serves to illustrate the concept of a Step:
┌─────────┐ ┌──────────────────┐ ┌─────────┐
│ Input 1 │───────▶│ ├───────▶│Output 1 │
└─────────┘ │ │ └─────────┘
│ │
┌─────────┐ │ │ ┌─────────┐
│ Input 2 │───────▶│ Step │───────▶│Output 2 │
└─────────┘ │ │ └─────────┘
│ │
┌─────────┐ │ │ ┌─────────┐
│ Input 3 │───────▶│ ├───────▶│Output 3 │
└─────────┘ └──────────────────┘ └─────────┘
Steps are built on top of Pydantic, which is a data validation and settings management using python type annotations. This allows for the automatic validation of the inputs and outputs of a Step.
- Step inherits from BaseModel, which is a Pydantic class used to define data models. This allows Step to automatically validate data against the defined fields and their types.
- Step is metaclassed by StepMetaClass, which is a custom metaclass that wraps the
execute
method of the Step class with the_execute_wrapper
function. This ensures that theexecute
method always returns the output of the Step along with providing logging and validation of the output. - Step has an
Output
class, which is a subclass ofStepOutput
. This class is used to validate the output of the Step. TheOutput
class is defined as an inner class of the Step class. TheOutput
class can be accessed through theStep.Output
attribute. - The
Output
class can be extended to add additional fields to the output of the Step.
Examples:
class MyStep(Step):
a: str # input
class Output(StepOutput): # output
b: str
def execute(self) -> MyStep.Output:
self.output.b = f"{self.a}-some-suffix"
Output #
Output class for Step
execute
abstractmethod
#
execute() -> InstanceOf[StepOutput]
Abstract method to implement for new steps.
The Inputs of the step can be accessed, using self.input_name
Note: since the Step class is meta-classed, the execute method is wrapped with the do_execute
function making
it always return the Steps output
Source code in src/koheesio/steps/__init__.py
from_step
classmethod
#
from_step(step: Step, **kwargs) -> InstanceOf[BaseModel]
Returns a new Step instance based on the data of another Step or BaseModel instance
Source code in src/koheesio/steps/__init__.py
repr_json #
dump the step to json, meant for representation
Note: use to_json if you want to dump the step to json for serialization This method is meant for representation purposes only!
Examples:
Parameters:
Name | Type | Description | Default |
---|---|---|---|
simple
|
bool
|
When toggled to True, a briefer output will be produced. This is friendlier for logging purposes |
False
|
Returns:
Type | Description |
---|---|
str
|
A string, which is valid json |
Source code in src/koheesio/steps/__init__.py
repr_yaml #
dump the step to yaml, meant for representation
Note: use to_yaml if you want to dump the step to yaml for serialization This method is meant for representation purposes only!
Examples:
Parameters:
Name | Type | Description | Default |
---|---|---|---|
simple
|
bool
|
When toggled to True, a briefer output will be produced. This is friendlier for logging purposes |
False
|
Returns:
Type | Description |
---|---|
str
|
A string, which is valid yaml |
Source code in src/koheesio/steps/__init__.py
run #
run() -> InstanceOf[StepOutput]
koheesio.StepOutput #
Class for the StepOutput model
Usage
Setting up the StepOutputs class is done like this:
model_config
class-attribute
instance-attribute
#
validate_output #
validate_output() -> StepOutput
Validate the output of the Step
Essentially, this method is a wrapper around the validate method of the BaseModel class