Skip to content

Classes

spark_expectations.core.context.SparkExpectationsContext(product_id: str, spark: SparkSession) dataclass

This class provides the context for SparkExpectations

Attributes

get_agg_dq_detailed_stats_status: bool property

This function returns whether to enable detailed result for Agg and Query dq is enabled or not Returns: Returns _enable_agg_dq_detailed_result(bool)

get_agg_dq_rule_type_name: str property

This function is used to get aggregation data quality rule type name

Returns:

Name Type Description
str str

Returns _agg_dq_rule_type_name"

get_basic_default_template: str property

This function is used to get the basic_default_template

Returns:

Name Type Description
str str

Returns the default_template

get_cerberus_cred_path: str property

This functions implemented to return cerberus credentials path Returns:

get_cerberus_token: str property

This functions implemented to return cerberus token Returns:

get_cerberus_url: str property

This functions implemented to return cerberus url Returns:

get_client_id: Optional[str] property

This function helps in getting key / path for client id Returns: client id key / path in Optional[str]

get_config_file_path: str property

This function returns config file abs path Returns: str: Returns _config_file_path(str)

get_custom_default_template: str property

This function is used to get the custom_default_template

Returns:

Name Type Description
str str

Returns the custom_template

get_custom_detailed_dataframe: DataFrame property

Returns the DataFrame containing custom detailed statistics.

Returns:

Name Type Description
DataFrame DataFrame

DataFrame with custom detailed DQ statistics

get_dbr_job_id: str property

This function returns the Databricks job ID if running in a job.

Returns:

Name Type Description
str str

Returns the job ID if available, "local" otherwise

get_dbr_version: Optional[str] property

Returns the raw DATABRICKS_RUNTIME_VERSION environment variable value, or None when not running on Databricks.

On standard (non-serverless) compute this is a numeric version string such as '13.3' or '14.2'. On Serverless compute is string value, e.g. client.1.13, client.4.9

get_dbr_workspace_id: str property

This function returns the Databricks workspace ID.

Returns:

Name Type Description
str str

Returns the workspace ID if running in Databricks, "local" otherwise

get_dbr_workspace_url: str property

This function returns Databricks workspace hostname.

Returns:

Name Type Description
str str

Returns the workspace hostname if running in Databricks, "local" otherwise

get_debugger_mode: bool property

This function returns a debugger Returns: bool: return debugger

get_detailed_default_template: str property

This function is used to get the detailed_default_template

Returns:

Name Type Description
str str

Returns the default_template

get_detailed_stats_table_writer_config: dict property

This function returns stats table writer config Returns: dict: Returns detailed_stats_table_writer_config which in dict

get_df_dq_obs_report_dataframe: DataFrame property

Returns the DataFrame for DQ observability report.

Returns:

Name Type Description
DataFrame DataFrame

DataFrame containing DQ observability report data

get_dq_detailed_stats_table_name: str property

Get dq_stats_table_name to which the final stats of the dq job will be written into

Returns:

Name Type Description
str str

returns the dq_stats_table_name

get_dq_expectations: dict property

Get dq_expectations to which has rule information

Returns:

Name Type Description
str dict

returns the rules_df

get_dq_obs_rpt_gen_status_flag: bool property

This function is used to get the dq_obs_rpt_gen_status_flag

Returns:

Name Type Description
bool bool

Returns the dq_obs_rpt_gen_status_flag

get_dq_rules_params: dict property

This function returns params which are mapping in dq rules Returns: _dq_rules_params(dict)

get_dq_run_status: str property

This function is used to get data quality pipeline status

Returns:

Name Type Description
str str

Returns _dq_status"

get_dq_run_time: float property

This function implements time diff for dq run Returns: float: time in float

get_dq_stats_table_name: str property

Get dq_stats_table_name to which the final stats of the dq job will be written into

Returns:

Name Type Description
str str

returns the dq_stats_table_name

get_email_custom_body: str property

This function returns email custom body Returns: str: Returns _email_custom_body(str)

get_enable_custom_email_body: bool property

This function return whether to enable custom email body or not Returns: str: Returns _enable_custom_email_body(bool)

get_enable_mail: bool property

This function return whether mail notification to enable or not Returns: str: Returns _enable_mail(bool)

get_enable_obs_dq_report_result: bool property

This function is used to get the enable_obs_dq_report_result

Returns:

Name Type Description
bool bool

Returns the enable_obs_dq_report_result

get_enable_pagerduty: bool property

This function returns if pagerduty notifications are enabled or not.

Returns:

Name Type Description
bool bool

Whether to enable pagerduty incidents or not

get_enable_slack: bool property

This function returns whether to enable slack notification or not Returns: Returns _enable_slack(bool)

get_enable_smtp_server_auth: bool property

This function return whether smtp server requires authentication or not Returns: str: Returns _enable_smtp_server_auth(bool)

get_enable_teams: bool property

This function returns whether to enable teams notification or not Returns: Returns _enable_teams(bool)

get_enable_templated_basic_email_body: bool property

This function return whether to enable html templating for basic emails or not Returns: str: Returns _enable_templated_basic_email_body(bool)

get_enable_templated_custom_email: bool property

This function return whether to enable html templating for custom emails or not Returns: str: Returns _enable_templated_custom_email(bool)

get_enable_zoom: bool property

Get whether Zoom notification is enabled.

Returns:

Name Type Description
bool bool

Whether Zoom notification is enabled or not.

get_env: Optional[str] property

functions returns running environment type Returns: str: Returns _env

get_error_count: int property

This functions return error count Returns: int: Returns _error_count(int)

get_error_drop_percentage: float property

This function returns error drop percentage percentage Returns: float: error drop percentage

get_error_drop_threshold: int property

This function return error threshold breach Returns: int: error threshold breach

get_error_percentage: float property

This function returns error percentage Returns: float: error percentage

get_error_table_name: str property

Get error_table_name to which the final stats of the dq job will be written into

Returns:

Name Type Description
str str

returns error_table_name attribute value

get_error_table_name_user_specified: bool property

Returns True if user override default error table name, otherwise False

get_final_agg_dq_result: Optional[List[Dict[str, str]]] property

This function return status of the final_agg_dq_result Returns: dict: Returns final_agg_dq_result which in list of dict with str(key) and str(value)

get_final_agg_dq_run_time: float property

This function implements time diff for final agg dq run Returns: float: time in float

get_final_agg_dq_status: str property

This function is used to get final aggregation data quality status

Returns:

Name Type Description
str str

Returns _final_agg_dq_status"

get_final_query_dq_result: Optional[List[Dict[str, str]]] property

This function return status of the final_query_dq_result Returns: dict: Returns final_query_dq_result which in list of dict with str(key) and str(value)

get_final_query_dq_run_time: float property

This function implements time diff for final query dq run Returns: float: time in float

get_final_query_dq_status: str property

This function is used to get final query dq data quality status

Returns:

Name Type Description
str str

Returns _final_query_dq_status"

get_final_table_name: str property

Get dq_stats_table_name to which the final stats of the dq job will be written into

Returns:

Name Type Description
str str

returns the dq_stats_table_name

get_input_count: int property

This function return input count Returns: int: Returns _input_count(int)

get_job_metadata: Optional[str] property

This function is used to get row data quality rule type name

Returns:

Name Type Description
str Optional[str]

Returns _row_dq_rule_type_name"

get_kafka_write_error_message: str property

This function returns the Kafka write error message Returns: str: Returns _kafka_write_error_message

get_kafka_write_status: str property

This function returns the Kafka write status Returns: str: _kafka_write_status ("Success", "Failed", "Disabled")

get_mail_from: str property

This function returns mail id to send email Returns:

get_mail_smtp_password: Optional[str] property

This functions returns smtp password Returns: str: returns _mail_smtp_server password or None if smtp password is not set

get_mail_smtp_port: int property

This functions returns smtp port Returns: int: returns _mail_smtp_server port

get_mail_smtp_server: str property

This functions returns smtp server host Returns: str: returns _mail_smtp_server

get_mail_smtp_user_name: Optional[str] property

Returns the SMTP username for authentication, if configured. Returns: str: _mail_smtp_user_name or None if not set

get_mail_subject: str property

This function returns mail subject Returns: str: Returns _mail_subject(str)

get_min_priority_email: str property

Returns the min priority for email notifications Returns: str: The minimum priority for email notifications

get_min_priority_pagerduty: str property

Returns the min priority for pagerduty notifications Returns: str: The minimum priority for pagerduty notifications

get_min_priority_slack: str property

Returns the min priority for slack notifications Returns: str: The minimum priority for slack notifications

get_min_priority_teams: str property

Returns the min priority for teams notifications Returns: str: The minimum priority for teams notifications

get_min_priority_zoom: str property

Returns the min priority for zoom notifications Returns: str: The minimum priority for zoom notifications

get_notification_on_completion: bool property

This function returns notification on completion Returns: bool: Returns _notification_on_completion

get_notification_on_error_drop_exceeds_threshold_breach: bool property

Returns whether notification is enabled when error drop exceeds threshold.

Returns:

Name Type Description
bool bool

True if notification on threshold breach is enabled, False otherwise

get_notification_on_fail: bool property

This function returns notification on fail Returns: bool: Returns _notification_on_fail

get_notification_on_start: bool property

This function returns notification on start Returns: bool: Returns _notification_on_start

get_notifications_on_rules_action_if_failed_set_ignore: bool property

Returns whether notifications are enabled for rules with action_if_failed set to 'ignore'.

Returns:

Name Type Description
bool bool

True if notifications for ignored failures are enabled, False otherwise

get_num_agg_dq_rules: dict property

This function returns number agg dq rules applied for batch run Returns: int: number of rules in int

get_num_dq_rules: int property

This function returns number dq rules applied for batch run Returns: int: number of rules in int

get_num_query_dq_rules: dict property

This function returns number query dq rules applied for batch run Returns: int: number of rules in int

get_num_row_dq_rules: int property

This function returns number row dq rules applied for batch run Returns: int: number of rules in int

get_output_count: int property

This function returns output count Returns: int: Returns _output(int)

get_output_percentage: float property

This function return output percentage Returns: float: output percentage

get_pagerduty_creds_dict: Dict[str, str] property

This function returns secret keys dict for pagerduty authentication

get_pagerduty_integration_key: Optional[str] property

This function returns pagerduty integration key Returns: str: Returns _pagerduty_integration_key(str)

get_pagerduty_webhook_url: str property

This function returns pagerduty webhook url Returns: str: Returns _pagerduty_webhook_url(str)

get_query_dq_detailed_stats_status: bool property

This function returns whether to enable detailed result for Agg and Query dq is enabled or not Returns: Returns _enable_query_dq_detailed_result(bool)

get_query_dq_output_custom_table_name: str property

Get query_dq_detailed_stats_status to which the final output of the query of the querydq will be written into

Returns:

Name Type Description
str str

returns the query_dq_output_custom_table_name

get_query_dq_rule_type_name: str property

This function is used to get query data quality rule type name

Returns:

Name Type Description
str str

Returns _query_dq_rule_type_name"

get_querydq_secondary_queries: dict property

This function gets row dq secondary queries Returns: dict: Returns querydq_secondary_queries

get_report_table_name: str property

Returns the report table name for DQ observability reports.

Returns:

Name Type Description
str str

Fully qualified table name for DQ reports

get_row_dq_end_time: datetime property

This function sets end time row dq computation Returns: None

get_row_dq_rule_type_name: str property

This function is used to get row data quality rule type name

Returns:

Name Type Description
str str

Returns _row_dq_rule_type_name"

get_row_dq_run_time: float property

This function implements time diff for row dq run Returns: float: time in float

get_row_dq_start_time: datetime property

This function sets start time row dq computation Returns: None

get_row_dq_status: str property

This function is used to get row data quality status

Returns:

Name Type Description
str str

Returns _row_dq_status"

get_rules_exceeds_threshold: Optional[List[dict]] property

This function returns error percentage for each rule

get_rules_execution_settings_config: dict property

This function returns stats table writer config Returns: dict: Returns detailed_stats_table_writer_config which in dict

get_run_date: str property

Get run_date for the instance of the spark-expectations class

Returns:

Name Type Description
str str

returns the run_date

get_run_date_name: str property

This function returns name for the run_date column Returns: str: name of run_date in str

get_run_date_time_name: str property

This function returns name for the run_date_time column Returns: str: name of run_date_time in str

get_run_id: str property

Get run_id for the instance of spark-expectations class

Returns:

Name Type Description
str str

returns the run_id

get_run_id_name: str property

This function returns name for the run_id column Returns: str: name of run_id in str

get_se_dq_obs_alert_flag: bool property

This function is used to get the se_dq_obs_alert_flag

Returns:

Name Type Description
bool bool

Returns the se_dq_obs_alert_flag

get_se_enable_error_table: bool property

This function returns whether to enable relational table or not Returns: Returns _se_enable_error_table(bool)

get_se_job_metadata: dict property

This function returns Spark Expectations job metadata for local/Databricks runtimes.

Returns:

Name Type Description
dict dict

Job metadata including versions and runtime environment details.

get_se_streaming_stats_dict: Dict[str, str] property

This function returns secret keys dict

get_se_streaming_stats_kafka_bootstrap_server: str property

This function returns kafka bootstrap server that was specified in custom config Returns: str: Returns _se_streaming_stats_kafka_bootstrap_server

get_se_streaming_stats_kafka_custom_config_enable: bool property

This function returns whether it's enabled to use the custom kafka config Returns: bool: Returns _se_streaming_stats_kafka_custom_config_enable

get_secret_type: Optional[str] property

This function helps in getting secret type Returns: secret type in Optional[str]

get_server_url_key: Optional[str] property

This function helps in getting key / path for kafka server url Returns: kafka server url key / path in Optional[str]

get_slack_webhook_url: str property

This function returns sack webhook url Returns: str: Returns _webhook_url(str)

get_smtp_creds_dict: Dict[str, str] property

This function returns secret keys dict for smtp server authentication

get_source_agg_dq_detailed_stats: Optional[List[Tuple]] property

This function returns the detailed result for Agg and Query dq Returns: Returns _source_agg_dq_detailed_stats

get_source_agg_dq_result: Optional[List[Dict[str, str]]] property

This function return status of the source_agg_dq_result Returns: dict: Returns source_agg_dq_result which in list of dict with str(key) and str(value)

get_source_agg_dq_run_time: float property

This function implements time diff for source agg dq run Returns: float: time in float

get_source_agg_dq_status: str property

This function is used to get source aggregation data quality status

Returns:

Name Type Description
str str

Returns _source_agg_dq_status"

get_source_query_dq_detailed_stats: Optional[List[Tuple]] property

This function returns the detailed result for Agg and Query dq Returns: Returns _source_query_dq_detailed_stats

get_source_query_dq_output: Optional[List[dict]] property

This function gets row dq secondary queries Returns: dict: Returns source_query_dq_output

get_source_query_dq_result: Optional[List[Dict[str, str]]] property

This function return status of the source_query_dq_result Returns: dict: Returns source_query_dq_result which in list of dict with str(key) and str(value)

get_source_query_dq_run_time: float property

This function implements time diff for source query dq run Returns: float: time in float

get_source_query_dq_status: str property

This function is used to get source query data quality status

Returns:

Name Type Description
str str

Returns _source_query_dq_status"

get_spark_expectations_version: str property

This function returns the Spark Expectations package version.

Returns:

Name Type Description
str str

Returns the package version if available, "unknown" otherwise

get_spark_version: str property

This function returns the Spark version.

Returns:

Name Type Description
str str

Returns the Spark version if available, "unknown" otherwise

get_stats_detailed_dataframe: DataFrame property

Returns the DataFrame containing detailed statistics.

Returns:

Name Type Description
DataFrame DataFrame

DataFrame with detailed DQ statistics

get_stats_dict: Optional[List[Dict[str, Any]]] property

This function is used to get the stats_dict

Returns:

Type Description
Optional[List[Dict[str, Any]]]

Optional[List[Dict[str, Any]]]: Returns the stats_dict if it exists, otherwise None

get_stats_table_writer_config: dict property

This function returns stats table writer config Returns: dict: Returns stats_table_writer_config which in dict

get_stats_table_writer_type: str property

This function returns stats table writer type Returns: str: Returns stats_table_writer_type which in str

get_success_percentage: float property

This function returns success percentage Returns: float: success percentage

get_summarized_row_dq_res: Optional[List[Dict[str, str]]] property

This function returns row dq summarized res Returns: list(dict): Returns summarized_row_dq_res which in list of dict with str(key) and str(value) of rule meta data

get_supported_df_query_dq: DataFrame property

This function returns the place holder dataframe for query check Returns: DataFrame: returns dataframe for query dq

get_table_name: str property

This function returns table name Returns: str: Returns _table_name(str)

get_target_agg_dq_detailed_stats: Optional[List[Tuple]] property

This function returns the detailed result for Agg and Query dq Returns: Returns _target_agg_dq_detailed_stats

get_target_and_error_table_writer_config: dict property

This function returns target and error table writer config Returns: dict: Returns target_and_error_table_writer_config which in dict

get_target_and_error_table_writer_type: str property

This function returns target and error table writer type Returns: str: Returns target_and_error_table_writer_type which in str

get_target_query_dq_detailed_stats: Optional[List[Tuple]] property

This function returns the detailed result for Agg and Query dq Returns: Returns _target_query_dq_detailed_stats

get_target_query_dq_output: Optional[List[dict]] property

This function gets row dq secondary queries Returns: dict: Returns target_query_dq_output

get_teams_webhook_url: str property

This function returns sack webhook url Returns: str: Returns _webhook_url(str)

get_to_mail: str property

This function returns list of mail id's Returns: str: Returns _mail_id(str)

get_token: Optional[str] property

This function helps in getting key / path for token Returns: token key / path in Optional[str]

get_token_endpoint_url: Optional[str] property

This function helps in getting key / path for end point url Returns: end point url key / path in Optional[str]

get_topic_name: Optional[str] property

This function helps in getting key / path for topic name Returns: topic name key / path in Optional[str]

get_zoom_token: str property

Get the Zoom token.

Returns:

Name Type Description
str str

The Zoom token.

get_zoom_webhook_url: str property

Get the Zoom webhook URL.

Returns:

Name Type Description
str str

The Zoom webhook URL.

product_id: str instance-attribute

spark: SparkSession instance-attribute

Functions

get_time_diff(start_time: Optional[datetime], end_time: Optional[datetime]) -> float

This function implements time diff Args: start_time: end_time:

Returns:

Source code in spark_expectations/core/context.py
def get_time_diff(self, start_time: Optional[datetime], end_time: Optional[datetime]) -> float:
    """
    This function implements time diff
    Args:
        start_time:
        end_time:

    Returns:

    """
    if start_time and end_time:
        time_diff = end_time - start_time

        return round(float(time_diff.total_seconds()), 1)
    else:
        return 0.0

print_dataframe_with_debugger(df: DataFrame) -> None

This function has a debugger that can print out the DataFrame Returns:

Source code in spark_expectations/core/context.py
def print_dataframe_with_debugger(self, df: DataFrame) -> None:
    """
    This function has a debugger that can print out the DataFrame
    Returns:

    """
    if self.get_debugger_mode:
        df.show(truncate=False)

reset_num_agg_dq_rules() -> None

This function used to reset the_num_agg_dq_rules Returns: None

Source code in spark_expectations/core/context.py
def reset_num_agg_dq_rules(self) -> None:
    """
    This function used to reset the_num_agg_dq_rules
    Returns:
        None

    """
    self._num_agg_dq_rules = {
        "num_agg_dq_rules": 0,
        "num_source_agg_dq_rules": 0,
        "num_final_agg_dq_rules": 0,
    }

reset_num_dq_rules() -> None

This function used to reset the _num_dq_rules Returns: None

Source code in spark_expectations/core/context.py
def reset_num_dq_rules(self) -> None:
    """
    This function used to reset the _num_dq_rules
    Returns:
        None

    """
    self._num_dq_rules = 0

reset_num_query_dq_rules() -> None

This function used to rest the _num_query_dq_rules Returns: None

Source code in spark_expectations/core/context.py
def reset_num_query_dq_rules(self) -> None:
    """
    This function used to rest the _num_query_dq_rules
    Returns:
        None

    """
    self._num_query_dq_rules = {
        "num_query_dq_rules": 0,
        "num_source_query_dq_rules": 0,
        "num_final_query_dq_rules": 0,
    }

reset_num_row_dq_rules() -> None

This function used to reset the _num_row_dq_rules Returns: None

Source code in spark_expectations/core/context.py
def reset_num_row_dq_rules(self) -> None:
    """
    This function used to reset the _num_row_dq_rules
    Returns:
        None

    """

    self._num_row_dq_rules = 0  # pragma: no cover

set_agg_dq_detailed_stats_status(agg_dq_detailed_result_status: bool) -> None

Parameters:

Name Type Description Default
_enable_agg_dq_detailed_result
required

Returns:

Source code in spark_expectations/core/context.py
def set_agg_dq_detailed_stats_status(self, agg_dq_detailed_result_status: bool) -> None:
    """
    Args:
        _enable_agg_dq_detailed_result:
    Returns:
    """
    self._enable_agg_dq_detailed_result = bool(agg_dq_detailed_result_status)

set_basic_default_template(basic_default_template: str) -> None

This function is used to set the basic_default_template

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_basic_default_template(self, basic_default_template: str) -> None:
    """
    This function is used to set the basic_default_template

    Returns:
        None

    """
    self._basic_default_template = basic_default_template

set_custom_default_template(custom_default_template: str) -> None

This function is used to set the custom_default_template

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_custom_default_template(self, custom_default_template: str) -> None:
    """
    This function is used to set the custom_default_template

    Returns:
        None

    """
    self._custom_default_template = custom_default_template

set_custom_detailed_dataframe(dataframe: DataFrame) -> None

Sets the DataFrame containing custom detailed statistics.

Parameters:

Name Type Description Default
dataframe DataFrame

DataFrame with custom detailed DQ statistics

required
Source code in spark_expectations/core/context.py
def set_custom_detailed_dataframe(self, dataframe: DataFrame) -> None:
    """
    Sets the DataFrame containing custom detailed statistics.

    Args:
        dataframe: DataFrame with custom detailed DQ statistics
    """
    self._custom_dataframe = dataframe

set_debugger_mode(debugger_mode: bool) -> None

This function sets debugger mode Returns:

Source code in spark_expectations/core/context.py
def set_debugger_mode(self, debugger_mode: bool) -> None:
    """
    This function sets debugger mode
    Returns:

    """
    self._debugger_mode = debugger_mode

set_detailed_default_template(detailed_default_template: str) -> None

This function is used to set the detailed_default_template

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_detailed_default_template(self, detailed_default_template: str) -> None:
    """
    This function is used to set the detailed_default_template

    Returns:
        None

    """
    self._detailed_default_template = detailed_default_template

set_detailed_stats_table_writer_config(config: dict) -> None

This function sets stats table writer config Args: config: dict Returns: None

Source code in spark_expectations/core/context.py
def set_detailed_stats_table_writer_config(self, config: dict) -> None:
    """
    This function sets stats table writer config
    Args:
        config: dict
    Returns: None
    """
    self._stats_table_writer_config = config

set_df_dq_obs_report_dataframe(dataframe: DataFrame) -> None

Sets the DataFrame for DQ observability report.

Parameters:

Name Type Description Default
dataframe DataFrame

DataFrame containing DQ observability report data

required
Source code in spark_expectations/core/context.py
def set_df_dq_obs_report_dataframe(self, dataframe: DataFrame) -> None:
    """
    Sets the DataFrame for DQ observability report.

    Args:
        dataframe: DataFrame containing DQ observability report data
    """
    self._df_dq_obs_report_dataframe = dataframe

set_dq_detailed_stats_table_name(dq_detailed_stats_table_name: str) -> None

Sets the table name for detailed DQ statistics.

Parameters:

Name Type Description Default
dq_detailed_stats_table_name str

Fully qualified table name for detailed stats

required
Source code in spark_expectations/core/context.py
def set_dq_detailed_stats_table_name(self, dq_detailed_stats_table_name: str) -> None:
    """
    Sets the table name for detailed DQ statistics.

    Args:
        dq_detailed_stats_table_name: Fully qualified table name for detailed stats
    """
    self._dq_detailed_stats_table_name = dq_detailed_stats_table_name

set_dq_end_time() -> None

This function sets end time dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_dq_end_time(self) -> None:
    """
    This function sets end time dq computation
    Returns:
        None
    """
    self._dq_end_time = datetime.now()

set_dq_expectations(dq_expectations: dict) -> None

Sets the data quality expectations/rules dictionary.

Parameters:

Name Type Description Default
dq_expectations dict

Dictionary containing DQ rule definitions

required
Source code in spark_expectations/core/context.py
def set_dq_expectations(self, dq_expectations: dict) -> None:
    """
    Sets the data quality expectations/rules dictionary.

    Args:
        dq_expectations: Dictionary containing DQ rule definitions
    """
    self._dq_expectations = dq_expectations

set_dq_obs_rpt_gen_status_flag(dq_obs_rpt_gen_status_flag: bool) -> None

This function is used to set the dq_obs_rpt_gen_status_flag

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_dq_obs_rpt_gen_status_flag(self, dq_obs_rpt_gen_status_flag: bool) -> None:
    """
    This function is used to set the dq_obs_rpt_gen_status_flag

    Returns:
        None

    """
    self._dq_obs_rpt_gen_status_flag = dq_obs_rpt_gen_status_flag

set_dq_rules_params(_dq_rules_params: dict) -> None

This function set params for dq rules Args: _se_dq_rules_params:

Returns:

Source code in spark_expectations/core/context.py
def set_dq_rules_params(self, _dq_rules_params: dict) -> None:
    """
    This function set params for dq rules
    Args:
        _se_dq_rules_params:

    Returns:

    """
    self._dq_rules_params = _dq_rules_params

set_dq_run_status(dq_run_status: str = 'Failed') -> None

Sets the overall data quality pipeline run status.

Parameters:

Name Type Description Default
dq_run_status str

Overall DQ run status (e.g., "Failed", "Passed")

'Failed'
Source code in spark_expectations/core/context.py
def set_dq_run_status(self, dq_run_status: str = "Failed") -> None:
    """
    Sets the overall data quality pipeline run status.

    Args:
        dq_run_status: Overall DQ run status (e.g., "Failed", "Passed")
    """
    self._dq_run_status = dq_run_status

set_dq_start_time() -> None

This function sets start time dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_dq_start_time(self) -> None:
    """
    This function sets start time dq computation
    Returns:
        None
    """
    self._dq_start_time = datetime.now()

set_dq_stats_table_name(dq_stats_table_name: str) -> None

Sets the DQ stats table name where final statistics will be written.

Parameters:

Name Type Description Default
dq_stats_table_name str

The fully qualified table name for DQ statistics

required
Source code in spark_expectations/core/context.py
def set_dq_stats_table_name(self, dq_stats_table_name: str) -> None:
    """
    Sets the DQ stats table name where final statistics will be written.

    Args:
        dq_stats_table_name: The fully qualified table name for DQ statistics
    """
    self._dq_stats_table_name = dq_stats_table_name

set_email_custom_body(email_custom_body: str) -> None

Sets the custom email body content.

Parameters:

Name Type Description Default
email_custom_body str

Custom HTML or text content for email body

required
Source code in spark_expectations/core/context.py
def set_email_custom_body(self, email_custom_body: str) -> None:
    """
    Sets the custom email body content.

    Args:
        email_custom_body: Custom HTML or text content for email body
    """
    self._email_custom_body = email_custom_body

set_enable_custom_email_body(enable_custom_email_body: bool) -> None

Enables or disables custom email body content.

Parameters:

Name Type Description Default
enable_custom_email_body bool

True to enable custom email body, False to use default

required
Source code in spark_expectations/core/context.py
def set_enable_custom_email_body(self, enable_custom_email_body: bool) -> None:
    """
    Enables or disables custom email body content.

    Args:
        enable_custom_email_body: True to enable custom email body, False to use default
    """
    self._enable_custom_email_body = bool(enable_custom_email_body)

set_enable_mail(enable_mail: bool) -> None

Enables or disables email notifications.

Parameters:

Name Type Description Default
enable_mail bool

True to enable email notifications, False to disable

required
Source code in spark_expectations/core/context.py
def set_enable_mail(self, enable_mail: bool) -> None:
    """
    Enables or disables email notifications.

    Args:
        enable_mail: True to enable email notifications, False to disable
    """
    self._enable_mail = bool(enable_mail)

set_enable_obs_dq_report_result(enable_obs_dq_report_result: bool) -> None

This function is used to set the enable_obs_dq_report_result

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_enable_obs_dq_report_result(self, enable_obs_dq_report_result: bool) -> None:
    """
    This function is used to set the enable_obs_dq_report_result

    Returns:
        None

    """
    self._enable_obs_dq_report_result = enable_obs_dq_report_result

set_enable_pagerduty(enable_pagerduty: bool) -> None

Set whether to enable pagerduty notification or not

Parameters:

Name Type Description Default
enable_pagerduty bool

Whether to enable pagerduty incidents or not

required
Source code in spark_expectations/core/context.py
def set_enable_pagerduty(self, enable_pagerduty: bool) -> None:
    """
    Set whether to enable pagerduty notification or not

    Args:
        enable_pagerduty (bool): Whether to enable pagerduty incidents or not
    """
    self._enable_pagerduty = enable_pagerduty

set_enable_slack(enable_slack: bool) -> None

Parameters:

Name Type Description Default
enable_slack bool
required

Returns:

Source code in spark_expectations/core/context.py
def set_enable_slack(self, enable_slack: bool) -> None:
    """

    Args:
        enable_slack:

    Returns:

    """
    self._enable_slack = enable_slack

set_enable_smtp_server_auth(enable_smtp_server_auth: bool) -> None

Enables or disables SMTP server authentication.

Parameters:

Name Type Description Default
enable_smtp_server_auth bool

True to enable SMTP authentication, False to disable

required
Source code in spark_expectations/core/context.py
def set_enable_smtp_server_auth(self, enable_smtp_server_auth: bool) -> None:
    """
    Enables or disables SMTP server authentication.

    Args:
        enable_smtp_server_auth: True to enable SMTP authentication, False to disable
    """
    self._enable_smtp_server_auth = bool(enable_smtp_server_auth)

set_enable_teams(enable_teams: bool) -> None

Parameters:

Name Type Description Default
enable_teams bool
required

Returns:

Source code in spark_expectations/core/context.py
def set_enable_teams(self, enable_teams: bool) -> None:
    """

    Args:
        enable_teams:

    Returns:

    """
    self._enable_teams = enable_teams

set_enable_templated_basic_email_body(enable_templated_basic_email_body: bool) -> None

Enables or disables HTML templating for basic email notifications.

Parameters:

Name Type Description Default
enable_templated_basic_email_body bool

True to enable HTML templating, False to disable

required
Source code in spark_expectations/core/context.py
def set_enable_templated_basic_email_body(self, enable_templated_basic_email_body: bool) -> None:
    """
    Enables or disables HTML templating for basic email notifications.

    Args:
        enable_templated_basic_email_body: True to enable HTML templating, False to disable
    """
    self._enable_templated_basic_email_body = bool(enable_templated_basic_email_body)

set_enable_templated_custom_email(enable_templated_custom_email: bool) -> None

Enables or disables HTML templating for custom email notifications.

Parameters:

Name Type Description Default
enable_templated_custom_email bool

True to enable HTML templating for custom emails, False to disable

required
Source code in spark_expectations/core/context.py
def set_enable_templated_custom_email(self, enable_templated_custom_email: bool) -> None:
    """
    Enables or disables HTML templating for custom email notifications.

    Args:
        enable_templated_custom_email: True to enable HTML templating for custom emails, False to disable
    """
    self._enable_templated_custom_email = bool(enable_templated_custom_email)

set_enable_zoom(enable_zoom: bool) -> None

Set whether to enable Zoom notification and its token.

Parameters:

Name Type Description Default
enable_zoom bool

Whether to enable Zoom notification or not.

required
Source code in spark_expectations/core/context.py
def set_enable_zoom(self, enable_zoom: bool) -> None:
    """
    Set whether to enable Zoom notification and its token.

    Args:
        enable_zoom (bool): Whether to enable Zoom notification or not.
    """
    self._enable_zoom = enable_zoom

set_end_time_when_dq_job_fails() -> None

function used to set end time when job fails in any one of the stages by using start time Returns:

Source code in spark_expectations/core/context.py
def set_end_time_when_dq_job_fails(self) -> None:
    """
    function used to set end time when job fails in any one of the stages by using start time
    Returns:
    """
    if self._source_agg_dq_start_time and self._source_agg_dq_end_time is None:
        self.set_source_agg_dq_end_time()
    elif self._source_query_dq_start_time and self._source_query_dq_end_time is None:
        self.set_source_query_dq_end_time()
    elif self._row_dq_start_time and self._row_dq_end_time is None:
        self.set_row_dq_end_time()
    elif self._final_agg_dq_start_time and self._final_agg_dq_end_time is None:
        self.set_final_agg_dq_end_time()
    elif self._final_query_dq_start_time and self._final_query_dq_end_time is None:
        self.set_final_query_dq_end_time()

set_env(env: Optional[str]) -> None

Parameters:

Name Type Description Default
env Optional[str]

which accepts env type

required

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_env(self, env: Optional[str]) -> None:
    """
    Args:
        env: which accepts env type

    Returns:
        None

    """
    self._env = env

set_error_count(error_count: int = 0) -> None

Sets the error record count after DQ processing.

Parameters:

Name Type Description Default
error_count int

Number of records that failed DQ rules

0
Source code in spark_expectations/core/context.py
def set_error_count(self, error_count: int = 0) -> None:
    """
    Sets the error record count after DQ processing.

    Args:
        error_count: Number of records that failed DQ rules
    """
    self._error_count = error_count

set_error_drop_threshold(error_drop_threshold: int) -> None

Sets the error drop threshold percentage.

Parameters:

Name Type Description Default
error_drop_threshold int

Maximum allowed error percentage before alerting

required
Source code in spark_expectations/core/context.py
def set_error_drop_threshold(self, error_drop_threshold: int) -> None:
    """
    Sets the error drop threshold percentage.

    Args:
        error_drop_threshold: Maximum allowed error percentage before alerting
    """
    self._error_drop_threshold = error_drop_threshold

set_error_table_name(error_table_name: str, user_specified: bool = True) -> None

Sets the error table name where failed records will be written.

Parameters:

Name Type Description Default
error_table_name str

The fully qualified error table name

required
user_specified bool

Flag indicating if user explicitly provided this name

True
Source code in spark_expectations/core/context.py
def set_error_table_name(self, error_table_name: str, user_specified: bool = True) -> None:
    """
    Sets the error table name where failed records will be written.

    Args:
        error_table_name: The fully qualified error table name
        user_specified: Flag indicating if user explicitly provided this name
    """
    self._error_table_name = error_table_name
    self._error_table_name_user_specified = user_specified

set_final_agg_dq_end_time() -> None

This function sets end time final agg dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_final_agg_dq_end_time(self) -> None:
    """
    This function sets end time final agg dq computation
    Returns:
        None
    """
    self._final_agg_dq_end_time = datetime.now()

set_final_agg_dq_result(final_agg_dq_result: Optional[List[Dict[str, str]]] = None) -> None

Sets the final/target aggregate data quality check results.

Parameters:

Name Type Description Default
final_agg_dq_result Optional[List[Dict[str, str]]]

List of dictionaries containing final agg DQ results

None
Source code in spark_expectations/core/context.py
def set_final_agg_dq_result(self, final_agg_dq_result: Optional[List[Dict[str, str]]] = None) -> None:
    """
    Sets the final/target aggregate data quality check results.

    Args:
        final_agg_dq_result: List of dictionaries containing final agg DQ results
    """
    self._final_agg_dq_result = final_agg_dq_result

set_final_agg_dq_start_time() -> None

This function sets start time final agg dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_final_agg_dq_start_time(self) -> None:
    """
    This function sets start time final agg dq computation
    Returns:
    None

    """
    self._final_agg_dq_start_time = datetime.now()

set_final_agg_dq_status(final_agg_dq_status: str = 'Skipped') -> None

Sets the final/target aggregate data quality check status.

Parameters:

Name Type Description Default
final_agg_dq_status str

Status of final agg DQ execution (e.g., "Skipped", "Passed", "Failed")

'Skipped'
Source code in spark_expectations/core/context.py
def set_final_agg_dq_status(self, final_agg_dq_status: str = "Skipped") -> None:
    """
    Sets the final/target aggregate data quality check status.

    Args:
        final_agg_dq_status: Status of final agg DQ execution (e.g., "Skipped", "Passed", "Failed")
    """
    self._final_agg_dq_status = final_agg_dq_status

set_final_query_dq_end_time() -> None

This function sets end time final query dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_final_query_dq_end_time(self) -> None:
    """
    This function sets end time final query dq computation
    Returns:
        None
    """
    self._final_query_dq_end_time = datetime.now()

set_final_query_dq_result(final_query_dq_result: Optional[List[Dict[str, str]]] = None) -> None

Sets the final/target query data quality check results.

Parameters:

Name Type Description Default
final_query_dq_result Optional[List[Dict[str, str]]]

List of dictionaries containing final query DQ results

None
Source code in spark_expectations/core/context.py
def set_final_query_dq_result(self, final_query_dq_result: Optional[List[Dict[str, str]]] = None) -> None:
    """
    Sets the final/target query data quality check results.

    Args:
        final_query_dq_result: List of dictionaries containing final query DQ results
    """
    self._final_query_dq_result = final_query_dq_result

set_final_query_dq_start_time() -> None

This function sets start time final query dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_final_query_dq_start_time(self) -> None:
    """
    This function sets start time final query dq computation
    Returns:
        None
    """
    self._final_query_dq_start_time = datetime.now()

set_final_query_dq_status(final_query_dq_status: str = 'Skipped') -> None

Sets the final/target query data quality check status.

Parameters:

Name Type Description Default
final_query_dq_status str

Status of final query DQ execution (e.g., "Skipped", "Passed", "Failed")

'Skipped'
Source code in spark_expectations/core/context.py
def set_final_query_dq_status(self, final_query_dq_status: str = "Skipped") -> None:
    """
    Sets the final/target query data quality check status.

    Args:
        final_query_dq_status: Status of final query DQ execution (e.g., "Skipped", "Passed", "Failed")
    """
    self._final_query_dq_status = final_query_dq_status

set_final_table_name(final_table_name: str) -> None

Sets the final/target table name where validated data will be written.

Parameters:

Name Type Description Default
final_table_name str

The fully qualified target table name

required
Source code in spark_expectations/core/context.py
def set_final_table_name(self, final_table_name: str) -> None:
    """
    Sets the final/target table name where validated data will be written.

    Args:
        final_table_name: The fully qualified target table name
    """
    self._final_table_name = final_table_name

set_input_count(input_count: int = 0) -> None

Sets the input record count before DQ processing.

Parameters:

Name Type Description Default
input_count int

Number of input records

0
Source code in spark_expectations/core/context.py
def set_input_count(self, input_count: int = 0) -> None:
    """
    Sets the input record count before DQ processing.

    Args:
        input_count: Number of input records
    """
    self._input_count = input_count

set_job_metadata(job_metadata: Optional[str] = None) -> None

This function is used to set the job_metadata

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_job_metadata(self, job_metadata: Optional[str] = None) -> None:
    """
    This function is used to set the job_metadata

    Returns:
        None

    """
    self._job_metadata = job_metadata

set_kafka_write_error_message(error_message: str = '') -> None

This function sets the Kafka write error message Args: error_message: Error message from Kafka write failure Returns: None

Source code in spark_expectations/core/context.py
def set_kafka_write_error_message(self, error_message: str = "") -> None:
    """
    This function sets the Kafka write error message
    Args:
        error_message: Error message from Kafka write failure
    Returns: None
    """
    self._kafka_write_error_message = error_message

set_kafka_write_status(kafka_write_status: str = 'Disabled') -> None

This function sets the Kafka write status Args: kafka_write_status: Status of Kafka write operation ("Success", "Failed", "Disabled") Returns: None

Source code in spark_expectations/core/context.py
def set_kafka_write_status(self, kafka_write_status: str = "Disabled") -> None:
    """
    This function sets the Kafka write status
    Args:
        kafka_write_status: Status of Kafka write operation ("Success", "Failed", "Disabled")
    Returns: None
    """
    self._kafka_write_status = kafka_write_status

set_mail_from(mail_from: str) -> None

Sets the sender email address for notifications.

Parameters:

Name Type Description Default
mail_from str

Email address of the sender

required
Source code in spark_expectations/core/context.py
def set_mail_from(self, mail_from: str) -> None:
    """
    Sets the sender email address for notifications.

    Args:
        mail_from: Email address of the sender
    """
    self._mail_from = mail_from

set_mail_smtp_password(mail_smtp_password: str) -> None

Sets the SMTP password for authentication.

Parameters:

Name Type Description Default
mail_smtp_password str

Password for SMTP authentication

required
Source code in spark_expectations/core/context.py
def set_mail_smtp_password(self, mail_smtp_password: str) -> None:
    """
    Sets the SMTP password for authentication.

    Args:
        mail_smtp_password: Password for SMTP authentication
    """
    self._mail_smtp_password = mail_smtp_password

set_mail_smtp_port(mail_smtp_port: int) -> None

Sets the SMTP server port for email notifications.

Parameters:

Name Type Description Default
mail_smtp_port int

SMTP server port number (e.g., 25, 465, 587)

required
Source code in spark_expectations/core/context.py
def set_mail_smtp_port(self, mail_smtp_port: int) -> None:
    """
    Sets the SMTP server port for email notifications.

    Args:
        mail_smtp_port: SMTP server port number (e.g., 25, 465, 587)
    """
    self._mail_smtp_port = mail_smtp_port

set_mail_smtp_server(mail_smtp_server: str) -> None

Sets the SMTP server hostname for email notifications.

Parameters:

Name Type Description Default
mail_smtp_server str

SMTP server hostname or IP address

required
Source code in spark_expectations/core/context.py
def set_mail_smtp_server(self, mail_smtp_server: str) -> None:
    """
    Sets the SMTP server hostname for email notifications.

    Args:
        mail_smtp_server: SMTP server hostname or IP address
    """
    self._mail_smtp_server = mail_smtp_server

set_mail_smtp_user_name(mail_smtp_user_name: str) -> None

Sets the SMTP username for authentication. Args: mail_smtp_user_name: The SMTP username to use for authentication

Source code in spark_expectations/core/context.py
def set_mail_smtp_user_name(self, mail_smtp_user_name: str) -> None:
    """
    Sets the SMTP username for authentication.
    Args:
        mail_smtp_user_name: The SMTP username to use for authentication
    """
    self._mail_smtp_user_name = mail_smtp_user_name

set_mail_subject(mail_subject: str) -> None

Sets the email subject line for notifications.

Parameters:

Name Type Description Default
mail_subject str

Subject line for email notifications

required
Source code in spark_expectations/core/context.py
def set_mail_subject(self, mail_subject: str) -> None:
    """
    Sets the email subject line for notifications.

    Args:
        mail_subject: Subject line for email notifications
    """
    self._mail_subject = mail_subject

set_min_priority_email(min_priority_email: str) -> None

Sets the minimum priority level for email notifications.

Parameters:

Name Type Description Default
min_priority_email str

Minimum priority threshold for triggering email alerts

required
Source code in spark_expectations/core/context.py
def set_min_priority_email(self, min_priority_email: str) -> None:
    """
    Sets the minimum priority level for email notifications.

    Args:
        min_priority_email: Minimum priority threshold for triggering email alerts
    """
    self._min_priority_email = min_priority_email

set_min_priority_pagerduty(min_priority_pagerduty: str) -> None

Sets the minimum priority level for PagerDuty notifications.

Parameters:

Name Type Description Default
min_priority_pagerduty str

Minimum priority threshold for triggering PagerDuty alerts

required
Source code in spark_expectations/core/context.py
def set_min_priority_pagerduty(self, min_priority_pagerduty: str) -> None:
    """
    Sets the minimum priority level for PagerDuty notifications.

    Args:
        min_priority_pagerduty: Minimum priority threshold for triggering PagerDuty alerts
    """
    self._min_priority_pagerduty = min_priority_pagerduty

set_min_priority_slack(min_priority_slack: str) -> None

Sets the minimum priority level for Slack notifications.

Parameters:

Name Type Description Default
min_priority_slack str

Minimum priority threshold for triggering Slack alerts

required
Source code in spark_expectations/core/context.py
def set_min_priority_slack(self, min_priority_slack: str) -> None:
    """
    Sets the minimum priority level for Slack notifications.

    Args:
        min_priority_slack: Minimum priority threshold for triggering Slack alerts
    """
    self._min_priority_slack = min_priority_slack

set_min_priority_teams(min_priority_teams: str) -> None

Sets the minimum priority level for Microsoft Teams notifications.

Parameters:

Name Type Description Default
min_priority_teams str

Minimum priority threshold for triggering Teams alerts

required
Source code in spark_expectations/core/context.py
def set_min_priority_teams(self, min_priority_teams: str) -> None:
    """
    Sets the minimum priority level for Microsoft Teams notifications.

    Args:
        min_priority_teams: Minimum priority threshold for triggering Teams alerts
    """
    self._min_priority_teams = min_priority_teams

set_min_priority_zoom(min_priority_zoom: str) -> None

Sets the minimum priority level for Zoom notifications.

Parameters:

Name Type Description Default
min_priority_zoom str

Minimum priority threshold for triggering Zoom alerts

required
Source code in spark_expectations/core/context.py
def set_min_priority_zoom(self, min_priority_zoom: str) -> None:
    """
    Sets the minimum priority level for Zoom notifications.

    Args:
        min_priority_zoom: Minimum priority threshold for triggering Zoom alerts
    """
    self._min_priority_zoom = min_priority_zoom

set_notification_on_completion(notification_on_completion: bool) -> None

Enables or disables notifications when DQ processing completes.

Parameters:

Name Type Description Default
notification_on_completion bool

True to send notification on completion, False to disable

required
Source code in spark_expectations/core/context.py
def set_notification_on_completion(self, notification_on_completion: bool) -> None:
    """
    Enables or disables notifications when DQ processing completes.

    Args:
        notification_on_completion: True to send notification on completion, False to disable
    """
    self._notification_on_completion = notification_on_completion

set_notification_on_error_drop_exceeds_threshold_breach(notification_on_error_drop_exceeds_threshold_breach: bool) -> None

Enables or disables notifications when error drop exceeds threshold.

Parameters:

Name Type Description Default
notification_on_error_drop_exceeds_threshold_breach bool

True to send notification on threshold breach, False to disable

required
Source code in spark_expectations/core/context.py
def set_notification_on_error_drop_exceeds_threshold_breach(self, notification_on_error_drop_exceeds_threshold_breach: bool) -> None:
    """
    Enables or disables notifications when error drop exceeds threshold.

    Args:
        notification_on_error_drop_exceeds_threshold_breach: True to send notification on threshold breach, False to disable
    """
    self._notification_on_error_drop_exceeds_threshold_breach = notification_on_error_drop_exceeds_threshold_breach

set_notification_on_fail(notification_on_fail: bool) -> None

Enables or disables notifications when DQ processing fails.

Parameters:

Name Type Description Default
notification_on_fail bool

True to send notification on failure, False to disable

required
Source code in spark_expectations/core/context.py
def set_notification_on_fail(self, notification_on_fail: bool) -> None:
    """
    Enables or disables notifications when DQ processing fails.

    Args:
        notification_on_fail: True to send notification on failure, False to disable
    """
    self._notification_on_fail = notification_on_fail

set_notification_on_start(notification_on_start: bool) -> None

Enables or disables notifications when DQ processing starts.

Parameters:

Name Type Description Default
notification_on_start bool

True to send notification on start, False to disable

required
Source code in spark_expectations/core/context.py
def set_notification_on_start(self, notification_on_start: bool) -> None:
    """
    Enables or disables notifications when DQ processing starts.

    Args:
        notification_on_start: True to send notification on start, False to disable
    """
    self._notification_on_start = notification_on_start

set_notifications_on_rules_action_if_failed_set_ignore(notifications_on_rules_action_if_failed_set_ignore: bool) -> None

Enables or disables notifications for rules with action_if_failed set to 'ignore'.

Parameters:

Name Type Description Default
notifications_on_rules_action_if_failed_set_ignore bool

True to send notifications for ignored failures, False to disable

required
Source code in spark_expectations/core/context.py
def set_notifications_on_rules_action_if_failed_set_ignore(self, notifications_on_rules_action_if_failed_set_ignore: bool) -> None:
    """
    Enables or disables notifications for rules with action_if_failed set to 'ignore'.

    Args:
        notifications_on_rules_action_if_failed_set_ignore: True to send notifications for ignored failures, False to disable
    """
    self._notifications_on_rules_action_if_failed_set_ignore = notifications_on_rules_action_if_failed_set_ignore

set_num_agg_dq_rules(source_agg_enabled: bool = False, final_agg_enabled: bool = False) -> None

This function sets number of applied agg dq rules for batch run source_agg_enabled: Marked True when agg rules set for source, by default False final_agg_enabled: Marked True when agg rules set for final, by default False Returns: None

Source code in spark_expectations/core/context.py
def set_num_agg_dq_rules(self, source_agg_enabled: bool = False, final_agg_enabled: bool = False) -> None:
    """
    This function sets number of applied agg dq rules for batch run
    source_agg_enabled: Marked True when agg rules set for source, by default False
    final_agg_enabled: Marked True when agg rules set for final, by default False
    Returns:
        None
    """

    self._num_agg_dq_rules["num_agg_dq_rules"] += 1
    self._num_dq_rules += 1

    if source_agg_enabled:
        self._num_agg_dq_rules["num_source_agg_dq_rules"] += 1
    if final_agg_enabled:
        self._num_agg_dq_rules["num_final_agg_dq_rules"] += 1

set_num_query_dq_rules(source_query_enabled: bool = False, final_query_enabled: bool = False) -> None

This function sets number of applied query dq rules for batch run source_query_enabled: Marked True when query rules set for source, by default False final_query_enabled: Marked True when query rules set for final, by default False Returns: None

Source code in spark_expectations/core/context.py
def set_num_query_dq_rules(self, source_query_enabled: bool = False, final_query_enabled: bool = False) -> None:
    """
    This function sets number of applied query dq rules for batch run
    source_query_enabled: Marked True when query rules set for source, by default False
    final_query_enabled: Marked True when query rules set for final, by default False
    Returns:
        None
    """

    self._num_query_dq_rules["num_query_dq_rules"] += 1
    self._num_dq_rules += 1

    if source_query_enabled:
        self._num_query_dq_rules["num_source_query_dq_rules"] += 1
    if final_query_enabled:
        self._num_query_dq_rules["num_final_query_dq_rules"] += 1

set_num_row_dq_rules() -> None

This function sets number of applied row dq rules for batch run Returns: None

Source code in spark_expectations/core/context.py
def set_num_row_dq_rules(self) -> None:
    """
    This function sets number of applied row dq rules for batch run
    Returns:
        None

    """
    self._num_row_dq_rules += 1
    self._num_dq_rules += 1

set_output_count(output_count: int = 0) -> None

Sets the output record count after DQ processing.

Parameters:

Name Type Description Default
output_count int

Number of records that passed DQ rules

0
Source code in spark_expectations/core/context.py
def set_output_count(self, output_count: int = 0) -> None:
    """
    Sets the output record count after DQ processing.

    Args:
        output_count: Number of records that passed DQ rules
    """
    self._output_count = output_count

set_pagerduty_creds_dict(pagerduty_creds_dict: Dict[str, str]) -> None

This function helps to set secret keys dict for pagerduty authentication Args: pagerduty_creds_dict (Dict[str, str]): Dictionary containing secrets for PagerDuty authentication

Source code in spark_expectations/core/context.py
def set_pagerduty_creds_dict(self, pagerduty_creds_dict: Dict[str, str]) -> None:
    """
    This function helps to set secret keys dict for pagerduty authentication
    Args:
        pagerduty_creds_dict (Dict[str, str]): Dictionary containing secrets for PagerDuty authentication
    """
    self._pagerduty_creds_dict = pagerduty_creds_dict

set_pagerduty_integration_key(pagerduty_integration_key: str) -> None

Set the pagerduty integration key manually.

Parameters:

Name Type Description Default
pagerduty_integration_key str

Integration key for PagerDuty when creating incidents.

required
Source code in spark_expectations/core/context.py
def set_pagerduty_integration_key(self, pagerduty_integration_key: str) -> None:
    """
    Set the pagerduty integration key manually.

    Args:
        pagerduty_integration_key (str): Integration key for PagerDuty when creating incidents.
    """
    self._pagerduty_integration_key = pagerduty_integration_key

set_pagerduty_webhook_url(pagerduty_webhook_url: str) -> None

This function helps to set pagerduty webhook url Args: pagerduty_webhook_url (str): PagerDuty webhook url to create incidents

Source code in spark_expectations/core/context.py
def set_pagerduty_webhook_url(self, pagerduty_webhook_url: str) -> None:
    """
    This function helps to set pagerduty webhook url
    Args:
        pagerduty_webhook_url (str): PagerDuty webhook url to create incidents
    """
    self._pagerduty_webhook_url = pagerduty_webhook_url

set_query_dq_detailed_stats_status(query_dq_detailed_result_status: bool) -> None

Parameters:

Name Type Description Default
_enable_query_dq_detailed_result
required

Returns:

Source code in spark_expectations/core/context.py
def set_query_dq_detailed_stats_status(self, query_dq_detailed_result_status: bool) -> None:
    """
    Args:
        _enable_query_dq_detailed_result:
    Returns:
    """
    self._enable_query_dq_detailed_result = bool(query_dq_detailed_result_status)

set_query_dq_output_custom_table_name(query_dq_output_custom_table_name: str) -> None

Sets the custom table name for query DQ output results.

Parameters:

Name Type Description Default
query_dq_output_custom_table_name str

Fully qualified table name for query DQ output

required
Source code in spark_expectations/core/context.py
def set_query_dq_output_custom_table_name(self, query_dq_output_custom_table_name: str) -> None:
    """
    Sets the custom table name for query DQ output results.

    Args:
        query_dq_output_custom_table_name: Fully qualified table name for query DQ output
    """
    self._query_dq_output_custom_table_name = query_dq_output_custom_table_name

set_querydq_secondary_queries(querydq_secondary_queries: dict) -> None

This function sets row dq secondary queries Args: querydq_secondary_queries: dict Returns: None

Source code in spark_expectations/core/context.py
def set_querydq_secondary_queries(self, querydq_secondary_queries: dict) -> None:
    """
    This function sets row dq secondary queries
    Args:
        querydq_secondary_queries: dict
    Returns: None
    """
    self._querydq_secondary_queries = querydq_secondary_queries

set_report_table_name(report_table_name: str) -> None

Sets the report table name for DQ observability reports.

Parameters:

Name Type Description Default
report_table_name str

Fully qualified table name for DQ reports

required
Source code in spark_expectations/core/context.py
def set_report_table_name(self, report_table_name: str) -> None:
    """
    Sets the report table name for DQ observability reports.

    Args:
        report_table_name: Fully qualified table name for DQ reports
    """
    self._report_table_name = report_table_name

set_row_dq_end_time() -> None

This function sets end time row dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_row_dq_end_time(self) -> None:
    """
    This function sets end time row dq computation
    Returns:
        None
    """
    self._row_dq_end_time = datetime.now()

set_row_dq_start_time() -> None

This function sets start time row dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_row_dq_start_time(self) -> None:
    """
    This function sets start time row dq computation
    Returns:
        None
    """
    self._row_dq_start_time = datetime.now()

set_row_dq_status(row_dq_status: str = 'Skipped') -> None

Sets the row-level data quality check status.

Parameters:

Name Type Description Default
row_dq_status str

Status of row DQ execution (e.g., "Skipped", "Passed", "Failed")

'Skipped'
Source code in spark_expectations/core/context.py
def set_row_dq_status(self, row_dq_status: str = "Skipped") -> None:
    """
    Sets the row-level data quality check status.

    Args:
        row_dq_status: Status of row DQ execution (e.g., "Skipped", "Passed", "Failed")
    """
    self._row_dq_status = row_dq_status

set_rules_exceeds_threshold(rules: Optional[List[dict]] = None) -> None

This function implements error percentage for each rule type

Source code in spark_expectations/core/context.py
def set_rules_exceeds_threshold(self, rules: Optional[List[dict]] = None) -> None:
    """
    This function implements error percentage for each rule type
    """
    self._rules_error_per = rules

set_rules_execution_settings_config(config: dict) -> None

This function sets stats table writer config Args: config: dict Returns: None

Source code in spark_expectations/core/context.py
def set_rules_execution_settings_config(self, config: dict) -> None:
    """
    This function sets stats table writer config
    Args:
        config: dict
    Returns: None
    """
    self._rules_execution_settings_config = config

set_run_date() -> str staticmethod

This function is used to generate the current datetime in UTC

Returns:

Name Type Description
str str

Returns the current utc datetime in the format - "%Y-%m-%d %H:%M:%S"

Source code in spark_expectations/core/context.py
@staticmethod
def set_run_date() -> str:
    """
    This function is used to generate the current datetime in UTC

    Returns:
        str: Returns the current utc datetime in the format - "%Y-%m-%d %H:%M:%S"

    """
    current_datetime: datetime = datetime.now(timezone.utc)
    return current_datetime.replace(tzinfo=timezone.utc).strftime("%Y-%m-%d %H:%M:%S")

set_se_dq_obs_alert_flag(se_dq_obs_alert_flag: bool) -> None

This function is used to set the se_dq_obs_alert_flag

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_se_dq_obs_alert_flag(self, se_dq_obs_alert_flag: bool) -> None:
    """
    This function is used to set the se_dq_obs_alert_flag

    Returns:
        None

    """
    self._se_dq_obs_alert_flag = se_dq_obs_alert_flag

set_se_enable_error_table(_enable_error_table: bool) -> None

Parameters:

Name Type Description Default
_se_enable_error_table
required

Returns:

Source code in spark_expectations/core/context.py
def set_se_enable_error_table(self, _enable_error_table: bool) -> None:
    """

    Args:
        _se_enable_error_table:

    Returns:

    """
    self._se_enable_error_table = _enable_error_table

set_se_job_metadata(se_job_metadata: Optional[Any] = None) -> None

This function sets Spark Expectations job metadata override.

Returns:

Type Description
None

None

Source code in spark_expectations/core/context.py
def set_se_job_metadata(self, se_job_metadata: Optional[Any] = None) -> None:
    """
    This function sets Spark Expectations job metadata override.

    Returns:
        None
    """
    if se_job_metadata is None:
        self._se_job_metadata = None
        return

    if isinstance(se_job_metadata, dict):
        self._se_job_metadata = se_job_metadata
        return

    if isinstance(se_job_metadata, str):
        try:
            parsed_value = ast.literal_eval(se_job_metadata)
            if isinstance(parsed_value, dict):
                self._se_job_metadata = parsed_value
                return
        except (ValueError, SyntaxError) as e:
            _log.warning(f"Failed to parse se_job_metadata string as dict: {e}")

        self._se_job_metadata = {"job_metadata": se_job_metadata}
        return

    self._se_job_metadata = {"job_metadata": str(se_job_metadata)}

set_se_streaming_stats_dict(se_streaming_stats_dict: Dict[str, str]) -> None

This function helps to set secret keys dict

Source code in spark_expectations/core/context.py
def set_se_streaming_stats_dict(self, se_streaming_stats_dict: Dict[str, str]) -> None:
    """
    This function helps to set secret keys dict"""
    self._se_streaming_stats_dict = se_streaming_stats_dict

set_se_streaming_stats_kafka_bootstrap_server(se_streaming_stats_kafka_server: str) -> None

Source code in spark_expectations/core/context.py
def set_se_streaming_stats_kafka_bootstrap_server(self, se_streaming_stats_kafka_server: str) -> None:
    self._se_streaming_stats_kafka_bootstrap_server = se_streaming_stats_kafka_server

set_se_streaming_stats_kafka_custom_config_enable(se_streaming_stats_kafka_config_enable: bool) -> None

Source code in spark_expectations/core/context.py
def set_se_streaming_stats_kafka_custom_config_enable(self, se_streaming_stats_kafka_config_enable: bool) -> None:
    self._se_streaming_stats_kafka_custom_config_enable = se_streaming_stats_kafka_config_enable

set_slack_webhook_url(slack_webhook_url: str) -> None

Sets the Slack webhook URL for notifications.

Parameters:

Name Type Description Default
slack_webhook_url str

Slack incoming webhook URL

required
Source code in spark_expectations/core/context.py
def set_slack_webhook_url(self, slack_webhook_url: str) -> None:
    """
    Sets the Slack webhook URL for notifications.

    Args:
        slack_webhook_url: Slack incoming webhook URL
    """
    self._slack_webhook_url = slack_webhook_url

set_smtp_creds_dict(smtp_creds_dict: Dict[str, str]) -> None

This function helps to set secret keys dict for smtp server authentication

Source code in spark_expectations/core/context.py
def set_smtp_creds_dict(self, smtp_creds_dict: Dict[str, str]) -> None:
    """
    This function helps to set secret keys dict for smtp server authentication"""
    self._smtp_creds_dict = smtp_creds_dict

set_source_agg_dq_detailed_stats(source_agg_dq_detailed_stats: Optional[List[Tuple]] = None) -> None

Parameters:

Name Type Description Default
_source_agg_dq_detailed_stats
required

Returns:

Source code in spark_expectations/core/context.py
def set_source_agg_dq_detailed_stats(self, source_agg_dq_detailed_stats: Optional[List[Tuple]] = None) -> None:
    """
    Args:
        _source_agg_dq_detailed_stats:
    Returns:
    """
    self._source_agg_dq_detailed_stats = source_agg_dq_detailed_stats

set_source_agg_dq_end_time() -> None

This function sets end time source agg dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_source_agg_dq_end_time(self) -> None:
    """
    This function sets end time source agg dq computation
    Returns:
        None

    """
    self._source_agg_dq_end_time = datetime.now()

set_source_agg_dq_result(source_agg_dq_result: Optional[List[Dict[str, str]]] = None) -> None

Sets the source aggregate data quality check results.

Parameters:

Name Type Description Default
source_agg_dq_result Optional[List[Dict[str, str]]]

List of dictionaries containing source agg DQ results

None
Source code in spark_expectations/core/context.py
def set_source_agg_dq_result(self, source_agg_dq_result: Optional[List[Dict[str, str]]] = None) -> None:
    """
    Sets the source aggregate data quality check results.

    Args:
        source_agg_dq_result: List of dictionaries containing source agg DQ results
    """
    self._source_agg_dq_result = source_agg_dq_result

set_source_agg_dq_start_time() -> None

This function sets start time source agg dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_source_agg_dq_start_time(self) -> None:
    """
    This function sets start time source agg dq computation
    Returns:
         None

    """
    self._source_agg_dq_start_time = datetime.now()

set_source_agg_dq_status(source_agg_dq_status: str = 'Skipped') -> None

Sets the source aggregate data quality check status.

Parameters:

Name Type Description Default
source_agg_dq_status str

Status of source agg DQ execution (e.g., "Skipped", "Passed", "Failed")

'Skipped'
Source code in spark_expectations/core/context.py
def set_source_agg_dq_status(self, source_agg_dq_status: str = "Skipped") -> None:
    """
    Sets the source aggregate data quality check status.

    Args:
        source_agg_dq_status: Status of source agg DQ execution (e.g., "Skipped", "Passed", "Failed")
    """
    self._source_agg_dq_status = source_agg_dq_status

set_source_query_dq_detailed_stats(source_query_dq_detailed_stats: Optional[List[Tuple]] = None) -> None

Parameters:

Name Type Description Default
_source_query_dq_detailed_stats
required

Returns:

Source code in spark_expectations/core/context.py
def set_source_query_dq_detailed_stats(self, source_query_dq_detailed_stats: Optional[List[Tuple]] = None) -> None:
    """
    Args:
        _source_query_dq_detailed_stats:
    Returns:
    """
    self._source_query_dq_detailed_stats = source_query_dq_detailed_stats

set_source_query_dq_end_time() -> None

This function sets end time source query dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_source_query_dq_end_time(self) -> None:
    """
    This function sets end time source query dq computation
    Returns:
        None
    """
    self._source_query_dq_end_time = datetime.now()

set_source_query_dq_output(source_query_dq_output: Optional[List[dict]] = None) -> None

This function sets row dq secondary queries Args: source_query_dq_output: List[dict] Returns: None

Source code in spark_expectations/core/context.py
def set_source_query_dq_output(self, source_query_dq_output: Optional[List[dict]] = None) -> None:
    """
    This function sets row dq secondary queries
    Args:
        source_query_dq_output: List[dict]
    Returns: None
    """
    self._source_query_dq_output = source_query_dq_output

set_source_query_dq_result(source_query_dq_result: Optional[List[Dict[str, str]]] = None) -> None

Sets the source query data quality check results.

Parameters:

Name Type Description Default
source_query_dq_result Optional[List[Dict[str, str]]]

List of dictionaries containing source query DQ results

None
Source code in spark_expectations/core/context.py
def set_source_query_dq_result(self, source_query_dq_result: Optional[List[Dict[str, str]]] = None) -> None:
    """
    Sets the source query data quality check results.

    Args:
        source_query_dq_result: List of dictionaries containing source query DQ results
    """
    self._source_query_dq_result = source_query_dq_result

set_source_query_dq_start_time() -> None

This function sets start time source query dq computation Returns: None

Source code in spark_expectations/core/context.py
def set_source_query_dq_start_time(self) -> None:
    """
    This function sets start time source query dq computation
    Returns:
        None
    """
    self._source_query_dq_start_time = datetime.now()

set_source_query_dq_status(source_query_dq_status: str = 'Skipped') -> None

Sets the source query data quality check status.

Parameters:

Name Type Description Default
source_query_dq_status str

Status of source query DQ execution (e.g., "Skipped", "Passed", "Failed")

'Skipped'
Source code in spark_expectations/core/context.py
def set_source_query_dq_status(self, source_query_dq_status: str = "Skipped") -> None:
    """
    Sets the source query data quality check status.

    Args:
        source_query_dq_status: Status of source query DQ execution (e.g., "Skipped", "Passed", "Failed")
    """
    self._source_query_dq_status = source_query_dq_status

set_stats_detailed_dataframe(dataframe: DataFrame) -> None

Sets the DataFrame containing detailed statistics.

Parameters:

Name Type Description Default
dataframe DataFrame

DataFrame with detailed DQ statistics

required
Source code in spark_expectations/core/context.py
def set_stats_detailed_dataframe(self, dataframe: DataFrame) -> None:
    """
    Sets the DataFrame containing detailed statistics.

    Args:
        dataframe: DataFrame with detailed DQ statistics
    """
    self._dataframe = dataframe

set_stats_dict(df: DataFrame) -> None

This function is used to set the stats_dict

Returns:

Type Description
None

dictionary of statistics

Source code in spark_expectations/core/context.py
def set_stats_dict(self, df: DataFrame) -> None:
    """
    This function is used to set the stats_dict

    Returns:
        dictionary of statistics

    """
    self._stats_dict = [row.asDict() for row in df.collect()]

set_stats_table_writer_config(config: dict) -> None

This function sets stats table writer config Args: config: dict Returns: None

Source code in spark_expectations/core/context.py
def set_stats_table_writer_config(self, config: dict) -> None:
    """
    This function sets stats table writer config
    Args:
        config: dict
    Returns: None
    """
    self._stats_table_writer_config = config

set_stats_table_writer_type(writer_type: str) -> None

This function returns stats table writer type Args: writer_type: str Returns: str: Returns stats_table_writer_type which in str

Source code in spark_expectations/core/context.py
def set_stats_table_writer_type(self, writer_type: str) -> None:
    """
    This function returns stats table writer type
    Args:
        writer_type: str
    Returns:
        str: Returns stats_table_writer_type which in str
    """
    self._stats_table_writer_type = writer_type

set_summarized_row_dq_res(summarized_row_dq_res: Optional[List[Dict[str, str]]] = None) -> None

This function implements or supports to set row dq summarized res Args: summarized_row_dq_res: list(dict) Returns: None

Source code in spark_expectations/core/context.py
def set_summarized_row_dq_res(self, summarized_row_dq_res: Optional[List[Dict[str, str]]] = None) -> None:
    """
    This function implements or supports to set row dq summarized res
    Args:
        summarized_row_dq_res: list(dict)
    Returns: None

    """
    self._summarized_row_dq_res = summarized_row_dq_res

set_supported_df_query_dq() -> DataFrame

Source code in spark_expectations/core/context.py
def set_supported_df_query_dq(self) -> DataFrame:
    return self.spark.createDataFrame(
        [{"spark_expectations_query_check": "supported_place_holder_dataset_to_run_query_check"}]
    )

set_table_name(table_name: str) -> None

Sets the table name being processed.

Parameters:

Name Type Description Default
table_name str

Name of the table being processed

required
Source code in spark_expectations/core/context.py
def set_table_name(self, table_name: str) -> None:
    """
    Sets the table name being processed.

    Args:
        table_name: Name of the table being processed
    """
    self._table_name = table_name

set_target_agg_dq_detailed_stats(target_agg_dq_detailed_stats: Optional[List[Tuple]] = None) -> None

Parameters:

Name Type Description Default
_target_agg_dq_detailed_stats
required

Returns:

Source code in spark_expectations/core/context.py
def set_target_agg_dq_detailed_stats(self, target_agg_dq_detailed_stats: Optional[List[Tuple]] = None) -> None:
    """
    Args:
        _target_agg_dq_detailed_stats:
    Returns:
    """
    self._target_agg_dq_detailed_stats = target_agg_dq_detailed_stats

set_target_and_error_table_writer_config(config: dict) -> None

This function sets target and error table writer config Args: config: dict Returns: None

Source code in spark_expectations/core/context.py
def set_target_and_error_table_writer_config(self, config: dict) -> None:
    """
    This function sets target and error table writer config
    Args:
        config: dict
    Returns: None

    """
    self._target_and_error_table_writer_config = config

set_target_and_error_table_writer_type(writer_type: str) -> None

This function returns target and error table writer type Args: writer_type: str Returns: str: Returns target_and_error_table_writer_type which in str

Source code in spark_expectations/core/context.py
def set_target_and_error_table_writer_type(self, writer_type: str) -> None:
    """
    This function returns target and error table writer type
    Args:
        writer_type: str
    Returns:
        str: Returns target_and_error_table_writer_type which in str
    """
    self._target_and_error_table_writer_type = writer_type

set_target_query_dq_detailed_stats(target_query_dq_detailed_stats: Optional[List[Tuple]] = None) -> None

Parameters:

Name Type Description Default
_target_query_dq_detailed_stats
required

Returns:

Source code in spark_expectations/core/context.py
def set_target_query_dq_detailed_stats(self, target_query_dq_detailed_stats: Optional[List[Tuple]] = None) -> None:
    """
    Args:
        _target_query_dq_detailed_stats:
    Returns:
    """
    self._target_query_dq_detailed_stats = target_query_dq_detailed_stats

set_target_query_dq_output(target_query_dq_output: Optional[List[dict]] = None) -> None

This function sets row dq secondary queries Args: target_query_dq_output: List[dict] Returns: None

Source code in spark_expectations/core/context.py
def set_target_query_dq_output(self, target_query_dq_output: Optional[List[dict]] = None) -> None:
    """
    This function sets row dq secondary queries
    Args:
        target_query_dq_output: List[dict]
    Returns: None
    """
    self._target_query_dq_output = target_query_dq_output

set_teams_webhook_url(teams_webhook_url: str) -> None

Sets the Microsoft Teams webhook URL for notifications.

Parameters:

Name Type Description Default
teams_webhook_url str

Teams incoming webhook URL

required
Source code in spark_expectations/core/context.py
def set_teams_webhook_url(self, teams_webhook_url: str) -> None:
    """
    Sets the Microsoft Teams webhook URL for notifications.

    Args:
        teams_webhook_url: Teams incoming webhook URL
    """
    self._teams_webhook_url = teams_webhook_url

set_to_mail(to_mail: str) -> None

Sets the recipient email addresses for notifications.

Parameters:

Name Type Description Default
to_mail str

Comma-separated email addresses of recipients

required
Source code in spark_expectations/core/context.py
def set_to_mail(self, to_mail: str) -> None:
    """
    Sets the recipient email addresses for notifications.

    Args:
        to_mail: Comma-separated email addresses of recipients
    """
    self._to_mail = to_mail

set_zoom_token(zoom_token: str) -> None

Set the Zoom webhook token.

Parameters:

Name Type Description Default
zoom_token str

The token for Zoom notification.

required
Source code in spark_expectations/core/context.py
def set_zoom_token(self, zoom_token: str) -> None:
    """
    Set the Zoom webhook token.

    Args:
        zoom_token (str): The token for Zoom notification.
    """
    self._zoom_token = zoom_token

set_zoom_webhook_url(zoom_webhook_url: str) -> None

Set the Zoom webhook URL.

Parameters:

Name Type Description Default
zoom_webhook_url str

The webhook URL for Zoom notification.

required
Source code in spark_expectations/core/context.py
def set_zoom_webhook_url(self, zoom_webhook_url: str) -> None:
    """
    Set the Zoom webhook URL.

    Args:
        zoom_webhook_url (str): The webhook URL for Zoom notification.
    """
    self._zoom_webhook_url = zoom_webhook_url