Skip to content

comparision

Please find the difference in the changes with different version, latest three versions changes are documented

Modifications made to the version during implementation or integration

stage 0.6.0 0.7.0 0.8.0
rules table schema changes refer rule table creation here added three additional column
1.enable_for_source_dq_validation(boolean)
2.enable_for_target_dq_validation(boolean)
3.is_active(boolean)

documentation found here
added additional two column
1.enable_error_drop_alert(boolean)
2.error_drop_thresholdt(int)

documentation found here
rule table creation required yes yes - creation not required if you're upgrading from old version but schema changes required yes - creation not required if you're upgrading from old version but schema changes required
stats table schema changes refer rule table creation here added additional columns
1. source_query_dq_results
2. final_query_dq_results
3. row_dq_res_summary
4. dq_run_time
5. dq_rules

renamed columns
1. runtime to meta_dq_run_time
2. run_date to meta_dq_run_date
3. run_id to meta_dq_run_id

documentation found here
remains same
stats table creation required yes yes - creation not required if you're upgrading from old version but schema changes required automated
notification config setting define global notification param, register as env variable and place in the __init__.py file for multiple usage, example Define a global notification parameter in the __init__.py file to be used in multiple instances where the spark_conf parameter needs to be passed within the with_expectations function. example remains same
secret store and kafka authentication details not applicable not applicable Create a dictionary that contains your secret configuration values and register in __init__.py for multiple usage, example
spark expectations initialisation create SparkExpectations class object using the SparkExpectations library and by passing the product_id create spark expectations class object using SpakrExpectations by passing product_id and optional parameter debugger example create spark expectations class object using SpakrExpectations by passing product_id and additional optional parameter debugger, stats_streaming_options example
spark expectations decorator The decorator allows for configuration by passing individual parameters to each decorator. However, registering a DataFrame view within a decorated function is not supported for implementations of query_dq example The decorator allows configurations to be logically grouped through a dictionary passed as a parameter to the decorator. Additionally, registering a DataFrame view within a decorated function is supported for implementations of query_dq. example remains same