Skip to content

comparison

Please find the difference in the changes with different version, latest three versions changes are documented

Modifications made to the version during implementation or integration

stage 0.8.0 1.0.0 1.2.0
rules table schema changes added additional two column
1.enable_error_drop_alert(boolean)
2.error_drop_threshold(int)

documentation found here
Remains same Remains same
rule table creation required yes - creation not required if you're upgrading from old version but schema changes required yes - creation not required if you're upgrading from old version but schema changes required Remains same. Additionally dq rules dynamically updates based on the parameter passed from externally
stats table schema changes remains same Remains Same Remains same. Additionally all row dq rules stats get in row dq rules summary
stats table creation required automated Remains Same Remains same
notification config setting remains same Remains Same Remains same
secret store and Kafka authentication details Create a dictionary that contains your secret configuration values and register in __init__.py for multiple usage, example Remains Same. You can disable streaming if needed, in SparkExpectations class Remains same
spark expectations initialization create spark expectations class object using SparkExpectations by passing product_id and additional optional parameter debugger, stats_streaming_options example New arguments are added. Please follow this - example Remains same
with_expectations decorator remains same New arguments are added. Please follow this - example Remains same
WrappedDataFrameWriter Doesn't exist This is new and users need to provider the writer object to record the spark conf that need to be used while writing - example Remains same