Classes¶
brickflow_plugins.databricks.workflow_dependency_sensor.WorkflowDependencySensor(databricks_host: str, databricks_token: Union[str, SecretStr], delta: timedelta, timeout_seconds: int, dependency_job_id: int = None, dependency_job_name: str = None, poke_interval_seconds: int = 60)
¶
This is used to have dependencies on the databricks workflow
Example Usage in your brickflow task
service_principle_pat = ctx.dbutils.secrets.get("brickflow-demo-tobedeleted", "service_principle_id") WorkflowDependencySensor( databricks_host=https://your_workspace_url.cloud.databricks.com, databricks_token=service_principle_pat, dependency_job_id=job_id, poke_interval=20, timeout=60, delta=timedelta(days=1) ) In above snippet Databricks secrets is used as a secure service to store the databricks token. If you get your token from another secret management service, like AWS Secrets Manager, GCP Secret Manager or Azure Key Vault, just pass it in the databricks_token argument.
Source code in brickflow_plugins/databricks/workflow_dependency_sensor.py
Attributes¶
databricks_host = databricks_host
instance-attribute
¶
databricks_token = databricks_token if isinstance(databricks_token, SecretStr) else SecretStr(databricks_token)
instance-attribute
¶
delta = delta
instance-attribute
¶
dependency_job_id = dependency_job_id
instance-attribute
¶
dependency_job_name = dependency_job_name
instance-attribute
¶
log = logging
instance-attribute
¶
poke_interval = poke_interval_seconds
instance-attribute
¶
start_time = time.time()
instance-attribute
¶
timeout = timeout_seconds
instance-attribute
¶
Functions¶
execute()
¶
Source code in brickflow_plugins/databricks/workflow_dependency_sensor.py
get_execution_start_time_unix_milliseconds() -> int
¶
Source code in brickflow_plugins/databricks/workflow_dependency_sensor.py
get_http_session()
cached
¶
Source code in brickflow_plugins/databricks/workflow_dependency_sensor.py
get_retry_class(max_retries)
¶
Source code in brickflow_plugins/databricks/workflow_dependency_sensor.py
brickflow_plugins.databricks.workflow_dependency_sensor.WorkflowTaskDependencySensor(dependency_job_name: str, dependency_task_name: str, delta: timedelta, timeout_seconds: int, databricks_host: str = None, databricks_token: Union[str, SecretStr] = None, poke_interval_seconds: int = 60)
¶
Bases: WorkflowDependencySensor
This is used to have dependencies on the specific task within a databricks workflow
Example Usage in your brickflow task
service_principle_pat = ctx.dbutils.secrets.get("scope", "service_principle_id") WorkflowDependencySensor( databricks_host=https://your_workspace_url.cloud.databricks.com, databricks_token=service_principle_pat, dependency_job_id=job_id, dependency_task_name="foo", poke_interval=20, timeout=60, delta=timedelta(days=1) ) In the above snippet Databricks secrets are used as a secure service to store the databricks token. If you get your token from another secret management service, like AWS Secrets Manager, GCP Secret Manager or Azure Key Vault, just pass it in the databricks_token argument.