Skip to content

Commands

This page provides documentation for our command line tools.

bf

CLI for managing Databricks Workflows

Usage:

bf [OPTIONS] COMMAND [ARGS]...

Options:

  --version  Show the version and exit.
  --help     Show this message and exit.

bundle

CLI for proxying to databricks bundles cli

Usage:

bf bundle [OPTIONS]

Options:

  --help  Show this message and exit.

docs

Use to open docs in your browser

Usage:

bf docs [OPTIONS]

Options:

  --help  Show this message and exit.

projects

Manage one to many brickflow projects

Usage:

bf projects [OPTIONS] COMMAND [ARGS]...

Options:

  --help  Show this message and exit.

add

Adds a project to the brickflow-multi-project.yml file and a entrypoint.py file in workflows dir

Usage:

bf projects add [OPTIONS]

Options:

  --name TEXT                     Name of the project
  --path-from-repo-root-to-project-root DIRECTORY
                                  Path from repo root to project root
  --path-project-root-to-workflows-dir TEXT
                                  Path from project root to workflows dir
  -g, --git-https-url TEXT        Provide the github URL for your project,
                                  example: https://github.com/nike-eda-
                                  apla/brickflow
  -bfv, --brickflow-version TEXT
  -sev, --spark-expectations-version TEXT
  --skip-entrypoint               Skip creating entrypoint.py file
  --help                          Show this message and exit.

deploy

Deploy the resources and workflows to databricks for the project configured in the brickflow-project-root.yml file

Usage:

bf projects deploy [OPTIONS]

Options:

  -t, --tag TEXT         Provide the runtime key-value tags, each key-value
                         separated by space! Example: bf projects deploy -p
                         DEFAULT -e local -t t_key1=value1 -kv t_key2=value2
  -kv, --key-value TEXT  Provide the runtime key-value parameters, each key-
                         value separated by space! Example: bf projects deploy
                         -p DEFAULT -e local -kv key1=value1 -kv key2=value2
  -w, --workflow TEXT    Provide the workflow names (local mode only) to
                         deploy, each workflow separated by space! Note:
                         provide the workflow names without the env prefix or
                         the file extension. Example: bf projects deploy -p
                         DEFAULT -e local -w wf1 -w wf2
  --force-acquire-lock   Force acquire lock for databricks bundles destroy.
  --skip-libraries       Skip automatically adding brickflow libraries.
  --auto-approve         Auto approve brickflow pipeline without being
                         prompted to approve.
  -p, --profile TEXT     The databricks profile to use for authenticating to
                         databricks during deployment.
  --project []           Select the project of workflows you would like to
                         deploy.
  -e, --env TEXT         Set the environment value, certain tags [TBD] get
                         added to the workflows based on this value.
  --help                 Show this message and exit.

destroy

Destroys the deployed resources and workflows in databricks for the project

Usage:

bf projects destroy [OPTIONS]

Options:

  -t, --tag TEXT         Provide the runtime key-value tags, each key-value
                         separated by space! Example: bf projects deploy -p
                         DEFAULT -e local -t t_key1=value1 -kv t_key2=value2
  -kv, --key-value TEXT  Provide the runtime key-value parameters, each key-
                         value separated by space! Example: bf projects deploy
                         -p DEFAULT -e local -kv key1=value1 -kv key2=value2
  -w, --workflow TEXT    Provide the workflow names (local mode only) to
                         deploy, each workflow separated by space! Note:
                         provide the workflow names without the env prefix or
                         the file extension. Example: bf projects deploy -p
                         DEFAULT -e local -w wf1 -w wf2
  --force-acquire-lock   Force acquire lock for databricks bundles destroy.
  --skip-libraries       Skip automatically adding brickflow libraries.
  --auto-approve         Auto approve brickflow pipeline without being
                         prompted to approve.
  -p, --profile TEXT     The databricks profile to use for authenticating to
                         databricks during deployment.
  --project []           Select the project of workflows you would like to
                         deploy.
  -e, --env TEXT         Set the environment value, certain tags [TBD] get
                         added to the workflows based on this value.
  --help                 Show this message and exit.

list

Lists all projects in the brickflow-multi-project.yml file

Usage:

bf projects list [OPTIONS]

Options:

  --help  Show this message and exit.

remove

Removes a project from the brickflow-multi-project.yml file

Usage:

bf projects remove [OPTIONS]

Options:

  --name []  Name of the project
  --help     Show this message and exit.

sync

Sync project file tree into databricks workspace from local. It is only one way from local to databricks workspace.

Usage:

bf projects sync [OPTIONS]

Options:

  --watch                   Enable filewatcher to sync files over.
  --full                    Run a full sync.
  --interval-duration TEXT  File system polling interval (for --watch).
  --debug                   Enable debug logs
  -t, --tag TEXT            Provide the runtime key-value tags, each key-value
                            separated by space! Example: bf projects deploy -p
                            DEFAULT -e local -t t_key1=value1 -kv
                            t_key2=value2
  -kv, --key-value TEXT     Provide the runtime key-value parameters, each
                            key-value separated by space! Example: bf projects
                            deploy -p DEFAULT -e local -kv key1=value1 -kv
                            key2=value2
  -w, --workflow TEXT       Provide the workflow names (local mode only) to
                            deploy, each workflow separated by space! Note:
                            provide the workflow names without the env prefix
                            or the file extension. Example: bf projects deploy
                            -p DEFAULT -e local -w wf1 -w wf2
  --force-acquire-lock      Force acquire lock for databricks bundles destroy.
  --skip-libraries          Skip automatically adding brickflow libraries.
  --auto-approve            Auto approve brickflow pipeline without being
                            prompted to approve.
  -p, --profile TEXT        The databricks profile to use for authenticating
                            to databricks during deployment.
  --project []              Select the project of workflows you would like to
                            deploy.
  -e, --env TEXT            Set the environment value, certain tags [TBD] get
                            added to the workflows based on this value.
  --help                    Show this message and exit.

synth

Synth the bundle.yml for project

Usage:

bf projects synth [OPTIONS]

Options:

  -t, --tag TEXT         Provide the runtime key-value tags, each key-value
                         separated by space! Example: bf projects deploy -p
                         DEFAULT -e local -t t_key1=value1 -kv t_key2=value2
  -kv, --key-value TEXT  Provide the runtime key-value parameters, each key-
                         value separated by space! Example: bf projects deploy
                         -p DEFAULT -e local -kv key1=value1 -kv key2=value2
  -w, --workflow TEXT    Provide the workflow names (local mode only) to
                         deploy, each workflow separated by space! Note:
                         provide the workflow names without the env prefix or
                         the file extension. Example: bf projects deploy -p
                         DEFAULT -e local -w wf1 -w wf2
  --skip-libraries       Skip automatically adding brickflow libraries.
  -p, --profile TEXT     The databricks profile to use for authenticating to
                         databricks during deployment.
  --project []           Select the project of workflows you would like to
                         deploy.
  -e, --env TEXT         Set the environment value, certain tags [TBD] get
                         added to the workflows based on this value.
  --help                 Show this message and exit.