Azure Databricks
    • Dark
      Light

    Azure Databricks

    • Dark
      Light

    Article summary

    To set up the Databricks Spark job:

    1. In your warehouse, create a catalog namespace called sightmachine.

    2. With the sm_cdc_script.py we provided, upload the file into your workspace on Databricks. The script is configured to sync the CDC data from an Azure Cloud Storage container into the Warehouse.

    3. The Service Principal credentials to access the cloud storage need to be added to the cluster configuration.

      1. https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#access-adls-gen2-directly

    4. Adding Secrets to Databricks.

      1. Add the Service Principal Secret Value to the secrets API for Databricks Use the Databricks CLI to do this.

        1. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/tutorial

        2. CLI example (Service Principal): databricks secrets create-scope <secret-scope> databricks secrets put-secret <secret-scope> sp_client_secret --string-value <secret-value> ex. databricks secrets create-scope smreleasetesting databricks secrets put-secret smreleasetesting sp_client_secret --string-value 'REDACT'

    5. In the Databricks Warehouse UI, go to Workflows, click “Create Job”

    1. Set “Task name” as “SightMachineETL”

    2. Set “Type” as “Python Script”

    3. Set “Source” as “Workspace”

    4. Set “Path” as the path to the cdc_script.py file linked in step 2.  

    5. Set “Parameters” as the parameter to the location of the CDC data in cloud storage.

      1. [“--file-path=abfss://<storage account container>@<storage account>.dfs.core/windows.net/wal2json”, “--context=<storage account>”]

      2. E.g: ["--file-path=abfss://[email protected]/wal2json", "--context=smreleasetesting"]

    6. For “Cluster” row,  click on the “edit” button.

    1. For “Databricks runtime version” Use DBR 14.3 LTS or later.

    2. For “Driver Type”, select the instance size that will work for your use case, we strongly recommend using a Delta Cache Accelerated instance (d_ads) instance. 

    3. Click on “Advanced options”

    4. We need to configure the Spark cluster with the credentials and Cloud Storage location information. Use the Databricks Secrets Vault to protect secrets. Spark Config values to add:

      1. Service Principle: spark.hadoop.fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net OAuth spark.hadoop.fs.azure.account.oauth.provider.type.<storage-account>.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider spark.hadoop.fs.azure.account.oauth2.client.id.<storage-account>.dfs.core.windows.net <application-id> spark.hadoop.fs.azure.account.oauth2.client.secret.<storage-account>.dfs.core.windows.net {{secrets/<secret-scope>/<service-credential-key>}} spark.hadoop.fs.azure.account.oauth2.client.endpoint.<storage-account>.dfs.core.windows.net https://login.microsoftonline.com/<directory-id>/oauth2/token E.g. spark.hadoop.fs.azure.account.auth.type.smreleasetesting.dfs.core.windows.net OAuth spark.hadoop.fs.azure.account.oauth.provider.type.smreleasetesting.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider spark.hadoop.fs.azure.account.oauth2.client.id.smreleasetesting.dfs.core.windows.net 4669e29c-ce6e-417f-a495-e331ae969ccb spark.hadoop.fs.azure.account.oauth2.client.secret.smreleasetesting.dfs.core.windows.net {{secrets/smreleasetesting/sp_client_secret}} spark.hadoop.fs.azure.account.oauth2.client.endpoint.smreleasetesting.dfs.core.windows.net https://login.microsoftonline.com/beb1d7f9-8e2e-4dc4-83be-190ebceb70ea/oauth2/token

    5. Click Confirm button to save cluster configuration.

    1. Click “Create task” to save the job configuration.

    2. Edit the Job Title to something more meaningful than “New Job DATE”

    3. Click “Run now” to run the task, the task will run continuously to sync data between Sight Machine Cloud Storage and DataBricks

    Troubleshooting notes:

    1. If the job fails with error:

    Server failed to authenticate the request

    This means there was something wrong with configuring the secrets so it was unable to read the files. Please double check that the secrets were set up correctly.

    1. If the job fails with error: 

    NOTE: When using the `ipython kernel` entry point, Ctrl-C will not work. To exit, you will have to explicitly quit this process, by either sending "quit" from a client, or using Ctrl-\ in UNIX-like environments. To read more about this, see https://github.com/ipython/ipython/issues/2049

    This means that the cluster provided does not have enough memory, please attempt to modify the Node type to a type with more memory



    What's Next