Set up connection from Azure Data Factory to Databricks

Set up connection from Azure Data Factory to Databricks

Both, Azure Data Factory and Azure Databricks offer transformations at scale when it comes to ELT processing. On top of that, ADF allows you to orchestrate the whole solution in an easy way. In a case, when you prefer to use Scala, Python or SQL code in your process, rather than Mapping Data Flow in ADF – you must link ADF to Databricks. I will show you how to do that in two ways.

In this short video, I will present two methods and their differences, and as always the demo:
DEMO 1: Connect via PAT (Personal Access Token)
DEMO 2: Connect via MSI (Managed Identity Authentication)

 

 

Previous Two methods of deployment Azure Data Factory
Next Canvas Apps Components

About author

Kamil Nowinski
Kamil Nowinski 202 posts

Blogger, speaker. Data Platform MVP, MCSE. Senior Data Engineer & data geek. Member of Data Community Poland, co-organizer of SQLDay, Happy husband & father.

View all posts by this author →

You might also like

DevOps 1Comments

ADF – Continuous Integration & Deployment with Azure DevOps

Building CI/CD process for Azure Data Factory is not quite straightforward. Furthermore, there are a few different methods of doing that. Before we begin doing that, we must set up

Azure Data Factory 6 Comments

ADF and passwords with Azure Key Vault & set up GIT

Have you worked with ADF yet? Did you configure the GIT code repository to automatically upload all changes to having your own isolated branch during development? If not yet, in

Azure Data Factory 14 Comments

ADF – Deployment from master branch code (JSON files)

In the previous episode, I showed how to deploy Azure Data Factory in a way recommended by Microsoft, which is deployment from adf_publish branch from ARM template. However, there is

1 Comment

  1. Simon
    May 12, 13:13 Reply

    Hi Kamil,
    thanks a lot for the great video. I know the video is quite old but maybe you still see my question:
    I’m working at the moment on an architecture like the one you showed me, but the databricks workspace will be deployed automatically via terraform.
    So in this configuration it is not feasible to asign contributor rights for the managed identity to the databricks resource itself, since the service principle of the terraform agent does not have the rights to assign roles to users.
    My idea is to give the managed identity contributor rights to the resource group into which the databricks workspace will be deployed, but contributor rights to the whole resource group is a little bit much rights for this use case …
    Do you know if there is a role like “databricks contributor” in Azure that I could give to the managed identity instead?
    I searched for it, but I was not able to find a feasible role…

    Thanks a lot!
    Simon

Leave a Reply