Set up connection from Azure Data Factory to Databricks

Set up connection from Azure Data Factory to Databricks

Both, Azure Data Factory and Azure Databricks offer transformations at scale when it comes to ELT processing. On top of that, ADF allows you to orchestrate the whole solution in an easy way. In a case, when you prefer to use Scala, Python or SQL code in your process, rather than Mapping Data Flow in ADF – you must link ADF to Databricks. I will show you how to do that in two ways.

In this short video, I will present two methods and their differences, and as always the demo:
DEMO 1: Connect via PAT (Personal Access Token)
DEMO 2: Connect via MSI (Managed Identity Authentication)

 

 

Previous Two methods of deployment Azure Data Factory
Next Canvas Apps Components

About author

Kamil Nowinski
Kamil Nowinski 200 posts

Blogger, speaker. Data Platform MVP, MCSE. Senior Data Engineer & data geek. Member of Data Community Poland, co-organizer of SQLDay, Happy husband & father.

View all posts by this author →

You might also like

17 months of podcasting – recap

Before I publish the next episode of “Ask SQL Family” podcast, I would like to stop and look back for a while. Any time is good for that kind of summarize,

Last Week Reading 0 Comments

Last week reading (2018-09-30)

Entire last week was dimmed by MsIgnite, hence take a look at videos from that conference: Microsoft Ignite 2018 Some sessions recorded and available over there. CosmosDB at Ignite 2018

Azure Data Factory 6 Comments

Publish ADF from code to further environments

Struggling with #ADF deployment? adf_publish branch doesn’t suit your purposes? Don’t have skills with PowerShell? I have good news for you. There is a new tool in the market. It’s a task for Azure

1 Comment

  1. Simon
    May 12, 13:13 Reply

    Hi Kamil,
    thanks a lot for the great video. I know the video is quite old but maybe you still see my question:
    I’m working at the moment on an architecture like the one you showed me, but the databricks workspace will be deployed automatically via terraform.
    So in this configuration it is not feasible to asign contributor rights for the managed identity to the databricks resource itself, since the service principle of the terraform agent does not have the rights to assign roles to users.
    My idea is to give the managed identity contributor rights to the resource group into which the databricks workspace will be deployed, but contributor rights to the whole resource group is a little bit much rights for this use case …
    Do you know if there is a role like “databricks contributor” in Azure that I could give to the managed identity instead?
    I searched for it, but I was not able to find a feasible role…

    Thanks a lot!
    Simon

Leave a Reply