Do we need both 'dfs' and 'blob' subresource private endpoints for ADLS connected with Azure Databricks ( UC enabled )

Akash S02 20 Reputation points
2025-10-14T12:54:39.48+00:00

Hi team,

Do we need both 'dfs' and 'blob' subresource private endpoints ( 2 private endpoints, one for 'dfs' and another for 'blob' ) for ADLS connected with Azure Databricks ( UC enabled ). I believe only 'dfs' is enough but can you confirm on this?

If so, when subresource 'blob' will be required in ADLS which is connected with Azure Databricks ( UC enabled )

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
0 comments No comments
{count} votes

Answer accepted by question author
  1. PRADEEPCHEEKATLA 91,321 Reputation points Moderator
    2025-10-14T13:40:06.7833333+00:00

    Akash S02 - Thanks for the question and using MS Q&A Platform.

    You're absolutely right in your understanding — for Azure Data Lake Storage Gen2 (ADLS Gen2) connected with Azure Databricks (Unity Catalog enabled), only the dfs subresource private endpoint is typically required. ✅ Why dfs is usually sufficient:

    • ADLS Gen2 uses the HDFS-compatible endpoint (dfs.core.windows.net) for hierarchical namespace operations, which is what Databricks primarily interacts with — especially when using Unity Catalog and mounting storage.
    • The dfs endpoint supports file system semantics, which are essential for operations like reading/writing files, listing directories, and managing metadata.

    ❓ When would you need the blob subresource private endpoint? You would need the blob endpoint (blob.core.windows.net) in scenarios where:

    • Applications or services access the storage using Blob REST APIs directly (e.g., AzCopy, Azure Functions, or other tools that use blob APIs).
    • Model serving in Databricks: If you're using serverless compute (like SQL warehouses or model serving endpoints), Databricks may use the blob endpoint to download model artifacts. In such cases, a blob private endpoint is required for secure access from serverless workloads.
    • Non-Databricks services (e.g., Azure Synapse, Logic Apps, etc.) that interact with the storage account via blob APIs.

    🔒 Best Practice Summary:

    • For Databricks with Unity Catalog, use dfs private endpoint.
    • Add blob private endpoint only if:
    1. You have other services accessing the storage via blob APIs.
    2. You're using Databricks serverless features that require blob access.

    Hope this helps. Let me know if you have any further questions or need additional assistance. Also, if these answers your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.


    𝘛𝘰 𝘴𝘵𝘢𝘺 𝘪𝘯𝘧𝘰𝘳𝘮𝘦𝘥 𝘢𝘣𝘰𝘶𝘵 𝘵𝘩𝘦 𝘭𝘢𝘵𝘦𝘴𝘵 𝘶𝘱𝘥𝘢𝘵𝘦𝘴 𝘢𝘯𝘥 𝘪𝘯𝘴𝘪𝘨𝘩𝘵𝘴 𝘰𝘯 𝘈𝘻𝘶𝘳𝘦 𝘋𝘢𝘵𝘢𝘣𝘳𝘪𝘤𝘬𝘴, 𝘥𝘢𝘵𝘢 𝘦𝘯𝘨𝘪𝘯𝘦𝘦𝘳𝘪𝘯𝘨, 𝘢𝘯𝘥 Data & AI 𝘪𝘯𝘯𝘰𝘷𝘢𝘵𝘪𝘰𝘯𝘴, 𝘧𝘰𝘭𝘭𝘰𝘸 𝘮𝘦 𝘰𝘯 𝘓𝘪𝘯𝘬𝘦𝘥𝘐𝘯.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.