Akash S02 - Thanks for the question and using MS Q&A Platform.
You're absolutely right in your understanding — for Azure Data Lake Storage Gen2 (ADLS Gen2) connected with Azure Databricks (Unity Catalog enabled), only the dfs subresource private endpoint is typically required. ✅ Why dfs is usually sufficient:
- ADLS Gen2 uses the HDFS-compatible endpoint (dfs.core.windows.net) for hierarchical namespace operations, which is what Databricks primarily interacts with — especially when using Unity Catalog and mounting storage.
- The dfs endpoint supports file system semantics, which are essential for operations like reading/writing files, listing directories, and managing metadata.
❓ When would you need the blob subresource private endpoint? You would need the blob endpoint (blob.core.windows.net) in scenarios where:
- Applications or services access the storage using Blob REST APIs directly (e.g., AzCopy, Azure Functions, or other tools that use blob APIs).
- Model serving in Databricks: If you're using serverless compute (like SQL warehouses or model serving endpoints), Databricks may use the blob endpoint to download model artifacts. In such cases, a blob private endpoint is required for secure access from serverless workloads.
- Non-Databricks services (e.g., Azure Synapse, Logic Apps, etc.) that interact with the storage account via blob APIs.
🔒 Best Practice Summary:
- For Databricks with Unity Catalog, use dfs private endpoint.
- Add blob private endpoint only if:
- You have other services accessing the storage via blob APIs.
- You're using Databricks serverless features that require blob access.
Hope this helps. Let me know if you have any further questions or need additional assistance. Also, if these answers your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.
𝘛𝘰 𝘴𝘵𝘢𝘺 𝘪𝘯𝘧𝘰𝘳𝘮𝘦𝘥 𝘢𝘣𝘰𝘶𝘵 𝘵𝘩𝘦 𝘭𝘢𝘵𝘦𝘴𝘵 𝘶𝘱𝘥𝘢𝘵𝘦𝘴 𝘢𝘯𝘥 𝘪𝘯𝘴𝘪𝘨𝘩𝘵𝘴 𝘰𝘯 𝘈𝘻𝘶𝘳𝘦 𝘋𝘢𝘵𝘢𝘣𝘳𝘪𝘤𝘬𝘴, 𝘥𝘢𝘵𝘢 𝘦𝘯𝘨𝘪𝘯𝘦𝘦𝘳𝘪𝘯𝘨, 𝘢𝘯𝘥 Data & AI 𝘪𝘯𝘯𝘰𝘷𝘢𝘵𝘪𝘰𝘯𝘴, 𝘧𝘰𝘭𝘭𝘰𝘸 𝘮𝘦 𝘰𝘯 𝘓𝘪𝘯𝘬𝘦𝘥𝘐𝘯.