PostgreSQL V2 Connector Not Appearing as Sink in Azure Data Factory

Mohan, Maddala Dharmendra 0 Reputation points
2025-10-21T12:01:30.41+00:00

Hi Team,

We are working on a data integration pipeline using Azure Data Factory (ADF) to load data from Azure Data Lake (Zen2 Lake) into an on-premises PostgreSQL database using the PostgreSQL V2 connector.

Here’s our setup:

PostgreSQL version: 15+

ADF Connector: PostgreSQL V2

Integration Runtime: Self-hosted (version 5.54.9267.1)

Sink Dataset: Created using PostgreSqlV2Table type

Linked Service: Uses PostgreSQL V2 with SHIR

Source Dataset: DelimitedText from Azure Data Lake Storage Gen2

We are able to:

Preview data from the source

Validate the sink dataset independently

**Issue:**In the Copy Activity, the sink dataset (sink_ds_for_postgres) does not appear in the dropdown, and when attempting to add it manually via JSON, ADF Studio throws a "name is not allowed" error.

We’ve verified:

PostgreSQL version compatibility

SHIR connectivity and version

Dataset and linked service configurations

No regional limitations are documented

Could you please help us identify the root cause or confirm if this is a known issue?

Thanks,
Mohan

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
{count} votes

1 answer

Sort by: Most helpful
  1. Manoj Kumar Boyini 330 Reputation points Microsoft External Staff Moderator
    2025-10-23T13:25:59.6633333+00:00

    Hi Mohan, Maddala Dharmendra,

    Thanks for sharing those details you are absolutely right to notice that difference in the documentation. I checked this closely, and here what’s really going on.

    The PostgreSQL V2 connector currently supports reading data (source) only. It doesn’t support writing data to PostgreSQL (sink) in the Copy Activity, which is why your PostgreSqlV2Table dataset doesn’t appear in the sink dropdown, and why you get the “name not allowed” error when you try to add it manually. This isn’t a configuration issue on your side it’s just a limitation of the current V2 connector.

    You have two possible ways to load data into PostgreSQL:

    If your PostgreSQL database is hosted on Azure Use the Azure Database for PostgreSQL connector instead of the V2 one. It fully supports copy operations into PostgreSQL as a sink. Just create an AzurePostgreSql linked service and dataset, and select that in your Copy activity sink.

    If your PostgreSQL database is on-premises You can connect through ODBC using your Self-Hosted Integration Runtime (SHIR).

    Install the PostgreSQL ODBC driver on your SHIR machine.

      Create an ODBC linked service and an OdbcTable dataset for your target table.
      
         Use OdbcSink in your Copy activity. This is the supported way to write data to on-premises PostgreSQL from ADF.
         
    

    The reason the docs seem confusing is because one page lists PostgreSQL in the general “supported data stores,” but the detailed PostgreSQL V2 connector page mentions that it works as a source only, not as a sink.

    There’s nothing wrong with your setup. The connector itself just doesn’t support writes yet. Switching to the AzurePostgreSQL connector (for Azure) or ODBC (for on-prem) will solve this.

    Do let us know if you have questions and queries.

    Thanks,
    Manoj

    1 person found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.