Azure Data Factory Documentation

Azure Data Factory (ADF) is a cloud‑based data integration service that allows you to create data‑driven workflows for orchestrating and automating data movement and transformation at scale.

Key Concepts

Sample Pipeline JSON

{
    "name": "CopyPipeline",
    "properties": {
        "activities": [
            {
                "name": "CopyFromBlobToSQL",
                "type": "Copy",
                "inputs": [
                    { "referenceName": "BlobDataset", "type": "DatasetReference" }
                ],
                "outputs": [
                    { "referenceName": "SqlDataset", "type": "DatasetReference" }
                ],
                "typeProperties": {
                    "source": { "type": "BlobSource" },
                    "sink": { "type": "SqlSink" }
                }
            }
        ]
    }
}

Quickstart: Create a Pipeline

  1. Open the Azure portal and navigate to your Data Factory instance.
  2. Select Author & Monitor to launch the ADF UI.
  3. Click + New pipeline and give it a name.
  4. Add a Copy data activity and configure source & sink linked services.
  5. Validate and publish the pipeline.

Resources