Azure Data Factory (ADF) is a serverless data integration service for simplifying ETL (Extract, Transform, Load) at cloud scale.
In ADF, you build **Pipelines** of activities. An activity can be a simple 'Copy' (from SQL to Blob Storage), a 'Mapping Data Flow' (transforming data visually), or triggering an external service like a Databricks notebook or a .NET C# script.
A crucial tool for 'Hybrid' cloud. If you have data in an on-premise SQL server that isn't exposed to the internet, you install the **Integration Runtime** on a server inside your office. ADF then reaches 'Out' from your office to Azure, allowing you to securely pull data into the cloud without opening firewall ports.
Q: "ADF vs Logic Apps: When to use which?"
Architect Answer: "Use **Logic Apps** for 'Small Data' events (e.g., 'A user registered, send an email'). Use **Data Factory** for 'Big Data' batch processing (e.g., 'At midnight, sync 10 million sales records from the retail system to the data warehouse'). ADF is built for throughput and reliability with massive data sets."