Azure Data Factory is a cloud-based data integration service that automates the movement and transformation of data. We can create data integration solutions using the Data Factory service that can read data from various data stores, transform/process the data, and publish result data to the data stores. Data Factory service allows us to create data pipelines that move and transform data, and then run the pipelines on a specified schedule (hourly, daily, weekly, etc.). It also provides interface to monitor our pipeline and its execution. We can execute the pipeline on ad hoc basis as well. In the first exercise, we will learn to create Azure Data Factory and linked services. In the second exercise, we will learn to create datasets in Azure Data Factory. In the third exercise, we will learn to create pipelines in Azure Data Factory and will also learn to execute the pipeline to see results.
Exercise 1: Create Azure Data Factory and Linked Services
Exercise 2: Create datasets
Exercise 3: Create pipeline, execution of pipeline and see result