Getting Started with Azure Data Factory

Target Audience:

Data integration (SSIS, DataStage, Informatica, etc.) developers, DBAs, cloud enthusiasts.


I begin by creating a Data Factory in Azure, exploring two options: using the interface provided at the Azure portal (cleverly registered as and PowerShell scripts known as Azure Resource Manager (ARM) templates generated from the Azure portal.

I lead you on an expedition inside the ADF portal – navigating the Overview, Author, and Monitoring pages – surfacing what’s where, what’s it for, how to use it in real life. I survey ADF constituents; pipelines, datasets, connections, linked servers, and triggers.

I demonstrate how to build a simple pipeline, popping through the list of available activities and demonstrating as many as time will allow.

Why I Want to Present This Session:

I am an experienced data integration developer, but that’s not why you should vote for (or attend) this presentation. You should vote for / attend this session because I *love* data integration. I’ve been doing this kind of work for decades, before I even knew the proper term for this kind of work.

Plus, my goal is to leave it all on the field.

Additional Resources:

The following two tabs change content below.
Andy Leonard is Chief Data Engineer at Enterprise Data & Analytics, creator of the DILM (Data Integration Lifecycle Management) Suite, author / co-author of twelve books, technical trainer, blogger, dad, grandfather, and recovering chicken farmer.

Latest posts by Andy Leonard (see all)

Previous Post
Think Like A Certification Exam
Next Post
Getting Data Into Power BI

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.