The event trigger works with storage account blob container. It gets triggered if any blob file is created or deleted so event trigger is scoped to trigger for such file. It supports only for Azure Data Lake Storage Gen2 and General-purpose version 2 storage accounts. Steps to Perform You need to create below services to... Continue Reading →
Azure Data Factory – How to Create Schedule Trigger
The schedule trigger runs automatically and invoke pipelines in an ADF. We can create multiple triggers to run a pipeline and create single trigger to run multiple pipeline. Let's suppose you have already implemented an ADF pipeline, if not you can find an example here to create an ADF pipeline so you need to create... Continue Reading →
Azure Data Factory – How to Parameterize Linked Service
The ADF linked Services are the connectors, between source and sink data stores, which are used to move data using pipeline activities. In real time scenario, we need to deal with different databases, blob storage containers, KeyVault secrets in various environments like development, QA, UAT etc. This article describes a general approach to overcome this... Continue Reading →