Skip to content
Home » Delete files from folder using delete activity

Delete files from folder using delete activity

Delete Activity in Azure Data Factory allow us to delete files or folders from on-premises storage stores or cloud storage stores.

You can use this activity to clean up or archive files when they are no longer needed.

There are some points you should always keep in mind before performing Delete Activity.

  • Make sure  you are not deleting files that are being written at the same time.
  • You should keep back up of your files or folders before deleting them using Delete activity in case you need to restore them in the future as deleted files or folders cannot be restored (unless the storage has soft-delete enabled).

Delete files from blob storage

As you can see, in container there is one folder named Input folder and inside this folder there are two files Customer.txt and Details.txt.

Now, we will delete these files from Input folder and save the logs of deleted files into Log folder.

Log into Azure data factory portal. Click on Author tab and create a new Pipeline.

Provide a name and description for pipeline. Description is an optional.

Now search for Delete in the pipeline Activities pane, and drag and drop Delete activity to the pipeline canvas.

After that, click on Source tab, to edit its details.

Now select an existing or create a new Dataset specifying the files to be deleted.

Here we will create a new Dataset, click on New button.

It opens a New dataset page, select Azure blob storage as files that we want to delete are stored in Azure blob storage.

Click on Continue button.

Once, you click on Continue button, it takes you to the New linked service page. Here you need to define Linked service as shown below.

After that, click on Create button. Now, you have to select the file path. In our case, files are stored in Input folder.

As we want to delete all files available input folder so we just select folder name and click on OK button.

Now, you can see the dataset name, linked service name and file path details.

You can verify them if all details are correct you can click on OK button.

Now, dataset is defined. Keep the Recursively option enabled.

As recursive deletion option is enabled it process all files in the input folder and its subfolders recursively or just the ones in the selected folder. This setting is disabled when a single file is selected.

In our case, we want to delete all files from Input folder so kept this option enabled.

Now, click on Logging settings tab.

Here we will define the logging settings to save the deleted files logs into Log folder.

Click on New button to create a New linked service, provide all required details for linked service such as Linked service name, subscription, storage account details and click on OK button.

Now, next it asks you to provide the folder name, where you want to keep the deleted files log.

Now, you can see the Logging settings are defined. You can verify them if all details are correct otherwise you can edit them.

Now, click on Publish button to publish all the changes in pipeline.

Let’s, run the pipeline. Click on Debug button.

You can see the status of pipeline, it ran successfully.

Let’s validate it, go and check the Input folder and Log folder.

It should be deleting all files from Input folder and saving the details of all deleted files in Log folder.

You can see, all the files (Customer.txt and Details.txt) from Input folder are deleted.

Let’s check the Log folder. You can see, inside the Log folder one subfolder is created.

Inside that folder one log file is created. To see the log details select the file and click on Edit button.

You can see the logs of deleted files. It saves the deleted files details such as name, category, status and Error.


Also Read..

Create Azure Data Factory using Azure Portal

Azure Data Factory ETL – Load Blob Storage to Azure SQL Table

Pivot Transformation Using Data Flow Activity

Wait Activity in Azure Data Factory

Copy multiple files from one folder to another using ForEach loop activity

Create a Schedule trigger


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.