Load Files from Blob Storage to Azure SQL Server Database
You can load files stored on Azure blob Storage to Azure SQL Database using BULK INSERT.
To load a file into Azure SQL Database from blob storage, you must have a file uploaded in your azure storage container.
If you do not have any file in Azure sotrage container, you can refer the post how to upload a file in azure storage container – Upload file in azure Storage container
Following are the steps to loading a File to Azure SQL Databae.
First open an azure storage account, Here we have an account (oneazurestorageaccount)
Now, click on container service tab.
Here we have an azure storage container named as blobcontainer.
Right click on triple dots as shown below.
We need an azure storage container url that is used to etablish an external data source that will point to a container in an Azure storage account.
Later, we will use this URL in BULK INSERT Command.
Rememeber- If you want to load files from different- different container then you do not need to inculde specific container name , just exclude the container name from URL and copy the renaming URL.
If you need to load files from specific container only then you can copy whole URL inculding container name.
Copy the URL for later use.
Next, go to Azure Storage Account setting, Clikc on Shared Access Signature >
Select Allowed Recource Types – tick services, Container, and objects.
After that click on Generate SAS and Connection string button.
Now go down , The SAS key value might begin with a ‘?’ (question mark). When you use the SAS key, you must remove the leading ‘?’.
Keep this URL save for later use.
Now time to open AZURE SQL Database.
Click on your database that you want to use to load file.
Now go to Query editor (Preview).
After that, Login into SQL Database.
Select Database, and create a table that will be used to load blob storage.
Before moving further, lets take a look blob sotrage that we want to load into SQL Database.
Here we have a file EmployeeData in Blobcontainer.
File having a six columns: NationalIdNumber, jobTitle, birthDate, MarialStatus, Gender, and HireDat , as you can see in preview.
Lets create a table that will store the file data.
CREATE TABLE Employee_BlobData( NationalIdNumber int, JobTitle VARCHAR(50), BirthData DATE , MaritalStatus VARCHAR(2), Gender VARCHAR (2), HireDate DATE)
Create a database master key, if you do not have one.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'providepassword';
CREATE DATABASE SCOPED CREDENTIAL AzureBolbStorageCredential WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'sv=2019-10-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2020-07-19T11:42:37Z&st=2020-07-19T03:42:37Z&spr=https&sig=irbCynmU4cbtDysP%2BICVotO3DqYKwcmba%2Ba0E5abpPk%3D'
CREATE EXTERNAL DATA SOURCE azureblobstorageDs WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://oneazurestorageaccount.blob.core.windows.net', CREDENTIAL= AzureBolbStorageCredential);
Now will create an External data source that will use the Database scopped credential that we have created above.
Now will write a BULK INSERT Query that will load data into Employee_BlobData table.
As you can see, when you run the qurey it loads the file data into table.
BULK INSERT Employee_BlobData FROM 'blobcontainer/EmployeeData.csv' WITH ( DATA_SOURCE = 'azureblobstorageDs', FORMAT = 'CSV', FIRSTROW = 2);
Now you can check the table. As you can see, it returns records.
SELECT * FROM Employee_BlobData
Recommended for you
Azure Data Lake