Load Files from Blob Storage to Azure SQL Server Database
You can load files stored on Azure blob Storage to Azure SQL Database using BULK INSERT.
To load a file into Azure SQL Database from blob storage, you must have a file uploaded in your azure storage container.
If you do not have any file in Azure storage container, you can refer the post how to upload a file in azure storage container – Upload file in azure Storage container
Following are the steps to loading a File to Azure SQL Database.
First open an azure storage account, Here we have an account (oneazurestorageaccount)
Now, click on container service tab.
Here we have an azure storage container named as blobcontainer.
Right click on triple dots as shown below.
We need an azure storage container URL that is used to establish an external data source that will point to a container in an Azure storage account.
Later, we will use this URL in BULK INSERT Command.
Note that, If you want to load files from different- different container then you do not need to include specific container name , just exclude the container name from URL and copy the renaming URL.
If you need to load files from specific container only then you can copy whole URL including container name.
Copy the URL for later use.
Next, go to Azure Storage Account setting, Click on Shared Access Signature >
Select Allowed Recourse Types – tick services, Container, and objects.
After that click on Generate SAS and Connection string button.
Now go down , The SAS key value might begin with a ‘?’ (question mark). When you use the SAS key, you must remove the leading ‘?’.
Keep this URL save for later use.
Now time to open AZURE SQL Database.
Click on your database that you want to use to load file.
Now go to Query editor (Preview).
After that, Login into SQL Database.
Select Database, and create a table that will be used to load blob storage.
Before moving further, lets take a look blob storage that we want to load into SQL Database.
Here we have a file EmployeeData in Blobcontainer.
File having six columns: NationalIdNumber, jobTitle, birthDate, MarialStatus, Gender, and HireDate, as you can see in preview.
Lets create a table that will store the file data.
CREATE TABLE Employee_BlobData( NationalIdNumber int, JobTitle VARCHAR(50), BirthData DATE , MaritalStatus VARCHAR(2), Gender VARCHAR (2), HireDate DATE)
Create a database master key, if you do not have one.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'providepassword';
 CREATE DATABASE SCOPED CREDENTIAL AzureBolbStorageCredential WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'sv=2019-10-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2020-07-19T11:42:37Z&st=2020-07-19T03:42:37Z&spr=https&sig=irbCynmU4cbtDysP%2BICVotO3DqYKwcmba%2Ba0E5abpPk%3D'
CREATE EXTERNAL DATA SOURCE azureblobstorageDs  WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://oneazurestorageaccount.blob.core.windows.net',  CREDENTIAL= AzureBolbStorageCredential);
Now will create an External data source that will use the Database scopped credential that we have created above.
Now will write a BULK INSERT Query that will load data into Employee_BlobData table.
As you can see, when you run the qurey it loads the file data into table.
BULKÂ INSERTÂ Employee_BlobData FROM 'blobcontainer/EmployeeData.csv' WITH ( DATA_SOURCE = 'azureblobstorageDs', FORMAT = 'CSV', FIRSTROW = 2);
Now you can check the table. As you can see, it returns records.
SELECT * FROM Employee_BlobData
If you want to learn more on External data Source, Database scoped credential then you can refer Microsoft documentation- External Data Source and Database Scoped Credentail
Also Read.
Create Azure Data Lake Storage Account
Upload Files to Azure Data Lake StorageÂ
Create an Azure Data Lake Database
Create a Schema and Tables in Azure Data Lake Database
Load Data from Azure Data Lake Storage to Azure Data Lake Database