Recently I had the challenge to push data from an on premise database (SQL Server 2012) to a container in my storage account. I didn’t had access to the database via ADF and didn’t want to use SSIS. Hello PowerShell, you made my day. (and off course google and stack overflow) Note: this is a temporary solution.
First I created a function that opens a SQL connection and executes a query. The output of that query will be saved as a csv file. Below a part of that function that executes a basic SELECT query and that uses the Export-Csv cmdlet to write the csv file to a directory.
$sqlConnection.Open() #Fill the dataset $sqlAdapter.Fill($dataSet) #Write the data to a CSV file using the Export-Csv cmdlet $dataSet.Tables[0] | Export-Csv -NoTypeInformation ` -Path "c:\Temp\Data\$Table.csv" ` -Encoding UTF8 ` -Delimiter "|" $sqlConnection.Close() $sqlConnection.Dispose() }
I added an array in which I enter the tables needed. A simple loop over that array with the function call to the StartSqlSelect function with one parameter that contains the table name.
$Tables = @("PRODUCT", "CUSTOMER", "INVOICE") $StartDT = get-date Write Host "Start Export data from tables" foreach ($Table in $Tables) { StartSqlSelect $Table }
The files were now located in my c:\temp\Data folder, next was the upload to the Azure Storage account. However those files can be quite large, therefore we want to compress them first. I use 7zip in this case with a medium compression (hence the -mx parameter).
$DateStr = [System.DateTime]::Now.ToString("yyyyMMddhhmmss") $src_folder = "C:\Temp\Data" $destfile = "C:\Temp\sourcesystem_$DateStr.zip" set-alias sz "c:\Temp\7z\x64\7za.exe" sz a -mx=5 $destfile $src_folder
Once the zip was available I started the AzCopy to upload the file to my account.
set-alias azc "c:\Temp\AzCopy\AzCopy.exe" azc /Source:$destfile /Dest:https://accountname.blob.core.windows.net/ingest/source_$DateStr.zip /DestKey:{key}
All the code can be found here.