I have a server in which I'm running an SQL Server Express DB and an Azure blob i which I Upload each morning the backup of the SQL Server.
Now, I've been able to automate the backup via a mix of SQL query + batch file and I have it scheduled into my task scheduler to run each night at 9:00pm, but I would like to move also a copy of the backup from the server to the Azure Storage.
I've already tryed a batch file in task scheduler:
echo off
copy "Z:\Backup\SQLBackup\" "C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Backup\DailyStep_bck.bak"
But it doesn't work by itself, only if I run it manually.
Each day the current copy should replace the older one, I don't need retention of old bacups for now.
I've tryed also robocopy and it also doesn't work... could someone tell me what am I missin?
the task is running as administrator with the "Run wether the admin is loged in or not" option.
thanks for your help.
I summarize the solution as below.
Windows task scheduler doesn't have permissions to the mounted azure file share drive. So we need to use the path https://[Azure_account].file.core.windows.net/storage/Backup to access Azure file share. Besides, if you want to implement auto-upload, you can use azcopy.
For example
Create a sas token for the file share
Script
azcopy copy 'C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Backup*.bak' 'https://[Azure_account].file.core.windows.net/storage/Backup/SQLBackup/[backup_file].bak?<sas token>
Create a schedule task
For more details, please refer to here and here
Related
I've inherited an SSIS package which loads CSV files into a SQL database.
The first thing the package does is call a .BAT file which maps a network drive. The .BAT file contains the username and password used to map the drive in plain text, so urgently needs to be replaced.
I've written a solution which uses New-PSDrive in a Powershell script and creates a credentials XML file with the password encrypted. If I execute the ps1 script it works Ok.
If I call the ps1 script from within SSIS and run it through VS then it also works fine.
When I call the SSIS package that calls the script task through a SQL Agent job (executed as the SQL Server Agent user account) the drive mapping doesn't seem to work, and the package complains it cannot access the file from the file share.
I presume this is no working because the SQL Server Agent user account can't run Powershell queries? The package only errors because it cannot access the share, not during the execution of the Powershell script.
So, two questions:
Is there an obvious issue with the SQL Agent deployment idea
or
Is there a different way of mapping a network drive from within SSIS without storing the password in plain text anywhere?
This is the pattern I suggest using for accessing files on a remote server:
Access the files via unc path: \\MyServer\Files
This means that the package needs to run under a context that has access to the path. There are a couple of options:
Run the package under a proxy, i.e. the account credentials that were stored in the bat file. This means that you have to also grant access to the account on the sql instance as well
Use a group managed service account (gMSA). In this case, the sql agent service account is replaced with the gMSA and the job runs under sql agent. The gMSA needs to be granted access to the remote share as well as the sql instance. This is a much more secure way of addressing the whole process because there is no password to manage (AD takes care of that) and the account cannot be used to log in anywhere. But there is set up work to do to get it created so it's not the fastest option.
I Have Azure VM server. In that I have a job set up for automatic backup to Azure local storage. I will need to store a copy of that backup in another server? How do I do that? Is there any way to do it automatically?
I am not sure of you can do it directly from one serve to another server but you can do via blob storage. Use AzCopy(https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy) for uploading and downloading files from blobs.
You can also use Azure File Service to copy the backups for archival purposes. Use the following commands to mount a network drive to archive the backup:
Create a storage account in Windows Azure PowerShell
New-AzureStorageAccount -StorageAccountName “tpch1tbbackup” -Location “West US”
Create a storage context
$storageAccountName = “tpch1tbbackup”
$storageKey = Get-AzureStorageKey $storageAccountName | %{$_.Primary}
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Create a share
New-AzureStorageShare -Name backup -Context $context
Attach the share to your Azure VM
net use x: \tpch1tbbackup.file.core.windows.netbackup /u:$storageAccountName $storageKey
Xcopy backup to x: to offload the backup on the mounted drive
xcopy D:backup*.* X:tpchbackup*.*
According to you question, you can achieve this with many ways. As #Alberto Morillo and #Vivek said, you can use powershell and AzCopy to do that . But if you want to back up and copy automatically, you can use a runbook to achieve that.
Also, you can set Schedules to runbook. With this, you can backup your resource automatically. Runbook can run powershell cmdlets and provide many features to make your job automatically.
See more details about Runbook in Azure automation in this document.
To automate you backup process from one server to another server using azure storage service, you have to make three batch files.
It will take backup of your db and will store it locally. here is the command to do that.
set FileName=DBName_%date:~-4,4%%date:~-10,2%%date:~-7,2%.bacpac
echo %FileName%
"C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\sqlpackage.exe" /a:Export /ssn:your IP(00:00:00) /sdn:yourdatabaseName /tf:"D:\%FileName%" /su:username /sp:"password"
It will post your locally saved backup file to Azure storage.
"C:\AzCopy\AzCopy.exe" /Source:D: /Dest:https://youstoragedestination.link/blobname/ /DestKey:yourAzureStoragekey /Pattern:"*.bacpac" /Y
del "d:*.bacpac"
from this batch file call above two batch file
example:
call "yourpath\backupFile.bat"
call "youpath\backupFilepushing2azure.bat"
you can schedule your third batch file to automate your process.
Now you have pushed your backup file to Azure Storage, which is enough i think.
If you really want to save that backup file to another server then make another batch file that will download your backup file from blob to server using AzCopy.
I'm new to SSIS and stuck with a problem and hope some of them would have already gone through any of this.
Task:
Copying files from a remote server to a local machine folder using File System task and For each loop container.
Problem:
The job executes i.e. files are getting copied successfully when I execute from the SSIS designer but when deployed the project on the SQL server instance it isn't copying any files in fact the target folder is totally empty.
I'm not understanding this strange behavior. Any inputs would be of great help!
Regards-
Santosh G.
The For each loop will not error out if it doesn't find any files.
The SQL Agent account may not have access to read the directory contents.
Check is your path a variable - is it being set by a config or /SET statement?
Can you log the path before starting the for loop?
Can you copy a dummy file and see can SSIS see this file?
How are you running the job - cmd_exec() can give spurious results with file I/O tasks
The issue was related to the user authorizarions of the SQL Server agent service.
When I execute the job from SQL Server it uses agent service and for that agent service you need to assign a service user who has access rights to the desired file path.
I designed a SSIS package that ensures .CSV file generation into a destination folder using a Script Task component. Everything is ok when I run from the Visual Studio solution but the problems starts to appear right after deployment in SQL Server. The Script Task shows success but no file is generated. If someone please can provide help.
Thanks a lot in advance.
Are you running it through a SQL job? This might be because the SQL Agent account (or whatever account you're running with) might not have Read/Write permissions on that particular directory.
I have a SSIS package which creates a folder in an UNC share and then creates a file there (using script task).
The domain account which is used by SSIS and Agent has all the possible permissions in the DB computer and the share computer.
It always fails there.
I created a test SQL Agent job which creates a backup of the database in the same location and it fails too (Operating system error 5 - access denied).
EDIT: The above test example may be irrelevant since the backup operation is executed by SQL Server Database Engine and not the SQL Agent itself (Agent still schedules the task).
I cannot debug the script task in SSIS and therefore Im not sure what is the problem.
I have managed to fix this problem. The first problem was lack of sufficient task activation/execution permissions in the DCOM config node in Component Services. The permissions had to be set for SQL Server Integration Services.
The second problem was the fact that the UNC path looked like this:
\\192.168.250.51\C$\Folder\
I needed to create another share (visible) like that:
\\COMPUTER-NAME\Folder\
Also don't try to map any drives to the folders. It won't work.