I've inherited an SSIS package which loads CSV files into a SQL database.
The first thing the package does is call a .BAT file which maps a network drive. The .BAT file contains the username and password used to map the drive in plain text, so urgently needs to be replaced.
I've written a solution which uses New-PSDrive in a Powershell script and creates a credentials XML file with the password encrypted. If I execute the ps1 script it works Ok.
If I call the ps1 script from within SSIS and run it through VS then it also works fine.
When I call the SSIS package that calls the script task through a SQL Agent job (executed as the SQL Server Agent user account) the drive mapping doesn't seem to work, and the package complains it cannot access the file from the file share.
I presume this is no working because the SQL Server Agent user account can't run Powershell queries? The package only errors because it cannot access the share, not during the execution of the Powershell script.
So, two questions:
Is there an obvious issue with the SQL Agent deployment idea
or
Is there a different way of mapping a network drive from within SSIS without storing the password in plain text anywhere?
This is the pattern I suggest using for accessing files on a remote server:
Access the files via unc path: \\MyServer\Files
This means that the package needs to run under a context that has access to the path. There are a couple of options:
Run the package under a proxy, i.e. the account credentials that were stored in the bat file. This means that you have to also grant access to the account on the sql instance as well
Use a group managed service account (gMSA). In this case, the sql agent service account is replaced with the gMSA and the job runs under sql agent. The gMSA needs to be granted access to the remote share as well as the sql instance. This is a much more secure way of addressing the whole process because there is no password to manage (AD takes care of that) and the account cannot be used to log in anywhere. But there is set up work to do to get it created so it's not the fastest option.
I have created SSIS packages which creates new folder using file system task at output location and then creates data files in that folder using data flow task.
I have created one network drive which is mapped to azure blob. So I pass the network drive path like Z:\ to package and the folder and files are created as expected which are also reflected on the azure blob.
Now when I schedule this package through SQL agent job I get error that the cannot find part of the path Z:\folderName from file system task. So I though it is because the sql server agent service was not running through my user id . So I started sql server agent with my credentials but it still gives me same error.
Note: My Id doesnt have direct access to azure blob and the network drive is only accessible by my id.
We are currently using azure blob for dev but we may use separate server to store files due to which I cannot use flexible file system task available in SSIS azure service pack
We had reasons we could not use UNC paths (\server\share\file.txt) and struggled with mapped drives as well. The approach we ended up using was to explicitly map and unmap drives as part of our ETL Process.
Execute Process Task - Mount drive
We used a batch file to mount the drive as it made things easier on us but the command would take the form of something like
net use Z: \\server\share mypassword user#azure
Execute process task - Unmount Drive
Again, execute process task
net use /delete Z:
We ended up with a precursor step that I don't remember how we accomplished it, but we would test whether the drive had been unmounted before we attempted to mount it. I think it was File System Task to see if the drive existed and if so, then we had the Unmount task duplicated.
I was able to resolve this using credential manager. I logged in using a domain user and then added the azure path like \\server, userid and password to windows credentials . Then started the sql server agent service using the same domain user id . Set up the file path as \\server\share similar to what I have provided in windows credentials into the ssis package configurations.After doing this I was able to successfully execute the package through sql agent job.
I was able to solve this issue using script task (C# Code) to move data. I've generated a project parameter for Target ShareFolder and used it as ReadOnlyVariable for script task. The target share folder has been already mapped as a network drive on my local PC as well as on Application Server.
I have a server in which I'm running an SQL Server Express DB and an Azure blob i which I Upload each morning the backup of the SQL Server.
Now, I've been able to automate the backup via a mix of SQL query + batch file and I have it scheduled into my task scheduler to run each night at 9:00pm, but I would like to move also a copy of the backup from the server to the Azure Storage.
I've already tryed a batch file in task scheduler:
echo off
copy "Z:\Backup\SQLBackup\" "C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Backup\DailyStep_bck.bak"
But it doesn't work by itself, only if I run it manually.
Each day the current copy should replace the older one, I don't need retention of old bacups for now.
I've tryed also robocopy and it also doesn't work... could someone tell me what am I missin?
the task is running as administrator with the "Run wether the admin is loged in or not" option.
thanks for your help.
I summarize the solution as below.
Windows task scheduler doesn't have permissions to the mounted azure file share drive. So we need to use the path https://[Azure_account].file.core.windows.net/storage/Backup to access Azure file share. Besides, if you want to implement auto-upload, you can use azcopy.
For example
Create a sas token for the file share
Script
azcopy copy 'C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Backup*.bak' 'https://[Azure_account].file.core.windows.net/storage/Backup/SQLBackup/[backup_file].bak?<sas token>
Create a schedule task
For more details, please refer to here and here
I'm using EC2 with a Windows Server 2012 R2 instance. I'm trying to upload a large backup of my database (15gb .bak file) to Amazon S3.
I'm using a .bat script that use dgsync.exe :
SET DGTOOLS_ACCESS_KEY=***
SET DGTOOLS_SECRET_KEY=****
SET DGTOOLS_ENCRYPTION_PASSWORD=***
SET DGTOOLS_DECRYPTION_PASSWORD_0=****
SET DGTOOLS_DECRYPTION_PASSWORD_1=****
dgsync.exe -z -l --dont-delete --rrs "E:/BAK_ASO" "s3://bucket-backup-s3/Backup ATLAS/BAK_ASO/"
It's working for my other database (less than 5gb) but not for the big one (15gb).
Some people tell me to split my database but I don't want to corrupt my file, what can I do ?
I have the non-enterprise edition of SQL Server 2008. I do nightly backups, manually zip the files and then manually copy to a remote server.
I need to automate this using batch files. Copying files from server to server is easy but how do I automate the zipping of the backup first?
The full process I need is:
Run the backup nightly
Zip the backup to reduce size (with a unique zip filename)
Move the zip file to a remote server which is setup as a network drive on the database server
I confess the compression part has thrown me off. Any advice would be very much welcomed.
Thanks in advance.
You can backup databases with SQLBackupAndFTP software. It's a simple UI tool with ability to execute and schedule backup jobs (full, diff and log tran backups). It just compresses backups with embedded archiver or 7-zip and send to a local folder or to a NAS drive or FTP or a to a cloud (Dropbox, Google Drive, Amazon S3). Also there is support forum.
You could (and should!) most definitely investigate the SQL Server maintenance plans.
These allow you to automate things like
checking for database consistency
rebuild indexes as needed
do database and log backups (definitely use the SQL Server 2008 backup compression!!)
I'm not sure if they have built-in support for zipping and copying to a remote server, but you could definitely automate the backup part with a maintenance plan, and the rest with a command file of some sort.
you do not specify the zip utility that you are using. There are many, but I tend to use Winzip as that is the main zip tool used at work. Winzip has a command line interface ( http://www.winzip.com/prodpagecl.htm ) that is a free addin to winzip that can be called from a command line.
Another alternative would be to use cygwin and tar.gz via the command line.
If you are just stuck on how to compress from a batch script:
Install 7-Zip
Run from the command line:
"C:\Program Files\7-Zip\7z.exe" a -t7z MyBackups.7z [Files To Zip]
To get a unique filename, I usually embed the date/time: yyyymmddhhMMss-backup.7z
You can ZIP stuff from the command line, for example with RAR. Just add the ZIP commands to wherever you do the copying. If that's in T-SQL, you can execute a ZIP command using xp_cmdshell.
For a luxury option, check out Red Gate Backup, it makes this process fairly painless.
Since you've got 2008, you've got Powershell installed. I would suggest looking at a psh script executed after a successful backup to compress and copy over the wire. This would most likely be a second job step after your backup.
You can also go old-school and write a batch file to do the compress and copy. Then invoke that as a cmdshell job step, again after your backup job/step.
if you are a programmer you can make an application that get your db backup by SMO and zip this file to .gz file by available libraries.
try this link:
[http://www.sqlhub.com/2009/05/copy-files-with-sql-server-from-one.html][1]
in short:
1 - you must enable "Ole Automation Procedures"
2 - modify & run this script to test:
DECLARE #FsObjId INTEGER
DECLARE #Source VARCHAR(4096)
DECLARE #Destination VARCHAR(4096)
SET #Source = 'C:\ritesh'
SET #Destination= 'D:\ritesh'
--creare OLE Automation instance
EXEC sp_OACreate 'Scripting.FileSystemObject', #FsObjId OUTPUT
--call method of OLE Automation
EXEC sp_OAMethod #FsObjId, 'CopyFolder', NULL, #Source, #Destination
--once you finish copy, destroy object
EXEC sp_OADestroy #FsObjId
3 - create a maintenance plan:
3.1 - add a "Back Up Database Task" and make sure to choose "Set backup compression" = Compress backup. (this will create your backups in a compressed format)
3.2 - add an "Execute T-SQL Statement Task" with the above script that'll move your files that'll execute after the 3.1 task :).
Try SQL Backup Master, which can zip backups and move them to a network (or local) folder. Can also move zipped backup files to FTP, Dropbox, Amazon S3, or Google Drive. Basic edition is free.
Zipping the file after backup takes significant time. Backup programs which use Virtual Device Interface of SQL Server solve your task and decrease overall process time. Try EMS SQL Backup which also allows sending compressed backups to network locations, FTP or clouds.
You can use Backup Assistant. It is going to be a solution.
You need to set a schedule for this program. (like every hour).
And then you need to configure your backup file paths and your ftp credentials in Appsettings.json
{ "AppSettings": {
"BackupPaths": [
{
"LocalPath": "C:\\Users\\Sinan\\Desktop\\FtpBackup\\TestDb",
"RemotePath": "TestDb"
},
{
"LocalPath": "C:\\Users\\Sinan\\Desktop\\FtpBackup\\TestHangFireDb",
"RemotePath": "TestHangFireDb"
},
{
"LocalPath": "C:\\Users\\Sinan\\Desktop\\FtpBackup\\TestLogDb",
"RemotePath": "TestLogDb"
}
],
"BackupFileExtensions": [ ".bak" ],
"DeleteFilesAfterSend": true,
"ZipFilesBeforeSend": true,
"DeleteZipFilesAfterSend": false,
"WriteLog": true,
"Providers": {
"FtpServer": {
"Enabled": true,
"Host": "ftphost",
"Port": "21",
"Username": "ftpusername",
"Password": "ftppassword"
}
} } }