Upload large file (sqldatabase.bak 15gb) to Amazon S3 - sql-server

I'm using EC2 with a Windows Server 2012 R2 instance. I'm trying to upload a large backup of my database (15gb .bak file) to Amazon S3.
I'm using a .bat script that use dgsync.exe :
SET DGTOOLS_ACCESS_KEY=***
SET DGTOOLS_SECRET_KEY=****
SET DGTOOLS_ENCRYPTION_PASSWORD=***
SET DGTOOLS_DECRYPTION_PASSWORD_0=****
SET DGTOOLS_DECRYPTION_PASSWORD_1=****
dgsync.exe -z -l --dont-delete --rrs "E:/BAK_ASO" "s3://bucket-backup-s3/Backup ATLAS/BAK_ASO/"
It's working for my other database (less than 5gb) but not for the big one (15gb).
Some people tell me to split my database but I don't want to corrupt my file, what can I do ?

Related

Automate Azure VM server SQL job backup copy to another server?

I Have Azure VM server. In that I have a job set up for automatic backup to Azure local storage. I will need to store a copy of that backup in another server? How do I do that? Is there any way to do it automatically?
I am not sure of you can do it directly from one serve to another server but you can do via blob storage. Use AzCopy(https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy) for uploading and downloading files from blobs.
You can also use Azure File Service to copy the backups for archival purposes. Use the following commands to mount a network drive to archive the backup:
Create a storage account in Windows Azure PowerShell
New-AzureStorageAccount -StorageAccountName “tpch1tbbackup” -Location “West US”
Create a storage context
$storageAccountName = “tpch1tbbackup”
$storageKey = Get-AzureStorageKey $storageAccountName | %{$_.Primary}
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Create a share
New-AzureStorageShare -Name backup -Context $context
Attach the share to your Azure VM
net use x: \tpch1tbbackup.file.core.windows.netbackup /u:$storageAccountName $storageKey
Xcopy backup to x: to offload the backup on the mounted drive
xcopy D:backup*.* X:tpchbackup*.*
According to you question, you can achieve this with many ways. As #Alberto Morillo and #Vivek said, you can use powershell and AzCopy to do that . But if you want to back up and copy automatically, you can use a runbook to achieve that.
Also, you can set Schedules to runbook. With this, you can backup your resource automatically. Runbook can run powershell cmdlets and provide many features to make your job automatically.
See more details about Runbook in Azure automation in this document.
To automate you backup process from one server to another server using azure storage service, you have to make three batch files.
It will take backup of your db and will store it locally. here is the command to do that.
set FileName=DBName_%date:~-4,4%%date:~-10,2%%date:~-7,2%.bacpac
echo %FileName%
"C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\sqlpackage.exe" /a:Export /ssn:your IP(00:00:00) /sdn:yourdatabaseName /tf:"D:\%FileName%" /su:username /sp:"password"
It will post your locally saved backup file to Azure storage.
"C:\AzCopy\AzCopy.exe" /Source:D: /Dest:https://youstoragedestination.link/blobname/ /DestKey:yourAzureStoragekey /Pattern:"*.bacpac" /Y
del "d:*.bacpac"
from this batch file call above two batch file
example:
call "yourpath\backupFile.bat"
call "youpath\backupFilepushing2azure.bat"
you can schedule your third batch file to automate your process.
Now you have pushed your backup file to Azure Storage, which is enough i think.
If you really want to save that backup file to another server then make another batch file that will download your backup file from blob to server using AzCopy.

How to push data from local SQL Server to Tableau Server on AWS

We are developing Tableau dashboards and deploying the workbooks on a EC2 windows instance in AWS. One of the data source is the company SQL server inside firewall. The server is managed by IT and we only have read permission to one of the databases. Now the solution is to build workbook on Tableau desktop locally by connecting to the company SQL server. Before publishing the workbooks to Tableau server, the data are extracted from data sources. The static data got uploaded with workbooks when published.
Instead of linking to static extracted data on Tableau server, we would like to set up a database on AWS (e.g. Postgresql), probably on the same instance and push the data from company SQL server to AWS database.
There may be a way to push directly from SQL server to postgres on AWS. But since we don't have much control of the server plus the IT folks are probably not willing to push data to external, this will not be an option. What I can think of is as follows:
Set up Postgres on AWS instance and create the tables with same schemas as the ones in SQL server.
Extract data from SQL server and save as CSV files. One table per file.
Enable file system sharing on AWS windows instance. So the instance can read files from local file system directly.
Load data from CSV to Postgres tables.
Set up the data connection on Tableau Server on AWS to read data from Postgres.
I don't know if others have come across a situation like this and what their solutions are. But I think this is not a uncommon scenario. One change would be to have both local Tableau Desktop and AWS Tableau Server connect to Postgres on AWS. Not sure if local Tableau could access Postgres on AWS though.
We also want to automate the whole process as much as possible. On local server, I can probably run a Python script as cron job to frequently export data from SQL server and save to CSVs. On the server side, something similar will be run to load data from CSV to Postgres. If the files are big, though, it may be pretty slow to import data from CSV to postgres. But there is no better way to transfer files from local to AWS EC2 instance programmatically since it is Windows instance.
I am open to any suggestions.
A. Platform choice
If you use a database other than SQL Server on AWS (say Postgres), you need to perform one (or maybe two) conversions:
In the integration from on on-prem SQl Server to AWS database you need to map from SQL Server datatypes to postgres datatypes
I don't know much about Tableau, but if it is currently pointing at SQL Server, you probably need some kind of conversion to point it at Postgres
These two steps alone might make it worth your while to investigate a SQL Express RDS. SQL Express has no licencing cost but obviously windows does. You can also run SQL Express on Linux which would have no licencing costs, but would require a lot of fiddling about to get running (i.e. I doubt if there is a SQL Express Linux RDS available)
B. Integration Approach
Any process external to your network (i.e. on the cloud) that is pulling data from your network will need the firewall opened. Assuming this is not an option, that leaves us only with push from on-prem options
Just as an aside on this point, Power BI achieves it's desktop data integration by using a desktop 'gateway' that coordinates data transfer, meaning that cloud Power BI doesn't need to open a port to get what it needs, it uses the desktop gateway to push it out
Given that we only have push options, then we need something on-prem to push data out. Yes, this could be a cron job on Linux or a windows scheduled task. Please note, this is where you start creating shadow IT
To get data out of SQL Server to be pushed to the cloud, the easiest way is to use BCP.EXE to generate flat files. If these are going into a SQL Server, these should be native format (to save complexity). If these are going to Postgres they should be tab delimited
If these files are being uploaded to SQL Server, then it's just another BCP command to push native files into tables into SQL Server (prior to this you need to run SQLCMD.EXE command to truncate the target table
So for three tables, assuming you'd installed the free* SQL Server client tools, you'd have a batch file something like this:
REM STEP 1: Clear staging folder
DEL /Y C:\Staging\*.TXT
REM STEP 2: Generate the export files
BCP database.dbo.Table1 OUT C:\Staging\Table1.TXT -E -S LocalSQLServer -N
BCP database.dbo.Table2 OUT C:\Staging\Table2.TXT -E -S LocalSQLServer -N
BCP database.dbo.Table3 OUT C:\Staging\Table3.TXT -E -S LocalSQLServer -N
REM STEP 3: Clear target tables
REM Your SQL RDS is unlikely to support single sign on
REM so need to use user/pass here
SQLCMD -U username -P password -S RDSSQLServerName -d databasename -Q"TRUNCATE TABLE Table1; TRUNCATE TABLE Table2; TRUNCATE TABLE Table3;"
REM STEP 4: Push data in
BCP database.dbo.Table1 IN C:\Staging\Table1.TXT -U username -P password -S RDSSQLServerName-N
BCP database.dbo.Table2 IN C:\Staging\Table2.TXT -U username -P password -S RDSSQLServerName-N
BCP database.dbo.Table3 IN C:\Staging\Table3.TXT -U username -P password -S RDSSQLServerName-N
(I'm pretty sure that BCP and SQLCMD are free... not sure but you can certainly download the free SQL Server tools and see)
If you wanted to push to Postgres SQL instead,
in step 2, you'd need to drop the -N option, which would make the file text, tab delimited, readable by anything
in step 3 and step 4 you'd need to use the associated Postgres command line tool, but you'd need to deal with data types etc. (which can be a pain - ambiguous date formats alone are always a huge problem)
Also note here the AWS RDS instance is just another database with a hostname, login, password. The only thing you have to do is make sure the firewall is open on the AWS side to accept incoming connections from your IP Address
There are many more layers of sophistication you can build into your integration: differential replication, retries etc. but given the 'shadow IT status' this might not be worth it
Also be aware that I think AWS charges for data uploads, so if you are replicating a 1G database everyday, that's going to add up. (Azure doesn't charge for uploads but I'm sure you'll pay in some other way!)
For this type of problem I would strongly recommend use of SymmetricDS - https://www.symmetricds.org/
The main caveat is that the SQL Server would require the addition of some triggers to track changes but at that point SymmetricDS will handle the push of the data.
An alternative approach, similar to what you suggested, would be to have a script export the data into CSV files, upload them to S3, and then have a bucket event trigger on the S3 bucket that kicks off a Lambda to load the data when it arrives.

Get local copy of SQL Server hosted on Amazon RDS

I have a small (few hundred MB) SQL Server database running on RDS. I've spent several hours trying to get a copy of it onto my local SQL Server 2014 instance. All of the following fail. Any ideas what might work?
Task -> Backup fails because it doesn't give my admin account permission to backup to a local drive.
Copy Database fails during create package with While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E4D
From SSMS, while connected to the RDS server, running BACKUP DATABASE. This fails with message BACKUP DATABASE permission denied in database 'MyDB'. Even after running EXEC sp_addrolemember 'db_backupoperator' for the connected user.
General scripts generates a 700MB .sql file. Running that with sqlcmd -i fails at some point after producing plausible .mdf and .ldf files that can't be mounted on the local server (probably because the sqlcmd failed to complete and unlock them).
AWS has finally provided a reasonably easy means of doing this: It requires an S3 bucket.
After creating a bucket called rds-bak I ran the following stored procedure in the RDS instance:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='arn:aws:s3:::rds-bak/MyDatabase.bak',
#overwrite_S3_backup_file=1;
The following stored procedure returns the status of the backup request:
exec msdb.dbo.rds_task_status #db_name='MyDatabase'
Once it finished I downloaded the .bak file from S3 and imported it into a local SQL Server instance using the SSMS Restore Database... wizard!
The SSIS Import Export Wizard can generate a package to duplicate a whole set of tables. (It's not the sort of Copy Database function that relies on files - it makes a package with data flow components for each table.)
It's somewhat brittle but can be made to work :-)
SSMS Generate Scripts feature can often fail with any large data set as the script for all the data is just to large/verbose. This method never scripts out the data.
Check this out: https://github.com/andrebonna/RDSDump
It is a C#.NET Console Application that search for the latest origin database Snapshot, restore it on a temporary RDS instance, generate a BACPAC file, upload it to S3 and delete the temporary RDS instance.
You can transform your RDS snapshot into a BACPAC file, that can be downloaded and imported onto your local SQL Server 2014 instance using the feature answered here (Azure SQL Database Bacpac Local Restore)
Redgate's SQL Compare and SQL Data Compare are invaluable for these types of things. They are not cheap (but worth every penny imo). But if this is a one-time thing, you could use the 14 day trial and see how it behaves for you.
http://www.red-gate.com/products/

Upload Images to Azure Storage Using Portal (not programmatically)

I need a SQL Server database that stores images, and their name, category, etc, so the SQL table will have 5 or so columns. I'm using Azure as my SQL Server host. It appears I cannot seem to insert image data into my VARBINARY(MAX) column from SQL Server Management Studio which was my first plan. I cannot do this because I cannot seem to give my user permissions to use BULK LOAD. Azure SQL seems to make this impossible. I think I need to use Azure Storage, and then in the SQL Server database, just store a link to the image.
To be clear, I want the images in the database already, I do not want to add them from within the application I am developing. The application I'm developing will only download the images to the device, not upload them.
So How do I upload the images to Azure Storage using the portal, not using code?
So how do I upload the images to Azure Storage using the portal, not using code?
Short Answer
You cannot. The portal does not have a way to upload an image to a storage container from either the old or the new portal.
Alternative
Use the AzCopy Command-Line Utility by Microsoft. It allows you to do what you want with just two command lines. There is terrific tutorial here.
First, download and install the utility. Second, open a command prompt and navigate to the installation AzCopy install directory. Third, upload a file to your storage account. Here are the second and third steps.
> cd C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy
> AzCopy /Source:folder /Dest:account /DestKey:key /Pattern:file
And here are what the parameters mean.
Source The folder on your computer that contains the images to upload.
Dest The address of the storage container at which to store the images.
DestKey The primary access key for your storage account.
Pattern The name of the file to upload (or a pattern).
Example
This uploads an image named my-cat.png from the C:\temp folder on my computer to a storage contained called mvp1. If you wanted to upload all the png images in that folder, you could replace my-cat.png with *.png and it work upload them all.
AzCopy /Source:C:\temp /Dest:https://my.blob.core.windows.net/mvp1 /DestKey:tLlbC59ggDdJ+Dg== /Pattern:my-cat.png
You might also what to take a look at the answers to this question: How do I upload some file into Azure blob storage without writing my own program?

I have a 18MB MySQL table backup. How can I restore such a large SQL file?

I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.

Resources