I have a 200 GB file in an Azure SQL database. I want to back this file up to free up some space. I can't do a BCP out - since this is also the realtime database. I am approaching the 1TB limit, hence I need to do this soon.
How do I proceed? Note: this is SQL Azure, not SQL on-prem.
Ideally, I'd like to be able to pipe the data out from this table into some other form of cloud storage. I don't want to drop the table. I don't want to delete the data permanently.
Related
I have a client for which I did a portal which manages documents used by their company. The documents are stored in SQL Server. This all works great.
Now, a few years later, there are 130,000 documents, most of which are no longer needed. The database is up to 200 gigs, and the cost of Azure Db gets costly above 250 gigs.
However, they don't want to just delete the old documents, as on occasion, they are needed. So what are my choices? They are creating about 50,000 documents per year.
Just let the database grow larger and pay the price?
Somehow save them to a disk somewhere? Seems like 130,000 documents in storage is going to be a task to manage in itself.
Save the current database somewhere offline? But accessing the documents off the database would be difficult.
Rewrite the app to NOT store the files in SQL Server, and instead save/retrieve from a storage location.
Any ideas welcome.
Export the backup file in Azure Blob storage in archive mode which will cost less and easy to import back to database when required. Delete the records from database afterwards.
Click on the Export option for your SQL database in Azure SQL Server.
Select the Storage account in Azure where you want to store the backup file.
Once the backup is available in storage account, change the access tier to archive by following this tutorial - Archive an existing blob.
I am trying to download my SQL Server database (that is more than 40 GB) from production server to my local machine. I need only schema & some of data as downloading 40 GB backup file & restoring is really tough task for me.
I have tried to use generate scripts to obtain schema, this was successful. But for getting data for (suppose approx. first 500 rows) of all tables, I am not sure how I should approach that.
Please let me know is there any other way to achieve this?
I am using Microsoft's SQL Server Version 12.0.xxx.
Thanks
SQL Server Management Studio provides a wizard which enables you to generate scripts not only for metadata (or schema) but also the scripts for data within database.
Please refer to Script Data in SQL Server
But if your database backup size is very big, the script file will be very huge.
Actually this wizard does not provide a parameter to script only for first 500 rows of each table.
Besides all, if you have foreign keys and constraints on your table definitions, you might not be enough to get only the first 500 rows. You need every referenced lookup data in your database in order to insert data into your transactional tables, or you need the parent for the child data.
This forces you to create a more smart script for data extraction.
Architectural/perf question here.
I have a on premise SQL server database which has ~200 tables of ~10TB total.
I need to make this data available in Azure in Parquet format for Data Science analysis via HDInsight Spark.
What is the optimal way to copy/convert this data to Azure (Blob storage or Data Lake) in Parquet format?
Due to manageability aspect of task (since ~200 tables) my best shot was - extract data locally to file share via sqlcmd, compress it as csv.bz2 and use data factory to copy file share (with 'PreserveHierarchy') to Azure. Finally run pyspark to load data and then save it as .parquet.
Given table schema, I can auto-generate SQL data extract and python scripts
from SQL database via T-SQL.
Are there faster and/or more manageable ways to accomplish this?
ADF matches your requirement perfectly with one-time and schedule based data move.
Try copy wizard of ADF. With it, you can directly move on-prem SQL to blob/ADLS in Parquet format with just couple clicks.
Copy Activity Overview
Is there any way to transfer the large volume of data from Azure SQL to on-premises SQL Server 2016 Enterprise/Standard? The requirements prescribed as follows:
Weekly full database transfer
Daily delta transfer before midnight
I read about SSIS for Azure Blob Storage but am not sure whether it is applicable to this context.
Updated: I found an article on Azure Data Sync; according to that article, it seems doable. Please share your experiences. That would be extremely helpful.
https://www.mssqltips.com/sqlservertip/3062/understanding-sql-data-sync-for-sql-server/
Weekly full database transfer
SSIS Doesn't provide a way to do Full transfer of data(i mean backup),unless you want to truncate and insert from source..
For Weekly full database transfer,i would go with SQLAzure Export/Import functionality
Refer below links for more details..
1.https://github.com/richorama/SQLDatabaseBackup
2.I need to automate SQL Azure database backup in SQL Script files. How can i do so?
Daily delta transfer before midnight
You will need a way to identify delta..so create a table with all table names and last run time
create a console application which uses bulk insert functionality,which uses above table as base and insert in onpremises
I have two databases, one on a remote server the other local. (SQL Server 2008)
The database on my local server has the entire structure setup but no data. I would like to copy the data from the remote server to my server and I am wondering the best method in which to do this.
The main issue I am experiencing is the user that I have to the remote database has limited permissions. I cannot read the stored procedures, user defined functions so when I use Import/Export wizard I do not get the schema etc. So a regular dump/restore is not working for me as it restores the tables without the Primary Keys/Foreign Keys and the stored procedures.
I'd like to do this,
INSERT INTO localtable SELECT * FROM remotedb.table
I was having issues because of the IDENTITY fields and I had to explicitly name all of the columns. Also I am not sure if SQL Server Management Studio allows you to use two different databases, remote and local, so I was looking for any advice.
I have also tried applications like SQL FTP and Backup and it fails because it runs out of memory (I have 16GB of memory on the machine and the DB is like 4GB). I also can use the SQL Server import/export wizard but then I don't get the schema information. I also tried SQL Compare from Red Gate and it runs into issues with the permissions. Unfortunately I do not have the time to request and gain access to a new user so I was hoping someone had a creative idea.
You can definitely use SQL Server Backups for this. It will not run out of memory. If it does please tell us the message (because likely you are misinterpreting it). This is the fastest possible and the most complete solution.
You can tell the export wizard to also script the schema. It is hidden under "advanced" somewhere (terrible UI). But the script will be extremely big and I know of no way to execute it.
You can drop all schema objects except PKs in the target database. Then you can use remote queries to copy all the data over. You will not get any problems with foreign keys and identity columns if you drop the beforehand. After you are done you can recreate all those objects. It is probably best if you use a transaction for all of this because that way you get consistent source data from a point-in-time.