Moving data between TSQL servers without backup or BCP - sql-server

I have a very large database with near 21 million rows in various tables. It took months to get it filled out, and I've been told that I now have to move it.
The server it is on right now, I have VPN access and can work with the HDD at-will. It runs Microsoft SQL Server 2014.
The server I need to transfer it to is one of those SQL-only servers that do not allow you to VPN or access the HDD in any way. It does have an FTP site that I can upload .bak files to, and then a way through their interface to restore .bak files to a database; however this will not work since the version of Microsoft SQL Server on this server is 2012. It is not backwards compatible.
Since I do not have access to the HDD I have also ruled out using bulk file commands (BCP). I may be able to perform link server operations if I open the right ports, but I am afraid it will take a week to transfer this way.
Does anyone know of any other options I can try here?

Perhaps you can restore the DB locally to your environment as a 2014 DB. Then you can ETL the data (SQL import/export utility, DDL/DML scripts) to a 2012 DB instance. Then back it up natively (2012) and upload the BAK to your provider and restore it. This assumes you're not using functionality in 2014 that is not available in 2012.

Related

How To Take SQL Server Backup From One Server To Another Server

Folks,
I have multiple sql servers so Here i want take backup from server TO server.
Actually i want to store all backup files at only one server.Not to
move manually or Copy/Paste backup files.
Also i search out it on google on my way but didn't get exact idead to how to perform. also not know if any security issues are affected.
FYI..SQL Server Management Studio is installed at all servers.

Is it possible to restore a SQL Server Standard 2014 backup to Azure SQL?

We currently have a SQL Server Standard 2014 database on one of our servers that is backed up daily to Azure Blob Storage. Those backups are working well and have restored beautifully to the original server in manual tests.
However, to ensure that our backups continue to be valid, I want to put in place some sort of automated restore testing. Due to performance/disk constraints, I'd rather not do this automated testing on our primary database server. But we can't spend the money to buy more SQL Server Standard licenses to set up another server. And we can't use SQL Server Express, because our database is too large (about 20 GB).
Given that our backups are stored in Azure, I thought the best way to test backup restoration would be to restore the backup directly into an Azure SQL database. I could do this roughly once per week, run some quick checks on the restored data, and then automatically delete the database, and pay for less than 1 hour of service/week. This would result in minimal expense. However, I'm not sure it's possible. Google searches for instructions on how to restore a SQL Server backup directly to Azure SQL haven't turned up anything so far. Is it possible for me to restore my SQL Server backups directly to Azure Managed SQL like this?
If it isn't possible, my next thought is that I could just create an SQL Server VM in Azure and activate/deactivate it as needed for my automated restore tests. That'd be a bit more complicated though, so I'm saving that approach for plan B.
Not Directly. You can only directly import a bacpac file into SQL Azure.
What you can do is use SQL Server on an Azure VM to test your backup files.
Pretty sure you can write a script to automatically pull down the latest .bak file and restore it to the SQL Server Instance on the VM.

Import data from SQL database on server to SQL database on localhost

We have a database with a very large amount of data in it (around 400k records per table). I'd like to be able to debug a stored procedure on this database, however I do not have the permissions to do that on the server it's on.
As such, I need to create a replica of the database on localhost and debug from there.
However.. due to the large size the script that gets created is too large for SQL Server Management Studio to open.
Is there a way to directly import the data from one database to another if one is located on localhost and the other is not? The security issues shouldn't be a problem for importing/exporting data, I'm told.
Just create a binary backup of the source database (e.g. using SQL Server Management Studio, right click on the database, then select Tasks -> Back Up), then import that .bak file into your local installation - again using Management Studio, right click on the "Databases" node and choose Restore Database.
The only thing that might prevent you from doing that is if you are running SQL Server Express locally and the total size of the source database exceeds the size limit of the Express edition.
If you can connect to the server database from your computer, you can use SQL Server Management Studio, right click the Database, menu option Tasks and then Copy Database. You'll see an easy wizard asking for source and destination databases.
You can also create a backup through SQL Server Management Studio and restore on your computer.

How third-party SQL Server backup tools access databases' files for backup?

I'm reading about Red Gate SQL Backup, and I liked the concept of creating a database backup compressed and writing on disk the compressed backup directly without an intermediate SQL Server native backup.
And I'm wondering how this type of software make backups. It accesses the database files directly? It uses some sort of SQL Server or Windows API? Windows Shadow Copy?
The SQL Server has an API for backup providers to plug in elements into he backup pipeline. See INFORMATIONAL: SHEDDING LIGHT on VSS & VDI Backups in SQL Server, or have a look at the SQL Server Compressed Backup project on sourceforge.
More information at:
A Guide for SQL Server Backup Application Vendors
SQL Server 2005 Virtual Backup Device Interface (VDI) Specification
It uses the "SQL Server Virtual Device Interface (VDI)" as per the Datasheet.
You can't shadow copy or use a Windows API to backup SQL server files
CodeProject VDI wrapper if you want to write your own

How do I backup the data in SQL Server 2008 on a third party host?

I have a SQL Server 2008 database that is hosted by a third party host (heart internet).
How would I go about backing this up?
I used SQL Server Management Studio Express 2008 to create the tables within the database, but the backup options within this app seem to be only of use if you have direct access to the server machine (which I don't)
It's also worth noting that I am using change tracking - I presume this data would be lost should any backup be restored?
Thanks In Advance!
(PS - SQL Server 2008 novice here!)
If they allow you to run backups to a particular folder that you have access to you can just do it with the regular backup command:
backup database dbname to disk = 'y:\users\YourHomePath'
If they do not do that you might want to use the Database Publishing Wizard to script out your database (depending on the size this might be very slow)
You can create a .DAT file into a shared disk and to download it through FTP.

Resources