My scenario is my live database is on Azure database name vlproduction and I am using SQL Server 2014 on my local machine, database name testvlproduction. For some reason, my testvlproduction database was deleted.
I want to generate testvlproduction to be the same as vlproduction but I found there is no way to take direct backup of live so I generate script with data but script is too big (300mb) whenever I'm trying to run script on my local it shows
System.OutOfMemoryException
Please tell me, what to do to fix this ? or is there another way to generate database same as live on local
Is there already built any such functionality in SQL Server?
May be this question repeated but still I have no solution for my issue.
Fill free to ask any query.
Thanks
This is a limitation of sql server,This happen because Management Studio is running out of memory, not the SQL Server service.
This is likely caused by the size of the result set that you are returning to Management Studio.
See https://support.microsoft.com/en-in/kb/2874903for more details.
For more sql-server-management-studio-cant-handle-large-files
Since your SSMS runs out of memory you should try it with the a new release. It should be compatible with SQL Server 2014.
If you have some idle time on your production database take a db backup and restore it locally - use Export Data Tier Application and select the proper version where you want to restore. If this is not an option for you I'd suggest to take #mohan111 's suggestion of committing your script in batches.
I believe you have two problems:
System out of memory exception - This usually happens while running big scripts in SSMS. One work around is to run it using command prompt using sqlcmd command (see the link - https://msdn.microsoft.com/en-us/library/ms180944.aspx)
Creating Test Database with the same schema as that of production - You can make use of SSDT tool in Visual Studio. This will help you to create a database project which mimics the production database and you can use the publish functionality to create database where ever required which will have the same schema
Related
I am trying to use Copy Database Wizard to copy from my live server (shared hosting) to my local machine. Both the live and local servers are SQL 2008 R2.
I have used CDW for several years with perfect success when copying from a live SQL 2000 server to my local 2008 R2. But now that I have migrated my live database to SQL 2008 R2 the CDW is giving me this error:
Could not read metadata, possibly due to insufficient access rights.
I've learned that this error can be predicted before you even complete the CDW setup: On the page where the CDW asks you for your desired destination database name, it is SUPPOSED to populate the .mdf and .ldf files with their name-to-be and size (e.g. MB, GB).
But in my case these file names and sizes are not being shown (area is simply blank in the wizard) and then of course when I attempt to execute the package it gives me the error.
After much research I believe that reason for this error is due to the CDW requirement of "You must be a member of the sysadmin fixed server role on both the source and destination servers."
On my local server, my Windows Authentication login is listed as a Role Member for the sysadmin Server Role. However on my live server (keep in mind it is a shared SQL server with 250+ databases) the only Role Member listed is [sa].
Am I right in thinking that the only way to satisfy this requirement would be to add my specific SQL user to the live/source Server > Security > Server Roles > sysadmin role? I'm guessing that would never be done on a shared server right? Or is there some other way to make it work by messing with the specific database properties/users/roles?
I can't explain why CDW is working from the live SQL 2000 server and not the 2008 R2. I HOPE it is simply that something isn't set up right on the live database, but maybe it is due to changes that were made to SQL security over the years.
In case it matters, I must use the SMO method instead of detach/attach because it is a live database that I don't want to take down. Historically the CDW from SQL 2000 only takes 3 minutes with SMO method so speed isn't an issue anyway.
Here's my preference for a solution:
Find a way to get CDW to work, most likely by changing something on the live server. Is this possible? What would it be?
If that fails, then...
What about an idea of using CDW to create the package, but then going into to BIDS and manipulating something in the package to circumvent the sysadmin role requirement. (Does it really need the metadata? I don't need anything beside the actual data tables.) Is this possible?
UPDATE 6/14/2016: Editing a CDW package in BIDS won't work as it appears to simply use the .mdf and .ldf files, which of course I don't have access to on the shared server. I think an alternative is to use Import/Export Wizard to create a package, then edit in BIDS. The annoying part is that without access to metadata the Import/Export Wizard doesn't seem to be aware of Foreign Keys, and thus doesn't know what order to process the tables in.
If that fails, then...
Is there any other way to easily automate a daily copy from my live server to local machine? The reason I like CDW is because it is super simple to use (when it works), it can be scheduled to run daily as a SQL agent job, and requires no manual work on my part. Is there a "next best thing" if CDW can't be made to work?
You'd think that a very common scenario for all websites out there would be "how do I get a copy of my live database onto my local SQL server, daily, automatically"? But maybe I'm the weird one!
Another simple solution would be the Import/Export Wizard.
In SSMS right-click the database you want transferred and select 'Tasks' and then 'Export Data...'. It will open a wizard that is very similar to that of CDW. The difference here is that I could not find a sysadmin requirement to use it.
At the end it will give the option to run immediately and/or save the SSIS package. If you save the SSIS package (I prefer to save it to disk) you can then create a schedule via a SQL Agent job.
I'm using a C# application to Backup and Restore DBs on a remote server using the microsoft.sqlserver.smo.dll.
Testing with my local machine, I can browse backup files to select the backup to use. Can this be done through code for the remote SQL Server using the SQL credentials similar to the way MSSMS does it?
My backups are saved with a certain naming convention (ie. "Ebuy_full_2013_8_7_H13_M40.bak") and I would like to be able to show these in the application so a decision about which backup file to restore can be made.
Thanks,
Rick
SOLVED: Based on a comment for another question I ran SQL Profiler to determine what functions MSSMS was using, found it was using master.dbo.xp_dirtree, was able to duplicate this in my app.
I want to migrate a database completely or partially. Right now, I will do a complete database. Partial will be posted as a separate question.
Strategies I am considering:
copy database wizard
convert 2005 database to script. Run script on 2008
simple SQL query
My question - I want to use method 2. Is it even possible to do this? If yes, how to do it? Any limitations/risks ?
NOTE - The source server is a SQL Server 2005 database with one IP. Destination is a SQL Server 2008 instance with another IP.
I don't know if you need to be sysadmin to do this. I am not even sure if I am sysadmin. If yes, then how to check if I am a sysadmin ?
Just RESTORE it on the SQL2008 server and it will be automatically upgraded. And you can check if you're a sysadmin using IS_SRVROLEMEMBER.
My question - I want to use method (2). Is it even possible to do this ? (...) any limitations/risks ?
Option 2 could be a problem if database is too large. Worked for me with up to 2 GB databases.
any limitations/risks ?
You may need to increase SQL's buffer and/or run the script through command line, since a large script in SQL Manager eats up plenty of memory.
If yes, how to do it ?
To generate the script, simply right click and choose generate script. Select both data and model for whole database, and choose appropriate options for others.
As Pondlife said, Just backup the database at SQL 2005, restore as a new database at SQL 2008. You change the compatible mode to SQL 2005 which is version 9, or you can leave it to 2008 which is version 10 or 10.5. I think you want to keep the database as is, so you could set the compatible more to 9.0.xxxx and you all be done.
As usual grant access to users, create them as new if you have to, or migrate them from SQL 2005.
Generating script from 2005 and running on 2008 is a long route. There may be some possibilities for errors and TSQL compatibility.
Hope it helps !!
Most simple way to do this is to restore backup or copy MDF and LDF files to new server. If your servers are in the same network you can do this by creating shared folders on the second server and copying files there.
If that is not an option for any reason then you can zip the backup (make sure to add strong password) upload it to some online storage and then download it from second server.
Final option is to use third party comparison and synchronization tools from RedGage or ApexSQL (there are a lot of these on the market and they all have fully functional trials)
I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.
We have several SQL Server 2000 databases (I know, we need to upgrade) that have the same structure and have them set up to replicate to another server. The problem is that whenever I have to change the structure (which is usually pretty easy to do on all databases, especially with tools from Red Gate) I have to stop the replication, make the structure changes, and then set up replication again. The steps to set up replication only take a few minutes to do for each database but it's repetitive and drives me crazy. I have the IDE create a script of the replication procedure and then just replace the name of the prior database with the name of the next database and run the script. Still annoying but faster than clicking though the IDE and forgetting an option.
I've tried things like using the "SP_MSFOREEACHDB" but that didn't look very promising.
My guess is I should just use the TSQL that gets generated from the IDE and use that as a starting point to build a new TSQL script have it pass the name of the database as a parameter. And then when something changes with the structure of the database I need to address that in the TSQL replication script and make the changes there. Is this an issue for anyone else? Does 2005 or 2008 have a better "SP_MSFOREACHDB" so I wouldn't have to mainting some crazy script and just have the IDE generate a script when there are changes that I could then use on multiple databases easily?
Are you using SQL Server replication? If so, why aren't you making the changes to the publishing database and letting it push out the schema changes to its subscribers? We do this occasionally on SQL 2005 and it works well for the most part; I don't have any experience with replication on 2000 servers.
If you can use SQL Server Management studio, then the SSMS tools pack has a widget to allow the same script to be run on different databases.
SQLCMD tool can connect to SQL Server 2000 and enable interaction from command line. Using parametrized queries and a fixed set of .bat files (one for each server) can be a good alternative for what you do now.