I would like to copy one database (almost all tables) to another server. So far we have done this using the standard MSSQL wizard. We would like to include this work into an automated build we have with ant.
I have found one command line tool: http://dbcopytool.codeplex.com/
Any better ideas?
Though I've not done this with Ant, I have done all of this with NAnt (and the links below are for Ant).
I would use the Sql task to issue a backup command, then a Apply/ExecOn command to issue the backup copy to the new server (Or the Copy task) , and then another Sql task to issue the restore on the new server.
Related
I have a Windows server with MS SQL Server running on it.
On the SQL Server developers have created stored procedures, views, tables, triggers.
On the Windows server developers created shell scripts.
I would like to start versioning the code described above in a BitBucket repository. I have a repository created in BitBucket.
How should the branches be organized in this repository? i.e. "SQL Server\Database\\ ...
"Windows Server\\shell_script" ...
Can I connect BitBucket to SQL Server and Windows Server and specify which code needs to be versioned?
Are both 1 and 2 options above possible?
I just need to version control the changes to the code and have the ability to mark under which project the code change was made.
I am new to BitBucket. I am using the web front end of it. I do not know how to configure command line access, so please try not to reference Bitbucket commands. Sorry if I sound confusing.
Please help.
I know this is an old question but anyway, in principle I'd recommend:
Put all the server shell scripts into one place and make that a git repo linked to your bitbucket repo
Add a server shell script to export what you want version controlled from the SQL db
The export from the SQL db should be to text files so they are easily 'diffable'
You might as well make the export to a sub-directory within the shell scripts repo so that everything is in one place and can't get out of sync
So you only have one branch, not a separate one for server shell scripts and db
Make sure people run the export script and then commit everything when they make a change
You ideally have a test server which means you'd want a way to push changes from the repo into the SQL db. I presume you can do this with a script but deleting the server setup and re-creating it from the text files.
So basically, you can't connect an SQL db to bitbucket directly. You need scripts to read and write to the db from a repo.
I am working on a Windows Server 2012 64-bit. I want to be able to import data from a .dbf file into a SQL Server table. I used the import wizard and it worked correctly. However, I have SQL Server Express and can't schedule this insertion.
Is there another way to schedule the insertion of the .dbf data to the SQL Server tables, without the use of the SSIS package loader?
Update
I ended up using Python and writing a script to import from XML. However, I believe the answer by #Oleg was the most accurate, given the circumstances.
Thank you all!
You can also use DBF Commander Pro for this task:
Create command line for your insertion - choose 'File -> Export to DBMS'. Specify transfer options in the window appears, then copy the command line from the bottom of the window:
Create text .BAT file and insert the copied command line, e.g.:
"c:\Program Files\DBFCommander\DBFCommander.exe" -edb "D:\Data\customer.dbf" customer_table "Provider=SQLOLEDB.1;User ID=user1;Initial Catalog=test_db;Data Source=test_server"
Make a schedule using Windows Scheduler that will execute this .BAT file.
Additional info that may be useful for you:
Using DBF in batch mode
Export DBF file to SQL database
I suggest you the next approach:
Create C# script which will use the OleDbConnection (to fetch) and SqlConnection (to upload) objects to import data from the .DBF file to SQL Server database table.
By using LinqPad, LinqPad command-line utility (lprun.exe) and windows Scheduled Task service automate the execution of the mentioned script file
Useful links:
How to get data from DBF file using C#
How to load data into datadase using C#
About LINQPad command-line utility
Another way is create a SQL linked server an ODBC that is pointing at the DBF. Use Windows scheduler to call SQLCMD.EXE to run some SQL to copy the data in.
I am trying to copy a .bak file nightly from Server A to Server B.
Can I do that using SQL server Job Agent to run this every night?
I am thinking of adding the copy command as a statement within a step of a job.
Something like: 'copy "G:\source\folder\" "\target\folder\"'
inside the step and setting the type to Operating System(CmdExec).
Is there a way to do it?
is this question about the command to copy the files?
If you want to copy entire folder use robocopy instead of copy
You can make a SSIS package to do that, and then run it from the SQL agent.
However, don't use logical drives, such as G: -- if the server doesn't have the same mapping, it won't work. Use the actual named servers: \serverA\source\folder to \serverB\target\folder.
Short answer is yes. You can try SSIS package as described here or here. Another option is to use windows task scheduler (vs using SQL Server Agent) and a simple bat script to do the same thing.
I have updated the package in BIDS 2005 (I changed the backup routine to save to a different drive) and now I'm trying to get it back on the server (2005). I tried File > Save Copy As... Then ran the job that executes the package and it's still saving to the old drive, thus, my package didn't get saved.
In my opinion always create a deployment utility with your SSIS Project. This is configured under the Project Properties (see below). Once you have configured the project deployment utility, go to your project, find the "bin" folder and double-click the deployment utility. I will walk you through getting your package(s) onto the server really easily.
Good Luck!
The quick and dirty answer is to use dtutil
dtutil /file C:\Src\MyPackage.dtsx /destserver thatDatabase /COPY SQL;MyPackage
I too am a fan of the manifest files but, while probably overkill for your problem, I prefer to use tools that allow for unattended use. I combine the ssisdeploymanifest with a PowerShell script to handle all of SSIS deployments.
Powershell SSIS Deployment and maintenance
I'm running the copy database wizard on a 2008 R2 instance of SQL Server.
The database I want to copy is a SQL 2000 database.
I'm copy that database to another SQL SErver 2008 R2.
The wizard uses SQL authentication for both servers, and both are sysadmins.
When I run it, I get the following error (FYI I have tried both copying the logins and leaving them out):
Event Name: OnError
Message: ERROR : errorCode=-1073548784 description=Executing the query "sys.sp_addrolemember #rolename = N'RandomRoleName..." failed with the following error: "The role 'RandomRoleName' does not exist in the current database.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
helpFile= helpContext=0 idofInterfaceWithError={C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
StackTrace: at Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer()
at Microsoft.SqlServer.Management.Smo.Transfer.TransferData()
at Microsoft.SqlServer.Dts.Tasks.TransferObjectsTask.TransferObjectsTask.TransferDatabasesUsingSMOTransfer()
Any help would be appreciated!
Jim
My suggestion is dont use the copy database wizard. Create a full backup of the database on the 2000 server and then restore it on the 2008 server.
If you google "Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer Copy Database Wizard" you will find that many many people have gotten this same error or other nearly identical smo errors... no-one appears to have gotten past it.
That's isn't to say its impossible... just, restoring a backup is so much easier then the wizard or troubleshooting the wizard. Good luck.
The copy wizard had missed some security and IIRC it's caused by subtle differences in security tables, principals etc between the 2 versions.
Frankly, the easiest way is to do one of these two:
backup/restore
detach, copy, attach
If you don't have access to the O/S and can't get it, another option is to create the missing role(s) in the background as the copy runs. You have to catch it between the creation of the files and when it tries to reference the roles, but there are a few seconds in which to create them if you keep clicking execute - I managed to create 9 roles.
Unfortunately, you'll end up with the roles in another database too (while yours cannot be used) so those need to be deleted.
Of course, this is only an option when you really can't use the other method.
Though the answer which is using the backup technique solves the problem generally, after facing the same issue several times, I was able to trace down the root of the problem using the Event Viewer of Windows to that the Database Copy wizard, using the SQL Agent, will eventually create a Job for the agent to run, after which the Agent will run using its own credentials (i.e. the credentials that you can look up in Windows Services, in my case, NT Service\SQLAgent$SQL2014)
All you need to do is to go the folder where SQL Server creates DB files (e.g. C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA by default for SQL 2014) and give the SQL Agent windows user write/read access on the folder.
The reason can be that a file with the new Database name already exist on the filesystem. We encountered this when we renamed Database X to X_Old, and tried to copy database Y to X. This cannot be done, because database X_Old is still associated with the filename X.
Either delete the conflicting database, or rename the file on the file system.
See http://codecopy.wordpress.com/2012/01/03/error-while-copying-a-database/