I have SQL Server 2012 on my local machine and I am hosting the production version of my application/database with Arvixe.
To initially set up the database on the remote server with Arvixe, I just uploaded a .bak file of my local DB. Not a big deal since it was just getting things set up, but this as you know also pushes all of my test data to the database on my production server.
My question is this .. How should I go about pushing database changes (new tables, columns, keys, etc..) from my local development environment to the production environment on Arvixe's server? A simple backup won't work now - I can't overwrite my production data and replace it with dev data.
Is there a program that I can use for this? Is there something in SQL Server 2012 that I'm just missing? All I can find is the suggestion to upload a backup version of my local DB.
Thanks for your help.
The way to push database changes from Development to Production has little to nothing to do with where the Production instance is located.
All changes should be migrated to other environments via rollout scripts:
You should be creating scripts for all changes as you make those changes.
The scripts should be placed in folder for a particular release.
The scripts should be numbered to retain the same chronological order in which those changes happened (this should eliminate -- mostly -- any dependency issues).
The script numbering should be consistent with 2 or 3 digits for proper sorting (i.e. 01, 02, ... OR 001, 002, ...).
The scripts should all be re-runnable (i.e. they should first check to see if the intended change has already happened and if so, skip to the next script).
When migrating to a higher environment, just run all of the scripts.
If any script fails, the entire process should terminate since all changes need to occur in the same order in which they happened in Development. If using SQLCMD.EXE to run your scripts, use the -b (i.e. "On error batch abort") command line switch as it will terminate the process upon any error.'
Here is a simple CMD script (you could name it DeploySqlScripts.cmd) that handles one folder at a time, to a single Server/Instance, and assumes that you have a USE [DatabaseName]; line at the top of each script:
#ECHO OFF
SETLOCAL ENABLEDELAYEDEXPANSION
IF "%1"=="" (
ECHO Usage: DeployScripts "full:\path\to\SQL\scripts" Server[\Instance]
GOTO :EOF
)
FOR /F "delims=," %%B IN ('DIR /B /O:N /A-D "%1\*.sql"') DO (
SQLCMD -b -E -S "%2" -i "%%~fB"
IF !ERRORLEVEL! EQU 1 (
ECHO.
ECHO Error in release script...
ECHO.
EXIT /B !ERRORLEVEL!
)
ECHO.
)
Also:
If you are migrating from Dev to Prod, then you are missing at least one environment, if not 2 or even 3. You should not push changes directly from Development to Production since you might be developing things that are not ready to release. All changes that you feel are ready for release should first go to a QA environment that provides a better testing ground since it doesn't have any other changes that might invalidate certain tests.
You really, really should have your database object CREATE scripts in some source of source code control system (a.k.a. version control system). Subversion (SVN), Git, TFS, etc. There are several options, each with their pros and cons (as well as true-believers and haters). So do some research on a few of them, pick one that suits your needs, and just use it. And yes, the release scripts should also be a part of that repository.
There is also a tool from Redgate, SQL Source Control, that is not free, and I have not used it, but seeks to help in this area.
For simple / small projects (single database) Microsoft has a free tool that can script out differences between a source (i.e. Development) and target (i.e. QA or Production). It is called SqlPackage.exe and is part of SQL Server Data Tools (SSDT).
You can set up triggers on the database objects you're interested in pushing and set up the production server as a linked server.
As an aside, pushing production data to the development environment is fine, but going the other way involves a lot of risk.
Related
Each time I run the Microsoft Windows Backup and Restore App that is left over since the Microsoft Windows 7 Operating System, I get an error that some Files are missing and the Backup Process fails.
The Files are actually Folders. I have uninstalled some Apps in the meantime and now there is only one missing Folder that the Backup App does not find.
I have tried to run a Batch File within the CMD.EXE Command-Line Processor App with System Administrator Rights:
#ECHO OFF
SET DIR1="C:\Windows\System32\config\systemprofile\OneDrive\Pictures\Saved Pictures"
MKDIR %DIR1%
PAUSE
The Folder does get created well and nice, but the Backup App is still failing.
Could it be a Rights Dead-Lock?
I am creating the Folder using System Administrator Privileges because it is not possible otherwise.
I suspect that the Backup App is run with Normal Rights. However, the User Account that I am using is also part of the Administrators Group.
Please advise.
I could not reproduce this Issue.
The reason why I guess that this is happening is the following one:
The Microsoft Windows Insider Program is constantly rewriting the whole C:\Windows Folder on each Update, therefore the Folders that are missing have to be constantly recreated.
Earlier, I might have manually started the Microsoft Windows Backup and Restore Application and forgot to run the Batch File. The Application might have started to work on the Files and Folders to back up. Then, I might have manually run the Batch File that correctly created the Files and Folders, but that might have been too late - that is, after the Application already considered them as missing. Therefore, the error was happening.
I do not know for sure whether this is the cause for this error since I have encountered it a number of times, not only once, and I do not feel that it was possible to have manually run the Batch File later than needed each time.
Anyway, a possible workaround for this Issue might be the following one:
Create a Scheduled Task that first runs the Batch File and then runs the Microsoft Windows Backup and Restore Application. I do not know yet how to tell the Scheduler to automatically run the Application, but I can imagine that it might not be difficult to achieve this goal.
Then, whenever the manual Backup is needed to be performed, one can simply manually run the Scheduled Task. This way, this Issue might not reoccur, at least because the previously suspected behavior should be avoided.
I need to perform the backup manually because I am using Removable Disks as a Third Backup Solution. The First One is the ASUS Web Storage Cloud Provider and Synchronizer Application and the Second One is the File History Application run on an External Winchester Hard Disk Drive.
If anybody has a better solution for this Issue, then please let me know.
I am using a SQL-Server 2012 (SP2) on a WinServer 2012 R2. I created a maintenance plan that makes a backup of my databases and deletes report files and backup files that are older than 2 days.
My problem is that the maintenance plan doesn't delete files which are older than 2 days. these files are also not deleted after 5 or 6 days. The backups are verified and compressed.
I have rights on the directory (agent is running under LocalSystem) and in the task properties under "file extension" I tried to set the name without and with a dot, e.g. ".bak", "bak" and "*.bak" but nothing works. I changed the order of the task and nothing works.
But I always get success messages. So what is here wrong?
Just to clarify, are you trying to run a script (i.e., .bat) to handle some of this through SQLCMD, and calling that script through Task Scheduler? From your question, it sounds like you've set the Maintenance Plan up in SQL Server, but are trying to setup a "Task" to call it (Task Scheduler), rather than a "Job" (SQL Agent), my apologies if this is not the case.
It will be a lot tidier to do this through the SQL Agent (the SQL Server internal equivalent of Task Scheduler). Once you have a Maintenance Paln you can test through the Management Studio Interface, it's a breeze to add the Agent Job.
If you are running a Task Scheduler Task to call a script, it is likely you're getting "success" because the Task started your script, but the script isn't setup to handle the returns from the SQLCMD call properly.
I have tried to solve it in different ways and there has been no way, the fastest I have found is to use a .bat that deletes them and schedule a windows task that does it every night.
This bat clears from the last 3 days :
FORFILES /P "C:\Directorio" /S /D -3 /c "CMD /c DEL /Q #PATH"
I want use svnsync for create a back-up copy of my svn repository.
The svnsync command replicates all versioned files, and their
attendant svn properties. It does not copy hooks or conf
directories from the source repository, so I will need to do those
manually.
I have wrote this batch script for create the backup repository:
setlocal enableextensions enabledelayedexpansion
:: create local backup project
svnadmin create C:\Repositories\Test
:: initialize local backup project with remote project
svnsync initialize https://local.com/svn/Test https://remote.com/svn/Test
:: get remote info project and store in info.txt
svn info https://remote.com/svn/Test>info.txt
:: get remote UUID project from info.txt
set UUID=
for /f "delims=" %%a in (info.txt) do (
set line=%%a
if "x!line:~0,17!"=="xRepository UUID: " (
set UUID=!line:~17!
)
)
:: set local UUID project with the remote UUID project
svnadmin setuuid C:\Repositories\Test %UUID%
:: sync local and remote project
svnsync sync https://local.com/svn/Test https://remote.com/svn/Test
endlocal
I have wrote this batch script for synchronize the backup repository with master repository (hourly schedule):
svnsync sync https://local.com/svn/Test https://remote.com/svn/Test
Say, if I set up a mirror svn repo via svnsync, and if my production svn
server crashed, then how can i restore the production svn repo via the
mirror? Is it possible?
Can someone suggest the best practice of backing up the production server using svnsync?
There are several options that are better than using svnsync:
When you need to have a live backup or DR server for your Subversion repositories, consider deploying a replica (slave server) using the Multisite Repository Replication (VDFS) feature. For the more detailed getting started guidance please consider the KB136: Getting started with Multisite Repository Replication and KB93: Performing disaster recovery for distributed VDFS repositories articles.
When you want to backup your Subversion repositories, use the built-in Backup and Restore feature. The Backup and Restore feature helps you make daily backups of the repositories of any size and does not have any impact on performance and user operations. What is more, the Backup and Restore feature in VisualSVN Server is very easy to setup and maintain. Please, read the article KB106: Getting Started with Backup and Restore for setup instructions.
Don't forget to create background verification jobs, too. See the article KB115: Getting started with repository verification jobs.
I am using SourceTree with a Mercurial BitBucket repository. I would like for any SQL script files (*.sql) pulled from my remote BitBucket repo into my local one to simply be executed immediately after I update my working copy with the pulled files. The DB connection info would always be the same.
What would be the simplest way to accomplish this with either SourceTree itself or perhaps a mercurial hook? In the latter case, I believe something could be done with an update hook, but I'm not exactly sure how to set one up.
I am running SQL server 2012 on a Windows 7 machine.
I restored my development database from production, and the stored procedures I need in my development environment doesn't exist in my production database. Is there a command Ii can use to import the developmetn stored procedures back into SQL Server. There are about 88 files, as each procedure is in a different text file.
TIA!
Chris
Oops, you did the painful way of generating scripts. You should have created a single script for all procedures by right clicking on the database in SSMS, choosing Tasks -> Generate Scripts.
However, if you don't want to go through that process again, open up a cmd shell in the folder and remember those old batch file days:
for %f in (*.sql) do sqlcmd -i %f
This should do the trick!
You could add other parameters to sqlcmd if required (i.e. login, password, server name, ...). To see a list of switches just do a sqlcmd -h.
For SQL 2K & 2K5, you want this tool.
I asked a similar question awhile ago and got this advice from Mike L (give him votes here).
Right click on the database from where you want to transfer the data
Select Data Transfer
Select Tables or Store Procedure (what you want to transfer)
Select the location where you want to transfer the data (either on server or localhost or any file)
Right click on the development database Hit Generate SQL Scripts and then only select stored precedures. If you need need additional filtering you can even select the stored procedures you dont want.
Then just run that query on development.
I don't know if there's a command line way to do it, but if you have them all in text files, it shouldn't be difficult at all to write a quick down and dirty app that just loops through all the files, and runs the create statements on your production server using whatever language you choose.
If like me you have to deal with a bunch of sql files in a hierarchy of folders, this one liner will combine them into a single file called out.sql which you can easily execute in SQL Management studio. It will only include files that END in .sql, and ignore files such as *.sqlsettings.
Run it from the root folder of the hierarchy of .sql files. Be sure you have nothing of value in out.sql, as it will be replaced.
del out.sql && for /f %f in ('dir /b /s ^| findstr /E \.sql') do type %f >> out.sql