Import Existing Stored Procedures In SQL Server - sql-server

I restored my development database from production, and the stored procedures I need in my development environment doesn't exist in my production database. Is there a command Ii can use to import the developmetn stored procedures back into SQL Server. There are about 88 files, as each procedure is in a different text file.
TIA!
Chris

Oops, you did the painful way of generating scripts. You should have created a single script for all procedures by right clicking on the database in SSMS, choosing Tasks -> Generate Scripts.
However, if you don't want to go through that process again, open up a cmd shell in the folder and remember those old batch file days:
for %f in (*.sql) do sqlcmd -i %f
This should do the trick!
You could add other parameters to sqlcmd if required (i.e. login, password, server name, ...). To see a list of switches just do a sqlcmd -h.

For SQL 2K & 2K5, you want this tool.
I asked a similar question awhile ago and got this advice from Mike L (give him votes here).

Right click on the database from where you want to transfer the data
Select Data Transfer
Select Tables or Store Procedure (what you want to transfer)
Select the location where you want to transfer the data (either on server or localhost or any file)

Right click on the development database Hit Generate SQL Scripts and then only select stored precedures. If you need need additional filtering you can even select the stored procedures you dont want.
Then just run that query on development.

I don't know if there's a command line way to do it, but if you have them all in text files, it shouldn't be difficult at all to write a quick down and dirty app that just loops through all the files, and runs the create statements on your production server using whatever language you choose.

If like me you have to deal with a bunch of sql files in a hierarchy of folders, this one liner will combine them into a single file called out.sql which you can easily execute in SQL Management studio. It will only include files that END in .sql, and ignore files such as *.sqlsettings.
Run it from the root folder of the hierarchy of .sql files. Be sure you have nothing of value in out.sql, as it will be replaced.
del out.sql && for /f %f in ('dir /b /s ^| findstr /E \.sql') do type %f >> out.sql

Related

Passing values from Batch to DB2

I am really new to Batch and DB2 and got little time to explore much about them. I just want to know how it is possible for a batch program to pass a value/s to a db2 file so I can manipulate my database.
I found several suggestions but none of them worked. Here's my batch codes so far:
Rem This is db2execute.bat
#echo off
db2cmd -c -w -i db2 -tf INSERT.db2 id=1
PAUSE
My .db2 file on the other hand:
CONNECT TO SAMPLEDB;
INSERT INTO TB1 VALUES('$(ID)');
I would really appreciate some kind help. Thanks.
At the present time, in the shipping versions of Db2-LUW, the CLP (command line processor) does not directly support parameters in script files in the style that your question suggests.
If your product-type and version offers the clpplus command, then you can instead try using the Oracle sqlplus style for passing parameters on the command line and referencing those parameters in your script. See the Db2 and Oracle documentation for details .

Publishing Local DB Changes to Remote DB SQL Server 2012

I have SQL Server 2012 on my local machine and I am hosting the production version of my application/database with Arvixe.
To initially set up the database on the remote server with Arvixe, I just uploaded a .bak file of my local DB. Not a big deal since it was just getting things set up, but this as you know also pushes all of my test data to the database on my production server.
My question is this .. How should I go about pushing database changes (new tables, columns, keys, etc..) from my local development environment to the production environment on Arvixe's server? A simple backup won't work now - I can't overwrite my production data and replace it with dev data.
Is there a program that I can use for this? Is there something in SQL Server 2012 that I'm just missing? All I can find is the suggestion to upload a backup version of my local DB.
Thanks for your help.
The way to push database changes from Development to Production has little to nothing to do with where the Production instance is located.
All changes should be migrated to other environments via rollout scripts:
You should be creating scripts for all changes as you make those changes.
The scripts should be placed in folder for a particular release.
The scripts should be numbered to retain the same chronological order in which those changes happened (this should eliminate -- mostly -- any dependency issues).
The script numbering should be consistent with 2 or 3 digits for proper sorting (i.e. 01, 02, ... OR 001, 002, ...).
The scripts should all be re-runnable (i.e. they should first check to see if the intended change has already happened and if so, skip to the next script).
When migrating to a higher environment, just run all of the scripts.
If any script fails, the entire process should terminate since all changes need to occur in the same order in which they happened in Development. If using SQLCMD.EXE to run your scripts, use the -b (i.e. "On error batch abort") command line switch as it will terminate the process upon any error.'
Here is a simple CMD script (you could name it DeploySqlScripts.cmd) that handles one folder at a time, to a single Server/Instance, and assumes that you have a USE [DatabaseName]; line at the top of each script:
#ECHO OFF
SETLOCAL ENABLEDELAYEDEXPANSION
IF "%1"=="" (
ECHO Usage: DeployScripts "full:\path\to\SQL\scripts" Server[\Instance]
GOTO :EOF
)
FOR /F "delims=," %%B IN ('DIR /B /O:N /A-D "%1\*.sql"') DO (
SQLCMD -b -E -S "%2" -i "%%~fB"
IF !ERRORLEVEL! EQU 1 (
ECHO.
ECHO Error in release script...
ECHO.
EXIT /B !ERRORLEVEL!
)
ECHO.
)
Also:
If you are migrating from Dev to Prod, then you are missing at least one environment, if not 2 or even 3. You should not push changes directly from Development to Production since you might be developing things that are not ready to release. All changes that you feel are ready for release should first go to a QA environment that provides a better testing ground since it doesn't have any other changes that might invalidate certain tests.
You really, really should have your database object CREATE scripts in some source of source code control system (a.k.a. version control system). Subversion (SVN), Git, TFS, etc. There are several options, each with their pros and cons (as well as true-believers and haters). So do some research on a few of them, pick one that suits your needs, and just use it. And yes, the release scripts should also be a part of that repository.
There is also a tool from Redgate, SQL Source Control, that is not free, and I have not used it, but seeks to help in this area.
For simple / small projects (single database) Microsoft has a free tool that can script out differences between a source (i.e. Development) and target (i.e. QA or Production). It is called SqlPackage.exe and is part of SQL Server Data Tools (SSDT).
You can set up triggers on the database objects you're interested in pushing and set up the production server as a linked server.
As an aside, pushing production data to the development environment is fine, but going the other way involves a lot of risk.

How to include an sql file within another sql file in MS SQL?

We have several SQL scripts which are generated from an Entity Model. They need to be run in a specific order. Additionally there are several filling scripts which insert test data into the database.
Currently I need to open each script in Visual Studio and execute them in the correct order by clicking Execute (ctrl shift E).
Is there a way to create a script like Master.sql that contains includes like this:
BEGIN TRANSACTION
ImportOrInclude myDB1.sql
ImportOrInclude myDB2.sql
...
COMMIT
This is only required to run from Visual Studio and will not be part of the application itself.
How can this be done?
EDIT 1
I've found that there is something called SQLCMD Scripts which can import SQL files:
http://msdn.microsoft.com/en-us/library/aa833281%28v=vs.80%29.aspx
But my question is then how to get the directory path of the current solution into the :r command.
EDIT 2
So I figured out how to do it, not perfectly, but it works. The downside is that the $(SolutionDir) is not loaded from the Visual Studio variables, so you need to set it up manually. This code is meant to be run in Visual Studio:
-- turn on in menu: Data -> Transact SQL editor -> SQL CMD mode
-- set this to path where the .sln file is.
:setvar SolutionDir C:\_work\projectname\
:!! echo $(SolutionDir)Maa.EntityModel.All\DbWEntityModel.edmx.sql
:r $(SolutionDir)Maa.EntityModel.All\DbWEntityModel.edmx.sql
go
:!! echo $(SolutionDir)Maa.EntityModel.All\DblQEntityModel.edmx.sql
:r $(SolutionDir)Maa.EntityModel.All\DbQEntityModel.edmx.sql
go
Use sqlcmd utility.
Extract you might find interesting:
-i input_file[,input_file2...]
Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order. Do not use any spaces between file names. sqlcmd will first check to see whether all the specified files exist. If one or more files do not exist, sqlcmd will exit.
Example:
sqlcmd -dDataBaseName -E -Stcp:ServerName\instancename -imaster.sql
Master.Sql:
:r file1.sql
:r file2.sql
Use the Business Intelligence Development Studio that comes with SQL Server to do this. Create a new SSIS Package. Inside this package create several Execute SQL Tasks in the Control Flow for each file you have and set SQLSourceType to FileConnection and choose your file.

I have a 18MB MySQL table backup. How can I restore such a large SQL file?

I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.

Import SQL Server database from large script

Opening large sql script generated by SQL Server publisher cant be open in management studio, returning error about not enough available storage to open it.
Is there some other way to import db from large script ? (command line maybe)
Is this something you have to edit? If so, you may want to open it in Notepad++ or TextPad or Editplus.
Here are some options I can think of:
Use the batch separator GO between sets of commands. The reason for this is that without the GO, SSMS is trying to execute the entire script as a single command. This puts a heavier load on memory requirements than multiple batches would.
To run the script, you can use SQLCMD from the command line.
Also, for large scripts that load data, you may want to ensure that you have COMMIT commands in the script (where appropriate).
Consider splitting your script into multiple scripts.
If you split into multiple files and build the SQLCMD command line syntax, you can run all scripts from a single batch file fairly quickly.
Have you tried using the OSql tool?

Resources