So I have a lot of table create scripts that I need to run on a fresh database. They are all in one folder, and theirs over 250 of them.
Is there a good way to go about doing this using SSMS?
Option 1: USE BATCH FILE
Using Batch file[to run all the scripts in one click] but that cmd implicitly will use sqlcmd.
Option 2: MERGE MULTIPLE SQL FILES INTO A SINGLE SQL FILE AND THEN EXECUTE IN SSMS IN ONE SHOT
Create a simple batch file under the folder(wherein your 250 table creation scripts reside) as follows
type *.sql > allinone.sql
Note: Open txt document and then copy paste the above content and then save it with extension .bat(Ex:merge.bat)
Just double clicking that batch file will give you the single sql file which you can run in SSMS in single go.
Related
I have to weekly upload text files from a server location to Microsoft SQL Server Management Studio .I wish to automate the task so that files are automatically uploaded .Can somebody suggest me the way?
Methods I know of:-
Via SQL:
Use OPENROWSET to open the file and obtain the records to write into
a table.
Use BULK INSERT to open the file and insert directly into a table (you may need to pair with XP_CMDSHELL to get a directory listing to loop through)
VIa SSMS:
Create a DataFlow to import from file
SSMS makes it easier to do clever things with the import process. But it can be very finnicky.
With both of those you can set up an Agent job to run the script / package automatically.
I have an ogr2ogr batch file that reprojects SQL data into a new SQL Server table.
It works fine when I run the bat file manually but it fails if I run the bat file via a SQL Server stored procedure. I have given the gdal folders SQL Service permissions and xp_CommandShell is also enabled. I'm using
EXECUTE xp_CMDShell 'blah'
in the T-SQL script.
For some reason the ogr_MSSQLSpatial.dll causes it to fail.
ERROR 1: Can't load requested DLL: Z:\BroadSpectrumSQLTreeExtract\ogr2ogr\gdalplugins\ogr_MSSQLSpatial.dll
If I remove this dll the script runs via SQL but it means I need to add extra commands that the dll must take care of, such as setting source coordinate system. I haven't managed to get it working 100%. The furthest I got to was producing the reprojected table but the geometry field is empty.
The DLL does contain SQL commands to the system tables. Could this be a SQL Server security issue stopping it from working?
I again had this problem with another ogr2ogr bat while executing with SQL. If I put the bat in the same folder as the dll's it works fine.
I have a folder with a .sql files; 1 file per query. I want to execute all queries/ sql files and save them as csv.
Is there a way to do that automated without using the windows cli (disabled in my environment). I do have the SQL Server Management Studio.
I would approach this task using SSIS, providing you have Business Intelligence Development Studio (BIDS) installed.
First create a 'Foreach Loop Container' pointed to the folder with the SQL files, then use a variable to retreive each file name.
Next, create a flat file connection and set the 'Connection String' property to the variable that contains the file location.
Next, using the 'Execute SQL Task' component set the 'SQLSourceType' to 'File Connection' and the 'FileConnection' to the one created in the previous step.
Finally, depending on how the data is returned you have a couple of options, if the result set is small, only a row or a single column, then you can save the results to a variable and using a 'Dataflow' task create a 'Derived Column' component and export the contents of that variable to a CSV file. Or, if the dataset is larger you could dump the results to a temp table and then using an 'OLE DB Source' and 'OLE DB Destination' you could push the full result set straight into a CSV.
Hopefully this isn't too convoluted of a solution, this approach has the advantage of being able be run from either a remote machine or from the server itself, plus you can automate its execution with a SQL Agent Job.
Create a VB.NET console application.
Generate a list of files that end in .SQL from the folder in question.
Load the contents of each file into individual SQL Commands
Execute the SQL Command for each, storing the results in DataSets.
For each table in each dataset, create a new .csv file
For each .csv file, you will need to iterate over each cell in the datatable, and utilize proper escaping for .csv files.
Use 'for' in combination with either sqlcmd or bcp command for each file in the script folder.
Opening large sql script generated by SQL Server publisher cant be open in management studio, returning error about not enough available storage to open it.
Is there some other way to import db from large script ? (command line maybe)
Is this something you have to edit? If so, you may want to open it in Notepad++ or TextPad or Editplus.
Here are some options I can think of:
Use the batch separator GO between sets of commands. The reason for this is that without the GO, SSMS is trying to execute the entire script as a single command. This puts a heavier load on memory requirements than multiple batches would.
To run the script, you can use SQLCMD from the command line.
Also, for large scripts that load data, you may want to ensure that you have COMMIT commands in the script (where appropriate).
Consider splitting your script into multiple scripts.
If you split into multiple files and build the SQLCMD command line syntax, you can run all scripts from a single batch file fairly quickly.
Have you tried using the OSql tool?
If I have scripts in multiple files, and I would like to execute each one in a known sequence, can I simply import them into a T-SQL script and execute them, or must I run sqlcmd or similar against each file? I'm sure Oracle has a feature to import/include script content from another file (maybe with ## ?).
I want all of the scripts to run automatically. I.e. I don't want to manually load and run each file of SQL script.
If you want to run the scripts automatically I'd make a batch file which calls osql for each file.
-Edoode
You can concatenate the scripts into a single file. Place a GO statement at the end of each file (if it doesn't already have one) and it should be fine to run as one file instead of multiple.
I think you can import them into a T-SQL script and execute. Better yet though, use SSMS and a maintenance plan. That way you can run the individual SQL and have some checks on success before executing the next statements.
We have a free simple tool you can use as well - it is called Script Executor - the community edition (http://www.xsqlsoftware.com/Product/Sql_Server_Script_Executor.aspx) allows you to add individual scripts and folders containing scripts, order them and execute. There is a command line option as well in case you need to schedule the execution to run automatically.