I am looking for a command line program that extracts a script containing commands for creations of all database objects (tables, views, procedures, generators, types). It should work with MySQL, Firebird, Oracle and MS SQL Server.
As a part of our automatic build process we would like to put in our SVN repository a script for database creation in an automatic fashion. Firebird comes with a tool that do what we want but we prefer a generic one so we can include as an ANT task.
For SQL Server you can use the Database Publishing Wizard
I can't help you with the other DB's, however. Good Luck.
To Script the Schema and Data
sqlpubwiz script -d YourDB -S YourServer -U YourUsernam -P pwdhere C:\DestFile.sql
and to just script the Schema
sqlpubwiz script -d YourDB -S YourServer -U YourUsernam -P pwdhere C:\DestFile.sql -schema
Related
I'm trying to script the database objects and data of my database to later move it to a server where I don't have backup/restore rights. Instead I'm using the Generate Scripts method and I use mssql-scripter to generate the scripts.
I have a .bat file with the following script code to generate my SQL script file.
set
timevar=%date:~4,2%%date:~7,2%%date:~10,4%-%time:~0,2%%time:~3,2%%time:~6,2%
mssql-scripter --server 10.100.8.8 -d Dev_db -f .\%timevar%.sql
--schema-and-data --script-drop-create --target-server-version 2016 --target-server-edition Standard --check-for-existence --include-dependencies --constraint-names --collation -U ScriptingUser -P 1234 --exclude-use-database
The problem is that it's also scripting DROP DATABASE and CREATE DATABASE, which I don't want. I would only like to DROP and CREATE database objects and later populate tables with the scripted data.
Has anyone faced this problem and have you found a solution?
After fiddling around with the options for longer, I managed to find the right parameter and work-around to solve my problem.
The exact code that I ran is:
set
timevar=%date:~4,2%%date:~7,2%%date:~10,4%-%time:~0,2%%time:~3,2%%time:~6,2%
mssql-scripter --server 10.100.8.8 -d Dev_db -f .\%timevar%.sql --schema-and-data
--script-drop-create --target-server-version 2016 --target-server-edition Standard --check-for-existence --constraint-names --collation -U ScriptingUser -P 1234 --exclude-use-database --include-objects "dbo." --display-progress
The key change I added the --include-objects parameter, with a twist. The way I changed by scripts is by adding code snippet:
--include-objects "dbo."
This tells mssql-scripter to only script out objects that contain the "dbo." keyword(substring) in the fully qualified name.
Also I remove this parameter from my initial command:
--include-dependencies
since I script out everything in my database under the dbo schema.
This scripts out:
all of the objects in my database
it includes a IF EXISTS check
it issues a DROP query to drop the existing
it issues a CREATE query to create the new one
it issues multiple INSERT statements to also populate the database with data
I am working on PostgreSQL database and we have a test server which needs to have the same data set as the production one. For this, I plan to start a daily CRON job in linux and copy the production database along with its contents like tables, rows, columns, sequences.
I checked how to copy databases from one to another, and I used the pg_dump command as I will write it below, but it only copied the database tables, sequences, but not the contents.
What should I do to copy the contents?
pg_dump -C databaseName | ssh -C username#removeHost.com "psql databaseName"
Edit
So, What I did was I deleted the database which was on test server,
created a new empty database and then used the command above, and it
worked. So I guess I need to delete the database then only it will
overwrite it.
What should I do to circumvent this behaviour and do a force update
of the database, or delete the test server database even if it is use
and create a new empty database.
Have you tried to use pg_restore instead of psql ? pg_restore has special arguments for your case: -c -C.
Details here:http://www.postgresql.org/docs/current/static/app-pgrestore.html
An example of a command to dump/transfer/restore a db:
pg_dump -F c databaseName | ssh -C username#removeHost.com 'pg_restore --clean --create -d postgres'
For this command you need an empty db on target instance to connect to. (postgres in example).
database named with -d is used only to issue the initial DROP DATABASE
and CREATE DATABASE commands. All data is restored into the database
name that appears in the archive.
If you already have a db on target instance:
pg_dump -F c databaseName | ssh -C username#removeHost.com 'pg_restore --clean -d databaseName'
Similar question: Use pg_dump result as input for pg_restore
When you use the feature "generate database from model" how do you put some data into a table?
for example i want a tabel with all country after the database is created.
Is there a script / shortcut to do this?
You have following options
write a program to create db/table and
insert data
create SQL script to create db/table and
insert data
use the database management tool
(e.g. Management Studio for SQL
Server) to create db/table and insert
data
Are you looking for the Seed() method?
Accepted Ajay_Whiz answer since it helped me the most i created a bat file that just executes the contex.sql and then some other *.sql files to populate the database.
Disclaimer this is for development database do not ever ever use this on you productions database. unless you know what your doing
The generated contex.edmx.sql by Entity framework drops all your data!!!
::Automated databasecreater
#echo off
echo createing database
SQLCMD -S. -E -iContext.edmx.sql -e -b
echo add gemeentes
SQLCMD -S. -E -iContainsDataOfTableX.sql -b
pause
I want to automatically (ideally from the command prompt in a batch file) automate the generation of the schema of my SQL Server 2008 R2 database.
In SSMS, I can right-click the DB, choose "Tasks", "Generate scripts", and then follow the wizard to gen a Schema script. Is there a command-line version of this process that I can use?
Microsoft released a new tool a few weeks ago called mssql-scripter that's the command line version of the "Generate Scripts" wizard in SSMS. It's a Python-based, open source command line tool and you can find the official announcement here. Essentially, the scripter allows you to generate a T-SQL script for your database/database object as a .sql file. You can generate the file and then execute it. This might be a nice solution for you to generate the schema of your db (schema is the default option). Here's a quick usage example to get you started:
$ pip install mssql-scripter
# script the database schema and data piped to a file.
$ mssql-scripter -S localhost -d AdventureWorks -U sa > ./adventureworks.sql
More usage examples are on our GitHub page here: https://github.com/Microsoft/sql-xplat-cli/blob/dev/doc/usage_guide.md
From this answer, there appear to be tools called SMOScript and ScriptDB that can do that.
If you find a way without third party tools please share :)
I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.
I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".
Any ideas on how I can get this SQL script to run?
use the sqlcmd tool to execute the file..
sqlcmd -S myServer\instanceName -i C:\myScript.sql
In case your have an unexplained "script error" for large sql files (> 100MB) which includes several INSERT, just replace "INSERT INTO" by "GO INSERT INTO" in your file, which will reduce size of transaction.
This tool (Big SQL Script File Runner) on CodePlex will run any size script file with log and GUI.
Adding to Gulzar Nazim's answer:
If you still get a failure, try specifying the codepage of your SQL file using option -f:
sqlcmd -S myServer\instanceName -d databaseName -i C:\myScript.sql -f 65001
I was trying to import a .dump file from SQLite (UTF-8 by default), and sqlcmd kept throwing an error after encountering the first special character. -f 65001 fixed it for me.
Why not just use DTS to import the CSV file directly?
Yes we could do that, I tried with BCP(Bulk Copy Program) approach in order to avoid OutOfMemory issue.
Note : Tried in SQLServer 2014
In BCP, first we need to export the Source DataBase data to bcp file(in local directory folder) and then need to import that bcp file to Source DataBase
Below are the cake walk steps:
Note:
a) Make sure empty table is present in Destination DataBase
b) Make sure Temp folder is present in C drive
1) Create a bat file named as Export_Data.bat with below command
bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
pause
2) Run that bat file, as a result of that a bcp file will get generated in Temp folder
3) Then Create a another bat file named as Import_Data.bat with below command
bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
Pause
And here we go!!
Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.
Another option is to look at some of the other ways to import CSV data directly.