I have a table that is truncated and loaded with data everyday the problem is truncating the table is taking a while and users are noticing this. What I am wondering is, is there a way to have two of the same tables and truncate one then load the new data and then have the users user that new table and just keep switching between the two table.
If you're clearing out the old table, as well as populating new you could use the OUTPUT clause. Be mindful of the potential for log growth, consider a loop/batch approach if this may be a problem.
DELETE
OldDatabase.dbo.MyTable
OUTPUT
DELETED.col1
, DELETED.col2
, DELETED.col3
INTO
NewDatabase.dbo.MyTable
Or you can Use BCP which is a handy alternative to be aware of. Note this is using SQLCMD syntax.
:setvar SourceServer OldServer
:setvar SourceDatabase OldDatabase
:setvar DestinationServer NewServer
:setvar DestinationDatabase NewDatabase
:setvar BCPFilePath "C:\"
!!bcp "$(SourceDatabase).dbo.MyTable" FORMAT nul -S "$(SourceServer)" -T -n -q -f "$(BCPFilePath)MyTable.fmt"
!!bcp "SELECT * FROM $(SourceDatabase).dbo.MyTable WHERE col1=x AND col2=y" queryout "$(BCPFilePath)MyTable.dat" -S "$(SourceServer)" -T -q -f "$(BCPFilePath)MyTable.fmt" -> "$(BCPFilePath)MyTable.txt"
!!bcp "$(DestinationDatabase).dbo.MyTable" in $(BCPFilePath)MyTable.dat -S $(DestinationServer) -T -E -q -b 2500 -h "TABLOCK" -f $(BCPFilePath)MyTable.fmt
Related
I can use the following command to do so as long as I create the table and the appropriate columns first. I would like the command to be able to create table for me based on the results of my query.
psql -h remote.host -U myuser -p 5432 -d remotedb -c "copy (SELECT view.column FROM schema.view LIMIT 10) to stdout" | psql -h localhost -U localuser -d localdb -c "copy localtable from stdin"
Again, it will populate the data properly if I create the table and columns ahead of time, but it would be much easier if I could automate that with a comand that creates the table according to the results of my query.
I'm using SQLCMD in PDW for extracting data into a flat file. The command line syntax is given below:
sqlcmd -S "10.20.30.40,19001" -d MyPDW_DB -U PDW_User -P Password1 -Q "SET QUOTED_IDENTIFIER ON; SELECT * FROM MyPDW_DB.dbo.SampleFact" -o "FactOut.txt" -s"|"
When I try to execute the batch file, I get the following error:
Msg 104409, Level 16, State 1, Server PdwTdsServer, Line 1
Setting QuotedIdentifier to 'OFF' is not supported.
I am assuming this is due to the fact that there is a "comma" in the server name (IP address,Port Number). I can use this command for extracting data from SQL tables. Any idea on how I can make this working for PDW?
Thanks in advance
I got this working partially.
sqlcmd -S "10.20.30.40,19001" -d MyPDW_DB -U PDW_User -P Password1 -I -Q "SELECT * FROM MyPDW_DB.dbo.SampleFact" -o "FactOut.txt" -s"|"
For setting the quoted_identifier OFF, the option to use is "-I". However, I'm still trying to find an alternative for "SET NOCOUNT ON" option which is not supported in PDW. If someone can help me with that, I'd greatly appreciate that.
I have all the scripts to do:
Set up a database.
Create schema/s.
Create tables.
Create stored procedures.
I would like to write a batch file that will have SQL Server run those scripts and consequently my database will be created easier and quicker. For the sake of this example, lets assume that I have a folder with the address C:\folder and inside this folder I have files SetDatabase.sql, SetSchema.sql, SetTable.sql, and SetSP.sql. How would I set all that up on localhost\TSQL2012?
You can do this in powershell using sqlcmd
sqlcmd -S serverName\instanceName -i scripts.sql
The above statement will execute a script.
You can use the :r command in another file (scripts.sql) to store all your scripts.
:r C:\..\script1.sql
:r C:\..\script2.sql
....
set _connectionCredentialsMaster=-S MyServer\MyInstance -d Master -U sa -P mypassword
set _connectionCredentialsMyDatabase=-S MyServer\MyInstance -d MyDatabase -U sa -P mypassword
set _sqlcmd="%ProgramFiles%\Microsoft SQL Server\110\Tools\Binn\SQLCMD.EXE"
%_sqlcmd% -i MyFileCreateDatabase001.sql -b -o MyFileCreateDatabase001.Sql.log %_connectionCredentialsMaster%
%_sqlcmd% -i MyFile001.sql -b -o MyFile001.Sql.log %_connectionCredentialsMyDatabase%
%_sqlcmd% -i MyFile002.sql -b -o MyFile002.Sql.log %_connectionCredentialsMyDatabase%
set _connectionCredentialsMaster=
set _connectionCredentialsMyDatabase=
set _sqlcmd=
Just remember, when you run the 'Create Database' statement, you are actually USING the "Master" database. Then, after MyDatabase is created, you can use it. Thus why the first line in the example above...connects to Master.
The above will let you set the credentials "at the top" "one time"....and keep your lines in the file for each file.
Use SQL data tools to implement your needs. You should study about that before you do.
http://msdn.microsoft.com/en-in/data/tools.aspx
Every week, I have to run a script that truncates a bunch of tables. Then I use the export data task to move the data to another server (same database name).
The servers aren't linked, I can't save the export job, and my permissions/settings are limited by the DBA (I am an admin on the databases). I have windows authentication on both servers only. The servers are different versions (2005/2008).
My question is is there a way to automate this with my limited ability to modify the servers? Perhaps using Powershell?
Selecting all these tables and stuff in the export wizard week after week is a pain.
If you have access to the SQL Server Management console apps, try something this from a different system.
C:\> bcp ExportImportFile.inp out prod.dbo.[Table] -b 10000 -S %SQLSERVER% -U %USERNAME% -P %PASSWORD% -T -c > C:\Temp\ExportImport.log
C:\> sqlcmd -S %SQLSERVER% -U %USERNAEM% -P %PASSWORD% -Q "Use Prod;TRUNCATE TABLE [Table];" >> C:\Temp\ExportImport.log
C:\> bcp prod.dbo.[Table] in ExportImportFile.inp -b 10000 -S %SQLSERVER% -U %USERNAME% -P %PASSWORD% -T -c >> C:\Temp\ExportImport.log
You can use DBATOOLS:
$splat = #{
SqlInstance = '{source instance}'
Database = 'tempdb'
Destination = '{dest instance}'
DestinationDatabase = 'tempdb'
Table = 'table1' # you can provide a list of tables
AutoCreateTable = $true
Truncate = $true
}
Copy-DbaDbTableData #splat
If you don't have dbatools: https://dbatools.io/getting-started/
Is there a way using SSMS or other tool to output about 600 tables from a SQL Server database. The catch is they need to have column headers.
Basically I need to dump 600+ tables with a bar '|' delimiter, and they need to all have column names in the first row.
If I remember right you should be able to use the command line sqlcmd tool to export data together with headers. Something like this:
sqlcmd -S localhost -d YourDatabase
-E -Q “SELECT * FROM YourTable” -o “CSVData.csv” -W -w 1024 -s”|”
You'll have to look into the options to get them right.