I am trying to backup a Sql Server database via Perl DBI. Calling "backup database" via do() runs but usually does not produce a backup. Calling do() creates a backup when ODBC tracing is enabled. Calling prepare() and execute() fails.
I am using ActiveState Perl on Windows 7 Professional and Sql Server 2008 R2.
Here is a link to download source code and various logs
http://www.fileswap.com/dl/4VnYbCdk6R/ToZip.zip.html
(Click on slow download)
Here is the summary of logs
BothTraces made 3 backups but program aborted
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 09:39 perlEasy.bak
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 09:39 perlHard.bak
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 09:38 queryOS.bak
NoTracing made 1 backup, program aborted
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 10:15 queryOS.bak
DbiTrace made 1 backup, program aborted
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 10:19 queryOS.bak
OdbcTrace made 3 backup but program aborted
-rwx------+ 1 SYSTEM SYSTEM 159744 Jan 16 10:21 perlEasy.bak
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 10:21 perlHard.bak
-rwx------+ 1 SYSTEM SYSTEM 160256 Jan 16 10:21 queryOS.bak
Here's my program:
#!perl -w
#try to use DBI for SQL Server backup
#connect to database server
use v5.14; #enable modern Perl
use DBI; #database interface
my $dbHandle = DBI->connect("dbi:ODBC:Driver={SQL Server};Server=DavidZ") or die; #dbi prints a detailed error message
$dbHandle->{RaiseError} = 1; #enable failure on DBI problems; obviates the need for "or die" with every DBI call
$dbHandle->{PrintError} = 0; #don't duplicate error messages
#enable debugging
$dbHandle->trace(1);
$dbHandle->{odbc_trace} = 1; #not helpful
$dbHandle->{odbc_trace_file} = 'C:\David\dump\tracer.file'; #not helpful
#run a SQL command to verify connection, write a note to ERRORLOG
$dbHandle->do ('use master');
$dbHandle->do ("raiserror ('New run of backup.pl', 0, 0) with log");
say 'Verified database connection';
#backup commands
my $perlEasy = "backup database dz to disk='C:\\David\\dump\\perlEasy.bak'";
my $perlHard = "backup database dz to disk='C:\\David\\dump\\perlHard.bak'";
my $queryOS = "backup database dz to disk='C:\\David\\dump\\queryOS.bak'";
#make a backup via sqlcmd. this works
my $sysCmd = "sqlcmd -Q \"$queryOS\" ";
system ($sysCmd) == 0
or die "The following system command failed: $sysCmd \n";
say 'Created backup via sqlcmd';
#try to make a backup via DBI
$dbHandle->do ($perlEasy); #runs silently but does not produce a backup file
say 'Created backup the easy way';
#more complicated DBI method
my $stHandle = $dbHandle->prepare($perlHard);
$stHandle->execute(); #statement starts a backup then fails, no furter code is executed
do
{
#print dbi results
say "DBI reports $DBI::errstr";
while (my #row = $stHandle->fetchrow_array()) #recommended by someone, but makes no sense for a backup
{ say "Returned values: #row" } #recommended by someone, but makes no sense for a backup
} while ($stHandle->{odbc_more_results});
say 'Created backup the hard way';
#program completion
say 'Program completed successfully';
exit 0;
There is nothing wrong with the Perl code you show. However, the ODBC trace file shows that DBD::ODBC made these calls just before the error:
SQLPrepare backup database dz to disk='C:\David\dump\perlHard.bak'
SQLExecute returns SQL_SUCCESS_WITH_INFO and
Processed 208 pages for database 'dz', file 'dz_test' on file 1. (4035)
then a few calls for various handles to SQLErrorW
SQLRowCount returns ok and -1 for row count
SQLNumResultCols returns SQL_ERROR and ]Invalid cursor state
I cannot for the life of me see how this is an invalid cursor state (look the valid state transitions for ODBC up yourself) so I'd have to say this looks like a bug in the SQL Server ODBC driver you are using. You could try getting a newer one or use the SQL Server native client driver instead (you've probably got both already).
You can ignore the errors in your sql server log as they are correct, error 1235 is ERROR_REQUEST_ABORTED which it was.
Related
In SQL Server 2016, I am executing a SQL script through SQLCMD like this:
SQLCMD -H XXXXXX,1433 -U username -P password -d mydatabase
-v varMDF="testing" -i "Script.sql" -o "DATA.txt"
and in Script.sql, I want to echo some text to the console, just to see the progress. I have a while loop in the script and executing the command
echo I am in sql script
as shown here:
OPEN tab_cursor
FETCH NEXT FROM tab_cursor INTO #tablename
WHILE ##FETCH_STATUS = 0
BEGIN
!!echo i am in sql script
PRINT #tablename
FETCH NEXT FROM tab_cursor INTO #tablename
END
CLOSE tab_cursor
DEALLOCATE tab_cursor
The problem is, it display the line "i am in sql script" only once in console but I could see many entries for tablename in my output file. Please help to solve this issue or suggest if there is any other way to do this.
Thanks
I would try the following solutions in order:
1) Look into BCP; it might allow you to see what you are doing much more effectively, and depending on the size of your output file it may be significantly faster. (1b : look into SSIS, even though it's a huge pain)
2) putting a SQLCMD execution inside of Script.sql that did the data push to the file, and having the PRINT statement work as normal without a -o. (NOTE: If this is a Complicated Stored Procedure, why aren't you writing a Complicated Stored Procedure?)
3) Monkeying with server monitoring and profiler. This would be for debugging purposes only, if that's why you need the output.
Generally, it sounds to me like the source of your problem is that you're using the wrong tool for the job. If you want lots of output from SQLCMD on process status, you're probably using it where you should be using BCP, which is designed for doing exports programmatically. SQLCMD isn't all that great an interface for running complicated scripts, in my experience; it needs fire-and-forget.
I'm trying to back up my production database to my local dev machine with the following constraints:
It will be a regular backup, so using the UI should (ideally?) be avoided.
Some tables are marked to be deleted, so these should not be included in the backup.
I would like to be able to pass the solution (file/package/etc) to other members of the team and they should only have to change a couple of variables in one file and then they can execute and get their own backup.
The DB is over 100GB and contains data that I won't need. I have identified the top largest tables and would only like to take say 5k rows from each - this should provide me with enough data for my purposes and limit space used on my local drives.
I have tried beginning with backing up the schema only using the following methods:
Using the UI to backup Schema only (Tasks -> Generate Scripts)
Get the following error:
Microsoft.SqlServer.Management.Smo.FailedOperationException: Discover dependencies failed. ---> System.ArgumentException: Item has already been added. Key in dictionary: 'Server[#Name='PRODSQL']/Database[#Name='Database1']/UnresolvedEntity[#Name='SomeObjectName' and #Schema='Some.Schema.SomeObjectName']' Key being added: 'Server[#Name='PRODSQL']/Database[#Name='Database1']/UnresolvedEntity[#Name='SomeObjectName' and #Schema='Some.Schema.SomeObjectName']' at System.Collections.SortedList.Add(Object key, Object value) at Microsoft.SqlServer.Management.Smo.DependencyTree..ctor(Urn[] urns, DependencyChainCollection dependencies, Boolean fParents, Server server) at Microsoft.SqlServer.Management.Smo.DependencyWalker.DiscoverDependencies(Urn[] urns, Boolean parents) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.SqlScriptPublish.GeneratePublishPage.worker_DoWork(Object sender, DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.OnDoWork(DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument)
So I moved on.
Tasks -> Copy Database
I get a message saying there is not enough space on disk
Extract Data-tier Application
Get same error as in 1. above.
Powershell script, and a batch file calling sqlcmd on the generated .sql files after the PS script was run.
I was sure this method would work and it took me 2 days to get this far, but still working through multiple errors.
Basically I am doing the following:
Create db objects from the source DB (Schemas, SPs, Tables, Views, UDFs, Triggers, Indexes) and output them to .sql files - Roughly followed http://cfmumbojumbo.com/index.cfm/coding/using-powershell-to-backup-your-stored-procedures-and-triggers/ with some more work added.
If the database already exists on my server, kill, drop, then recreate it (DropCreate.sql):
IF(db_id(#DatabaseName) IS NOT NULL)
BEGIN
DECLARE #SQL VARCHAR(max)
SELECT #SQL = COALESCE(#SQL,'') + 'Kill ' + Convert(VARCHAR, SPId) + ';'
FROM MASTER..SysProcesses
WHERE DBId = DB_ID(#DatabaseName) AND SPId <> ##SPId
EXEC(#SQL);
END
DROP DATABASE MYDATABASE
CREATE DATABASE MYDATABASE ON PRIMARY (...)
The .bat file is essentially doing this
sqlcmd -S %Server% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\DropCreateDB.sql
#loop through and execute multiple .sql files in the directory
for /f %f in (`dir /b C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\StoredProcedures\`) do sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i %f
#Just one sql file in this directory, execute it
sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\Schemas\AllSchemas.sql
sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\Tables\AllTables.sql
.............
The latest error I'm experiencing is:
Changed database context to 'master'.
Msg 6107, Level 14, State 1, Server MYSERVER, Line 1
Only user processes can be killed.
do was unexpected at this time.
Everywhere I turn I am experiencing new errors and have spent over 2 days on it, and I haven't even got to getting the data yet..
TLDR: Is there any easier way backup MSSQL Db schema and top n rows of data from certain tables?
I am trying to export data using JDBC connection (I am using DB2 database) but it is failing and giving following error:
Caused by: com.ibm.db2.jcc.c.SqlException: DB2 SQL error: SQLCODE: -3001, SQLSTATE: , SQLERRMC: sqlofopn -2029060079
Query I used:
call admin_cmd('EXPORT TO /home/user/test_1/db_extract.csv OF DEL
MODIFIED BY NOCHARDEL SELECT * from mytable fetch first 5 rows only');
I Gave 755 access to test_1 folder as well.
I tried removing the admin_cmd as well but getting BEGIN OF STATEMENT error
And also tried the same query using putty, but no luck I am getting this error :
SQL3001C An I/O error (reason = "sqlofopn -2029060079") occurred
while opening the output file.
You need to either grant permission to the db2 fenced user and/or the group that the db2 fenced user is a member of. For example:
-- file /tmp/stack.sql
connect to pocdb user proksch using in4mix;
call admin_cmd('export to /stack/my.unl of del select * from proksch.foo');
connect reset;
terminate;
The following stack.sh script was run as root (or another user than can set acls dynamically)
#!/bin/bash
# ran as root to set acts
DB2=/home/db2inst1/sqllib/bin/db2
function rmperms {
rm -f /stack/my.unl > /dev/null 2> /dev/null
setfacl -x user:db2inst1 /stack
setfacl -x user:db2fence /stack
setfacl -x group:db2 /stack
}
function setperms {
setfacl -m $1 /stack
}
function getperms {
echo " "
echo "Perms on /stack"
ls -l / | grep stack
getfacl /stack --tabular --absolute-names --recursive
echo " "
}
rmperms
getperms
su --command="${DB2} -tvf /tmp/stack.sql" db2inst1
rmperms
setperms user:db2inst1:rwx
getperms
su --command="${DB2} -tvf /tmp/stack.sql" db2inst1
rmperms
setperms user:db2fence:rwx
getperms
su --command="${DB2} -tvf /tmp/stack.sql" db2inst1
rmperms
setperms group:db2:rwx
getperms
su --command="${DB2} -tvf /tmp/stack.sql" db2inst1
Yields the following results:
Perms on /stack
drwxr-xr-x+ 2 unload ops 4096 Mar 22 16:17 stack
# file: /stack
USER unload rwx
GROUP ops r-x
mask r-x
other r-x
connect to pocdb user proksch using
Database Connection Information
Database server = DB2/LINUXX8664 10.5.3
SQL authorization ID = PROKSCH
Local database alias = POCDB
call admin_cmd('export to /stack/my.unl of del select * from proksch.foo')
SQL3001C An I/O error (reason = "sqlofopn -2079391743") occurred while
opening the output file.
connect reset
DB20000I The SQL command completed successfully.
terminate
DB20000I The TERMINATE command completed successfully.
Perms on /stack
drwxrwxr-x+ 2 unload ops 4096 Mar 22 16:17 stack
# file: /stack
USER unload rwx
user db2inst1 rwx
GROUP ops r-x
mask rwx
other r-x
connect to pocdb user proksch using
Database Connection Information
Database server = DB2/LINUXX8664 10.5.3
SQL authorization ID = PROKSCH
Local database alias = POCDB
call admin_cmd('export to /stack/my.unl of del select * from proksch.foo')
SQL3001C An I/O error (reason = "sqlofopn -2079391743") occurred while
opening the output file.
connect reset
DB20000I The SQL command completed successfully.
terminate
DB20000I The TERMINATE command completed successfully.
Perms on /stack
drwxrwxr-x+ 2 unload ops 4096 Mar 22 16:17 stack
# file: /stack
USER unload rwx
user db2fence rwx
GROUP ops r-x
mask rwx
other r-x
connect to pocdb user proksch using
Database Connection Information
Database server = DB2/LINUXX8664 10.5.3
SQL authorization ID = PROKSCH
Local database alias = POCDB
call admin_cmd('export to /stack/my.unl of del select * from proksch.foo')
Result set 1
--------------
ROWS_EXPORTED MSG_RETRIEVAL MSG_REMOVAL
-------------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
5 - -
1 record(s) selected.
Return Status = 0
connect reset
DB20000I The SQL command completed successfully.
terminate
DB20000I The TERMINATE command completed successfully.
Perms on /stack
drwxrwxr-x+ 2 unload ops 4096 Mar 22 16:17 stack
# file: /stack
USER unload rwx
GROUP ops r-x
group db2 rwx
mask rwx
other r-x
connect to pocdb user proksch using
Database Connection Information
Database server = DB2/LINUXX8664 10.5.3
SQL authorization ID = PROKSCH
Local database alias = POCDB
call admin_cmd('export to /stack/my.unl of del select * from proksch.foo')
Result set 1
--------------
ROWS_EXPORTED MSG_RETRIEVAL MSG_REMOVAL
-------------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
5 - -
1 record(s) selected.
Return Status = 0
connect reset
DB20000I The SQL command completed successfully.
terminate
DB20000I The TERMINATE command completed successfully.
In windows, the problem was in path access.
Running db2 with admin privileges solved the problem (the problem can be also solved by change paths to files).
I use RedGate SQL data compare and generated a .sql file, so I could run it on my local machine. But the problem is that the file is over 300mb, which means I can't do copy and paste because the clipboard won't be able to handle it, and when I try to open the file in SQL Server Management Studio I get an error about the file being too large.
Is there a way to run a large .sql file? The file basically contains data for two new tables.
From the command prompt, start up sqlcmd:
sqlcmd -S <server> -i C:\<your file here>.sql
Just replace <server> with the location of your SQL box and <your file here> with the name of your script. Don't forget, if you're using a SQL instance the syntax is:
sqlcmd -S <server>\instance.
Here is the list of all arguments you can pass sqlcmd:
Sqlcmd [-U login id] [-P password]
[-S server] [-H hostname] [-E trusted connection]
[-d use database name] [-l login timeout] [-t query timeout]
[-h headers] [-s colseparator] [-w screen width]
[-a packetsize] [-e echo input] [-I Enable Quoted Identifiers]
[-c cmdend] [-L[c] list servers[clean output]]
[-q "cmdline query"] [-Q "cmdline query" and exit]
[-m errorlevel] [-V severitylevel] [-W remove trailing spaces]
[-u unicode output] [-r[0|1] msgs to stderr]
[-i inputfile] [-o outputfile] [-z new password]
[-f | i:[,o:]] [-Z new password and exit]
[-k[1|2] remove[replace] control characters]
[-y variable length type display width]
[-Y fixed length type display width]
[-p[1] print statistics[colon format]]
[-R use client regional setting]
[-b On error batch abort]
[-v var = "value"...] [-A dedicated admin connection]
[-X[1] disable commands, startup script, environment variables [and exit]]
[-x disable variable substitution]
[-? show syntax summary]
I had exactly the same issue and had been struggling for a while then finally found the solution which is to set -a parameter to the sqlcmd in order to change its default packet size:
sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
You can use this tool as well. It is really useful.
BigSqlRunner
NB: Broken link, so have updated it.
Take command prompt with administrator privilege
Change directory to where the .sql file stored
Execute the following command
sqlcmd -S 'your server name' -U 'user name of server' -P 'password of server' -d 'db name'-i script.sql
I am using MSSQL Express 2014 and none of the solutions worked for me. They all just crashed SQL. As I only needed to run a one off script with many simple insert statements I got around it by writing a little console app as a very last resort:
class Program
{
static void Main(string[] args)
{
RunScript();
}
private static void RunScript()
{
My_DataEntities db = new My_DataEntities();
string line;
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\ukpostcodesmssql.sql");
while ((line = file.ReadLine()) != null)
{
db.Database.ExecuteSqlCommand(line);
}
file.Close();
}
}
Run it at the command line with osql, see here:
http://metrix.fcny.org/wiki/display/dev/How+to+execute+a+.SQL+script+using+OSQL
Hope this help you!
sqlcmd -u UserName -s <ServerName\InstanceName> -i U:\<Path>\script.sql
I had similar problem. My file with sql script was over 150MB of size (with almost 900k of very simple INSERTs). I used solution advised by Takuro (as the answer in this question) but I still got error with message saying that there was not enough memory ("There is insufficient system memory in resource pool 'internal' to run this query").
What helped me was that I put GO command after every 50k INSERTs.
(It's not directly addressing the question (file size) but I believe it resolves problem that is indirectly connected with large size of sql script itself. In my case many insert commands)
==> sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
I have successfully done with this command with 365mb sql file.
this syntax runs in about 15 minutes.
it helped me solve a problem that took me a long time to figure out
Run the script file
Open a command prompt window.
In the Command Prompt window, type: sqlcmd -S <ServerName\InstanceName> -i C:\yourScript.sql
Press ENTER.
Your question is quite similar to this one
You can save your file/script as .txt or .sql and run it from Sql Server Management Studio (I think the menu is Open/Query, then just run the query in the SSMS interface). You migh have to update the first line, indicating the database to be created or selected on your local machine.
If you have to do this data transfer very often, you could then go for replication. Depending on your needs, snapshot replication could be ok. If you have to synch the data between your two servers, you could go for a more complex model such as merge replication.
EDIT: I didn't notice that you had problems with SSMS linked to file size. Then you can go for command-line, as proposed by others, snapshot replication (publish on your main server, subscribe on your local one, replicate, then unsubscribe) or even backup/restore
The file basically contain data for two new tables.
Then you may find it simpler to just DTS (or SSIS, if this is SQL Server 2005+) the data over, if the two servers are on the same network.
If the two servers are not on the same network, you can backup the source database and restore it to a new database on the destination server. Then you can use DTS/SSIS, or even a simple INSERT INTO SELECT, to transfer the two tables to the destination database.
There is probably another way for all the fellows still encountering problems importing really large SQL dumps.
What also be considered when possible: If you have access to the server you could export the database in multiple parts, like first the structure, then per table (or related objects) an export of the data in smaller pieces, instead of one big file.
When you don't have access to server and/or required to use the existing big file, you could try to split them into parts with SQLDumpSplitter: https://philiplb.de/sqldumpsplitter3/.
Then import the pieces to get a full copy of the database.
Good luck, guys.
I already have an existing database I am trying to keep up to date on a daily basis. I get a daily dump of sql files. The batch script below created and populated the database the first time I run it, but that doesn't work when I try to update the database with it.
`#echo off
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
sqlcmd.exe -S servername -E -d DatabaseName -i %%f >>output.txt
)
pause`
Is there a different command for updating a database with sql files?
This is the output i get.
HUTRC started the batch process at 9:55:12.25 Changed database context to 'master'. Msg 15416, Level 16, State 1, Server HUTRC1-HP, Procedure sp_dbcmptlevel, Line 67 Usage: sp_dbcmptlevel [dbname [, compatibilitylevel]] Valid values of the database compatibility level are 100, 110, or 120.
What the sql file looks like. It is very long. I just got the first few lines.
USE [master]
GO
IF NOT EXISTS (SELECT [name] FROM sys.databases WHERE name = N'Migration')
BEGIN
CREATE DATABASE [Migration] COLLATE SQL_Latin1_General_CP1_CI_AS
END
GO
EXEC dbo.sp_dbcmptlevel #dbname=N'Migration', #new_cmptlevel=90
The error message tells you precisely where to look for the problem and what it is (emphasis added):
Procedure sp_dbcmptlevel, Line 67 Usage: sp_dbcmptlevel [dbname [, compatibilitylevel]] Valid values of the database compatibility level are 100, 110, or 120.
Examining your SQL script for sp_dbcmptlevel shows that it uses a different value:
EXEC dbo.sp_dbcmptlevel #dbname=N'Migration', #new_cmptlevel=90
^^
You'll need to either edit the SQL script to a valid compatibility level, or downgrade your server version to the same version as the source server.