Database script not executing - sql-server

In my sqlserver 2008 R2
I have a database script which generate successfully, but when I am trying to execute that script its only shows Executing query message and nothing happen.
I had waited at-lest 10 minutes for result but force fully I have to stop executing that query.
Note: All other queries are working normally, but only database script is not executing as explained above.
I doesn't know what's going on...
More details: This thisng is not happen on particular DataBase, it is make problem to all the database of my sqlserver.
Even no any single DataBase's script working, as explain below.
Its being remain to show Executing queries...

Related

SQL Server Management Studio commands latency

I noticed, while trying to run an UPDATE script on a remote DB server geographically dislocated in another country, that the command executes successfully in about 1 hour if I run it from Management Studio on my local machine. If, instead, i run it directly on Management Studio that is installed on the server, it takes about 10 minutes to execute.
So, my question is: is latency important even if there is no "data traffic" generated by my command? I would expect latency when i execute an INSERT or SELECT command because there is effectively data that has to be sent from one machine to the other. That's not my case because the UPDATE commands i'm executing don't produce any output and don't have input data inside them.

SQL Server Stored Procedure Select Into statement not completing

I'm having an issue with a stored procedure which SQL Server being executed as a scheduled task through Task Manager. I have a batch file containing the EXECUTE statement which is called by Task Scheduler. Platforms is SQL Server 2008 R2 on Windows Server 2008 R2.
The batch file code
#echo off
SQLCmd -S lccc-cpc-sql -E -d NTSR -Q "Execute update_vw_NTSR_Base_AllRecords_Labels_new_proc"
This SP does the following:
Drops a table
Recreates it with updated data using a SELECT INTO statement
Problem: It's running the DROP statement, but failing on the SELECT INTO. Here's what's weird though:
If I execute the sp through SSMS (right click the sp, choose Execute) OR, view a query editor, run the code to drop the table and the SELECT INTO statement, it finishes correctly. It's a very large SELECT INTO statement - hundreds of columns and about 100 joins. The purpose is to join a lot of lookup tables to values so I have one place for my users to go for labeled data and some variables computed for user friendliness. It's messy, but it's what I have to work with.
Query timeout is set to 0 (no limit). This only happened recently as I added more columns and variables but it seems it'd fail called through any method, not just through the batch file. Any thoughts on how to make this work as-is (ie without breaking this up into multiple SELECT INTO statements)?
Thanks.

Command(s) completed successfully but... the tables are not created

I backed up my database table's and entire schema into .sql script using Visual Studio's Database Publishing Wizard.
I then tried to re-create those tables on another PC, not before re-creating the database itself, with the same exact name and everything (using a script that I created via SSMS's Script Database as).
I then open that tables .sql file using SSMS and execute it.
SSMS reports:
Command(s) completed successfully
But examining Object Explorer reveals that no tables were created.
Why is this happening?
What have I missed?
I've just been having the exact same problem symptoms also using Visual Studio's Database Publishing Wizard, - but with a slightly different cause / fix.
Even though SQL Server Management Studio says it is connected to the correct database (in the drop down in the ribbon, and in the status bar of the window), it wasn't actually connected to anything.
To identify and fix either:
SELECT DB_NAME() AS DataBaseName
If you get
master
(or any other unexpected database name) as the result, then you are connected to the wrong database and should select the correct DB from the dropdown. This was the cause of your problem.
If you get
Command(s) completed successfully
then somehow you aren't connected at all - this happened to me.
To fix, click the "change connection" button to disconnect and reconnect.
Check whether you have selected database. Most of the times we execute query in Master db by mistake.
-- Mark as answered if this answer really answered your question
Check if you are running "Execute" or jut Parsing the code. It was a late night, I was tired, and kept running a query to create a table, successfully, but no new table. The next day with a clear mind i noticed that i was not actually running the query, i was parsing it.

DTS - Manual step excution gives different results than just hitting "Run"

I have a DTS (not SSIS) package that hasn't been touched in years that I had to update a query with. When I run the package by manually executing each step in the editor, everything works out fine and generates a file of a couple thousand records as expected. When I hit the "Execute" button at the top of the editor to run the whole package, it doesn't error but the file is generated with only 1 record.
All tasks inside of the package are either transformation steps or Sql Tasks. There are not any ActiveX script tasks. When I watch the process as it's running the steps by itself, the execution is following the mapping correctly.
I'm at a loss on this one. Has anyone seen this issue before or have any idea where to start?
I just ran into a similar issue recently. While working with the senior DBA, we found that the server where the package ran did not have the right permissions to a directory on the network. The package ran fine in my box, but died on the production server. We need to give permissions to the sqlservice account on the production box, to write to the directory on the network.
You might also want to check out any ActiveX Script step that changes the connection string or destination of Data Pump steps. I've had cases where these were different on the destination server that the DTS packages run.
After going through all of the lines of all of the stored procedures and straight sql tasks used in the package, I located a SET ROWCOUNT 1 that was never reset. While I was manually executing each step separately, the RowCount would be automatically reset; however, when it was run as a complete package, the RowCount was never reset. Adding SET ROWCOUNT 0 at the end of the particular script resolved this issue.

How can I have Sql Server 2005 asynchronously call a DOS batch file from a DDL trigger?

I created a batch file to run SqlMetal and generate Linq2Sql data classes, check into source control triggering a build, etc... I'd like to have this script run anytime there is a DDL change in Sql Server 2005.
Running the batch file via xp_cmdshell works fine outside of a trigger, like this:
exec master..xp_cmdshell 'd:\dev\db_triggers\generatedataclasses.bat', no_output
But when it runs as a trigger, it always times out connecting to the database, causing all DDL to fail. Here's my trigger:
CREATE TRIGGER [Trig_SqlMetal]
ON DATABASE
FOR DDL_DATABASE_LEVEL_EVENTS
AS
exec master..xp_cmdshell 'd:\dev\db_triggers\generatedataclasses.bat', no_output
I'm looking for advice on two points:
Make this work. For some reason it always fails when in the trigger, and doesn't when not in a trigger. Doesn't appear to be security related since it runs as LocalSystem in both cases.
Make this happen asychronously so that failures and timeouts in SqlMetal don't cause DDL update failure. I've tried wrapping the batch file with another and a "start cmd.exe /c otherbatch.bat", but when running through sql server it seems to ignore the start (works fine from DOS). I could certainly write a polling process to look at some table and pickup events, but I'd prefer this be trigger based to make it less complex (or am I doing the opposite :) ).
Your batch is probably being blocked because it tries to query data about the tables being created, but they are still locked inside a transaction (the trigger is part of the implicit transaction SQL Server starts for any DDL/DML statement), that will complete only after the trigger finishes.
The only "almost" practical way of asynchronous execution in SQL Server 2005 or higher that I know of is Service Broker. Look for "Service Broker Internal Activation".
In practice it is a bit complex to set it up properly, so you might well choose to go for the pooling option.

Resources