Copy entire SQL table to another and truncate original table - sql-server

I am writing a stored procedure that will copy the entire contents of a table called "CS_Consolidation" into a backup table called "CS_ConsolidationBackup2016" all fields are exactly the same and the new data everyday must just be added after which the original table must be truncated.
I am however having a problem with my procedure and how it is written if anyone can help:
CREATE PROCEDURE BackUpData2
AS
BEGIN
SET NOCOUNT ON;
SELECT *
INTO [dbo].[CS_ConsolidationBackUp]
FROM [dbo].[CS_Consolidation]
TRUNCATE TABLE [dbo].[CS_Consolidation]
GO

Why do you want to copy the data and then delete the original? This is entirely more complicated and stressful to the system then you need. There is no need to create a second copy of the data so that you can just turn around and drop the first copy.
A much easier path would be to rename to current table and then create you new primary table.
EXEC sp_rename 'CS_Consolidation', 'CS_ConsolidationBackUp';
GO
select *
into CS_Consolidation
from CS_ConsolidationBackUp
where 1 = 0; --this ensures no rows but the entire structure is copied.

If you are looking to create one backup table daily, would something like this work?
DECLARE #BackupTableName nvarchar(250)
SELECT #BackupTableName = 'CS_ConsolidationBackUp' + CAST(CONVERT(date, getdate()) as varchar(250))
IF EXISTS(SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = #BackupTableName)
BEGIN
EXEC('DROP TABLE [' + #BackupTableName + ']')
END
EXEC('SELECT * INTO [dbo].[' + #BackupTableName + '] FROM [dbo].[CS_Consolidation]')
TRUNCATE TABLE [dbo].[CS_Consolidation]

You are missing and "end" statement before "go". This is the correct code:
CREATE PROCEDURE BackUpData2
AS
BEGIN
SET NOCOUNT ON;
SELECT *
INTO [dbo].[CS_ConsolidationBackUp]
FROM [dbo].[CS_Consolidation]
TRUNCATE TABLE [dbo].[CS_Consolidation]
end
GO

Related

Why doesn't this alter after insert statement work?

I have a stored procedure with dynamic sql that i have embedded as below:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
begin
set #sql = 'alter table #temp_table add column1 float'
exec(#sql)
end
update #temp_table
set column1 = column1*100
select *
into Primary_Table
from #temp_table
However, I noticed that all the statements work but the alter does not. When run the procedure, I get an error message: "Invalid Column name column1"
What am I doing wrong here?
EDIT: Realized I didn't mention that the first insert is a dynamic sql as well. Updated it.
Alternate approach tried but throws same error:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
alter table #temp_table add column1 float
update #temp_table set column1 = column1*100
Local temporary tables exhibit something like dynamic scope. When you create a local temporary table inside a call to exec it goes out of scope and existence on the return from exec.
EXEC (N'create table #x (c int)')
GO
SELECT * FROM #x
Msg 208, Level 16, State 0, Line 4
Invalid object name '#x'.
The select is parsed after the dynamic SQL to create #x is ran. But #x is not there because dropped on exit from exec.
Update
Depending on the situation there are different ways to work around the issue.
Put everything into the same string:
DECLARE #Sql NVARCHAR(MAX) = N'SELECT 1 AS source INTO #table_name;
ALTER TABLE #table_name ADD TARGET float;
UPDATE #table_name SET Target = 100 * source;';
EXEC (#Sql);
Create the table ahead of the dynamic sql that populates it.
CREATE TABLE #table_name (source INT);
EXEC (N'insert into #table_name (source) select 1;');
ALTER TABLE #table_name ADD target FLOAT;
UPDATE #table_name SET target = 100 * source;
In this option, the alter table statement can be removed by adding the additional column to the create table statement.' Note also that the alter table and update statements could be in separate invocations of dynamic SQL, if that was beneficial to your context.
1) It should be ALTER TABLE #temp... Not ALTER #temp.
2) Even if #1 weren't an issue, you're adding column1, as a NULLable column with no default value and, in the next statement setting it's value to itself * 100...
NULL * 100 = NULL
3) Why are you using dynamic sql to alter the #temp table? It can just as easily be done with a regular ALTER TABLE script... or, better yet, can be included in the original table definition.
This is because the #temp_table reference in the outer batch is a different temp table than the one created in dynamic SQL. Consider:
use tempdb
drop table if exists sometable
drop table if exists #temp_table
go
create table sometable(id int, a int)
create table #temp_table(id int, b int)
exec( 'select * into #temp_table from sometable; select * from #temp_table;' )
select * from #temp_table
Outputs
id a
----------- -----------
(0 rows affected)
id b
----------- -----------
(0 rows affected)
A temp table created in a nested batch is scoped to the nested batch and automatically dropped after. A "nested batch" is either a dynamic SQL query or a stored procedure. This behavior is explained here CREATE TABLE, but it only mentions stored procedures. Dynamic SQL behaves the same.
If you create the temp table in a top level batch, you can access it in dynamic SQL, you just can't create a new temp table in dynamic SQL and see it in the outer batch or in subsequent same-level dynamic SQL. So try to use INSERT INTO instead of SELECT INTO.

Use trigger to modify column data type length

My program will create a temp table which will drop after the program executed. The data type length is 8. But I want to change the length to 15 when I run the program using the trigger function in Sql Server. I have few table that need to change the length. Is there any way to change the length without stating the table name in trigger function?
Clarification:
I have 100 programs which will create temporary table with different names. Each temp table will have user_id varchar(8). So i want to change the length to 15 . But i dont want to open my each program's source code to change it. is there a better way that you can suggest me?
What you want is essentially possible to achive using DDL triggers.
CREATE TRIGGER [TRG_TABLES]
ON DATABASE
AFTER
CREATE_TABLE
AS
BEGIN
SET NOCOUNT ON
DECLARE #TABLE_NAME SYSNAME
SELECT
#TABLE_NAME = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','SYSNAME')
IF EXISTS(SELECT * FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #TABLE_NAME
AND COLUMN_NAME = 'TEST')
BEGIN
DECLARE #SQL as NVARCHAR(MAX) ='ALTER TABLE ' + #TABLE_NAME + ' ALTER COLUMN TEST NVARCHAR(200) '
Exec sp_ExecuteSql #SQL
END
END
GO
ENABLE TRIGGER [TRG_TABLES] ON DATABASE
You should be EXTRA careful about SQL injection if you use this approach.
EDIT: This is just a general idea you should probably figure out under which conditions you should alter the column - if there is a predictable pattern to your table names.

How can I create an UPDATE statement from a large XML data type?

I am working with two databases that are not accessible at the same time. One of the standard methods of dealing with this I've seen on here is to create dynamic sql for loading one from the other.
I created a stored procedure that would drop update statements from an existing database. My issue is what happens when the XML is too large to be held in a VARCHAR(max).
Here is a relevant snippet from my attempt where field2 is actually of an XML data type:
DECLARE #field1Col VARCHAR(50)
DECLARE #field2Col VARCHAR(max)
DECLARE #vsSQL VARCHAR(max)
DECLARE curUpdates CURSOR FOR
-- field 1 is varchar(50), not null
-- field 2 is XML(.), null
SELECT
t.field1
,REPLACE(CAST(t.[field2] AS VARCHAR(max)), '''', '''''')
FROM
myTable t
WHERE
t.criteria = 0
OPEN curUpdates
FETCH NEXT FROM curUpdates INTO #field1Col, #field2Col
WHILE ##FETCH_STATUS = 0
BEGIN
SET #vsSQL = 'UPDATE dbo.myTable SET [field1] = ''' + #field1Col+ ''' WHERE [field2] = ''' + #field2Col + ''''
INSERT INTO #tmp ( SQLText ) VALUES ( #vsSQL )
FETCH NEXT FROM curUpdates INTO #field1Col, #field2Col
END
CLOSE curUpdates
DEALLOCATE curUpdates
SET NOCOUNT OFF;
SELECT * FROM #tmp
The issue I have is that even using VARCHAR(max), the XML will sometimes overrun the size. The end product just stops when it reaches the so many characters (the max size of a VARCHAR?).
Is there another approach for working with large XML (splitting into chunks, avoid casting, etc.) where I can build a string of update statements from it?
I do not have access to database B. I'd like to (one time run) update
a few tables in database B
The one time run could point to something like this:
CREATE DATABASE MyOneTimeRun;
GO
USE MyOneTimeRun;
GO
SELECT * INTO MyCopy FROM YourDatabase.dbo.YourTable;
GO
BACKUP DATABASE [MyOneTimeRun] TO DISK = N'C:\Path\MyOneTimeRun.bak' WITH NOFORMAT, NOINIT
,NAME = N'MyOneTimeRun-Copy of MyTable'
,SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
USE master;
GO
EXEC msdb.dbo.sp_delete_database_backuphistory #database_name = N'MyOneTimeRun'
GO
USE [master]
GO
ALTER DATABASE [MyOneTimeRun] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
USE [master]
GO
DROP DATABASE [MyOneTimeRun]
GO
Now you have a BAK-file with the content you need which you can restore on your other server.
There you use the appropriate scripts to shuffle your data typesafe and clean from the copy into your target.

SQL Server : update records in dynamically generated tables using parameters in stored procedure

I have to create a stored procedure where I will pass tableName, columnName, id as parameters. The task is to select records from the passed table where columnName has passed id. If record is found update records with some fixed data. Also implement Transaction so that we can rollback in case of any error.
There are hundreds of table in database and each table has different schema that is why I have to pass columnName.
Don't know what is the best approach for this. I am trying select records into a temp table so that I can manipulate it as per requirement but its not working.
I am using this code:
ALTER PROCEDURE [dbo].[GetRecordsFromTable]
#tblName nvarchar(128),
#keyCol varchar(100),
#key int = 0
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRY
--DROP TABLE #TempTable;
DECLARE #sqlQuery nvarchar(4000);
SET #sqlQuery = 'SELECT * FROM ' + #tblName + ' WHERE ' + #keyCol + ' = 2';
PRINT #sqlQuery;
INSERT INTO #TempTable
EXEC sp_executesql #sqlQuery,
N'#keyCol varchar(100), #key int', #keyCol, #key;
SELECT * FROM #TempTable;
END TRY
BEGIN CATCH
EXECUTE [dbo].[uspPrintError];
END CATCH;
END
I get an error
Invalid object name '#TempTable'
Also not sure if this is the best approach to get data and then update it.
If you absolutely must make that work then I think you'll have to use a global temp table. You'll need to see if it exists before running your dynamic sql and clean up. With a fixed table name you'll run into problems with other connections. Inside the dynamic sql you'll add select * into ##temptable from .... Actually I'm not even sure why you want the temp table in the first place. Can't the dynamic sql just return the results?
On the surface it seems like a solid idea to have one generic procedure for returning data with a couple of parameters to drive it but, without a lot of explanation, it's just not the way database are designed to work.
You should create the temp table.
IF OBJECT_ID('tempdb..##TempTable') IS NOT NULL
DROP TABLE ##TempTable
CREATE TABLE ##TempTable()

Generate script for triggers only using script wizard

I have SQL Server 2008 R2. I have around 150 tables in a database and for each table I have recently created triggers. It is working fine in my local environment.
Now I want to deploy them on my live environment. The question is I want to deploy only the triggers.
I tried the Generate Script wizard but it is creating script with table schema along with triggers, NOT triggers only.
Is there anyway to generate all the triggers drop and create type script?
Forget the wizard. I think you have to get your hands dirty with code. Script below prints all triggers code and stores it into table. Just copy the script's print output or get it from #triggerFullText.
USE YourDatabaseName
GO
SET NOCOUNT ON;
CREATE TABLE #triggerFullText ([TriggerName] VARCHAR(500), [Text] VARCHAR(MAX))
CREATE TABLE #triggerLines ([Text] VARCHAR(MAX))
DECLARE #triggerName VARCHAR(500)
DECLARE #fullText VARCHAR(MAX)
SELECT #triggerName = MIN(name)
FROM sys.triggers
WHILE #triggerName IS NOT NULL
BEGIN
INSERT INTO #triggerLines
EXEC sp_helptext #triggerName
--sp_helptext gives us one row per trigger line
--here we join lines into one variable
SELECT #fullText = ISNULL(#fullText, '') + CHAR(10) + [TEXT]
FROM #triggerLines
--adding "GO" for ease of copy paste execution
SET #fullText = #fullText + CHAR(10) + 'GO' + CHAR(10)
PRINT #fullText
--accumulating result for future manipulations
INSERT INTO #triggerFullText([TriggerName], [Text])
VALUES(#triggerName, #fullText)
--iterating over next trigger
SELECT #triggerName = MIN(name)
FROM sys.triggers
WHERE name > #triggerName
SET #fullText = NULL
TRUNCATE TABLE #triggerLines
END
DROP TABLE #triggerFullText
DROP TABLE #triggerLines
Just in generate scripts wizard in the second step ("Set Scripting Options) press Advanced button=> Table/View Options=> Set Script Triggers to True.
check also this link or this. If you want only triggers just select one table to proceed the next step.

Resources