Choosing when to insert data into Identity column - sql-server

I am trying to do a bulk insert from a select statement, but sometimes I would like to save the Identity from Table2 to Table1, but sometimes I want the Identity in the Table to be generated automatically when there is no data in Table2, So how would this be accomplished?
INSERT INTO Table (ID,Name)
SELECT
CASE WHEN Col1 IS NOT NULL THEN Col1 ELSE ##identity END ID,
Col2 Name,
FROM Table2
Is this possible or do I have to do 2 seperate bulk import processes?

Yes you need two bulk inserts:
SET IDENTITY_INSERT Table ON
INSERT INTO Table (ID,Name)
SELECT
Col1
Col2 Name
FROM Table2
WHERE Col1 IS NOT NULL
SET IDENTITY_INSERT Table OFF
INSERT INTO Table (Name)
SELECT
Col2 Name
FROM Table2
WHERE Col1 IS NULL

Related

Cannot insert into table with identity column from linked server

I have the following table:
use db_name;
create table dbo.tbl_name(
id_col int identity(1,1),
col1 varchar(20),
col2 varchar(50)
);
into which I can insert values without issue:
insert into dbo.tbl_name
select
col1,
col2
from dbo.other_tbl_name;
The issue arises when I attempt to insert into the table from a linked server:
insert into [server_name].db_name.dbo.tbl_name
select
col1,
col2
from dbo.other_tbl_name;
This gives the following error message:
Column name or number of supplied values does not match table
definition
If I try to set identity_insert on so I can include the identity column in the insert I get the error message:
Cannot find the object
"server_name.db_name.dbo.tbl_name" because
it does not exist or you do not have permissions.
which is somewhat misleading, given that I can select from the table and even issue update statements. I guess I don't have permission to set identity_insert on from the linked server, though I can do it on the server itself directly with the same credentials.
So how can I insert into this table from the linked server?
Explicitly define the columns in your INSERT statement:
insert into [server_name].db_name.dbo.tbl_name (col1,col2)
select
col1,
col2
from dbo.other_tbl_name;

I need a function which will insert rows in multiple tables

postgresql
So i am inserting a row in a table1 and this row contains one field which is a field in table2. So i would like to create a function which will insert a row in table2 when i am inserting a row in table1.
So example:
I have two tables
table1
....
....
....
table2
....
....
....
I insert in table1
Insert in table1 values ("Sam","USA");
as a result i want to have
table1
Sam Usa
...
...
...
table2
Usa ...
...
...
So what function should i write and what trigger? Also if there is a row in table2 which has a field USA, this function should not insert one more row with USA
Sorry, if i explained it to complicated
Like #Jorge Campos mentioned in comment duplication of data is usually bad idea.
But if you are really have some scenario where you need to do this you need to create trigger on source table and insert data in destination table with existing check. Here is example in SQL Server:
CREATE TABLE Tbl1(
Id INT NOT NULL IDENTITY PRIMARY KEY,
Name NVARCHAR(100) NOT NULL
)
GO
CREATE TABLE Tbl2(
Id INT NOT NULL IDENTITY PRIMARY KEY,
Name NVARCHAR(100) NOT NULL
)
GO
CREATE TRIGGER Sync ON Tbl1 AFTER INSERT AS
INSERT INTO Tbl2 (Name)
SELECT src.Name FROM inserted src
LEFT JOIN Tbl2 dst ON src.Name = dst.Name
WHERE dst.Id IS NULL
GO
INSERT INTO Tbl1 (Name) VALUES ('STR')
SELECT * FROM Tbl2

How to insert rows in another table based on insert in first table

If any insert happens in table A then,i need to insert the last inserted row into table B.
How can I do it by using ##rowcount.
I am trying below code.
create table table1
(
id int identity(1,1),
column1 nvarchar
)
create table table2
(
id int identity(1,1),
column1 nvarchar
)
Create procedure insert1
#column1 nvarchar
AS
Declare #t int,#column2 nvarchar
insert into table1 values(#column1)
select * from table1
set #t= (Select ##IDENTITY from table1)
Insert into table2 values (#t)
Please let me know how can i do the same by trigger.
You could write a trigger something like this:
CREATE TRIGGER trgTableAInsert
ON dbo.Table1
FOR INSERT
AS
INSERT INTO dbo.Table2(Column1)
SELECT Column1
FROM Inserted
Points to note:
a trigger is called once per statement, e.g. if your INSERT statement inserts 10 rows, the trigger is called once and Inserted contains those 10 newly inserted rows (do you want to insert all 10 of those into TableB?)
I would recommend to always use the schema prefix on tables (the dbo. part)
I would recommend to always explicitly specify the list of columns, both on an INSERT as well as a SELECT statement - don't omit those! (or you might run into messy and hard-to-debug issues when suddenly one of the tables changes)
MERGE INTO Table1 AS t1
USING MyTable ON 1=0 -- always generates "not matched by target"
WHEN NOT MATCHED BY TARGET THEN
-- INSERT into Table1:
INSERT (A, B, C) VALUES (t1.A, t1.B, t1.C)
--- .. and INSERT into Table2:
OUTPUT inserted.ID, MyTable.D, MyTable.E, MyTable.F
INTO Table2 (ID, D, E, F);

Maintaining Foreign Key references while migrating data using t-sql

I have 3 tables.
1) SourceTable - a source table with some data in it
2) DestinationTable - a destination table with the same schema as the Source table. both tables have similar kind of data
3) FKSourceTable - a totally different table that has a FK reference to SourceTable
4) FKDestinationTable - a totally different table that has a FK reference to DestinationTable. Has the same schema as FKSourceTable
Now I'm given the task of migrating some of the data from the SourceTable to the DestinationTable and FKSourceTable to FKDestinationTable
However I cannot migrate Primary Keys as the DestinationTable may have its own records with the same PK and that might create a PK violation.
DestinationTable as an Auto Identity column for the PK and when I do a Bulk insert, I don't specify the PK column so Auto Identity will do its Job.
This means the new records in DestionationTable will have brand new IDs.
The problem I'm having is, how do I maintain the FK reference when migrating FKSourceTable to FKDestinationTable? When I do a bulk insert to DestinationTable as follows, I lose track of the Identities:
INSERT INTO DestionationTable
(Col1, Col2)
SELECT st.Col1, st.Col2
FROM SourceTable st
(DestionationTable has 3 columns: Id, Col1, Col2)
The challenge is that I cannot use SSIS or any other ETL solution. I need to be able to do this with a simple SQL Script.
Does anyone have any ideas to tackle this? I've tried using OUTPUT INTO etc. but I haven't figured out a way to keep a reference between the original Id and the new Id
Any help is greatly appreciated
Thank you
Nandun.
This is probably not the most optimal solution but it should get the job done.
Idea is to disable identity insert and generate IDs yourself based on what is already in the table.
What this does is it iterates through source data and inserts it into destination tables one row at a time.
Pls review this code thoroughly before executing because I didn’t test this myself
declare #col1 varchar(20)
declare #col2 varchar(20)
declare #col3 varchar(20)
declare #new_id int
set identity_insert on
declare source_data cursor for
select col1, col2, colx
from SourceTable
open source_data
fetch next from source_data
into #col1, #col2, #col3
WHILE ##FETCH_STATUS = 0
begin
set #new_id = select MAX(ID) + 1 from SourceTable
insert into DestinationTable (ID, col1, col2, col3) values (#new_id,#col1,#col2,#col3)
-- do something similar for FKDestinationTable
insert into FKDestinationTable (ID, col1, col2, col3) values (#new_id,#col1,#col2,#col3)
fetch next from source_data
into #col1, #col2, #col3
end
set identity_insert off
Insert the data into the Destination table using Ident_Current of Destination table
DECLARE #ID INT = IDENT_CURRENT('DestionationTable')
INSERT INTO DestionationTable
(ID, Col1, Col2)
SELECT #ID + ROW_NUMBER() OVER(ORDER BY st.ID),st.Col1, st.Col2
FROM SourceTable st
WHERE -----
Now you have information of what each ID in source table = which ID in destination table.
SELECT #ID + ROW_NUMBER() OVER(ORDER BY st.ID) [NEW_ID], st.ID [OLD_ID]
FROM SourceTable st
WHERE -----
Note: Make sure this is done in a transaction and the transaction type depends on the usage of these tables

Set A Field the same as ID (IDENTITY) in the insert

I have a Code (int) in my table, the ID is set to identity. How can I set a default value for my code to be filled by the same value az ID? I mean Identity.
You could use an after insert trigger:
create table TestTable (id int identity, col1 int)
go
create trigger TestTrigger on TestTable after insert
as begin
update TestTable
set col1 = id
where col1 is null
and id in (select id from inserted)
end
go
Test code:
insert TestTable default values
insert TestTable (col1) values (666)
insert TestTable default values
select * from TestTable
In general, I try to stay clear of triggers. In the long run using a stored procedure for insert is much more maintainable:
create procedure dbo.InsertTestRow(
#col1 int)
as
insert TestTable (col1) values (#col1)
if #col1 is null
begin
update TestTable
set col1 = id
where id = SCOPE_IDENTITY()
end
If it always has the same value - why don't you just drop that field. Otherwise it can be maintained with triggers (BEFORE INSERT one).
I'm looking for something in the
default value! If it is null it should
be filled with the same value as id
but if it is provided with some value,
it should keep that value
You could solve the issue by using coalesce in your queries instead.
create table T (ID int identity, ID2 int)
insert into T values (default)
insert into T values (null)
insert into T values (78)
select
ID,
coalesce(ID2, ID) as ID2
from T
Result
ID ID2
-- ---
1 1
2 2
3 78
Assuming your table's ID is an Identity column, you could consider using a constraint:
ALTER TABLE MyTable
ADD CONSTRAINT MyTableCodeDefault
DEFAULT IDENT_CURRENT('MyTable') FOR Code
This works for these use cases:
INSERT INTO MyTable DEFAULT VALUES
INSERT INTO MyTable ({columns NOT including 'Code'})
VALUES ({value list matching insert columns})
INSERT INTO MyTable (Code) VALUES (666)
INSERT INTO MyTable (Code) SELECT 8 UNION SELECT 13 UNION SELECT 21
But it does not work for bulk inserts:
INSERT INTO MyTable ({columns NOT including 'Code'})
SELECT {value list matching insert columns}
UNION
SELECT {value list matching insert columns}
UNION
SELECT {value list matching insert columns}
This restriction may seem onerous, but in my practical experience, it's rarely a problem. Most of the use cases I've encountered that need a default value involve user/UI 'convenience': don't force the user to pick a value if they don't want to.
OTOH, rarely do I encounter bulk insert situations where it's impractical to specify the value for the columns you're targeting.
You could use computed column, like this:
if object_id('TempTable') is not null drop table TempTable
create table TempTable (Id int identity(1,1), Code as Id)
insert into TempTable
default values
insert into TempTable
default values
insert into TempTable
default values
select * from TempTable
Of course if you have other columns, then you dont need default values:
if object_id('TempTable') is not null drop table TempTable
create table TempTable (Id int identity(1,1), Code as Id, SomethingElse int)
insert into TempTable (SomethingElse)
select 10 union all
select 11 union all
select 12
select * from TempTable
But, like zerkms said - why do you need two columns that are same?
If the field is an Identity field in SQL Server, the database engine will take care of its value. What we normally do is to read the record back (after inserting) to get to the generated Id.
EDIT: It sounds like you are trying to "override" the identity? If so, before you insert, run:
SET IDENTITY_INSERT [tableName] ON
You'll have to be careful not to insert a value that already exists. This can get tricky, though. So maybe consider removing the identity property altogether, and managing the default values yourself?

Resources