I am looking for a replace of OUTPUT/RETURNING clause in SNOWFLAKE.
Test scenario:
I have a metadata tbl, in which I store all tables + columns which should be updated:
tblToUpdate
FieldToUpdate
tblA
name
tblA
lastname
tblB
middlename
tblC
address
tblC
phone
tblC
zipcode
I dynamically generate the upd stmt, based on this table and it looks like :
update tblA set name = '#tst', lastname = '#tst';
update tblB set middlename = '#tst';
update tblC set address = '#tst', phone ='#tst', zipcode = '#tst';
Next step is to create a log table, to store the names of the updated tables + ids of updated rows.
How can I do this without creating a STREAM tbl for each tbl to be updated (metadata tbl can contain from 1 to n tables, rows from it can change above the time). I need to find a way to create the log table, to keep a track of all changed tables with its rows.
Thanks!
I'm trying to insert a new row into a table which is an exact copy of another row except for the identifier. Previously I hadn't had the issue because there was an ID-column which didn't fill automatically. So I could just copy a row like this
INSERT INTO table1 SELECT * FROM table1 WHERE Id = 5
And then manually change the ID like this
WITH tbl AS (SELECT ROW_NUMBER() OVER (PARTITION BY Id ORDER BY Id) AS RNr, Id
FROM table1 WHERE Id = 5
UPDATE tbl SET Id = (SELECT MAX(Id) FROM table1) + 1
WHERE RNr = 2
Recently the column Id has changed to not be filled manually (which I also like better). But this leads to the error that I obviously can't fill that column while IDENTITY_INSERT is false. Unfortunately, I don't have the right to use SET IDENTITY_INSERT IdentityTable ON/OFF before and after the statement, which I also would like to avoid anyways.
I'm looking for a way to insert the whole row excluding the Id column without naming each other column in the INSERT and SELECT statement (which are quite a lot).
In the code below the maximum value of the ID gets added by one, so your integrity is not violated.
insert into table1
select a.Top_ID, Column2, Column3, ColumnX
from table1 t1
outer apply (select max(id_column) + 1as Top_ID from table1) as a
where t1.Id = 1
Okay, I found a way by using a temporary table
SELECT * INTO #TmpTbl
FROM table1 WHERE Id = 5;
ALTER TABLE #TmpTbl
DROP COLUMN Id;
INSERT INTO table1 SELECT * FROM #TmpTbl;
DROP TABLE #TmpTbl
For example right now I have three queries. An insert and select statement for one table:
INSERT INTO Table_A (col_1)
VALUES (val_1)
and
SELECT SUM(col_1)
FROM Table_A
WHERE Table_A_ID = id
After each insert I need to recalculate the sum and update the column of Table_B.
That sum is stored in a variable, 'sum', (I'm using dapper), and then passed to another function along with an 'id' variable which updates a column of Table_B.
UPDATE Table_B
SET col_1 = #sum
WHERE Table_B_ID = #id
I'd like to combine the last two queries to effectively achieve something like:
UPDATE Table_B
SET col_1 = (SELECT Sum(col_1) FROM Table_A WHERE Table_A_ID = id)
WHERE Table_B_ID = id
Is this possible?
Is this what you are looking for?
UPDATE #Table_B
SET col_1 = a.col_1
FROM (SELECT SUM(col_1) AS col_1 FROM #Table_A WHERE a.[id] = #id) a
WHERE [id] = #id
Is it really necessary to store a calculated column ?
If it is, you could consider creating a trigger on Table A that is executed each time a record is inserted in 'A'. The trigger could then update table_B with the calculated sum.
(Keep in mind that you'll have to modify the calculated sum in Table-B when a record in Table-A is deleted or updated as well. The trigger should thus be an AFTER INSERT, UPDATE, DELETE trigger.)
Here's what I am trying to do.
This is the select statement
select Id
from tblUsersPokemons
where UserId = 1 and PokemonPlace = 'bag'
Now I want to insert those returned Ids into another table like this:
foreach all returned Ids
insert into tblNpcBattleUsersPokemons values (Id, 1)
How can I do that ?
Like this:
insert into tblNpcBattleUsersPokemons
select Id, 1 from tblUsersPokemons where UserId=1 and PokemonPlace='bag'
This can be done in a single sql call
insert into tblNpcBattleUsersPokemons (id, [whatever the name of the other column is])
select Id, 1 from tblUsersPokemons where UserId=1 and PokemonPlace='bag'
I've used the longhand here because it does not make any assumption about the ordering of columns in the destination table, which can change over time and invalidate your insert statement.
You can insert the set retrieved by a SELECT into another existing table using the INSERT...SELECT syntax.
For example:
INSERT INTO tblNpcBattleUsersPokemons (Id, Val) -- not sure what the second column name was
SELECT Id FROM tblUsersPokemons WHERE UserID = 1 and PokemonPlace='bag';
ALTER TRIGGER t1
ON dbo.Customers
FOR INSERT
AS
BEGIN TRANSACTION
/* variables */
DECLARE
#maxid bigint
SELECT #customerid = id FROM inserted
SET IDENTITY_INSERT dbo.new_table ON
DECLARE
#maxid bigint
SELECT #maxid = MAX(ID) FROM new_table
INSERT INTO new_table (ID, ParentID, Foo, Bar, Buzz)
SELECT ID+#maxid, ParentID+#maxid, Foo, Bar, Buzz FROM initial_table
SET IDENTITY_INSERT dbo.new_tableOFF
/* execute */
COMMIT TRANSACTION
GO
fails with:
SQL Server Subquery returned more than 1 value. This is not permitted
when the subquery follows =, !=, <, <= , >, >= or when the subquery is
used as an expression
How to fix it?
What I am trying to do is
insert id and parentid, each INCREASED by #maxid
from initial_table
into new_table
thnx
new_table
id (bigint)
parentid (bigint - linked to id)
foo | bar | buzz (others are nvarchar, not really important)
initial table
id (bigint)
parentid (bigint - linked to id)
foo | bar | buzz (others are nvarchar, not really important)
You are battling against a few errors I suspect.
1.
You are inserting values that violate a unique constraint in new_table.
Avoid the existence error by joining against the table you are inserting into. Adjust the join condition to match your table's constraint:
insert into new_table (ID, ParentID, Foo, Bar, Buzz)
select ID+#maxid, ParentID+#maxid, Foo, Bar, Buzz
from initial_table i
left
join new_table N on
i.ID+#maxid = n.ID or
i.ParentID+#maxid = n.ParentId
where n.ID is null --make sure its not already there
2.
Somewhere, a subquery has returned multiple rows where you expect one.
The subquery error is either in the code that inserts into dbo.Customer (triggering t1), or perhaps in a trigger defined on new_table. I do not see anything in the posted code that would throw the subquery exception.
Triggers (aka, landmines) inserting into tables that have triggers defined on them is a recipe for pain. If possible, try to refactor some of this logic out of triggers and into code you can logically follow.
First you have to assume there will be more than one record in inserted or deleted. You should not ever set a value in inserted or deleted table to a scalar varaible in a SQL server trigger. It will cause a problem if the insert includes more than one record and sooner or later it will.
Next you should not ever consider setting identity insert on in a trigger. What were you thinking? If you have an identity field then use that, don't then try to manually create a value.
Next the subquery issue is associated apparently with the other trigger where you are also assuming only one record at a time would be processed. I would suspect that you will need to examine every trigger in your database and fix this basic problem.
Now when you run this part of the code:
INSERT INTO new_table (ID, ParentID, Foo, Bar, Buzz)
SELECT ID+#maxid, ParentID+#maxid, Foo, Bar, Buzz FROM initial_table
You are trying to insert all records in the table not just the ones in inserted. So since your trigger on the other table is incorreclty written, you are hitting an error which is actually hiding the error you will get when you try to insert 2000 records with the same PK into the new table or worse if you don't have a PK, it will happily insert them all every time you insert one record.
You have a trigger containing the statement:
SELECT #customerid = id FROM inserted
The inserted table contains a row for each row that was inserted (or updated for UPDATE triggers). A statement executed that inserted more than one row, the trigger fired, and your assumption was exposed.
Recode the trigger to operate on rowsets, not a single row.
while using a sub query in d any kind of select, try to tune your query so that the sub query only returns 1 value and not multiple.
If multiple are needed then restructure the query in such a way that the table becomes part of main query.
I am giving example of SQL:
Select col1, (select col2 from table2 where table2.col3=table1.col4) from table1;
If col2 returns multiple rows then the query fails then re-write it to:
Select col1, col2 from table1,table2 where table2.col3=table1.col4;
I hope you get the point.
You shouldn't SELECT it, you should SET it.
SET #maxid = MAX(ID) FROM another_table