When I run multiple insert queries together into a SQL Server database with Mule, if the second insert fails, it doesn't insert a row, and won't show as a failure in the flow or logs.
We use variables to collect together different SQL statements to insert into a header and detail table. I noticed last week that in some cases, the header record was there but no detail. There was nothing in the logs for this.
After some investigation it appears that Mule will take the result of the first SQL insert as the return code, regardless of whether the resulting SQL inserts worked or not.
I've tried changing this to a BULK UPDATE but i still get the same result.
Edit - code included for a sample insert. 4 insert statements, 3 will be successful, 1 will fail, but will simply pass back as successful -
insert into highjump.t_import_order(status,idoc_number,datetime_created,datetime_processed,error_message,wh_id,order_number,order_type,order_subtype,is_vas,is_shrinkwrap,is_mhe_packhold,is_consolidation,is_nonmhe_packhold,is_full_case,ship_to_account,ship_to_name,ship_to_address1,ship_to_address2,ship_to_address3,ship_to_city,ship_to_state,ship_to_zip,ship_to_country,sold_to_account,sold_to_name,sold_to_address1,sold_to_address2,sold_to_address3,sold_to_city,sold_to_state,sold_to_zip,telephone_number,sold_to_country,stock_pool,discount,box_type,service_level,telephone_number_alt,dest_type,carrier_code,route_code,inv_cat,cust_order_date,expected_ship_date,expected_delivery_date,dsv_tracking_number,postage_cost,carton_contents_type,unit_total,total_before_discount,total_after_discount,carton_cubing_indicator,req_proof_of_delivery,payment_type,is_cms,carrier_override_type,sales_org,pack_note_preference,shipper_order_id,master_order_number,currency_code,store_code,order_method,dsv_reference,email_address,ship_complete_flag,replen_type,carton_content_flags,partner_profile) values
(N'Z',N'0000000629673252','2019-04-12 09:57:38','2019-04-12 09:57:38',null,N'WST',N'6412210697',N'MCR',N'STD EU',0,0,0,0,0,0,N'MCRSHPTODE',N'Dave Smith',
N'888415936',N'PACKSTATION 432',null,N'Koettgenstr. 8',null,N'13629',N'DE',N'MCRSLDTODE',N'MCR SOLD TO DE',N'High St.',null,null,N'Street',null,N'BA330YA',null,
N'GB',N'MC01',0,N'BAG',N'10',null,N'RE','',N'01',N'W','2019-03-29 11:38:13','2019-03-29 11:38:13','2019-03-29 11:38:13',null,0,N'001',2,null,null,'91',1,
N'MCR CON - UK Orders',1,'1',null,N'N',null,N'623611121','GBP',null,null,null,N'Smith#arcor.com',null,'R',N'F', N'WWMULESFTH');
insert into highjump.t_import_order_cms
(order_id,delivery_from_date,delivery_to_date,pin_number,cms_location,cms_delivery_endpoint,cms_comm_preference,cms_dont_despatch_before,cms_market,cms_brand,is_gift,gift_message,loyalty_number,cms_dest_type,cms_time_delivery,cms_day_delivery,cms_customer_type,carrier_service_name,special_instructions) values ((select top(1) order_id from highjump.t_import_order where order_number='6412210697'),'2019-04-03','2019-04-03',null,N'432',N'PACKSTATIONPACKSTATION',null,null,null,N'CLA',null,null,null,N'PUDO',null,null,null,null,null);
insert into highjump.t_import_order_detail
(order_id,line_number,item_number,order_quantity,customer_item_number,ratio_pack_group,is_ratio_pack,ratio_pack_qty,uom,retail_price,freight_class,sales_order_number,customer_order_number,dsv_price_discount,customer_item_colour,price_paid,currency_code,customer_item_size)
values ((select top(1) order_id from highjump.t_import_order where order_number='6412210697'),00010,'261392464080',1.000,null,null,null,null,'U','0.0',null,N'623611121000010',N'623611121',null,null,99.95,null,null);
insert into highjump.t_import_order_detail (order_id,line_number,item_number,order_quantity,customer_item_number,ratio_pack_group,is_ratio_pack,ratio_pack_qty,uom,retail_price,freight_class,sales_order_number,customer_order_number,dsv_price_discount,customer_item_colour,price_paid,currency_code,customer_item_size)
values ((select top(1) order_id from highjump.t_import_order where order_number='6412210697'),00020,'261394324080',1.000,null,null,null,null,'U','0.0',null,N'623611121000020',N'623611121',null,null,89.95,null,null);
Structurally, these SQL queries seem to be fine. It is unclear to me why and how any of these queries would fail (or would not insert any data). It should just work fine, as far as I can see.
In the end, when you execute these queries in SQL Server Management Studio, they should all return value 1:
select count(*) from highjump.t_import_order where order_number = '6412210697';
select count(*) from highjump.t_import_order_cms where order_id = (select top (1) order_id from highjump.t_import_order where order_number = '6412210697');
select count(*) from highjump.t_import_order_detail where line_number = 10 and order_id = (select top (1) order_id from highjump.t_import_order where order_number = '6412210697');
select count(*) from highjump.t_import_order_detail where line_number = 20 and order_id = (select top (1) order_id from highjump.t_import_order where order_number = '6412210697');
Use transaction for multiple insert queries execution. In multiple SQL queries suppose one query gives error then it will be roll back.
BEGIN TRY
BEGIN TRANSACTION
//Here you will write multiple insert/delete/update queries
COMMIT
END TRY
BEGIN CATCH
ROLLBACK
END CATCH
I created a trigger that after insert must update the table inserted:
alter trigger DispararInsertFactura1
on FacturaCabecera
after insert
as
BEGIN
Declare #numfac int;
select #numfac = NumFactura
FROM FacturaCabecera
WHERE id = (SELECT max(id) from FacturaCabecera);
update FacturaCabecera
set NumFactura=#numfac+1
where Id = (SELECT Id FROM INSERTED);
END
GO
But doesn't work, Did I make a mistake?
It is generally not allowed for a trigger to start a separate access to the same table it is defined on. The table is right smack in the middle of being altered (otherwise referred to as mutating).
"After" triggers are good for audit-type actions. Inserting an entry into a different table to describe the action that just took place.
"Before" triggers are good for verifying and possible altering the data stream before it goes to the table. This is what you want to do.
Unfortunately, SQL Server does not have "Before" triggers. It does, however, allow "Instead Of" triggers on tables. These triggers are not executed as part of the DML operation but before it begins. As the name implies, the trigger is executed in lieu of the DML operation. The trigger itself must initiate the operation or nothing happens. Defining an "Instead Of" trigger that does nothing is a good way to render a table or view read-only.
alter trigger DispararInsertFactura1
on FacturaCabecera
instead of insert as
declare #numfac int;
select Top 1 #numfac = NumFactura
from FacturaCabecera
order by id desc;
insert into FacturaCabecera( ..., NumFactura, ...)
select ..., IsNull( #numfac, 0 ) + 1, ...
from Inserted;
Notice that the trigger itself must execute the Insert statement, allowing it to alter what is being inserted.
I was thinking in this and I found the mistake, I didn't realize that the MAX number always would be a value NULL because the trigger will updated after of inserted in the table, (the value is NULL when I insert) so I changed this:
select #numfac = NumFactura
FROM Factura Cabecera
WHERE id = (SELECT max(id)
from Factura Cabecera);
for this that give me the number max second:
SELECT #numfac = NumFactura FROM FacturaCabecera WHERE ID = (SELECT MIN(id)
FROM (SELECT DISTINCT TOP (2) id FROM FacturaCabecera where
tipofactura='1' ORDER BY id DESC) T);
Thanks for you help
ALTER TRIGGER DispararInsertFactura1
on FacturaCabecera
INSTEAD OF INSERT AS
DECLARE #numfac int;
SELECT TOP 1 #numfac = NumFactura
FROM FacturaCabecera
ORDER BY id DESC;
INSERT INTO FacturaCabecera( __ , NumFactura, __)
SELECT __ , IsNull( #numfac, 0 ) + 1, __
FROM Inserted;
Trying this
First off, I want to start by saying I am not an SQL programmer (I'm a C++/Delphi guy), so some of my questions might be really obvious. So pardon my ignorance :o)
I've been charged with writing a script that will update certain tables in a database based on the contents of a CSV file. I have it working it would seem, but I am worried about atomicity for one of the steps:
One of the tables contains only one field - an int which must be incremented each time, but from what I can see is not defined as an identity for some reason. I must create a new row in this table, and insert that row's value into another newly-created row in another table.
This is how I did it (as part of a larger script):
DECLARE #uniqueID INT,
#counter INT,
#maxCount INT
SELECT #maxCount = COUNT(*) FROM tempTable
SET #counter = 1
WHILE (#counter <= #maxCount)
BEGIN
SELECT #uniqueID = MAX(id) FROM uniqueIDTable <----Line 1
INSERT INTO uniqueIDTableVALUES (#uniqueID + 1) <----Line 2
SELECT #uniqueID = #uniqueID + 1
UPDATE TOP(1) tempTable
SET userID = #uniqueID
WHERE userID IS NULL
SET #counter = #counter + 1
END
GO
First of all, am I correct using a "WHILE" construct? I couldn't find a way to achieve this with a simple UPDATE statement.
Second of all, how can I be sure that no other operation will be carried out on the database between Lines 1 and 2 that would insert a value into the uniqueIDTable before I do? Is there a way to "synchronize" operations in SQL Server Express?
Also, keep in mind that I have no control over the database design.
Thanks a lot!
You can do the whole 9 yards in one single statement:
WITH cteUsers AS (
SELECT t.*
, ROW_NUMBER() OVER (ORDER BY userID) as rn
, COALESCE(m.id,0) as max_id
FROM tempTable t WITH(UPDLOCK)
JOIN (
SELECT MAX(id) as id
FROM uniqueIDTable WITH (UPDLOCK)
) as m ON 1=1
WHERE userID IS NULL)
UPDATE cteUsers
SET userID = rn + max_id
OUTPUT INSERTED.userID
INTO uniqueIDTable (id);
You get the MAX(id), lock the uniqueIDTable, compute sequential userIDs for users with NULL userID by using ROW_NUMBER(), update the tempTable and insert the new ids into uniqueIDTable. All in one operation.
For performance you need and index on uniqueIDTable(id) and index on tempTable(userID).
SQL is all about set oriented operations, WHILE loops are the code smell of SQL.
You need a transaction to ensure atomicity and you need to move the select and insert into one statement or do the select with an updlock to prevent two people from running the select at the same time, getting the same value and then trying to insert the same value into the table.
Basically
DECLARE #MaxValTable TABLE (MaxID int)
BEGIN TRANSACTION
BEGIN TRY
INSERT INTO uniqueIDTable VALUES (id)
OUTPUT inserted.id INTO #MaxValTable
SELECT MAX(id) + 1 FROM uniqueIDTable
UPDATE TOP(1) tempTable
SET userID = (SELECT MAXid FROM #MaxValTable)
WHERE userID IS NULL
COMMIT TRANSACTION
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION
RAISERROR 'Error occurred updating tempTable' -- more detail here is good
END CATCH
That said, using an identity would make things far simpler. This is a potential concurrency problem. Is there any way you can change the column to be identity?
Edit: Ensuring that only one connection at a time will be able to insert into the uniqueIDtable. Not going to scale well though.
Edit: Table variable's better than exclusive table lock. If need be, this can be used when inserting users as well.