I am creating a stored procedure, which I intend on running via a job every 24 hours. I am able to successfully run the procedure query but for some reason the values dont seem to make sense. See below.
This is my table and what it looks like prior to the running of the procedure, using the following statement:
SELECT HardwareAssetDailyAccumulatedDepreciationValue,
HardwareAssetAccumulatedDepreciationValue FROM HardwareAsset
I then run the following procedure (with the intention of basically copying the value in DailyDepreciationValue to DepreciationValue):
BEGIN
SELECT HardwareAssetID, HardwareAssetDailyAccumulatedDepreciationValue,
HardwareAssetAccumulatedDepreciationValue FROM HardwareAsset
WHERE HardwareAssetDailyAccumulatedDepreciationValue IS NOT NULL
UPDATE HardwareAsset SET HardwareAssetAccumulatedDepreciationValue = CASE WHEN
(HardwareAssetAccumulatedDepreciationValue IS NULL) THEN
CONVERT(DECIMAL(7,2),HardwareAssetDailyAccumulatedDepreciationValue) ELSE
CONVERT(DECIMAL(7,2),(HardwareAssetAccumulatedDepreciationValue + HardwareAssetDailyAccumulatedDepreciationValue))
END
END
But when i re-run the select statement the results are as follows:
It really doesnt make any sense to me at all any ideas?
I am not able to replicate. We need more detail on the table structure and data. This is what I used to attempt to replicate. Feel free to modify as needed:
create table #t (
AccD1 decimal(7,2)
, AccD2 decimal(7,2)
, AccDaily as AccD1 + AccD2
, AccTotal decimal(7,2)
)
insert #t values
(100, 7.87, null)
, (300, 36.99, null)
, (400, 49.32, null)
, (100, 50.00, 100)
select * from #t
update #t set
AccTotal = isnull(AccTotal, 0) + AccDaily
, AccD1 = 0
, AccD2 = 0
select * from #t
drop table #t
Related
Environment: SQL Server 2019 (v15).
I have a large query that uses too much space when run as a single SELECT statement. When I try to run it, I get the following error:
Could not allocate a new page for database 'TEMPDB' because of insufficient disk space in filegroup 'DEFAULT'.
However, the problem breaks down naturally into a dozen or so pieces, so I wrote a WHILE loop to iterate through each piece and insert into a results table. Unfortunately, the first iteration of the WHILE loop also returns the same memory error. All the WHILE loop is doing is changing a few values in the WHERE clause.
The key thing confusing me here, is that when I manually run one iteration of the INSERT statement, absent all looping logic, it works perfectly.
Manually coding the first iteration to use the first institution_name just works, so I don't think the joins here are going wrong and causing the memory error.
WITH my_cte AS
(
SELECT [columns]
FROM mytable a
INNER JOIN bigtable b ON a.institution_name = b.institution_name
AND a.personID = b.personID
WHERE a.institution_name = 'ABC'
AND b.institution_name = 'ABC'
)
INSERT INTO results (personID, institution_name, ...)
SELECT personID, institution_name, [some aggregations]
FROM my_cte
GROUP BY personID, institution_name;
The version with the WHILE loop fails. I need to run the query with different values for institution_name.
Here I show three different values but even just the first iteration fails.
DECLARE #INSTITUTION varchar(10)
DECLARE #COUNTER int
SET #COUNTER = 0
DECLARE #LOOKUP table (temp_val varchar(10), temp_id int)
INSERT INTO #LOOKUP (temp_val, temp_id)
VALUES ('ABC', 1), ('DEF', 2), ('GHI', 3)
WHILE #COUNTER < 3
BEGIN
SET #COUNTER = #COUNTER + 1
SELECT #INSTITUTION = temp_val
FROM #LOOKUP
WHERE temp_id = #COUNTER;
WITH my_cte AS
(
SELECT [columns]
FROM mytable a
INNER JOIN bigtable b ON a.institution_name = b.institution_name
AND a.personID = b.personID
WHERE a.institution_name = #INSTITUTION
AND b.institution_name = #INSTITUTION
)
INSERT INTO results (personID, institution_name, ...)
SELECT personID, institution_name, [some aggregations]
FROM my_cte
GROUP BY personID, institution_name
END
As I write this question, I have quite literally just copy-pasted the insert statement a dozen times, changed the relevant WHERE clause, and run it without errors. Could it be some kind of datatype issue where the query can properly subset if a string literal is put in the WHERE column, but the lookup on my temporary table is failing due to the datatype? I notice that mytable.institution_name is varchar(10) while bigtable.institution_name is nvarchar(10). Setting the temp table to use nvarchar(10) didn't fix it either.
I have an INSERT query inside a stored procedure that creates a set of parcels monthly named "MONTHLY_SET".
Sometimes the set gets too big, I need to be able to continue run the same insert query, but rather than insert, for example, 5000 records in a single set named "MONTHLY SET", I need to end up with 5 sets of 1000 each named: "MONTHLY_SET1", "MONTHLY_SET2", "MONTHLY_SET3", "MONTHLY_SET4", "MONTHLY_SET5"
I do not know how this can be achieved, I am not familiar with the use of cursor, and loops in T-SQL, or if those are the only available options to do this.
Would it be possible to ask for some help to understand how can I split a shingle set into smaller sets?
The INSERT query that needs to be inside a loop, currently looks like:
DECLARE #SETSEQ AS INT;
SET #SETSEQ = (SELECT MAX(SET_SEQ_NBR) FROM SETDETAILS);
INSERT INTO SETDETAILS
( SERV_PROV_CODE
, SET_SEQ_NBR
, SET_ID
, REC_DATE
, REC_FUL_NAM
, REC_STATUS
, SOURCE_SEQ_NBR
, L1_PARCEL_NBR
)
SELECT Top 10
'STRING'
, ROW_NUMBER() OVER(ORDER BY ParcelNumber ASC) + #SETSEQ
, 'MONTHLY_SET'
, GETDATE()
, 'USR'
, 'A'
, '155'
, ParcelNumber
FROM
dbo.Parcels
WHERE
Create = 1;
Thank you for your help.
Running on SQL Server 2016.
I have a routine that updates information across servers. I want to hold a list of any changes that I have been required to make. I am trying to output the changed columns as XML for basic storage, and would like to do this directly from the OUTPUT generated by the insert/update/delete if possible.
As an example:
DROP TABLE IF EXISTS Test
CREATE TABLE Test (myKey INT, myValue INT)
INSERT INTO dbo.Test (myKey, myValue)
VALUES (1, 1), (2, 2), (3, 3)
UPDATE dbo.Test
SET myValue = myValue + 10
OUTPUT Deleted.*
, Inserted.*
WHERE myKey < 3
SELECT *
FROM dbo.Test
FOR XML AUTO
DROP TABLE dbo.Test
I know I can set up a TVP to receive the output and then convert to XML from there, but it seems like I'm taking extra steps to do something that should be quite straight forward.
DROP TABLE IF EXISTS Test
CREATE TABLE Test (myKey INT, myValue INT)
INSERT INTO dbo.Test (myKey, myValue)
VALUES (1, 1), (2, 2), (3, 3)
DECLARE #OutputValues AS TABLE(dMyKey INT, dMyValue INT, iMyKey INT, iMyValue INT)
UPDATE dbo.Test
SET myValue = myValue + 10
OUTPUT Deleted.myKey
, Deleted.myValue
, Inserted.myKey
, Inserted.myValue
INTO #OutputValues
WHERE myKey < 3
SELECT *
FROM #OutputValues
FOR XML AUTO
DROP TABLE dbo.Test
While this second piece of code does achieve the sort of output I am looking for, going via a TVP seems to be a bit wasteful.
If I can format the output from the original code directly as XML I feel that would be a better solution. However, I can't see any way of doing so.
Many Thanks.
What are you trying to achieve? Such monitoring / auditing tasks are most likely better solved within a trigger...
The output clause does not allow sub-selects... The only way - not generic and rather ugly - which came into my mind was this (the clue is the implicit cast from nvarchar to xml):
CREATE TABLE Test (myKey INT, myValue INT)
INSERT INTO dbo.Test (myKey, myValue)
VALUES (1, 1), (2, 2), (3, 3)
DECLARE #OutputValues AS TABLE(dMyKey INT, dMyValue INT, iMyKey INT, iMyValue INT,Changed XML)
UPDATE dbo.Test
SET myValue = myValue + 10
OUTPUT Deleted.myKey
, Deleted.myValue
, Inserted.myKey
, Inserted.myValue
, N'<root><deletedKey>' + CAST(deleted.myKey AS NVARCHAR(MAX)) + N'</deletedKey>' +
N'<deletedValue>' + CAST(deleted.myValue AS NVARCHAR(MAX)) + N'</deletedValue>' +
N'<insertedKey>' + CAST(inserted.myValue AS NVARCHAR(MAX)) + N'</insertedKey>' +
N'<insertedValue>' + CAST(inserted.myValue AS NVARCHAR(MAX)) + N'</insertedValue>' +
N'</root>'
INTO #OutputValues
WHERE myKey < 3
SELECT Changed
FROM #OutputValues
DROP TABLE dbo.Test
attention: If your values include forbidden characters (such as <, > or &) this will fail!
I have to do an SQL Server Statement that have to return an empty row when is null, and data otherwhise.
I am trying to do a Select from (if exisits) but have an error on parent table.
I Simplify it. But the meaning, is to retrieve a couple of fields when condition is null and other fields when it is not null.
It Works fine when I do not clouse it in another select.... I need to retrieve it as a table to do an inner Join with other clouse.
How can i resolved it?
Here is my code..
select * from
(
if exists(select isnull(SECTOR_ID_DESTINO_BAD,-1)
from workflow_Compras_detalle w
where w.id=2)
begin
select null as Sector,null as sector_id_origen
end
else
begin
select top 1 isnull(ws.sector,'') sector, wd.sector_id_origen
from workflow_Compras_detalle wd
where orden < 10
end
)Table
you should try to insert the data into a temporary table or Table Variable, then get the data from that table, here is an example with a Table Variable, if you need something more persistent you may use a #Temp Table, i recommend you take a look to this: difference between var table and #Temp Table
DECLARE #VAR_TABLE AS TABLE(
Sector varchar(25),
sector_id_origen int
)
if exists(select isnull(SECTOR_ID_DESTINO_BAD,-1)
from workflow_Compras_detalle w
where w.id=2)
begin
INSERT INTO #VAR_TABLE
Select null as Sector,null as sector_id_origen
End
Else
begin
INSERT INTO #VAR_TABLE
select top 1 isnull(ws.sector,'') sector, wd.sector_id_origen
from workflow_Compras_detalle wd
where orden < 10
End
SELECT * FROM #VAR_TABLE
I currently have a stored procedure in MSSQL where I execute a SELECT-statement multiple times based on the variables I give the stored procedure. The stored procedure counts how many results are going to be returned for every filter a user can enable.
The stored procedure isn't the issue, I transformed the select statement from te stored procedure to a regular select statement which looks like:
DECLARE #contentRootId int = 900589
DECLARE #RealtorIdList varchar(2000) = ';880;884;1000;881;885;'
DECLARE #publishSoldOrRentedSinceDate int = 8
DECLARE #isForSale BIT= 1
DECLARE #isForRent BIT= 0
DECLARE #isResidential BIT= 1
--...(another 55 variables)...
--Table to be returned
DECLARE #resultTable TABLE
(
variableName varchar(100),
[value] varchar(200)
)
-- Create table based of inputvariable. Example: turns ';18;118;' to a table containing two ints 18 AND 118
DECLARE #RealtorIdTable table(RealtorId int)
INSERT INTO #RealtorIdTable SELECT * FROM dbo.Split(#RealtorIdList,';') option (maxrecursion 150)
INSERT INTO #resultTable ([value], variableName)
SELECT [Value], VariableName FROM(
Select count(*) as TotalCount,
ISNULL(SUM(CASE WHEN reps.ForRecreation = 1 THEN 1 else 0 end), 0) as ForRecreation,
ISNULL(SUM(CASE WHEN reps.IsQualifiedForSeniors = 1 THEN 1 else 0 end), 0) as IsQualifiedForSeniors,
--...(A whole bunch more SUM(CASE)...
FROM TABLE1 reps
LEFT JOIN temp t on
t.ContentRootID = #contentRootId
AND t.RealEstatePropertyID = reps.ID
WHERE
(EXISTS(select 1 from #RealtorIdTable where RealtorId = reps.RealtorID))
AND (#SelectedGroupIds IS NULL OR EXISTS(select 1 from #SelectedGroupIdtable where GroupId = t.RealEstatePropertyGroupID))
AND (ISNULL(reps.IsForSale,0) = ISNULL(#isForSale,0))
AND (ISNULL(reps.IsForRent, 0) = ISNULL(#isForRent,0))
AND (ISNULL(reps.IsResidential, 0) = ISNULL(#isResidential,0))
AND (ISNULL(reps.IsCommercial, 0) = ISNULL(#isCommercial,0))
AND (ISNULL(reps.IsInvestment, 0) = ISNULL(#isInvestment,0))
AND (ISNULL(reps.IsAgricultural, 0) = ISNULL(#isAgricultural,0))
--...(Around 50 more of these WHERE-statements)...
) as tbl
UNPIVOT (
[Value]
FOR [VariableName] IN(
[TotalCount],
[ForRecreation],
[IsQualifiedForSeniors],
--...(All the other things i selected in above query)...
)
) as d
select * from #resultTable
The combination of a Realtor- and contentID gives me a set default set of X amount of records. When I choose a Combination which gives me ~4600 records, the execution time is around 250ms. When I execute the sattement with a combination that gives me ~600 record, the execution time is about 20ms.
I would like to know why this is happening. I tried removing all SUM(CASE in the select, I tried removing almost everything from the WHERE-clause, and I tried removing the JOIN. But I keep seeing the huge difference between the resultset of 4600 and 600.
Table variables can perform worse when the number of records is large. Consider using a temporary table instead. See When should I use a table variable vs temporary table in sql server?
Also, consider replacing the UNPIVOT by alternative SQL code. Writing your own TSQL code will give you more control and even increase performance. See for example PIVOT, UNPIVOT and performance