Related
I have a stored procedure to return a paginated set of results based on a bunch of optional search criteria. The procedure uses a temp table so that the total count and page count can be returned separately from the search results. Here's what it looks like:
CREATE PROCEDURE [dbo].[Location_SearchAll]
#LocationName varchar(100) = null,
#LocationId varchar(10) = null,
#Address varchar(40) = null,
#City varchar(35) = null,
#StateProvince varchar(2) = null,
#PostalCode varchar(9) = null,
#CountryCode varchar(2) = null,
#PhoneNumber varchar(20) = null,
#ServiceLevel varchar(35) = null,
#Page int = 1,
#ItemsPerPage int = 100
AS
select * into #temp from Locations l
where (l.LocationName like '%' + #LocationName + '%' or #LocationName is null)
and (l.LocationId like '%' + #LocationId + '%' or #LocationId is null)
and (l.AddressLine1 like '%' + #Address + '%' or #Address is null)
and (l.City like '%' + #City + '%' or #City is null)
and (l.StateProvince like '%' + #StateProvince + '%' or #StateProvince is null)
and (l.PostalCode like '%' + #PostalCode + '%' or #PostalCode is null)
and (l.CountryCode like '%' + #CountryCode + '%' or #CountryCode is null)
and (l.Phone like '%' + #PhoneNumber + '%' or #PhoneNumber is null)
and (l.ServiceLevel like '%' + #ServiceLevel + '%' or #ServiceLevel is null);
select count(*) as TotalCount, count(*) / #ItemsPerPage + 1 as TotalPages from #temp;
select * from #temp
order by Id
offset (#Page) * #ItemsPerPage - #ItemsPerPage rows
fetch next #ItemsPerPage rows only;
drop table #temp;
RETURN 0
GO
I'm noticing that, on occasion, this procedure will return a different number of results each time despite having the exact same search criteria defined. Specifically, the count of results will steadily increase with each subsequent execution until the count finally hits the actual number of records that should be returned based on the search criteria. It's as if the results are being returned by the select statements before the temp table is fully populated.
Some points of note:
This anomaly is reflected in both select statements - the total counts and the results returned.
I've attempted to replace the temp table with a table variable, but it has the same problem.
I'm guessing this is some dumb thing I'm doing with respect to my usage of temp tables, but I can't seem to find any documentation anywhere explaining what that might be. Any ideas? Many thanks in advance.
#Larnu was right. It was my own dumb, dumb fault. I didn't so much as consider the possibility that the underlying data was changing because this is a local DB and no one else is touching it, and then I remembered I had a recurring ETL scheduled that was refreshing the table every few minutes. That's why my result set was changing.
If I could delete this question and conceal my embarrassment, I would, but in lieu of that I'll just thank #Stu for a concise schooling on stored procedure improvements and hope that my derp moment prevents someone else from making the same mistake.
I just made the stored procedure for search items before the procedure I was doing this all via LINQ in C# e.g
//orders
if (objParam.OrderNumber != null && objParam.OrderNumber.Count > 0)
{
foreach (var orderNumber in objParam.OrderNumber)
{
orderNumbersBuilder.Append(',' + orderNumber.ToString());
}
}
var orderNbrs = orderNumbersBuilder.ToString().Trim(',').Split(',');
//Searching
(objParam.OrderNumber.Count == 0 || orderNbrs.Any(a => i.tblOrderMaster.OrderNumber.Contains(a)))
Now I want to do with the stored procedure. I'm getting the result with IN operator but I want to use LIKE operator e.g
SELECT * FROM tblOrderMaster WHERE TrxNumber LIKE '%' + (SELECT * FROM STRING_SPLIT('1330,1329',',')) + '%'
I've multiple filters so I don't want to use function and subqueries e.g
--Params
#Account NVARCHAR(MAX) = NULL,
#OrderNumber NVARCHAR(MAX) = NULL,
#Carrier NVARCHAR(MAX) = NULL,
#ItemCode NVARCHAR(MAX) = NULL,
#OrderType NVARCHAR(MAX) = NULL,
#PONumber NVARCHAR(MAX) = NULL,
#SONumber NVARCHAR(MAX) = NULL
--columns start
--columns end
--Where condtions
(#ACCOUNT IS NULL OR #Account = '' OR partners.PartnerCode IN (select * from string_split(#ACCOUNT,','))) -- multi select filters started
AND
(#OrderNumber IS NULL OR #OrderNumber = '' OR orderMaster.OrderNumber IN (select * from string_split(#OrderNumber,',')))
AND
(#Carrier IS NULL OR #Carrier = '' OR carrier.Description IN (select * from string_split(#Carrier,',')))
AND
(#ItemCode IS NULL OR #ItemCode = '' OR itemMaster.ItemCode IN (select * from string_split(#ItemCode,',')))
AND
(#OrderType IS NULL OR #OrderType = '' OR orderMaster.OrderType IN (select * from string_split(#OrderType,',')))
AND
(#PONumber IS NULL OR #PONumber = '' OR orderMaster.PONumber IN (select * from string_split(#PONumber,',')))
AND
(#SONumber IS NULL OR #SONumber = '' OR orderMaster.SONumber IN (select * from string_split(#SONumber,',')))
You would need to use subqueries; the fact you don't want to doesn't change this with your current design. Using the query with the literal values you have, it would look like this:
SELECT *
FROM dbo.tblOrderMaster OM
WHERE EXISTS (SELECT 1
FROM STRING_SPLIT('1330,1329', ',') SS
WHERE OM.TrxNumber LIKE '%' + SS.[Value] + '%')
If you really don't want to use subqueries, then use table type parameters and then you can perform a JOIN:
SELECT OM.*
FROM dbo.tblOrderMaster OM
JOIN #YourTableVariable YTV ON OM.TrxNumber LIKE '%' + YTV.SearchValue + '%'
I have a problem I think it's something simple but I'm just getting started on this, I have a .txt file that contains
Kayle;Osvorn;35;4399900433
What would be these my columns: First name;Last name;Age;Phone
I need to separate them through the process of transformation of the derived column into ETL but for now only the first and last name I have been able to extract and the rest I do not know how to continue.
I have this for the first two columns
Name = SUBSTRING(CustomerData,1,FINDSTRING(CustomerData,";",1) - 1)
Last Name = SUBSTRING(CustomerData,FINDSTRING(CustomerData,";",1) + 1,LEN(CustomerData))
Age = ?
Phone = ?
Does anyone have any idea how the expression would go?
There's no need to use a Derived Column transformation in an SSIS package. Instead, in your Flat File Connection Manager, define your field separator as the semicolon ; instead of the default comma ','. Indicate that it should ... identify columns and now your single column of CustomerData goes away and you have nice delimited columns.
If you have column headers, it should pull that out. Otherwise, you will need to specify no header and then go into the advanced tab and give them friendly names.
Please use this below logic to achieve your requirement-
Demo Here
DECLARE #T VARCHAR(200) = 'Kayle;Osvorn;35;4399900433'
DECLARE #index_1 INT
DECLARE #index_2 INT
DECLARE #index_3 INT
DECLARE #name VARCHAR(100)
DECLARE #last_name VARCHAR(100)
DECLARE #age VARCHAR(100)
DECLARE #phone VARCHAR(100)
SELECT #index_1 = CHARINDEX(';',#T,0) + 1
SELECT #index_2 = CHARINDEX(';',#T,#index_1 + 1) + 1
SELECT #index_3 = CHARINDEX(';',#T,#index_2 + 1) + 1
SELECT
#name = SUBSTRING(#T,0,#index_1 - 1),
#last_name = SUBSTRING(#T, #index_1 ,#index_2 - #index_1 - 1),
#age = SUBSTRING(#T,#index_2, #index_3 - #index_2 - 1),
#phone = SUBSTRING(#T,#index_3,LEN(#T))
SELECT #name,#last_name, #age,#phone
There is one simple way by doing the same operation on the REVERSEd string:
[Name] = SUBSTRING(#CustomerData,1,FINDSTRING(#CustomerData,";",1) - 1)
[Last Name] = SUBSTRING(#CustomerData, FINDSTRING(#CustomerData, ";",1) + 1,
FINDSTRING(SUBSTRING(#CustomerData, FINDSTRING(#CustomerData, ";",1)+1, LEN(#CustomerData)),";",1)-1)
Age = REVERSE(SUBSTRING(REVERSE(#CustomerData), FINDSTRING(REVERSE(#CustomerData),";",1)+1,
FINDSTRING(SUBSTRING(REVERSE(#CustomerData), FINDSTRING(";",REVERSE(#CustomerData),1) + 1, LEN(#CustomerData)),";",1)-1))
Phone = REVERSE(SUBSTRING(REVERSE(#CustomerData),1,FINDSTRING(REVERSE(#CustomerData),";",1) - 1))
If you need to do that using a transformation, why not using the TOKEN() function?
Name = TOKEN(CustomerData,";",1)
Last Name = TOKEN(CustomerData,";",2)
Age = TOKEN(CustomerData,";",3)
Phone = TOKEN(CustomerData,";",4)
Please look at the below query..
select name as [Employee Name] from table name.
I want to generate [Employee Name] dynamically based on other column value.
Here is the sample table
s_dt dt01 dt02 dt03
2015-10-26
I want dt01 value to display as column name 26 and dt02 column value will be 26+1=27
I'm not sure if I understood you correctly. If I'am going into the wrong direction, please add comments to your question to make it more precise.
If you really want to create columns per sql you could try a variation of this script:
DECLARE #name NVARCHAR(MAX) = 'somename'
DECLARE #sql NVARCHAR(MAX) = 'ALTER TABLE aps.tbl_Fabrikkalender ADD '+#name+' nvarchar(10) NULL'
EXEC sys.sp_executesql #sql;
To retrieve the column name from another query insert the following between the above declares and fill the placeholders as needed:
SELECT #name = <some colum> FROM <some table> WHERE <some condition>
You would need to dynamically build the SQL as a string then execute it. Something like this...
DECLARE #s_dt INT
DECLARE #query NVARCHAR(MAX)
SET #s_dt = (SELECT DATEPART(dd, s_dt) FROM TableName WHERE 1 = 1)
SET #query = 'SELECT s_dt'
+ ', NULL as dt' + RIGHT('0' + CAST(#s_dt as VARCHAR), 2)
+ ', NULL as dt' + RIGHT('0' + CAST((#s_dt + 1) as VARCHAR), 2)
+ ', NULL as dt' + RIGHT('0' + CAST((#s_dt + 2) as VARCHAR), 2)
+ ', NULL as dt' + RIGHT('0' + CAST((#s_dt + 3) as VARCHAR), 2)
+ ' FROM TableName WHERE 1 = 1)
EXECUTE(#query)
You will need to replace WHERE 1 = 1 in two places above to select your data, also change TableName to the name of your table and it currently puts NULL as the dynamic column data, you probably want something else there.
To explain what it is doing:
SET #s_dt is selecting the date value from your table and returning only the day part as an INT.
SET #query is dynamically building your SELECT statement based on the day part (#s_dt).
Each line is taking #s_dt, adding 0, 1, 2, 3 etc, casting as VARCHAR, adding '0' to the left (so that it is at least 2 chars in length) then taking the right two chars (the '0' and RIGHT operation just ensure anything under 10 have a leading '0').
It is possible to do this using dynamic SQL, however I would also consider looking at the pivot operators to see if they can achieve what you are after a lot more efficiently.
https://technet.microsoft.com/en-us/library/ms177410(v=sql.105).aspx
I am creating a computed column across fields of which some are potentially null.
The problem is that if any of those fields is null, the entire computed column will be null. I understand from the Microsoft documentation that this is expected and can be turned off via the setting SET CONCAT_NULL_YIELDS_NULL. However, there I don't want to change this default behavior because I don't know its implications on other parts of SQL Server.
Is there a way for me to just check if a column is null and only append its contents within the computed column formula if its not null?
You can use ISNULL(....)
SET #Concatenated = ISNULL(#Column1, '') + ISNULL(#Column2, '')
If the value of the column/expression is indeed NULL, then the second value specified (here: empty string) will be used instead.
From SQL Server 2012 this is all much easier with the CONCAT function.
It treats NULL as empty string
DECLARE #Column1 VARCHAR(50) = 'Foo',
#Column2 VARCHAR(50) = NULL,
#Column3 VARCHAR(50) = 'Bar';
SELECT CONCAT(#Column1,#Column2,#Column3); /*Returns FooBar*/
Use COALESCE. Instead of your_column use COALESCE(your_column, ''). This will return the empty string instead of NULL.
You can also use CASE - my code below checks for both null values and empty strings, and adds a seperator only if there is a value to follow:
SELECT OrganisationName,
'Address' =
CASE WHEN Addr1 IS NULL OR Addr1 = '' THEN '' ELSE Addr1 END +
CASE WHEN Addr2 IS NULL OR Addr2 = '' THEN '' ELSE ', ' + Addr2 END +
CASE WHEN Addr3 IS NULL OR Addr3 = '' THEN '' ELSE ', ' + Addr3 END +
CASE WHEN County IS NULL OR County = '' THEN '' ELSE ', ' + County END
FROM Organisations
Use
SET CONCAT_NULL_YIELDS_NULL OFF
and concatenation of null values to a string will not result in null.
Please note that this is a deprecated option, avoid using.
See the documentation for more details.
I just wanted to contribute this should someone be looking for help with adding separators between the strings, depending on whether a field is NULL or not.
So in the example of creating a one line address from separate fields
Address1, Address2, Address3, City, PostCode
in my case, I have the following Calculated Column which seems to be working as I want it:
case
when [Address1] IS NOT NULL
then ((( [Address1] +
isnull(', '+[Address2],'')) +
isnull(', '+[Address3],'')) +
isnull(', '+[City] ,'')) +
isnull(', '+[PostCode],'')
end
Hope that helps someone!
ISNULL(ColumnName, '')
I had a lot of trouble with this too. Couldn't get it working using the case examples above, but this does the job for me:
Replace(rtrim(ltrim(ISNULL(Flat_no, '') +
' ' + ISNULL(House_no, '') +
' ' + ISNULL(Street, '') +
' ' + ISNULL(Town, '') +
' ' + ISNULL(City, ''))),' ',' ')
Replace corrects the double spaces caused by concatenating single spaces with nothing between them. r/ltrim gets rid of any spaces at the ends.
In Sql Server:
insert into Table_Name(PersonName,PersonEmail) values(NULL,'xyz#xyz.com')
PersonName is varchar(50), NULL is not a string, because we are not passing with in single codes, so it treat as NULL.
Code Behind:
string name = (txtName.Text=="")? NULL : "'"+ txtName.Text +"'";
string email = txtEmail.Text;
insert into Table_Name(PersonName,PersonEmail) values(name,'"+email+"')
This example will help you to handle various types while creating insert statements
select
'insert into doc(Id, CDate, Str, Code, Price, Tag )' +
'values(' +
'''' + convert(nvarchar(50), Id) + ''',' -- uniqueidentifier
+ '''' + LEFT(CONVERT(VARCHAR, CDate, 120), 10) + ''',' -- date
+ '''' + Str+ ''',' -- string
+ '''' + convert(nvarchar(50), Code) + ''',' -- int
+ convert(nvarchar(50), Price) + ',' -- decimal
+ '''' + ISNULL(Tag, '''''') + '''' + ')' -- nullable string
from doc
where CDate> '2019-01-01 00:00:00.000'