How to use an evaluated expression in SQL IN clause - sql-server

I have an application that takes a comma separated string for multiple IDs to be used in the 'IN' clause of a SQL query.
SELECT * FROM TABLE WHERE [TABLENAME].[COLUMNNAME]
IN
((SELECT '''' + REPLACE('PARAM(0, Enter ID/IDS. Separate multiple ids by
comma., String)', char(44), ''',''') + ''''))
I have tested that PARAM gets the string entered e.g. 'ID1, ID2' but SELECT/REPLACE does not execute. The statement becomes,
SELECT * FROM TABLE WHERE [TABLENAME].[COLUMNNAME]
IN
((SELECT '''' + REPLACE('ID1,ID2', char(44), ''',''') + ''''))
I am trying to achieve,
SELECT * FROM TABLE WHERE [TABLENAME].[COLUMNNAME]
IN ('ID1', 'ID2')
The query does not return any results/errors. I am confident the corresponding records are in the database I am working with. Not sure how to fix this.

You can't do it like this. The IN operator expects a list of parameters separated by comma, but you supply it with a single parameter that happens to contain a comma delimited string:
If you are working on SQL Server version 2016 or higher, you can use the built in string_split to convert the delimited string into a table.
SELECT *
FROM TABLE
WHERE [TABLENAME].[COLUMNNAME]
IN STRING_SPLIT(#CommaDelimitedString, ',')
For older versions, there are multiple user defined functions you can choose from, my personal favorite is Jeff Moden's DelimitedSplit8K. For more options, read Aaron Bertrand's Split strings the right way – or the next best way.

Related

T-SQL: LIKE operator, compare string to one value

Is there a way to compare multiple values in one column to a single value in another column.
Example:
Column A contains: [a;b;c;d]
Column B contains: [a]
At the moment I'm using the LIKE operator to achieve this but not result. I tried it with a wildcard % but I get no match because of the ;.
As Larnu suggested, the real fix here is to fix the design. You should go back to the owners and remind them that the database is for storing relational data; if you're jamming multiple "facts" into a single column, you may as well be using a flat file. The exception is if you are storing a comma-separated list for the application and only the application is responsible for assembling and exploding that set.
Anyway, given that you are probably stuck with this (and let's say ColumnA is limited to 128 characters):
CREATE TABLE dbo.BadDesign
(
ColumnA nvarchar(128),
ColumnB nvarchar(max)
);
INSERT dbo.BadDesign(ColumnA, ColumnB) VALUES
(N'[a]', N'[a;b;c;d]'), -- only match
(N'[p]', N'[q;r;s]'),
(N'[h]', N'[hi;j;k]');
You can see the following solutions demonstrated in turn in this db<>fiddle:
Nested Replace
In the old days before (SQL Server 2017), we would perform nested REPLACE() calls to get rid of the square brackets and replace each end of the string with delimiters:
-- All versions
SELECT ColumnA, ColumnB
FROM dbo.BadDesign
WHERE REPLACE(REPLACE(ColumnB, N'[',N';'),N']', N';')
LIKE N'%' + REPLACE(REPLACE(ColumnA ,N'[',N';'),N']', N';') + N'%';
Gross, but results:
ColumnA
ColumnB
[a]
[a;b;c;d]
We can't use TRIM() on versions prior to SQL Server 2017, but I explain below why we don't want to use that function on modern versions anyway.
OpenJson
In 2016+ we can use OPENJSON after a little manipulation to the string. And here I use a PARSENAME() trick which is only safe if the ColumnA <= 128 characters. I show other workarounds in this db<>fiddle:
SELECT b.ColumnA, b.ColumnB
FROM dbo.BadDesign AS b
CROSS APPLY OPENJSON(REPLACE(REPLACE(REPLACE(
b.ColumnB, N'[', N'["'), N']', N'"]"'), N';', N'","')) AS j
WHERE j.value = PARSENAME(b.ColumnA, 1);
Results:
ColumnA
ColumnB
[a]
[a;b;c;d]
Translate
In SQL Server 2017, it can be a little less gross with TRANSLATE():
-- SQL Server 2017+
SELECT ColumnA, ColumnB
FROM dbo.BadDesign
WHERE TRANSLATE(ColumnB, N'[]',N';;')
LIKE N'%' + TRANSLATE(ColumnA, N'[]',N';;') + N'%';
ColumnA
ColumnB
[a]
[a;b;c;d]
We don't want to use TRIM() here because we don't simply want to remove the enclosing square brackets; we want delimiters there so we can always compare A to B regardless of where B is in the string. Without surrounding delimiters replaced or translated, we could get inaccurate results if the match is at the beginning or end of the multi-value string.
Split Function
Alternatively, you could create this function on SQL Server 2016+ (or a messier one that doesn't use STRING_SPLIT() in earlier versions - as Smor noted, a search will turn up hundreds of those):
CREATE FUNCTION dbo.SplitAndClean(#s nvarchar(max))
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN
(
SELECT value
FROM STRING_SPLIT
(
-- if 2016:
REPLACE(REPLACE(#s, N'[',N';'),N']', N';'),
-- if 2017+, TRANSLATE() is slightly cleaner:
/* TRANSLATE(#s, N'[]',N';;'), */
N';'
)
WHERE value > N''
);
Then you can say:
SELECT bd.ColumnA, bd.ColumnB
FROM dbo.BadDesign AS bd
CROSS APPLY dbo.SplitAndClean(bd.ColumnA) AS a
CROSS APPLY dbo.SplitAndClean(bd.ColumnB) AS b
WHERE a.value = b.value;
ColumnA
ColumnB
[a]
[a;b;c;d]
But in the end...
...these are all gross "solutions" masking bad design, and you should really have them reconsider how they're using the database.
I know that many shops can't just switch to passing sets between the app and the database using TVPs, because several client providers and ORMs haven't quite had more than a decade to catch that train. If you can't use TVPs or can't change the app, you should at least consider intercepting the comma-separated list passed by the app and break it apart using SPLIT_STRING() or the like. Then you can store the values relationally and let the database do what the database was designed to do, without being handcuffed by app limitations.
If there will be always only one value in col_b like in your example, you can user nested replace function to remove [ and ] and then use "like" for search
select *
from test_data
where col_a like '%' + replace(replace(col_b, '[', ''), ']', '') + '%';
But
if there could be more than value in col_b and it could be in any order (e.g. "[a;c]" or "[d;a]") you'll find answer among already answered questions or you may google for "string_split()" function on msdn. The latter has great examples section that will definitely help you out

Converting Oracle statement to SQL Server format

Below is an Oracle script that I need to execute on an SQL Server.
SELECT
records.pr_id,
SUBSTR (REPLACE (REPLACE (XMLAGG (XMLELEMENT ("x", prad4.selection_value)
ORDER BY prad4.selection_value),'</x>'),'<x>',' ; '),4) as teva_role
FROM records
Thanks for the help,
Barry
I programmed in SQL for years in several environments and it is about 75% the same. So, the SQL statement should work as is, however the functions (REPLACE, SUBSTR) will be what you need to research and change.
Also, you get columns from prad4 without including it in the FROM statement which is a problem.
And, finally, your parentheses aren't balanced which, I would think, would be a problem in Oracle as well.
This is basically concatenating a set of strings with a delimiter. The common way to do this, is using FOR XML PATH('') which seems to be the equivalent of the combination of XMLELEMENT() in Oracle, but with a different syntax. You can also use XML functions to prevent change of certain characters not allowed in XML. The STUFF takes care of the SUBSTR() part of your code. For a more detailed explanation, you can read this article on Creating a comma-separated list.
The code should look similar to this:
SELECT records.pr_id,
STUFF(( SELECT ' ; ' + prad4.selection_value
FROM prad4
WHERE prad4.pr_id = records.pr_id
ORDER BY prad4.selection_value
FOR XML PATH(''), TYPE).value('./text()[1]', 'varchar(max)'), 1, 3, '')
FROM records;
Of course, with the improvements of SQL Server 2017, the code can be simplified to something like this:
SELECT records.pr_id,
STRING_AGG( selection_value, ' ; ') WITHIN GROUP (ORDER BY selection_value ASC)
FROM records;

Is there a SQL Server collation option that will allow matching different apostrophes?

I'm currently using SQL Server 2016 with SQL_Latin1_General_CP1_CI_AI collation. As expected, queries with the letter e will match values with the letters e, è, é, ê, ë, etc because of the accent insensitive option of the collation. However, queries with a ' (U+0027) do not match values containing a ’ (U+2019). I would like to know if such a collation exists where this case would match, since it's easier to type ' than it is to know that ’ is keystroke Alt-0146.
I'm confident in saying no. The main thing, here, is that the two characters are different (although similar). With accents, e and ê are still both an e (just one has an accent). This enables you (for example) to do searches for things like SELECT * FROM Games WHERE [Name] LIKE 'Pokémon%'; and still have rows containing Pokemon return (because people haven't used the accent :P).
The best thing I could suggest would be to use REPLACE (at least in your WHERE clause) so that both rows are returned. That is, however, likely going to get expensive.
If you know what columns are going to be a problem, you could, therefore, add a PERSISTED Computed Column to that table. Then you could use that column in your WHERE clause, but display the one the original one. Something like:
USE Sandbox;
--Create Sample table and data
CREATE TABLE Sample (String varchar(500));
INSERT INTO Sample
VALUES ('This is a string that does not contain either apostrophe'),
('Where as this string, isn''t without at least one'),
('’I have one of them as well’'),
('’Well, I''m going to use both’');
GO
--First attempt (without the column)
SELECT String
FROM Sample
WHERE String LIKE '%''%'; --Only returns 2 of the rows
GO
--Create a PERSISTED Column
ALTER TABLE Sample ADD StringRplc AS REPLACE(String,'’','''') PERSISTED;
GO
--Second attempt
SELECT String
FROM Sample
WHERE StringRplc LIKE '%''%'; --Returns 3 rows
GO
--Clean up
DROP TABLE Sample;
GO
The other answer is correct. There is no such collation. You can easily verify this with the below.
DECLARE #dynSql NVARCHAR(MAX) =
'SELECT * FROM (' +
(
SELECT SUBSTRING(
(
SELECT ' UNION ALL SELECT ''' + name + ''' AS name, IIF( NCHAR(0x0027) = NCHAR(0x2019) COLLATE ' + name + ', 1,0) AS Equal'
FROM sys.fn_helpcollations()
FOR XML PATH('')
), 12, 0+ 0x7fffffff)
)
+ ') t
ORDER BY Equal, name';
PRINT #dynSql;
EXEC (#dynSql);

Generate SQL insert statements of some records in a table

I try to auto generate insert SQL statement from an existing table in SQLServer 2008 but I do not need all record, only a small part of them. --> I thus need to filter the generated inserts. Adding a WHERE clause when generating the insert SQL statements would do the trick but I do not know how to do it.
This article answer to my question partly (SSMS internal generator) :
What is the best way to auto-generate INSERT statements for a SQL Server table?
But it exports all the data of a table. The insert scripts generated are not sorted thus I cannot filter the row I need easily (heavy manual work).
I also tried this stored procedure here (I also had to correct a part of the procedure to make it work with SQLServer 2008 replace char(255) by varchar as explained here)
But it is still not working : I get the following error :
Msg 8169, Level 16, State 2, Line 6
Conversion failed when converting from a character string to uniqueidentifier.
Could you then give me the best way to auto generate SQL Insert in SQL server 2008 from a part of a portion of a table (thus not all the rows of the table) ?
I found a way myself using Excel.
Make needed query including WHERE clause in SSMS
Select all the result
Copy with header
Paste in Excel file here under in 4th row, 1st column
Change in macro output path
Change in cell table name
Launch macro
--> take the file generated and you have a copy of your data ready to be insert again
https://dl.dropboxusercontent.com/u/49776541/GenerateInsert.xlsm
You can use merge syntax to insert data in table based on specific condition
using merge you can also delete and update data in table.you can also do
multiple operation in single sql statement.
There is an easier way to do this, other than going through all the fuss of an excel sheet.
This will return all the data in a table (much like the GUI version) where you right click on the database and select “Tasks” then select “Generate scripts”.
However, unlike the GUI version or the “export to excel” version, with this line of code, you can specify a filter in a “WHERE” clause to return only items for a particular day, or range of days, or any other filter that would normally be used in a “WHERE” clause.
In the code below, I am using 2 simple tables. One is populated with data, the other is not. I want to transfer some or all of the data from table2 to table3. Again, I can filter by date or parts of other columns. (for example… WHERE colB LIKE 'ging%';
This will generate a string of “INSERT” statements preformed in SQL query ready to run.
Note, before running this, switch your output display in SQL server from “Grid” to “Text”.
SELECT 'INSERT', + 'INTO', + 'TestTable3', + '(', + 'colA', + ',', + 'colB', + ',', + 'colDate', + ')', + 'values', + '(', + '''', + CAST(colA AS VARCHAR(10)), + '''', + ',', + '''', + CAST(colB AS VARCHAR(10)), + '''', + ',', + '''', + CAST(DATEADD(DAY, -1, GETDATE()) AS DATE) AS 'colDate', + '''', + ')', + ';'
FROM TestTable2
WHERE colDate LIKE '2018-10-14';
GO
Here is a snippet of what this will return.
Simply copy/paste the results into a new query and run it.
Too easy.

Join query with user provided data-T-sql

I have a requriment where user will provide many Ids(in Hundres/thousands) in a Text area in vb.net app, I need to use these IDs in T-Sql(Sql Server) to get the data. I dont want to save these Ids in any database table. Just want to pass using a paramater (type of varchar(max)) and use in the procedure.
actually, only read access is permitted for the vb application users.It is Sql-2005 database.Id field is atleaset 12 to 15 characters length.The user will copy/paste data from other source may be CSV or Excel file.
any idea how can i achive this.
any help is appreciated.
Thanks
If you do not want to use Table Valued Parameters, as suggested elsewhere, and you don't want to store the ID's in a temporary table, you can do the following.
Assuming your ID's are integer values, that are separated by commas, in the parameter string, you can use the LIKE operator in your SQL-statement's WHERE filter:
Say you have a parameter, #IDs of type varchar(max). You want to get only those records from MyTable where the ID-column contains a value that has been typed into the #IDs-parameter. Then your query should look something like this:
SELECT * FROM MyTable
WHERE ',' + #IDs + ',' LIKE '%,' + CAST(ID AS varchar) + ',%'
Notice how I prepend and append an extra comma to the #IDs parameter. This is to ensure that the LIKE operator will still work as expected for the first and last ID in the string. Make sure to take the nescessary precautions against SQL injection, for example by validating that users are only allowed to input integer digits and commas.
Try using CHARINDEX:
DECLARE #selectedIDs varchar(max) = '1,2,3'
Set #selectedIDs = ',' + #selectedIds + ',' --string should always start and end with ','
SELECT * FROM YourTable
WHERE CHARINDEX(',' + CAST(IDColumn as VARCHAR(10)) + ',', #selectedIds) > 0

Resources