Convert a series of character of different single quotes to nvarchar - sql-server

I am working on an SQL query which uses multiple queries which use temp tables to prepare required data to the final report query. An application which uses DevExpress reporting engine, generates a "where" clause automatically according to some pre-settings. The result of the desired filter is something like this:
-- the part "where 1=1" is not generated. it is added for the purpose of visualizing and demonstrating
where 1=1 AND (#inv IN(4)) AND (#date = convert(datetime, '2021-02-21 23:59:59:000', 121) )
I would like to convert the generated filter into a nvarchar for farther string process, but the single quote in the convert function for the date will always ruin any trial of surrounding the statements with single quote(s) for escaping.
so my question is, considering that I have a statement like:
AND (#para1 IN(4)) AND (#date = convert(datetime, '2021-02-21 23:59:59:000', 121) )
how could I deal with it as string of type nvarchar using SQL Server for farther parsing?
knowing that I have no control over the code of the application which generates the filter string
Edit2:
Additionally knowing that adding quotes is only allowed at the beginning and end only,
and the Application which generates the filters is a compiled app. The app takes the original SQL query with a placeholder {0} to fill up the generated filters. Unfortunately, the app does not allow using next placeholder which should be {1}, so I am stuck to use the whole filter string using placeholder {0} only. So, If there is something in SQL or idea that could convert the generated filter string mentioned above to an nvarchar, so that I could write a function to take the values of each filter and reuse them into my SQL query

Related

Snowflake - how to display column name using function?

I did not get much help from Snowflake documentation about how I can take give column names using snowflake functions.
I have automated report which will do calculation for the given dates. here it is
select
sum(case when logdate = to_date(dateadd('day', - 10, '2019-11-14')) then eng_fees + data_fees end) AS to_date(dateadd('day', - 10, '2019-11-14'))
from myTable
where logdate = '2019-11-04'
I am getting following output for my column name below
to_date(dateadd('day', - 10, '2019-11-14'))
100
My expected output for my column name
2019-11-04
100
how can I print expected date as column name in Snowflake?
Your statement is using an AS to name your column. Snowflake will treat that as a literal, not a calculation. In order to do what you're requesting inside a function, you'll need to use a Javascript Function, I think. This will allow you to dynamically build the SQL Statement with your calculated column name predefined.
https://docs.snowflake.net/manuals/sql-reference/udf-js.html
You can't have dynamic column names without using external functionality (or TASK scheduling).
You can create a JavaScript Stored Procedure that generates a VIEW where the column names can be set by dynamic parameters / expressions.
The normal way of handling this is to use a reporting tool that can display your fixed column result set with dynamic headers or run dynamic SQL altogether.
I see 2 paths of getting close to what you want:
1) you can simply use UNION to have your headers displayed as the first row and change the actual column aliases into 1....N - so that they provide the number of column - this is going to be the fastest and the cheapest
2) you can use dynamic SQL to generate a query that will have your alias names dynamically filled and then simply run it (and you can use to example PIVOT to construct such query)

Postgres: is there any row_to_json equivalent that returns values only?

In a project I'm working on, I need to stream potentially large data sets from a Postgres database to the client, for analytics purposes.
The application is built in Rails (irrelevant for this question) and after a bit of research I'm currently able to stream query results by using COPY in Postgres:
COPY (SELECT row_to_json(t) from (#{query}) t) TO STDOUT;
Sources (for who's interested):
https://shift.infinite.red/fast-csv-report-generation-with-postgres-in-rails-d444d9b915ab
https://github.com/brianhempel/stream_json_demo
This works, but it yields every row as a key-value pair, e.g.:
["{\"id\":403457,\"email\":\"email403457#example.com\",\"first_name\":\"Firstname403457\",\"last_name\":\"Lastname403457\",\"source\":\"adwords\",\"created_at\":\"2015-08-05T22:43:07.295796\",\"updated_at\":\"2017-01-19T04:48:29.464051\"}"]
In the spirit of minimising the size (in bytes) of the response and especially since this is getting served through the web, I want to return just an array of values for every row, i.e.:
["[403457, \"email403457#example.com\", \"Firstname403457\", \"Lastname403457\", \"adwords\", \"2015-08-05T22:43:07.295796\", \"2017-01-19T04:48:29.464051\"]"]
Is there a way to achieve this within Postgres, even by nesting functions, starting from the query above?
You could create a simple SQL function that converts a row into the desired format:
CREATE FUNCTION row2json(anyelement) RETURNS json
LANGUAGE sql STABLE AS
'SELECT json_agg(z.value) FROM json_each(row_to_json($1)) z';
Then you use that to transform the output:
SELECT row2json(mytab) FROM mytab;
If performance is more important than JSON output, just cast the result to a string:
SELECT CAST(mytab AS text) FROM mytab;

Use String parameter for RegEx in query

In my query (the database is a sql server) I use a RegEx for a select command like this:
SELECT * FROM test WHERE id LIKE '1[2,3]'
(This query is tested and returns the data I want)
I want to use a paramter for this RegEx. For that I definded the Paramter in iReport $P{id} as a string and the value is "1[2,3]".
In my query I use now this parameter like this:
SELECT * FROM test WHERE id LIKE $P{id}
As result I get a blank page. I think the problem is that the value of the parameter is defined with " ". But with ' ' I get a compiler error that the paramter isn't a string.
I hope someone can help me.
LIKE applies to text values, not to numeric values. Since id is numeric use something like this:
SELECT * FROM test WHERE id IN (12, 13)
with the parameter
SELECT * FROM test WHERE id IN ($P!{id_list})
and supply a comma separated list of ids for the parameter. The bang (!) makes sure that the parameter will be inserted as-is, without string delimiters.
Btw: LIKE (Transact-SQL) uses wildcards, not regex.
You can still use LIKE since there exists an implicit conversion from numeric types to text in T-SQL, but this will result in a (table or index) scan, where as the IN clause can take advantage of indexes.
The accepted answer works but it is using String replacement, read more about sql-injection, to understand why this is not good practice.
The correct way to execute this IN query in jasper report (using prepared statement) is:
SELECT * FROM test WHERE $X{IN, id, id_list}
For more information as the use of NOTIN, BETWEEN ecc. see JasperReports sample reference for query

Possible to replace digits in T-SQL

SQL Server 2008 (but have access to higher versions too)
I'm getting a string from another database on the same server. Using the below code i get some data and replace the content
INSERT INTO [DestinationDatabase].[DBO].[Table](ID, XML)
(SELECT ID, REPLACE(XML,'ReferenceID="1234"','PropertyID="2468"')
FROM [SourceDatabase].[DBO].[Customers]
This works as expected but every record has a different ReferenceID so is there a way to remove the current ReferenceID value as in the 4 digits (theres around 1000 records with different values) and replace it with another 4 digit value?
I will get the replacement value from another procedure but at this stage i need to know if it possible to find and strip the 4 digits and replace them.
If you want to use the replace function you can do it like that
REPLACE(XML,'ReferenceID="'+cast(table.field as nvarchar)+'"','ReferenceID="2468"')
REPLACE(XML,'ReferenceID="'+cast(table.field as nvarchar)+'"','ReferenceID="'+cast(table.another_field as nvarchar)+'"')
You can use xml function to do so but it seems like your XML column is not xml data type. is that correct.

SQL LIKE Operator doesn't work with Asian Languages (SQL Server 2008)

Dear Friends,
I've faced with a problem never thought of ever. My problem seems too simple but I can't find a solution to it.
I have a sql server database column that is of type NVarchar and is filled with standard persian characters. when I'm trying to run a very simple query on it which incorporates the LIKE operator, the resultset becomes empty although I know the query term is present in the table. Here is the very smiple example query which doesn't act corectly:
SELECT * FROM T_Contacts WHERE C_ContactName LIKE '%ف%'
ف is a persian character and the ContactName coulmn contains multiple entries which contain that character.
Please tell me how should I rewrite the expression or what change should I apply. Note that my database's collation is SQL_Latin1_General_CP1_CI_AS.
Thank you very much
Also, if those values are stored as NVARCHAR (which I hope they are!!), you should always use the N'..' prefix for any string literals to make sure you don't get any unwanted conversions back to non-Unicode VARCHAR.
So you should be searching:
SELECT * FROM T_Contacts
WHERE C_ContactName COLLATE Persian_100_CI_AS LIKE N'%ف%'
Shouldn't it be:
SELECT * FROM T_Contacts WHERE C_ContactName LIKE N'%ف%'
ie, with the N in front of the comparing string, so it treats it like an nvarchar?

Resources