db2 insert failure when ' present (even escaping doesn't work) - database

I am running an insert command on db2 like the following:
insert into uinfo.transaction (TRANSACTION_ID, DATE,TIME,ID,USER,DESC) values
(14,20110311,36909,97,2497580,'Note:9045-02 2=34 ///' 2eq034d,xw d""::: 214l 23e;l2')
It gave an error saying, During SQL processing it returned:
SQL0103N The numeric literal "2034d" is not valid. SQLSTATE=42604
So, I tried escaping the ' as following:
insert into uinfo.transaction (TRANSACTION_ID, DATE,TIME,ID,USER,DESC) values
(14,20110311,36909,97,2497580,'Note:9045-02 2=34 ///\' 2eq034d,xw d""::: 214l 23e;l2')
It still fails saying the same thing. During SQL processing it returned:
SQL0103N The numeric literal "2034d" is not valid. SQLSTATE=42604
Any idea what is wrong above and can I overcome this?

To include ' in a string you need to double it. E.g.: 'ab''cd'.
For details read the Character string constants section on http://publib.boulder.ibm.com/infocenter/db2luw/v9r8/index.jsp?topic=/com.ibm.db2.luw.sql.ref.doc/doc/r0000731.html.

Related

Snowflake JSON with foreign language to tabular format dynamically

I read through snowflake documentation and the web and found only one solution to my problem by https://stackoverflow.com/users/12756381/greg-pavlik which can be found here Snowflake JSON to tabular
This doesn't work on data with Russian attribute names and attribute values. What modifications can be made for this to fit my case?
Here is an example:
create or replace table target_json_table(
v variant
);
INSERT INTO target_json_table SELECT parse_json('{
"at": {
"cf": "NV"
},
"pd": {
"мо": "мо",
"ä": "ä",
"retailerName": "retailer",
"productName":"product"
}
}');
call create_view_over_json('target_json_table', 'V', 'MY_VIEW');
ERROR: Encountered an error while creating the view. SQL compilation error: syntax error line 7 at position 7 unexpected 'ä:'. syntax error line 8 at position 7 unexpected 'мо'.
There was a bug in the original SQL used as a basis for the creation of the stored procedure. I have corrected that. You can get an update on the Github page. The changed section is here:
sql =
`
SELECT DISTINCT '"' || array_to_string(split(f.path, '.'), '"."') || '"' AS path_nAme, -- This generates paths with levels enclosed by double quotes (ex: "path"."to"."element"). It also strips any bracket-enclosed array element references (like "[0]")
DECODE (substr(typeof(f.value),1,1),'A','ARRAY','B','BOOLEAN','I','FLOAT','D','FLOAT','STRING') AS attribute_type, -- This generates column datatypes of ARRAY, BOOLEAN, FLOAT, and STRING only
'"' || array_to_string(split(f.path, '.'), '.') || '"' AS alias_name -- This generates column aliases based on the path
FROM
#~TABLE_NAME~#,
LATERAL FLATTEN(#~COL_NAME~#, RECURSIVE=>true) f
WHERE TYPEOF(f.value) != 'OBJECT'
AND NOT contains(f.path, '[') -- This prevents traversal down into arrays
limit ${ROW_SAMPLE_SIZE}
`;
Previously this SQL simply replaced non-ASCII characters with underscores. The updated SQL will wrap key names in double quotes to create non-ASCII key names.
Be sure that's what you want it to do. Also, the keys are nested. I decided that the best way to handle that is to create column names in the view with dot notation, for example one column name is pd.ä. That will require wrapping the column name with double quotes, such as:
select * from MY_VIEW where "pd.ä" = 'ä';
Final note: The name of your stored procedure is create_view_over_json, however, in the Github project the name is create_view_over_variant. When you update, be sure to call the right procedure.

How to escape from a string with single quote inside to do an insert?

I'm trying to insert N"Pepito Malasanya O'Dhogerty" on Python 3 but cannot escape the single quote, the error on PYODBC is:
"Incorrect syntaxis".
I know that I have to add another one (O''Dhogerty) but I did my best trying everything I saw and didn't fix it.
That's what I did so far:
string = string.replace("'", "\\'\\'")
UPDATED:
It didn't work because I was using 'N' front of the string (because I have characters like "ª")
Now I have:
row[0] = row[0].replace("'","\'\'")
pyodbc.ProgrammingError: ('42S22', "[42S22] [Microsoft][ODBC SQL Server Driver][SQL Server]El nombre de columna 'Pepito Malasanya O''Dhogerty' no es válido. (207) (SQ
LExecDirectW)")
What am I doing wrong? I'm going to check parameterized query as you said.
SOLUTION:
I was doing the execute method with '{}'.format... but it doesnt work with inserts with such special characters, i changed to cursor.execute("INSERT INTO table VALUES (?,?,?,?,?,?,?,?,?,?)",value1, value2....) and ANY PROBLEM
Thanks anyway for the help

SQL-manipulating strings

I'll try and make this clear...
Let's say I have a table with 2 columns. issue_number and issue_text. I need to grab 2 strings out of the issue_text column. The first string is something that can be hard coded with case statements since there are only so many types of issues that can be logged (note, i know this isn't the best way)
case
when issue_text like '%error%' then 'error'
else 'not found'
end as error_type
the issue_text is a string that will be formatted mostly the same, it'll have an error, more info, then an incident number, and that is the end of the string.
i.e. "Can't add address. Ref Number: 9999999"
the problem I'm having is the number will not always be the same amount of characters away from the error message.
I was wondering if there is a way to access the substring that causes a match from the like clause. like another case statement using a regex(which i know aren't supported well in sql)
case
when issue_text like '%[0-9 .]%' then (the substring match from like '%[0-9 .]%')
else 00000
end as issue_number
I am restricted to solving this issue and parsing these strings from SQL Server Management Studio or yes, I'd use .net or something to leverage.
Declare #YourTable table (ID int,issue_text varchar(150))
Insert Into #YourTable values
(1,'Can''t add address. Ref Number: 9999999'),
(2,'error')
Select ID
,Issue = Left(issue_text,PatIndex('%:%',issue_text+':')-1)
,IssueNo = substring(issue_text,PatIndex('%:%',issue_text+':')+2,25)
From #YourTable
Returns
ID Issue IssueNo
1 Can't add address. Ref Number 9999999
2 error
If there's always a space just before the number and the number is the last part of the string you can do
RIGHT(issue_text, CHARINDEX(' ', REVERSE(issue_text)) - 1)

Error in VB.Net -- Conversion failed when converting from a character string to uniqueidentifier

I get a "Conversion failed when converting from a character string to uniqueidentifier."
I am using a String on the VB side and a GUID on the database side.....
Is there an equivalent field that I can use on the VB side that can work well with a "uniqueidentifier" data type in the Sql Server
If you already have your value as a string and since you are writing out your SQL by hand, you can CONVERT it like this:
strSql.Append("INSERT INTO tableName ")
strSql.Append("(GUID, ParentObsSetGUID, ChildObsSetGUID, ChildObsItemGUID) ")
strSql.Append(String.Format("VALUES (CONVERT(uniqueidentifier, '{0}'), CONVERT(uniqueidentifier, '{1}'), CONVERT(uniqueidentifier, '{2}'), CONVERT(uniqueidentifier, '{3}'))", parmList.ToArray))
EDIT: If you have an empty string and you need a new Guid, then do this:
parmList.Add(Guid.NewGuid().ToString())
instead of
parmList.Add(String.Empty)
If you would rather insert a NULL into the GUID column, then you need to change the last bit of your code to be like this instead:
parmList.Add(dtNewGUID.Rows(0).Item(0).ToString)
parmList.Add(dtResultParentGUID.Rows(0).Item(0).ToString)
parmList.Add(dtResultChildGUID.Rows(0).Item(0).ToString)
// remove the line with the empty string parameter
strSql.Append("INSERT INTO tableName ")
strSql.Append("(GUID, ParentObsSetGUID, ChildObsSetGUID, ChildObsItemGUID) ")
strSql.Append(String.Format("VALUES (CONVERT(uniqueidentifier, '{0}'), CONVERT(uniqueidentifier, '{1}'),
CONVERT(uniqueidentifier, '{2}'), NULL)", parmList.ToArray))
// Note the change to the last line. '{3}' becomes NULL.
// Make sure you remove the single quotes
NOTE: Your code as it stands (and this answer) is/are vulnerable to a SQL Injection attack, but that's another matter. At least with this answer you know how to convert the string to a uniqueidentifier.

Building dynamic query for Sql Server 2008 when table name contains " ' "

I need to fetch Table's TOP_PK, IDENT_CURRENT, IDENT_INCR, IDENT_SEED for which i am building dynamic query as below:
sGetSchemaCommand = String.Format("SELECT (SELECT TOP 1 [{0}] FROM [{1}]) AS TOP_PK, IDENT_CURRENT('[{1}]') AS CURRENT_IDENT, IDENT_INCR('[{1}]') AS IDENT_ICREMENT, IDENT_SEED('[{1}]') AS IDENT_SEED", pPrimaryKey, pTableName)
Here pPrimaryKey is name of Table's primary key column and pTableName is name of Table.
Now, i am facing problem when Table_Name contains " ' " character.(For Ex. KIN'1)
When i am using above logic and building query it would be as below:
SELECT (SELECT TOP 1 [ID] FROM [KIL'1]) AS TOP_PK, IDENT_CURRENT('[KIL'1]') AS CURRENT_IDENT, IDENT_INCR('[KIL'1]') AS IDENT_ICREMENT, IDENT_SEED('[KIL'1]') AS IDENT_SEED
Here, by executing above query i am getting error as below:
Incorrect syntax near '1'.
Unclosed quotation mark after the character string ') AS IDENT_SEED'.
So, can anyone please show me the best way to solve this problem?
Escape a single quote by doubling it: KIL'1 becomes KIL''1.
If a string already has adjacent single quotes, two becomes four, or four becomes eight... it can get a little hard to read, but it works :)
Using string methods from .NET, your statement could be:
sGetSchemaCommand = String.Format("SELECT (SELECT TOP 1 [{0}] FROM [{1}]) AS TOP_PK, IDENT_CURRENT('[{2}]') AS CURRENT_IDENT, IDENT_INCR('[{2}]') AS IDENT_ICREMENT, IDENT_SEED('[{2}]') AS IDENT_SEED", pPrimaryKey, pTableName, pTableName.Replace("'","''"))
EDIT:
Note that the string replace is now only on a new, third substitution string. (I've taken out the string replace for pPrimaryKey, and for the first occurrence of pTableName.) So now, single quotes are only doubled, when they will be within other single quotes.
You need to replace every single quote into two single quotes http://beyondrelational.com/modules/2/blogs/70/posts/10827/understanding-single-quotes.aspx

Resources