varType usage in RODBC - sql-server

I am writing to from an ODBC to a SQL Server table via the RODBC package, specifically the function sqlSave. It seems that the default var types is charvar(255) for this function. I tried to use the argument of varTypes that is listed within the documentation but it fails.
Here is the table called spikes20 with the Class structure, this in turn is what I am trying to save via sqlSave
sapply(spikes20, class)
Date Day EWW PBR BAC CHTP FB SPY
"Date" "factor" "numeric" "numeric" "numeric" "numeric" "numeric" "numeric"
Here is the code which attempts to write to the SQL Server
require(RODBC)
varTypes = c(as.Date="Date")
channel <-odbcConnect("OptionsAnalytics", uid="me", pwd="you")
sqlSave (channel, spikes20, tablename = NULL, append=TRUE, rownames = FALSE, colnames = TRUE, safer = FALSE, addPK = FALSE, varTypes=varTypes )
The error message that I get says:
Warning messages:
In sqlSave(channel, spikes20, tablename = NULL, append = TRUE, rownames = FALSE, :
column(s) as.Date 'dat' are not in the names of 'varTypes'
I tried to change the varType to:
varTypes=c(Date="Date")
then the error message becomes:
Error in sqlSave(channel, spikes20, tablename = NULL, append = TRUE, rownames = FALSE, :
[RODBC] Failed exec in Update
22007 241 [Microsoft][ODBC SQL Server Driver][SQL Server]Conversion failed when converting date and/or time from character string.
Any help will be appreciated. It seems I cannot decipher to use varTypes correctly...

First, are you really trying to append to a table named NULL?
As far as issues with varTypes goes, in my experience I have had to provide a mapping for all of the variables in the data frame even though the documentation for the varTypes argurment says:
"an optional named character vector giving the DBMSs datatypes to be used for
some (or all) of the columns if a table is to be created"
You need to make sure that the names of your varTypes vector are the column names and the values are the data types as recommended here. So following their example you would have:
tmp <- sqlColumns(channel, correctTableName)
varTypes = as.character(tmp$TYPE_NAME)
names(varTypes) = as.character(tmp$COLUMN_NAME)

varTypes = c(somecolumn="datetime") works for me.

Related

Reporting WHICH field caused an error in the pyodbc call to SQL Server

I have a collection of python files, each of which uses a number of calls of the form:
sql_str = "exec SP_UPDATE;"
try:
cursor.execute(sql_str)
except pyodbc.OperationalError as error:
print("Connection issue. connection failure")
except pyodbc.ProgrammingError as error:
# Note this error is specific to a table and a row
log_error(cursor, __file__, '', 'SP_UPDATE', 0,
"Failure running update production table for SP_UPDATE"
"SP_UPDATE" +
"",
f'pyodbc error message: {error}')
The SP's look similar too:
CREATE PROCEDURE [].[SP_UPDATE]
#id_something [nvarchar] (500) NULL,
#d_dt_start [date] NULL,
#d_dt_end [date] NULL
...
AS
BEGIN
something
END
Now the code works - most of the time. But I'm concerned about when the code does NOT work, which is always due to bad data. This code fragment is called in a loop. I should save the row number. That's helpful. But I would really like to know which column caused the problem and report that as well, because the real dataset has anywhere from 7 to over 200 columns. Is there way to figure that out?

postgres insert string to numeric column - auto-typecast does not happen

There is a table in postgres DB test1 having schema :
We are using spring frameworks jdbcTemplate to insert data as below:
Object[] params = {"978","tour"};
jdbcTemplate.update("insert into test1 values (?,?)", params);
But this gives the exception :
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [insert into test1 values (?,?)]; nested exception is org.postgresql.util.PSQLException: ERROR: column "id" is of type integer but expression is of type character varying
ERROR: column "id" is of type integer but expression is of type character varying
This works for Oracle database through implicit type conversion, but postgres does nOt seem to work that way.
Could this be an issue with postgres driver?
A workaround would be to cast explicitly:
insert into test1 values (?::numeric ,?)
But is there better way to do the conversion as this does not seem like a good solution since there are lot of queries to be modified and also there can be other such casting issues too.
Is there some parameter that can be set at DB level to perform an auto cast?
We found the answer here
Storing json, jsonb, hstore, xml, enum, ipaddr, etc fails with "column "x" is of type json but expression is of type character varying"
A new connection propertyshould be added :
String url = "jdbc:postgresql://localhost/test";
Properties props = new Properties();
props.setProperty("user","fred");
props.setProperty("password","secret");
props.setProperty("stringtype", "unspecified");
Connection conn = DriverManager.getConnection(url, props);
https://jdbc.postgresql.org/documentation/94/connect.html
"If stringtype is set to unspecified, parameters will be sent to the server as untyped values, and the server will attempt to infer an appropriate type. This is useful if you have an existing application that uses setString() to set parameters that are actually some other type, such as integers, and you are unable to change the application to use an appropriate method such as setInt()"
Yeah, drop the double quotes here:
Object[] params = {"978","tour"};
Becomes
Object[] params = {978,"tour"};
Alternatively do the casting as you mentioned.

JOOQ fails with PostgreSQL Custom Type as an Array: ERROR: malformed record literal

I have the following custom type on Postgres:
CREATE TYPE my_custom_type AS (
field_a VARCHAR,
field_b NUMERIC(10,3)
);
and the following table:
CREATE TABLE my_table
(
COL1 VARCHAR(120) NOT NULL,
CUSTOM_COLUMN my_custom_type,
CUSTOM_COLUMN_ARRAY my_custom_type[]
);
Everything works fine when I use my custom type with JOOQ:
#Test
public void testWithoutArray(){
MyTableRecord record = dsl.newRecord(MyTable.MY_TABLE);
record.setCol1("My Col1");
MyCustomType customType = new MyCustomType();
customType.setFieldA("Field A Val");
customType.setFieldB(BigDecimal.ONE);
record.setCustomColumn(customType);
record.store();
}
However, when I try to set some value in the field mapped to a custom type array, I have the following error:
#Test
public void testWithArray(){
MyTableRecord record = dsl.newRecord(MyTable.MY_TABLE);
record.setCol1("My Col1");
MyCustomTypeRecord customType = new MyCustomTypeRecord();
customType.setFieldA("Field A Val 1");
customType.setFieldB(BigDecimal.ONE);
MyCustomTypeRecord customType2 = new MyCustomTypeRecord();
customType2.setFieldA("Field A Val 2");
customType2.setFieldB(BigDecimal.TEN);
record.setCustomColumnArray(new MyCustomTypeRecord[]{customType, customType2});
record.store();
}
org.jooq.exception.DataAccessException: SQL [insert into "my_table" ("col1", "custom_column_array") values (?, ?::my_custom_type[]) returning "my_table"."col1"]; ERROR: malformed record literal: "my_custom_type"(Field A Val 1, 1)"
Detail: Missing left parenthesis.
at org.jooq.impl.Utils.translate(Utils.java:1553)
at org.jooq.impl.DefaultExecuteContext.sqlException(DefaultExecuteContext.java:571)
at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:347)
at org.jooq.impl.TableRecordImpl.storeInsert0(TableRecordImpl.java:176)
at org.jooq.impl.TableRecordImpl$1.operate(TableRecordImpl.java:142)
at org.jooq.impl.RecordDelegate.operate(RecordDelegate.java:123)
at org.jooq.impl.TableRecordImpl.storeInsert(TableRecordImpl.java:137)
at org.jooq.impl.UpdatableRecordImpl.store0(UpdatableRecordImpl.java:185)
at org.jooq.impl.UpdatableRecordImpl.access$000(UpdatableRecordImpl.java:85)
at org.jooq.impl.UpdatableRecordImpl$1.operate(UpdatableRecordImpl.java:135)
at org.jooq.impl.RecordDelegate.operate(RecordDelegate.java:123)
at org.jooq.impl.UpdatableRecordImpl.store(UpdatableRecordImpl.java:130)
at org.jooq.impl.UpdatableRecordImpl.store(UpdatableRecordImpl.java:123)
The query generated by JOOQ debugg is the following:
DEBUG [main] org.jooq.tools.LoggerListener#debug:255 - Executing query : insert into "my_table" ("col1", "custom_column_array") values (?, ?::my_custom_type[]) returning "my_table"."col1"
DEBUG [main] org.jooq.tools.LoggerListener#debug:255 - -> with bind values : insert into "my_table" ("col1", "custom_column_array") values ('My Col1', array[[UDT], [UDT]]) returning "my_table"."col1"
Am I missing some configuration or is it a bug?
Cheers
As stated in the relevant issue (https://github.com/jOOQ/jOOQ/issues/4162), this is a missing piece of support for this kind of PostgreSQL functionality. The answer given in the issue so far is:
Unfortunately, this is an area where we have to work around a couple of limitations of the PostgreSQL JDBC driver, which doesn't implement SQLData and other API (see also pgjdbc/pgjdbc#63).
Currently, jOOQ binds arrays and UDTs as strings. It seems that this particular combination is not yet supported. You will probably be able to work around this limitation by implementing your own custom data type Binding:
http://www.jooq.org/doc/latest/manual/code-generation/custom-data-type-bindings/

Sql Server: getting the names of the objects involved in errors [duplicate]

How do I correctly extract specific info from an sql error message number 547?
Info Required:
Table Name
Constraint Name
Column Name
Code:
Try
....
Catch ex As System.Data.SqlClient.SqlException
If ex.Number = 547 Then
End If
End Try
Sample message:
UPDATE statement conflicted with COLUMN CHECK constraint
'CK_Birthdate'. The conflict occurred in database 'Northwind', table
'Employees', column 'BirthDate'.
There is no straight forward way of getting these pieces of information separately.
It all gets concatenated into the error message.
You can use select * from sys.messages where message_id=547 to see the various different language formats of the message that you would need to deal with in order to extract the constituent parts then perhaps use regular expressions with capturing groups based around this information.
In addition to queries, here's a powershell script which wraps the sys.messages queries.
http://blogs.msdn.com/b/buckwoody/archive/2009/04/30/and-the-winner-is-get-sql-server-error-messages-from-powershell.aspx
its true there is no straight way to fix this but I did this insted
var str = sqlException.Message.ToString();
var strlist = str.Split(',', StringSplitOptions.RemoveEmptyEntries);
var streplace = strlist[1];
streplace = streplace.Replace("table \"dbo.", "");
streplace = streplace.Replace("\"", ""); //this will get the data table name
streplace = string.Concat(streplace.Select(x => Char.IsUpper(x) ? " " + x : x.ToString())).TrimStart(' ');

Error with sqlSave

I'm fighting with sqlSave to add my matrix B that looks like this:
Noinscr
88877799
45645687
23523521
45454545
to an SQL table.
so I run the following command:
sqlSave(channel, b, "[testsFelix].[dbo].[TREB]", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE, fast = FALSE)
and I get the following error:
Erreur dans sqlSave(channel, b, "[testsFelix].[dbo].[TREB]", append = TRUE, :
42S01 2714 [Microsoft][SQL Server Native Client 10.0][SQL Server]
There is already an object named 'TREB' in the database.
[RODBC] ERROR: Could not SQLExecDirect
'CREATE TABLE [testsFelix].[dbo].[TREB] ("Noinscr" int)'
Seeing that it didn't want to erase the table, even if append=TRUE is there, I've tried to erase my SQL table and ran the same code again.
I get the following error:
Erreur dans sqlColumns(channel, tablename) :
‘[testsFelix].[dbo].[TREB]’: table not found on channel
So I'm confused, when I want to append R says it can't because the table is there and when the table is not there, R says it can't put info in it because the table is not there. I went into SQL to verify that nothing happened, but I saw that R had created the table with the right Column Name (Noinscr) but the table is empty.
Please tell me what I am doing wrong.
Thank you
I had the same problem. What I realized is that by default sqlSave would create the table in the 'Master' schema. I launched the ODBC Data Source Administrator and changed the default database and selected the desired database and it worked.
I found this post googling for a similar problem. The problem persisted after restarting R, as well as a system re-boot. I narrowed the problem down to the database, by opening a new connection to different database, and writing to that using sqlSave.
Weirdly, the problem with the original database was corrected by opening and closing it using R:
DBchannel <- odbcConnectAccess(access.file = "C:/myPath/Data.mdb")
odbcClose(DBchannel)
After doing this, the following test worked just fine:
require(RODBC)
dd <- data.frame('normal' = rnorm(100), 'uniform' = runif(100))
DBchannel <- odbcConnectAccess(access.file = "C:/myPath/Data.mdb")
sqlSave(DBchan, dd, tablename='testtable')
odbcClose(DBchannel)
(which is nice, as my initial (non-)solution was to re-build the database)
I have struggled witrh same issue with you. I can call odbcQuery to insert data line by line. However, my data.frame has tens of miliions of line. It's kind of to oslow by insert. If your data set is not large, you may try it.
The problem is that you wrote the tablename parameter as "[testsFelix].[dbo].[TREB]" when you have to write it as "[dbo].[TREB]" ommiting tha database.
You have to change the database of your odbc channel to the one you are interested. In the odbc administrator in Microsoft. Maybe the problem is that the default database is one different than [testsFelix]
Therefore the solution that I had to your problem was
change database of your channel to [testsFelix], in Microsoft through the odbc administrator
tablename parameter in sqlSave does not expect the database, therfore you have to write as [schema].[tablename] sintaxis
sqlSave(channel, b, "[dbo].[TREB]", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE)
By the way. In my case is faster to insert values in blocks of 1000 observations.
try the trick:
vals = paste0("('", b$Field1 , "','",
b$Field2 , "','",
b$Field3 , "','",
b$lastField, "')", collapse = ",")
sqlQuery(channel,
query = paste0("INSERT INTO [testsFelix].[dbo].[TREB]
values", vals), as.is = TRUE)
Please try this
sqlSave(channel, b, "_b", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE, fast = FALSE)
What I found is that the Excel will add a "_" in front of the default filename, if you add this to the filename, Excel will find the table.
You have to remove your brackets ([]), and then it should run fine.

Resources