I'm fighting with sqlSave to add my matrix B that looks like this:
Noinscr
88877799
45645687
23523521
45454545
to an SQL table.
so I run the following command:
sqlSave(channel, b, "[testsFelix].[dbo].[TREB]", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE, fast = FALSE)
and I get the following error:
Erreur dans sqlSave(channel, b, "[testsFelix].[dbo].[TREB]", append = TRUE, :
42S01 2714 [Microsoft][SQL Server Native Client 10.0][SQL Server]
There is already an object named 'TREB' in the database.
[RODBC] ERROR: Could not SQLExecDirect
'CREATE TABLE [testsFelix].[dbo].[TREB] ("Noinscr" int)'
Seeing that it didn't want to erase the table, even if append=TRUE is there, I've tried to erase my SQL table and ran the same code again.
I get the following error:
Erreur dans sqlColumns(channel, tablename) :
‘[testsFelix].[dbo].[TREB]’: table not found on channel
So I'm confused, when I want to append R says it can't because the table is there and when the table is not there, R says it can't put info in it because the table is not there. I went into SQL to verify that nothing happened, but I saw that R had created the table with the right Column Name (Noinscr) but the table is empty.
Please tell me what I am doing wrong.
Thank you
I had the same problem. What I realized is that by default sqlSave would create the table in the 'Master' schema. I launched the ODBC Data Source Administrator and changed the default database and selected the desired database and it worked.
I found this post googling for a similar problem. The problem persisted after restarting R, as well as a system re-boot. I narrowed the problem down to the database, by opening a new connection to different database, and writing to that using sqlSave.
Weirdly, the problem with the original database was corrected by opening and closing it using R:
DBchannel <- odbcConnectAccess(access.file = "C:/myPath/Data.mdb")
odbcClose(DBchannel)
After doing this, the following test worked just fine:
require(RODBC)
dd <- data.frame('normal' = rnorm(100), 'uniform' = runif(100))
DBchannel <- odbcConnectAccess(access.file = "C:/myPath/Data.mdb")
sqlSave(DBchan, dd, tablename='testtable')
odbcClose(DBchannel)
(which is nice, as my initial (non-)solution was to re-build the database)
I have struggled witrh same issue with you. I can call odbcQuery to insert data line by line. However, my data.frame has tens of miliions of line. It's kind of to oslow by insert. If your data set is not large, you may try it.
The problem is that you wrote the tablename parameter as "[testsFelix].[dbo].[TREB]" when you have to write it as "[dbo].[TREB]" ommiting tha database.
You have to change the database of your odbc channel to the one you are interested. In the odbc administrator in Microsoft. Maybe the problem is that the default database is one different than [testsFelix]
Therefore the solution that I had to your problem was
change database of your channel to [testsFelix], in Microsoft through the odbc administrator
tablename parameter in sqlSave does not expect the database, therfore you have to write as [schema].[tablename] sintaxis
sqlSave(channel, b, "[dbo].[TREB]", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE)
By the way. In my case is faster to insert values in blocks of 1000 observations.
try the trick:
vals = paste0("('", b$Field1 , "','",
b$Field2 , "','",
b$Field3 , "','",
b$lastField, "')", collapse = ",")
sqlQuery(channel,
query = paste0("INSERT INTO [testsFelix].[dbo].[TREB]
values", vals), as.is = TRUE)
Please try this
sqlSave(channel, b, "_b", append = TRUE,
rownames = FALSE, colnames = FALSE, safer = TRUE, fast = FALSE)
What I found is that the Excel will add a "_" in front of the default filename, if you add this to the filename, Excel will find the table.
You have to remove your brackets ([]), and then it should run fine.
Related
I have a form that is linked to an ODBC (MS SQLServer), that is displaying a result of a VIEW, when I try to delete a record, I'm invoking a function at VBA level
Form_BeforeDelConfirm(Cancel As Integer, Response As Integer){
If (IsNull(Text104.value) = False) Then
Dim deleteLabel As DAO.QueryDef
Set deleteLabel = CurrentDb.CreateQueryDef("")
deleteLabel.Connect = CurrentDb.TableDefs("KontrollWerte").Connect
If (InStr(QuerySave, "KontrollWerteVIEW") <> 0) Then
deleteLabel.sql = "delete from KontrollWerte_Label where Kontrolle_Label_ID = " & Text104.value
End If
Close
End If
}
Error shown:
[Microsoft][SQL Server Native Client 11.0][SQL Server] View or function 'dbo.KontrollwerteVIEW' is not updatable because the modification affects multiple base tables. (#4450)
This error is okay, as the view is a select with multiple tables.
It seems the function is not called at all, and the "default" delete function of the MS Access is being called, there is a way to say to MS Access don't do the default delete and instead execute my sql statement inside of the Form_BeforeDelConfirm function?
Thanks!
I tried to change when the call of function is called, but no luck.
You have to ensure that the view allows updates.
When you link a table, and it is a view?
The during the creating of the table link, then you will see this prompt:
First, we select the "view" when linking that "view/table"
and note I also checked the Save password option.
Then next we get/see this:
DO NOT skip the above prompt. If you don't select the PK column for that view, then it will be read only.
FYI:
The above prompt to save password, and the above prompt to select the PK when linking to that view?
It DOES NOT appear on re-link, ONLY when adding a new link!!!!
So, that's why I stated to delete the link to sql server (not just re-link and not just re-fresh table link, but Adding for first time are the key instructions here).
So, yes, views are and can be up-datable, but you MUST answer the above prompt during the linking process in Access, or the result is a read-only table.
In sqlserver I have a function which generates a complex xml of all products with several tables joined: location, suppliers, orders etc.
No problem in that, it runs in 68 sec and produces around 450MB.
It should only be called occationally during migration to another server, so it doesn't matter it takes some time.
I want to make this available for download over webserver.
I've tried some variations of this in classic asp:
Response.Buffer = false
set rs=conn.execute("select cast(dbo.exportXML() as varchar(max)) as res")
response.write rs("res")
But I just get a standard
An error occurred on the server when processing the URL. Please contact the system administrator.
If you are the system administrator please click here to find out more about this error.
Not my usual custom 500-errorhandler, so I'm not sure how to find the error.
The problem is in response.write rs("res"), if i just do
temp = rs("res")
the script runs, but displays nothing of cause; if I then
response.write temp
I get the same failure.
So the problem is writing such a ling string.
Can I save the file from tsql directly; and run the job periodically from sql agent?
I found that there seems to be a limit on how much data can be written at once using Response.Write. The workaround I used was to break the data into chunks like this:
Dim Data, Done
Done = False
Do While Not Done
Data = RecordSet(0).GetChunk(8192)
If Not Len(Data) = 0 Then
Response.Write Data
Else
Done = True
End If
Loop
Try this:
Response.ContentType = "text/xml"
rs.CursorLocation = 3
rs.Open "select cast(dbo.exportXML() as varchar(max)) as res",conn
'Persist the Recordset in XML format to the ASP Response object.
'The constant value for adPersistXML is 1.
rs.Save Response, 1
I am writing to from an ODBC to a SQL Server table via the RODBC package, specifically the function sqlSave. It seems that the default var types is charvar(255) for this function. I tried to use the argument of varTypes that is listed within the documentation but it fails.
Here is the table called spikes20 with the Class structure, this in turn is what I am trying to save via sqlSave
sapply(spikes20, class)
Date Day EWW PBR BAC CHTP FB SPY
"Date" "factor" "numeric" "numeric" "numeric" "numeric" "numeric" "numeric"
Here is the code which attempts to write to the SQL Server
require(RODBC)
varTypes = c(as.Date="Date")
channel <-odbcConnect("OptionsAnalytics", uid="me", pwd="you")
sqlSave (channel, spikes20, tablename = NULL, append=TRUE, rownames = FALSE, colnames = TRUE, safer = FALSE, addPK = FALSE, varTypes=varTypes )
The error message that I get says:
Warning messages:
In sqlSave(channel, spikes20, tablename = NULL, append = TRUE, rownames = FALSE, :
column(s) as.Date 'dat' are not in the names of 'varTypes'
I tried to change the varType to:
varTypes=c(Date="Date")
then the error message becomes:
Error in sqlSave(channel, spikes20, tablename = NULL, append = TRUE, rownames = FALSE, :
[RODBC] Failed exec in Update
22007 241 [Microsoft][ODBC SQL Server Driver][SQL Server]Conversion failed when converting date and/or time from character string.
Any help will be appreciated. It seems I cannot decipher to use varTypes correctly...
First, are you really trying to append to a table named NULL?
As far as issues with varTypes goes, in my experience I have had to provide a mapping for all of the variables in the data frame even though the documentation for the varTypes argurment says:
"an optional named character vector giving the DBMSs datatypes to be used for
some (or all) of the columns if a table is to be created"
You need to make sure that the names of your varTypes vector are the column names and the values are the data types as recommended here. So following their example you would have:
tmp <- sqlColumns(channel, correctTableName)
varTypes = as.character(tmp$TYPE_NAME)
names(varTypes) = as.character(tmp$COLUMN_NAME)
varTypes = c(somecolumn="datetime") works for me.
I'm trying to run the following UPDATE query from a python script (note I've removed the database info):
print 'Connecting to db for update query...'
db = pyodbc.connect('DRIVER={FreeTDS};SERVER=<removed>;DATABASE=<removed>;UID=<removed>;PWD=<removed>')
cursor = db.cursor()
print ' Executing SQL queries...'
for i in range(len(data)):
sql = '''
UPDATE product.sanction
SET action_summary = '{action_summary}'
WHERE sanction_id = {sanction_id};
'''.format(sanction_id=data[i][0], action_summary=data[i][1])
cursor.execute(sql)
cursor.close()
db.commit()
db.close()
However, it hangs indefinitely, no error.
I'm new to pyodbc, but it should be setup correctly considering I'm having no problems performing SELECT queries. I did have to call CAST for SELECT queries (I've cast sanction_id AS INT [int identity on the database] and action_summary AS TEXT [nvarchar on the database]) to properly populate data, so perhaps the problem lies somewhere there, but I don't know where to start debugging. Converting the text to NVARCHAR didn't do anything either.
Here's an example of one of the rows in data:
(2861357, 'Exclusion Program: NonProcurement; Excluding Agency: HHS; CT Code: Z; Exclusion Type: Prohibition/Restriction; SAM Number: S4MR3Q9FL;')
I was unable to find my issue, but I ended up using QuerySets rather than running an UPDATE query.
We are in the process to migrate our SQL 2000 box to SQL 2008. But we ran into an issue; when a result set (rows or not) is returned by using a query that has a UNION. Later in the code we try to add a new row and assign field to it but because a UNION was used, when we try to assign a value to the field it gives us a Multiple-step operation generated errors. Check each status value. error. We tried the following code on a Windows XP & Windows 7 and got the same result. But when we change our connection string to point back to our SQL 2000 box we don't get that error any more.
The following example show the problem we are having.
var c = new ADODB.Connection();
var cmd = new ADODB.Command();
var rs = new ADODB.Recordset();
object recordsAffected;
c.Open("Provider=SQLOLEDB;Server=*****;Database=*****;User Id=*****;Password=*****;");
cmd.ActiveConnection = c;
cmd.CommandType = ADODB.CommandTypeEnum.adCmdText;
cmd.CommandText = "create table testing2008 (id int)";
cmd.Execute(out recordsAffected);
try {
cmd.CommandText = "select * from testing2008 union select * from testing2008";
rs.CursorLocation = ADODB.CursorLocationEnum.adUseClient;
rs.Open(cmd, Type.Missing, ADODB.CursorTypeEnum.adOpenDynamic, ADODB.LockTypeEnum.adLockBatchOptimistic, -1);
rs.AddNew();
rs.Fields["id"].Value = 0; //throws exception
rs.Save();
}
catch (Exception ex) {
MessageBox.Show(ex.ToString());
}
finally {
cmd.CommandText = "drop table testing2008";
cmd.Execute(out recordsAffected);
c.Close();
}
The link below is an article that gives a great breakdown of the 6 scenarios this error message can occur:
Scenario 1 - Error occurs when trying to insert data into a database
Scenario 2 - Error occurs when trying to open an ADO connection
Scenario 3 - Error occurs inserting data into Access, where a fieldname has a space
Scenario 4 - Error occurs inserting data into Access, when using adLockBatchOptimistic
Scenario 5 - Error occurs inserting data into Access, when using Jet.OLEDB.3.51 or ODBC driver (not Jet.OLEDB.4.0)
Scenario 6 - Error occurs when using a Command object and Parameters
http://www.adopenstatic.com/faq/80040e21.asp
Hope it may help others that may be facing the same issue.
It is type mismatch, try
rs.Fields["id"].Value = "0";
or make sure you assign a Variant to the value.
Since I posted this problem, we figured out that the problem was when you do a union the attributes on the fields are not bound (i.e. the attributes: basecatalog, basetable & basecolumn are empty) to remedy our problem we had to force the values of those attributes, by saving the recordset to xml (adPersistXML), change the xml and reopen the recordset from the xml. This rebound the fields and we were able to continue. We know this may not be the most efficient solution, but it was for an older app and we didn't want to rewrite the sql statements. It looks like the main error Multiple-step operation generated errors. Check each status value. is related to when an error occurs when a value is assigned to a field.
Two things I can think of... Make sure your "ID" column will accept a zero (0). Also - I've stopped this issue on one occasion by not using the adUseClient cursor (try server).
Many times this is a type mismatch, trying to stuff a NULL into a non-null column, or attempting to write more characters into a column than it's designed to take.
Hope this helps. - Freddo
Same issue occurred to me the problem was that i violated an object property , in my case it was size the error came out as
"IntegrationException: Problem (Multiple-step operation generated errors. Check each status value.)"
Imports ADODB
Dim _RecordSet As Recordset
_rs.Fields.Append("Field_Name", DataTypeEnum.adVarChar, 50)
_Recordset("Field_Name").Value = _RecordDetails.Field_NameValue
_RecordDetails.Field_NameValue length was more than 50 chars , so this property was violated , hence the error occurred .
I found another scenario:
When I was trying to set the value of a adLongVarChar field in a new record for a memory-only adodb.recordset. In my case, the error was triggered because the string I was passing had a buried unicode character.
I found this error when our legacy application was trying to parse 1/1/0001 12AM date and time. Looks like VB6 recordsets doesn't like that value.
To get rid of the errors, I had to set all the offending dates to null.
I was getting this error when trying to insert/update the field with a value that did not match the table>field type.
For example, the database table > field was
char(1)
however, I was trying to insert/update
"apple"
into the record.
Once I change the inputted value to "a" and it worked.