Update ADO Recordset for a Calculated Field - sql-server

I'm using the following sql for an ADO recordset against a SQLServer backend in VB6:
select c.name, taxid=
case when exists(select 1 from sometable where fld='abc')
then c.SSN else null end
When I try to update the taxid field in a row within the recordset locally ADO complains with the error "Multiple-step operation generated errors. Check each status value." I assume it's bothered by the fact that the taxid field is coming from a calculated field and not a raw table column. For my purposes I'm never going to be persisting these changes back to the database so I'm looking for a way to tell ADO that have no intent to persist changes so that it will allow me to change the data locally.

I think that #HK1's suggestion is a good one, though I'm not sure what happens to your ability to alter any of the values in the recordset whether the column you're trying to update is computed or not. It's been a long time since I played with classic ADO but if the recordset is disconnected it may become read only at that point.
But if you have no interest in using the recordset to perform updates, and you need to alter values locally, perhaps you should consider storing the results in a local array first? That way you can minimize the locking and cursor options of the recordset, for example, and immediately close the recordset and free up those resources.
rs.Open cmd, conn, adOpenForwardOnly, adLockReadOnly
Dim MyArray
MyArray = rs.GetRows()
rs.Close: set rs = nothing
Now you can manipulate MyArray however you want...

Related

Passing a table rrecordset to SQL server

Good afternoon experts!
I have encountered a very different type of problem than usual. In the past I have pass a single line to the server via 'Pass through Query' and at times when I need to pass more than a single record, I utilise the loop function to send the data to the server multiple times. However if I have over 40 lines of record that loop will take a considerable amount of time to complete. I am just wondering if there is a way to send a table set to the server in 1 move instead of X number of moves using loop.
This is the code I am using on Access side attached to a button within the form (recordsource is a local access table):
Dim db As dao.Database
Dim rs As dao.Recordset
Dim qdf As dao.QueryDef
Dim strSQL As String
Set db = CurrentDb
Set rs = Me.Recordset
rs.MoveLast
rs.MoveFirst
Do While Not rs.EOF
Set qdf = db.QueryDefs("Qry_Send_ClientData") 'local pass through query
strSQL = "EXEC dbo.SP_Client_Referral #JunctionID='" & Me.ClientID & "', #Note='" & Replace(Me.Txt_Note, "'", "") & "', #Value1='" & Txt_Value1 & "', #Value2='" & Txt_Value2 & "'"
qdf.SQL = strSQL
db.Execute "Qry_Send_ClientData"
rs.MoveNext
Loop
Msgbox "All Client Added!", , "Add client"
Now on the SQL server side I have the following Store Procedure (dbo.SP_Client_Referral) that receives the data from pass through query and insert the line of code onto a specific table
#ClientID AS NVARCHAR(15),
#Note As NVARCHAR(500),
#Value1 As NVARCHAR(50),
#Value2 As NVARCHAR(50)
AS
BEGIN
SET NOCOUNT ON;
BEGIN
INSERT INTO dbo.Client_Data(ClientID, Note, Value_1, Value_2)
SELECT #ClientID, #Note, #Value1, #Value2
END
END
For single record or even up to maybe 10 Records this method is relatively fast. However as the number of record increases, the amount of time required can be quite long. If there is a way to pass a table (i.e. Access side using SELECT * from LocalTable) to the SQL server as oppose to line by line would definitely save quite a lot of time. Just wondering if this method exists and if so how would I send a table and what must I use on SQL server side in the SP to receive a table record. Alternatively I may have to continue using this single line method and possibly making it more efficient so that it will execute faster.
Many thanks in advance for your assistance!
Actually, the fastest approach?
Well, it is one that seems VERY counter intuitive, and I could give a VERY long explain as to why. However, try this, you find it runs 10 times or better then what you have now. In fact, it may well be closer to a 100x then what you have.
We shall assume that we have a standard linked tale to dbo.Client_Data. Likely the link is Client_Data, or even dbo_Cliet_Data.
So, use this:
Dim rs As DAO.Recordset
Dim rsIns As DAO.Recordset
If Me.Dirty = True Then Me.Dirty = False ' write any pending data
Set rsIns = CurrentDb.OpenRecordset("dbo_Client_Data", dbOpenDynaset, dbSeeChanges)
Set rs = Me.RecordsetClone
rs.MoveFirst
Do While Not rs.EOF
With rsIns
.AddNew
!ClientID = rs!ClientID
!Note = Me.Txt_Note
!Value_1 = Me.Txt_Value1
!Value_2 = Me.Txt_Value2
.Update
End With
rs.MoveNext
Loop
rsIns.Close
MsgBox "All Client Added!", , "Add client"
Note a number of bonus in above. Our code is clean - we do NOT have to worry about data types such as dates, or your messy quote's issue. If dates were involved, we again could just assign without having to worry about delimiters. We also get the bonus of injection protection to boot!
We also used me.RecordSetClone. This is not a must do. It will help performance but MOST significant is when you move the record pointer, the form record position does not attempt to follow along. this will get rid of a lot of potential flicker. It can also eliminate HUGE issues if a on-current event exists on that form.
So, while a VERY good idea (recordsetclone), it not the main reason for the huge performance increase you will see here. RecordSetClone is a the same as me.RecordSet, but you can "move" and traverse the recordset without the main form following.
So, really, the most "basic" code approach, and one that would work say on access without SQL Server turns out to be the best approach. It less code, less messy code, and will have saved you all the trouble to setup + build a SQL Server stored procedure. All your concepts were not necessary, and worse they will cause a performance penalty. Try the above concept.
Access will bulk up and manage the multiple inserts as one go. The concept and idea that always using SQL update/insert commands as compared to reocrdsets is a REALLY HUGE urban myth that so many access developers fall for. It is not true. What is REALLY the issue is if you can replace a VBA loop of a huge number of say separate executed updates with ONE single SQL update statement, then yes, you are miles ahead (to use one SQL update over some VBA loop).
However, if you have to do multiple operations and each operation is on a single row? Well in place of "many separate" SQL updates, then in this case, (being) the vast majority of cases, then a recordset will run circles around a whole bunch of separate update/insert commands to achieve the same goal. Its not even close, and you get 10x if not 100 times better performance by using the above concepts.
You could try passing XML data to the stored procedure.
DECLARE #rowData XML
SELECT #rowData = '<data><record clientid="01" Notes="somenotes" Value1="val1" Value2="val2" /><record clientid="02" Notes="somenotes 2" Value1="val1-2" Value2="val2-2" /></data>'
SELECT X.custom.value('#clientid', 'varchar(max)'),
X.custom.value('#Notes', 'varchar(max)'),
X.custom.value('#Value1', 'varchar(max)'),
X.custom.value('#Value2', 'varchar(max)')
FROM #rowData.nodes('/data/record') X(custom)

T-SQL - SELECT SCOPE_IDENTITY does not work in vba code

I am working on a project which consists transferring a few thousands of Excel rows to a SQL Server (it was also called T-SQL if I'm right?) database. I put together some logic in VBA to shape up the data.
Let me give you some context first about this. The data I'm about to transfer are invoice files. On each row, there are code of the stock items, prices, invoice number, invoice date, name of the client etc. These needs to be transferred to the database of a proprietary ERP system.
There are two tables on the database which I'm interested in for now. First one holds the header data for the invoice (client data, date, invoice number, invoice total etc.). Second table holds the information on the stock items (what has been sold, how many and for how much money etc).
After each insert onto the first table, I have to get the inserted row's primary key, in order to insert rows to the second table, which requires the PK of the first table on each row.
Now, my approach was to use the SCOPE_IDENTITY() function of the T-SQL. When I try to do it on the database directly via SQL Server Management Studio, it works without a hitch.
But when I try to use it in the code, it returns an empty recordset.
Code I'm using is as follows:
Public Function Execute(query As String, Optional is_batch As Boolean = False) As ADODB.Recordset
If conn.State = 0 Then
OpenConnection
End If
Set rs = conn.Execute(query) 'this is the actual query to be executed
Dim identity As ADODB.Recordset 'this rs supposed to hold the PK of inserted row, but returns an empty recordset
Set identity = conn.Execute("SELECT SCOPE_IDENTITY();")
If TypeName(identity.Fields(0).Value) = "Null" Then
pInsertedId = -1
Else
pInsertedId = identity.Fields(0).Value 'I'm saving it in an object variable, to access it easily later on
End If
Set Execute = rs 'to be returned to the caller
'closing the connection is handled outside this procedure
End Function
When I run this on VBA, second query SELECT SCOPE_IDENTITY();just returns an empty recordset. Same query works successfully when ran on the db directly.
Actually I'm able to pull this off by other means. There is a UUID column which I'm supposed to insert to the row in the first table. I can just simply query the table with this UUID and get the PK, but I'm just curious why this won't work.
Any ideas?
Your code doesn't insert any data, so no identity values are generated in the current scope, as defined in the official documentation for SCOPE_IDENTITY():
Returns the last identity value inserted into an identity column in the same scope. A scope is a module: a stored procedure, trigger, function, or batch. Therefore, if two statements are in the same stored procedure, function, or batch, they are in the same scope.
Your code effectively is the same as inserting data in one query window in SSMS and querying SCOPE_IDENTITY() in another query window. Well, this isn't how it works. You must query it in the same scope, i.e. a stored procedure, trigger, function, or batch. Otherwise, use ID values generated by you and insert them with the data.

DAO to .mdb, ADO to .mdf comparison

This code editing a recordset based on joined tables works in DAO/.mdb database
RS.Edit
RS.fields("fieldA").value = 0 'in table A
RS.fields("fieldB").value = 0 ' in table B
RS.Update
The code was converted to ado on a sql server database and it failed with an error message:
Run-time error '-2147467259' (80004005)' :
Cannot insert or update columns from multiple tables.
However it appears to work if it is altered like so :
RS.fields("fieldA").value = 0 'in table A
RS.Update
RS.fields("fieldB").value = 0 ' in table B
RS.Update
Is this a normal way to do things with sql server or is there a gotcha to it.
I ask because when trying to find a solution (before I put in the extra update statement) I changed the recordset type to batchoptimistic and I got no error messge but only one table's record was edited.
Apparently, the data source of your recordset is an SQL returning data from multiple tables. Yes, it's normal that you can only update one table at a time. If you want to update values from multiple tables in a single, atomic step (so that no other client use can read the "intermediate value", where one table is changed but the other is not), you need to use a transaction.

How to get SQL Server CE TableAdapter to commit to database?

VS2008 project. Created local database (SSCE). Created dataset with tableadapter. Dragged dataset and tableadapter to the form so I can reference it in code.
New records successfully add to the dataset but will not commit back to the database. Gives no error or clue why it won't work.
TableAdapter insert statement was created automatically and is parameterized (#p1, #p2, etc.), but I am trying to avoid those parameter.add statements as I want to be able to use a field-by-field format without having to essentially repeat the schema of my database in code with parameter.add statements. Command object and INSERT statements work fine, but then you always have to construct an INSERT statement -- a pain if they're complicated. I want something as simple as working with a ADO recordset, but in .NET.
I know the last statement with the update is wrong without the parameters, but looking for an alternative method.
What can I do to accomplish the following without parameter.add statements?
DocsTableAdapter1.Fill(Documents1.Docs)
Debug.Print("Starting Row Count is: " & Documents1.Docs.Count.ToString)
Dim dr As DataRow = Documents1.Docs.NewRow
dr("Name") = "John Smith"
dr("Reference") = "My new reference code"
Documents1.Docs.Rows.Add(dr)
Debug.Print("New Row Count is: " & Documents1.Docs.Count.ToString)
DocsTableAdapter1.Update(Documents1.Docs)

Inserting NULL in an nvarchar fails in MSAccess

I'm experiencing something a bit strange.
I have a table on SQL Server 2008, say StockEvent that contains a Description field defined as nVarchar(MAX).
The field is set to be Nullable, has no default value and no index on it.
That table is linked into an Access 2007 application, but if I explicitly insert a NULL into the field, I'm systematically getting:
Run-time Error '3155' ODBC--insert on a linked table 'StockEvent' failed.
So the following bits of code in Access both reproduce the error:
Public Sub testinsertDAO()
Dim db As DAO.Database
Dim rs As DAO.Recordset
Set db = CurrentDb
Set rs = db.OpenRecordset("StockEvent", _
dbOpenDynaset, _
dbSeeChanges + dbFailOnError)
rs.AddNew
rs!Description = Null
rs.Update
rs.Close
Set rs = Nothing
Set db = Nothing
End Sub
Public Sub testinsertSQL()
Dim db As DAO.Database
Set db = CurrentDb
db.Execute "INSERT INTO StockEvent (Description) VALUES (NULL);", _
dbSeeChanges
Set db = Nothing
End Sub
However, if I do the same thing from the SQL Server Management Studio, I get no error and the record is correctly inserted:
INSERT INTO StockEvent (Description) VALUES (NULL);
It doesn't appear to be machine-specific: I tried on 3 different SQL Server installations and 2 different PCs and the results are consistent.
I initially though that the problem may be in my Access application somewhere, but I isolated the code above into its own Access database, with that unique table linked to it and the results are consistent.
So, is there some known issue with Access, or ODBC and inserting NULL values to nvarchar fields?
Update.
Thanks for the answers so far.
Still no luck understanding why though ;-(
I tried with an even smaller set of assumptions: I created a new database in SQL Server with a single table StockEvent defined as such:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[StockEvent](
[ID] [int] IDENTITY(1,1) NOT NULL,
[Description] [nvarchar](max) NULL
) ON [PRIMARY]
GO
Then linked that table though ODBC into the test Access 2007 application.
That application contains no forms, nothing except the exact 2 subroutines above.
If I click on the linked table, I can edit data and add new records in datasheet mode.
Works fine.
If I try any of the 2 subs to insert a record, they fail with the 3155 error message.
(The table is closed and not referenced anywhere else and the edit datasheet is closed.)
If I try the SQL insert query in SQL Server Management Studio, it works fine.
Now for the interesting bit:
It seems that anything as big or bigger than nvarchar(256), including nvarchar(MAX) will fail.
Anything with on or below nvarchar(255) works.
It's like Access was considering nvarchar as a simple string and not a memo if its size is larger than 255.
Even stranger, is that varchar(MAX) (wihout the n) actually works!
What I find annoying is that Microsoft's own converter from Access to SQL Server 2008 converts Memo fields into nvarchar(MAX), so I would expect this to work.
The problem now is that I need nvarchar as I'm dealing with Unicode...
OK, I may have found a related answer: Ms Access linking table with nvarchar(max).
I tried using the standard SQL Server driver instead of the SQL Server Native Client driver and nvarchar(MAX) works as expected with that older driver.
It really annoys me that this seems to be a long-standing, unfixed, bug.
There is no valid reason why nvarchar should be erroneously interpreted as a string by one driver and as a memo when using another.
In both cases, they appear as memo when looking a the datatype under the table design view in Access.
If someone has any more information, please leave it on this page. I'm sure others will be glad to find it.
That should be legal syntax. Is it possible that the field you are try to give a null value is linked to other fields that don't allow null values?
Potential concurrency problem... Is the record open by another instance of Access on the same or a different machine, or does a form bound to the table have the record open in the same instance of Access on the same machine?
Renaud, try putting something in one of the other fields when you do the insert.
Also, try inserting an empty string ("") instead of a null.
Renaud,
Did you try running a SQL Profiler trace? If you look at the Errors and Warnings category it should kick out an error if your insert failed as a result of a SQL Server constraint.
If you don't see any errors, you can safely assume that the problem is in your application.
Also, are you sure you're actually connected to SQL Server? Is CurrentDB not the same variable you're using in your Access test loop?
i got annother issue (here my post: link text
In some very rare cases an error arises when saving a row with a changed memo field - same construct explained in my former post but driving sql2000-servers and it's appropriate odbc-driver (SQL SERVER).
The only weired fix is: to expand the table structure on sql-server with a column of datatype [timestamp] and refresh the odbc-links. That works and releases the show-stopper in this column on this one row ...
Maybe this info can help someone - for me it's history in going further to odbc with sql2008 in changing the datatypes [text] to [varchar(max)].

Resources