Good afternoon experts!
I have encountered a very different type of problem than usual. In the past I have pass a single line to the server via 'Pass through Query' and at times when I need to pass more than a single record, I utilise the loop function to send the data to the server multiple times. However if I have over 40 lines of record that loop will take a considerable amount of time to complete. I am just wondering if there is a way to send a table set to the server in 1 move instead of X number of moves using loop.
This is the code I am using on Access side attached to a button within the form (recordsource is a local access table):
Dim db As dao.Database
Dim rs As dao.Recordset
Dim qdf As dao.QueryDef
Dim strSQL As String
Set db = CurrentDb
Set rs = Me.Recordset
rs.MoveLast
rs.MoveFirst
Do While Not rs.EOF
Set qdf = db.QueryDefs("Qry_Send_ClientData") 'local pass through query
strSQL = "EXEC dbo.SP_Client_Referral #JunctionID='" & Me.ClientID & "', #Note='" & Replace(Me.Txt_Note, "'", "") & "', #Value1='" & Txt_Value1 & "', #Value2='" & Txt_Value2 & "'"
qdf.SQL = strSQL
db.Execute "Qry_Send_ClientData"
rs.MoveNext
Loop
Msgbox "All Client Added!", , "Add client"
Now on the SQL server side I have the following Store Procedure (dbo.SP_Client_Referral) that receives the data from pass through query and insert the line of code onto a specific table
#ClientID AS NVARCHAR(15),
#Note As NVARCHAR(500),
#Value1 As NVARCHAR(50),
#Value2 As NVARCHAR(50)
AS
BEGIN
SET NOCOUNT ON;
BEGIN
INSERT INTO dbo.Client_Data(ClientID, Note, Value_1, Value_2)
SELECT #ClientID, #Note, #Value1, #Value2
END
END
For single record or even up to maybe 10 Records this method is relatively fast. However as the number of record increases, the amount of time required can be quite long. If there is a way to pass a table (i.e. Access side using SELECT * from LocalTable) to the SQL server as oppose to line by line would definitely save quite a lot of time. Just wondering if this method exists and if so how would I send a table and what must I use on SQL server side in the SP to receive a table record. Alternatively I may have to continue using this single line method and possibly making it more efficient so that it will execute faster.
Many thanks in advance for your assistance!
Actually, the fastest approach?
Well, it is one that seems VERY counter intuitive, and I could give a VERY long explain as to why. However, try this, you find it runs 10 times or better then what you have now. In fact, it may well be closer to a 100x then what you have.
We shall assume that we have a standard linked tale to dbo.Client_Data. Likely the link is Client_Data, or even dbo_Cliet_Data.
So, use this:
Dim rs As DAO.Recordset
Dim rsIns As DAO.Recordset
If Me.Dirty = True Then Me.Dirty = False ' write any pending data
Set rsIns = CurrentDb.OpenRecordset("dbo_Client_Data", dbOpenDynaset, dbSeeChanges)
Set rs = Me.RecordsetClone
rs.MoveFirst
Do While Not rs.EOF
With rsIns
.AddNew
!ClientID = rs!ClientID
!Note = Me.Txt_Note
!Value_1 = Me.Txt_Value1
!Value_2 = Me.Txt_Value2
.Update
End With
rs.MoveNext
Loop
rsIns.Close
MsgBox "All Client Added!", , "Add client"
Note a number of bonus in above. Our code is clean - we do NOT have to worry about data types such as dates, or your messy quote's issue. If dates were involved, we again could just assign without having to worry about delimiters. We also get the bonus of injection protection to boot!
We also used me.RecordSetClone. This is not a must do. It will help performance but MOST significant is when you move the record pointer, the form record position does not attempt to follow along. this will get rid of a lot of potential flicker. It can also eliminate HUGE issues if a on-current event exists on that form.
So, while a VERY good idea (recordsetclone), it not the main reason for the huge performance increase you will see here. RecordSetClone is a the same as me.RecordSet, but you can "move" and traverse the recordset without the main form following.
So, really, the most "basic" code approach, and one that would work say on access without SQL Server turns out to be the best approach. It less code, less messy code, and will have saved you all the trouble to setup + build a SQL Server stored procedure. All your concepts were not necessary, and worse they will cause a performance penalty. Try the above concept.
Access will bulk up and manage the multiple inserts as one go. The concept and idea that always using SQL update/insert commands as compared to reocrdsets is a REALLY HUGE urban myth that so many access developers fall for. It is not true. What is REALLY the issue is if you can replace a VBA loop of a huge number of say separate executed updates with ONE single SQL update statement, then yes, you are miles ahead (to use one SQL update over some VBA loop).
However, if you have to do multiple operations and each operation is on a single row? Well in place of "many separate" SQL updates, then in this case, (being) the vast majority of cases, then a recordset will run circles around a whole bunch of separate update/insert commands to achieve the same goal. Its not even close, and you get 10x if not 100 times better performance by using the above concepts.
You could try passing XML data to the stored procedure.
DECLARE #rowData XML
SELECT #rowData = '<data><record clientid="01" Notes="somenotes" Value1="val1" Value2="val2" /><record clientid="02" Notes="somenotes 2" Value1="val1-2" Value2="val2-2" /></data>'
SELECT X.custom.value('#clientid', 'varchar(max)'),
X.custom.value('#Notes', 'varchar(max)'),
X.custom.value('#Value1', 'varchar(max)'),
X.custom.value('#Value2', 'varchar(max)')
FROM #rowData.nodes('/data/record') X(custom)
Related
First question here so hoping that someone can help!
Im doing a lot of conversions of Access backends on to SQL server, keeping the front end in Access.
I have come across something that i need a little help with.
In Access, I have a query that is using a user-defined function in order to amalgamate some data from rows in a table into one variable. (By opening a recordset and enumerating through, adding to a variable each time.)
For example:
The query has a field that calls the function like this:
ProductNames: Product(ContractID)
And the VBA function "Product()" searches a table based on the ContractID. Cycles through each row it finds and concatenates the results of one field into one variable, ultimately returned to the query.
Obviously, moving this query to SQL server as a view means that that function will not be found as its in Access.
Can I use a function or stored procedure in order to do the same thing? (I have never used them before)
I must stress that I cannot create, alter or drop tables at run-time due to very strict production environment security.
If someone could give me an example id be really grateful.
So i need to be able to call it from the view as shown above.
Let say the table im looking at for the data is called tbl_Products and it has 2 columns:
| ContractID | Product |
How would that be done?! any help massively appreciated!
Andy
Yes you can most certainly do the same thing and adopt the same approach in SQL like you did in the past with VBA + SQL.
The easy solution would be to link to the view, and then build a local query that adds the additional column. However, often for reasons of performance and simply converting sql from Access to T-SQL, then I often “duplicate” those VBA functions as T-SQL functions.
The beauty of this approach is once you make this function, then this goes a “long” way towards easy converting some of your Access SQL to t-sql and views.
I had a GST calculation function in VBA that you would pass the amount, and a date (because the gst rate changes at a known date (in the past, or future).
So I used this function all over the place in my Access SQL.
When I had to convert to sql server, then I was able to use “views” and pass-though quires from Access and simply use “very” similar sql and include that sql function in the sql just like I did in Access.
You need to create what is called a SQL function. This function is often called a scaler function. This function works just like a function in VBA.
So in t-sql store procedure, or even as a expression in your SQL just like in Access!!!!
In your example, lets assume that you have some contract table, and you want to grab the “status” column (we assume text).
And there could be one, 1 or “several” or none!.
So we will concatenate each of the child records “status” code based on contract id.
You can thus fire up SSMS and in the database simply expand your database in the tree view. Now expand “programmability”. Now expand functions. You see “scaler-valued functions”. These functions are just like VBA functions. Once created, you can use the function in code (t-sql) or in views etc.
At this point, you can now write t-sql code in place of VBA code.
And really, you don’t have to “expand” the tree above – but it will allow you to “find” and “see” and “change” your functions you create. Once created then ANY sql, or code for that database can use the function as a expression just like you did in Access.
This code should do the trick:
CREATE FUNCTION [dbo].[ContractStatus]
(#ContractID int)
RETURNS varchar(255)
AS
BEGIN
-- Declare a cursor (recordset)
DECLARE #tmpStatus varchar(25)
DECLARE #MyResult varchar(255)
set #MyResult = ''
DECLARE rst CURSOR
FOR select Status from tblContracts where ID = #ContractID
OPEN rst
FETCH NEXT FROM rst INTO #tmpStatus
WHILE ##FETCH_STATUS = 0
BEGIN
IF #MyResult <> ''
SET #MyResult = #MyResult + ','
SET #MyResult = #MyResult + #tmpStatus
FETCH NEXT FROM rst INTO #tmpStatus
END
-- Return the result of the function
RETURN #MyResult
END
Now, in sql, you can go:
Select ProjectName, ID, dbo.ProjectStatus([ID]) as MyStatus from tblProjects.
I am using SQL Server Express 2016 and Excel VBA to generate unique lot numbers for a bunch of different excel documents. Currently, I am creating a new row,then a script in SQL Server increments the lot number. Then I run a select statement to grab the the field from the row that was just created. So far I have not had any issues, however, I am concerned that if the excel file is fired by different users at the same time, the select query for one user may grab the row that another user created. Is this a concern? If so how do I avoid it?
statement = "INSERT INTO LotInfo(RefNum,DateCreated,UserName)VALUES('" & RefNum
& "','" & DateCreated & "','" & user & "')"
conn.Execute statement
Set lot = conn.Execute("SELECT top 1 Lot FROM LotInfo Order By ID desc;")
I don't believe that Scope identity will work.
The statements are entirely separate. first you do the insert. That ends. Then you send the select. While I'm not 100% familiar with VBA, I'm not sure that the second select will know anything about the result of the first invoice.
I would suggest you create a stored procedure which you can call from VBA.
The procedure would perform the insert and then return the lot number.
I have a stored procedure that is called very often as it is used to retrieve an account statement. The actual stored procedure takes around 10ms in a query window in MSSMS, and works generally well, but SOMETIMES decides to time out (timeout set to 120 sec) in my VB6 application. The SP joins tables in between 2 databases, one containing the current transactions (DB #1) and the other containing archived transactions (DB #2). Using 'sp_who2', no SPID seems to be hogging or blocking the system.
This is the SQL variable I set:
DECLARE #rtnRecs int;
strSQL = "EXEC spA_StatementData
#sAccountNr = '123abc',
#bIncludeHistory = 1,
#bShowAllTransactions = 1,
#iValidRecords = #rtnRecs OUTPUT"
The method I use in VB6 is:
rs.Open sql, con, adOpenStatic
where rs is the ADODB.Recordset and con is a connection to the database.
This code works well for a long while, say 2 months, and is used by several operators. It then suddenly, for no apparent reason, stops working - but still works fine in MSSMS.
I am emphasizing VB6 as that's where the problem first appeared, but the same thing is happening in my VB.net code.
One thing of note is that the '#bIncludeHistory' parameter is the condition that sets the JOIN to the archive database (DB #2). When '#bIncludeHistory' is set to 0, no timeout occurs.
Resetting the service does the trick, but only as a last resort.
Is there anything else I can try?
Thanks
Beware of parameter sniffing in your stored proc. Try this
CREATE PROC spA_StatementData (
#sAccountNr VARCHAR(1000)
, #bIncludeHistory BIT
, ...
) AS
SET NOCOUNT ON
DECLARE #_sAccountNr VARCHAR(1000)
, #_bIncludeHistory BIT
, ...
--- prevent parameter sniffing
SELECT #_sAccountNr = #sAccountNr
, #_bIncludeHistory = #bIncludeHistory
, ...
--- use local #_sAccountNr, #_bIncludeHistory, etc. instead of parmeter variables
The same problem happened to me, I missed the following code in the STORE PROCEDURE
SET NOCOUNT ON
Hope this helps.
Make sure your SP have this code.
I'm using the following sql for an ADO recordset against a SQLServer backend in VB6:
select c.name, taxid=
case when exists(select 1 from sometable where fld='abc')
then c.SSN else null end
When I try to update the taxid field in a row within the recordset locally ADO complains with the error "Multiple-step operation generated errors. Check each status value." I assume it's bothered by the fact that the taxid field is coming from a calculated field and not a raw table column. For my purposes I'm never going to be persisting these changes back to the database so I'm looking for a way to tell ADO that have no intent to persist changes so that it will allow me to change the data locally.
I think that #HK1's suggestion is a good one, though I'm not sure what happens to your ability to alter any of the values in the recordset whether the column you're trying to update is computed or not. It's been a long time since I played with classic ADO but if the recordset is disconnected it may become read only at that point.
But if you have no interest in using the recordset to perform updates, and you need to alter values locally, perhaps you should consider storing the results in a local array first? That way you can minimize the locking and cursor options of the recordset, for example, and immediately close the recordset and free up those resources.
rs.Open cmd, conn, adOpenForwardOnly, adLockReadOnly
Dim MyArray
MyArray = rs.GetRows()
rs.Close: set rs = nothing
Now you can manipulate MyArray however you want...
I'm working on upsizing a suite of MS Access backend databases to SQL Server. I've scripted the SQL to create the table schemas in SQL Server. Now I am trying to populate the tables. Most of the tables have autonumber primary keys. Here's my general approach:
For each TblName in LinkedTableNames
'Create linked table "temp_From" that links to the existing mdb'
'Create linked table "temp_To" that links to the new SQL server table
ExecutePassThru "SET IDENTITY_INSERT " & TblName & " ON"
db.Execute "INSERT INTO temp_To SELECT * FROM temp_From", dbFailOnError
ExecutePassThru "SET IDENTITY_INSERT " & TblName & " OFF"
Next TblName
The first insert happens immediately. Subsequent insert attempts fail with the error: "Cannot insert explicit value for identity column in table 'TblName' when IDENTITY_INSERT is set to OFF."
I added a Resume statement for that specific error and also a timer. It turns out that the error continues for exactly 600 seconds (ten minutes) and then the insert proceeds successfully.
Does MS Access automatically refresh its ODBC sessions every 10 minutes? Is there a way to force that to happen faster? Am I missing something obvious?
Background info for those who will immediately want to say "Use the Upsizing Wizard":
I'm not using the built-in upsizing wizard because I need to be able to script the whole operation from start to finish. The goal is to get this running in a test environment before executing the switch at the client location.
I found an answer to my first question. The ten minutes is a setting buried in the registry under the Jet engine key:
'Jet WinXP/ Win7 32-bit:'
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\ODBC\ConnectionTimeout
'Jet Win7 64-bit:'
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\ODBC\ConnectionTimeout
'ACE WinXP/ Win7 32-bit:'
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Access Connectivity Engine\Engines\ODBC\ConnectionTimeout
'ACE Win7 64-bit:'
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\MicrosoftAccess Connectivity Engine\Engines\ODBC\ConnectionTimeout
It is documented here for ACE:
ConnectionTimeout: The number of seconds a cached connection can remain idle before timing out. The default is 600 (values are of type REG_DWORD).
This key was set to the default of 600. That's 600 seconds or 10 minutes. I reduced that to ten seconds and the code sped up accordingly.
This is by no means the full solution, because setting the default that low is sure to cause issues elsewhere. In fact, Tony Toews once recommended that the default might better be increased when using DSN-less connections.
I'm still hoping to find an answer to the second part of my question, namely, is there a way to force the refresh to happen faster.
UPDATE: The reason this is even necessary is that the linked tables use a different session than ADO pass-through queries. I ran a test using SQL Profiler. Here are some brief results:
TextData SPID
-------------------------------------------
SET IDENTITY_INSERT dbo.TblName ON 50
SET IDENTITY_INSERT "dbo"."TblName" ON 49
exec sp_executesql N'INSERT INTO "d... 49
SET IDENTITY_INSERT dbo.TblName OFF 50
SET IDENTITY_INSERT dbo.NextTbl ON 50
SET IDENTITY_INSERT "dbo"."NextTbl" ON 49
exec sp_executesql N'INSERT INTO "d... 49
What's going on here is that my ADO commands are running in a different session (#49) than my linked tables (#50). Access sees that I'm setting the value for an identity column so it helpfully sets IDENTITY_INSERT ON for that table. However, it never sets IDENTITY_INSERT OFF. I turn it off manually, but that's happening in a different session.
This explains why setting the ODBC session timeout low works. It's just an ugly workaround for the fact that Access never turns off IDENTITY_INSERT on a table once it turns it on. Since IDENTITY_INSERT is sessions-specific, creating a new session is like hitting the reset button on IDENTITY_INSERT. Access can then turn it on for the next table and the setting will take effect because it's a brand new session.
Two thoughts, though not sure either will be useful because this is unfamiliar territory for me.
"Does MS Access automatically refresh its ODBC sessions every 10 minutes? Is there a way to force that to happen faster? Am I missing something obvious?"
In the Access 2003 Options dialog, on the Advanced tab, there is a setting for "ODBC refresh interval" and also settings for retries. Does adjusting those help ... or have any effect at all?
I wonder if you could avoid this problem by creating the SQL Server columns as plain numbers rather than autonumber, INSERT your data, then ALTER TABLE ... ALTER COLUMN to change them after the data has been inserted.
Access won't let me convert a numeric column to an autonumber if the table contains data, but ISTR SQL Server is more flexible on that score.
I found a convenient whereas not so beautiful solution to export many access tables to sql server and avoid the identity_insert problem:
I open a local table-recordset which lists all tables to be exported and I loop through the records (each table). In each loop I...
create an access application object
use the transfer database method on application object
terminate / quit the application object and loop again
Here is the sample code:
Public Sub exporttables()
Dim rst As Recordset
Dim access_object
'First create a local access table which lists all tables to be exported'
Set rst = CurrentDb.OpenRecordset("Select txt_tbl from ####your_table_of_tables####")
With rst
While Not .EOF
'generate a new object to avoid identity insert problem'
Set access_object = CreateObject("Access.Application")
'with access object open the database which holds the tables to be exported'
access_object.OpenCurrentDatabase "####C:\yoursourceaccessdb####.accdb"
access_object.DoCmd.TransferDatabase acExport, "ODBC Database", "ODBC;DSN=####your connection string to target SQL DB;", acTable, .Fields("txt_tbl"), .Fields("txt_tbl"), False, False
Debug.Print .Fields("txt_tbl") & " exported"
access_object.CloseCurrentDatabase
access_object.Application.Quit
Set access_object = Nothing
.MoveNext
Wend
End With
Set rst = Nothing
End Sub