I have a Main table of 5000 rows, a Managers table of 51 rows and a Phase table of 16 rows. I created a query with a LEFT JOIN from the Main table to each of the other two:
SELECT [tblTrue-UpMain].Contract_Number, [tblTrue-UpMain].ProjectName, [tblTrue-UpMain].AFGroupName, [tblTrue-UpMain].AFAccountMgr, [tblTrue-UpMain].AFSubStationName, [tblTrue-UpMain].AFAssociatedContract, [tblTrue-UpMain].AFPreviousFile, [tblTrue-UpMain].AFFinancing, [tblTrue-UpMain].AFReplaceCoverage, [tblTrue-UpMain].AFBillingRate, [tblTrue-UpMain].AFProjectType, [tblTrue-UpMain].AFPreviousClients, [tblTrue-UpMain].ID, [tblTrue-UpMain].AF_VS_File_Number, [tblTrue-UpMain].Customer_Number, [tblTrue-UpMain].Contract_Execution_Date, [tblTrue-UpMain].Service_Account, [tblTrue-UpMain].Contract_Account, [tblTrue-UpMain].Contract_No_IF_Or_AF, [tblTrue-UpMain].CM, [tblTrue-UpMain].FinanceReplacementCPUC, [tblTrue-UpMain].Project_Type, [tblTrue-UpMain].Contract_Status, [tblTrue-UpMain].Contract_Phase, Local_ContractManagers.Contract_Manager, Local_Contract_Phase.PhaseName
FROM ([tblTrue-UpMain]
LEFT JOIN Local_ContractManagers ON [tblTrue-UpMain].CM = Local_ContractManagers.CMID)
LEFT JOIN Local_Contract_Phase ON [tblTrue-UpMain].Contract_Phase = Local_Contract_Phase.PhaseID
WHERE ((([tblTrue-UpMain].Contract_Number) Like "AF*"))
ORDER BY [tblTrue-UpMain].Contract_Number;
[tblTrue-UpMain].CM and [tblTrue-UpMain].Contract_Phase are both indexes. [Removing these indexes did not impact the time either way.]
Local_ContractManagers.CMID and Local_Contract_Phase.PhaseID are both Primary Keys. [INNER JOINS instead of LEFT JOINS ran for about 1 second in both cases.
When I run the query with the Managers and Phase tables as Access tables, it takes 1 second or less. If I run it with the Managers and Phase tables on SQL Server, it takes over 40 seconds. The Main table is on SQL Server for both queries.]
Running the query on SSMS (with all tables on SQL Server) runs quickly too.
Any ideas why it might be running so long?
As long as the tables are all SQL server based?
Then the query should not take that long. However, if you mix a local table and a SQL server table, then it will be slow as turtles.
However, one way to fix the issue?
Fire up SQL manager, and build the same query in SQL server.
In fact, choose create new view. A SQL "view" is really like a Access saved query, and you get to use familiar query builder (it similar to the Access one). you will note that sorting is not in general allowed, so, do that in the report, or even build a client-side query against that view.
Now, save this view, and then create a link to this view Access side.
I do this so often, I have a little helper VBA routine to add this one link, and that's a whole lot less than using the ODBC manager from the ribbon.
Now, in place of the local query you have, try using the SQL "view" (it will appear as a linked table).
The result should be a VERY high performance, and as good as when you try the query in sql studio.
Views tend to be a far better choice than say using a stored procedure, since it is much easier to use in Access and even VBA code.
In fact, I recommend giving the linked view the SAME name as what you had for the local SQL query. That way, existing forms, code or reports don't require any modifications, and you get fantastic performance.
This performance includes even opening a report with a "where" clause. So, you can still use client-side filters, or even one's created in VBA that launches the report with "where" clause. Access will ONLY pull those records that meet the criteria (and thus this approach is a network/bandwidth friendly approach).
So, anytime you encounter a slow turtle like performance? Try the view approach, it has fixed every performance issue I seen.
For a simple query based on one table? Then a view doesn't help. But, once you start introduction of joins and additional tables, then access can't do a proper join without pulling a lot of data - so a view really helps.
[tblTrue-UpMain].CM and [tblTrue-UpMain].Contract_Phase are both indexes.
We assume you talking about SQL server-side indexes. Do NOT attempt to create index's client side - they don't do anything and can't be used.
I also STRONG suggest you do NOT attempt to create index(s) server side when using a view. ALWAYS create the index(s) on the base sql server tables server side - nothing more, nothing less.
Edit: Adding a link with vba in place of odbc manager.=
So, as noted, since one often needs to add one new talbe, or one new view (as a link from SQL server), then I wrote a VBA routine to make save time.
Nothing speical about the routine. I have a boatload of such routines. I mean, once one adopts a split database, then for ease of deployment, we all over time "cobbile" together some re-link code. This holds true for just a access back end, or a sql server one.
So, my little VBA helper routine to add one link?
Say I just created a new view sql server side. say viewCustomers.
Now, I need a linked table (or view) in the client side. Too much time + pain to fire up the odbc manager and go through all those steps to link just one table.
So, I have this routine, and run it from the debug window.
So, I'll hit ctrl-g, and in the debug window type in this:
MyCreateOneLink "server table", "Local name for link"
So, the above is much quicker.
So, what does the above routine look like?
In my standard code module, I have this:
Sub MyCreateOneLink(strServerTable As String, strLocalTable As String)
Dim tdfcurrent As DAO.TableDef
Set tdfcurrent = CurrentDb.CreateTableDef(strLocalTable)
tdfcurrent.connect = dbCon("MYSERVER\SQLEXPRESS", "Customers", "MyUserName", "MyPassword", cControlPassword, "MyApp")
tdfcurrent.SourceTableName = strServerTable
CurrentDb.TableDefs.Append tdfcurrent
End Sub
And above calls a routine "dbcon()". That routine creates a valid SQL connection string from values passed.
this:
Public Function dbCon(ServerName As String, _
DataBaseName As String, _
Optional UserID As String, _
Optional USERpw As String, _
Optional APP As String = "Office 2010", _
Optional WSID As String = "Axis") As String
' returns a SQL server conneciton string
dbCon = "ODBC;DRIVER=SQL Server;" & _
"SERVER=" & ServerName & ";" & _
"DATABASE=" & DataBaseName & ";"
If UserID <> "" Then
dbCon = dbCon & "UID=" & UserID & ";" & "PWD=" & USERpw & ";"
End If
dbCon = dbCon & _
"APP=" & APP & ";" & _
"WSID=" & WSID & ";" & _
"Network=DBMSSOCN"
End Function
above will have to be changed for "native" driver, but the simple idea here is to have a quick code and thus from the access command prompt (debug window), I can type in a quick easy command to create a table link.
So, often during development, one will use the access command line (immediate) window, and we can thus call little VBA helper routines, such as one to create a table link for us.
Related
Two related questions. I hope that's OK. If not, please feel free to focus on the second question!
I have an Access 2019 database form that uses a macro for search functionality. The form recordsource is based on two tables using a left join. Any macro options that use ApplyFilter based on any fields in the joined tables operate correctly and quickly.
However, I need a search to use a subquery and for some reason the Macro Where Condition does not support a sub query (it shows a red exclamation point and gives an error when trying to save the macro "The 'ApplyFilter' macro action has an invalid value for the 'Where Condition' argument").
The Where Condition is:
JobPropertyID in (select PropertyID from Properties where PropertyAddress like '*' & [Address contains] & '*')
(I have tried various combinations of % and * wildcards, and quotes).
This used to work in earlier versions of Access (we upgraded from 2003 to 2019).
So, question 1 is - Is this a known limitation?
(I can work-around it by using RunCode to set the Filter and FilterOn in VBA code).
The second, and more important question relates to the performance when using a sub query. For example, this pseudo query to return jobs at matching property addresses:
select JobID, JobDescription, CompanyName from JobDetails Left Join Company on JobCompanyID = CompanyID where JobPropertyID in (select PropertyID from Properties where PropertyAddress like '*High Street*')
This does work but can take about a minute to run in Access. If I run the query in SQL Server Management Studio it shows the results in less than a second. I looked at the output in SQL Profiler and it appears that Access is requesting all rows from the joined tables and all rows from the Properties table (with no criteria being applied to either) and then, presumably, applying the filter and the sub query internally.
Is there a way to encourage Access to let SQL Server do the work?
I have tried using a pass through query, and this returns the correct results quickly, but is read only, so not suitable for a form that allows editing.
I suppose I could display the search results in subform and apply a filter to the main form from the OnCurrent event in the subform. But this seems a rather clunky solution.
Any thoughts welcome.
Access tends to like Exists over In a lot when it comes to performance/translation, and that second query can be rewritten to an Exists:
select JobID, JobDescription, CompanyName from JobDetails
Left Join Company on JobCompanyID = CompanyID
where Exists(select 1 from Properties where PropertyAddress like '*High Street*' AND JobPropertyID = PropertyID )
As for that filter, you can pass a filter string to avoid the macro from doing weird stuff:
="JobPropertyID in (select PropertyID from Properties where PropertyAddress like '*' & [Address contains] & '*')"
But you should really consider using VBA for your own sanity (working with macros is intended for beginners but quickly becomes harder than learning VBA even when doing trivial tasks), unless that's restricted in your company, in which case you will run into obstacles using Access.
Any client side filter from Access? One table - works great.
A left join - usually ok!
but, after that ? The problem is that Access (and most) ODBC connections view EACH table as a 100% separate link. While your linked tables point to SQL server? They don't have to!
So, one table could point to say a local FoxPro/dbase table.
Another table could point to a linked Excel sheet.
And another table could point to sql server.
So, if you try and use any compilex sql - involving joins, or ESPECIALLY sub-quires, then Access really can't make the assuming that these additional tables are from the same server - and they might be Excel sheets!! - So, those data sources are "often" seen and assumed by access to be seperate tables. (and while Access is amazing that you can say do sub-queries against to linked Excel sheets? Well, there no server on the other end - Access client side has to figure this out to make it work.
You have two great choices to fix this issue.
the best choice? Build that query server side (in SSMS stuido). Save it as a view. Now the joins and all that jazz will occur 100% server side. You can STILL very effective filter against that "table" in Access. You of course now will have a linked "view" in access. But access sees this as one table - filters work fantastic in this case.
And in a lot of cases, that sub-query can be re-wrtten as a left join (not always - but often so). So again, build that complex join server side, save it as view, and then from Access link to that view.
You can filter against columns in both parent and child table - it again works well.
However, if you really need that sub query and worse so that sub query needs values/paramters from the child table?
Well, then build the query but use what is called a Pass-though query in Access. When you do this, the sql is sent raw un-touched to sql server (no difference in speed, and what you put in that PT query will work 100% exactly the same if you just typed that SQL right into SSMS. So, for example, that query that ran so fast?
Take it 100% "as-is", un-changed. Build a PT query in Access, and then cut+paste in that working and speedy query you had. It will now run the same in Access. Just keep in mind that a PT query in access is read only, but often with sub-query etc., such queries would be anyway.
The next last one? Well, with a PT query in Access, you can use those to call + use + consume a stored procedure. And stored procedures are often used because this lets you EASY set parameters in say a sub query (not that you ever really needed say t-sql code, and some big procedure to run - but you need paramaters - and a SQL stored procedures give you this ability.
in VBA, to execute that stored procedure? Well, I create a working PT query, (could be anything or any T-SQL). I then use that PT query over and over in code.
eg like this:
dim rstData as DAO.RecordSet
With Currentdb.QueryDefs("MyPTQuery")
.SQL = "EXEC dbo.GetInvoices"
set rstData = .OpenRecordSet()
End if
Now above, the stored procedure (to get say invoices) did not have any parameters, but it can and could. So, say you have to pass the year, and month to get invoices.
So, your PT query could/would be used like this (from SSMS).
EXEC dbo.GetInvoices 2021, 3
So, you have to pass it the year + month. IN VBA, then we could go:
dim Mymonth as integer
dim Myyear as integer
MyMonth = Month(date) ' get current month based on today date
Myyear = Year(date) ' get current year based on today date
With Currentdb.QueryDefs("MyPTQuery")
.SQL = "EXEC dbo.GetInvoices " & MyYear & "," & MyMonth
rst = .OpenRecordSet()
End if
And you not limited to shoving the results into a recordset.
Say, we have a report - set the source of the Report to "MyPTQquery"
Then do this:
With Currentdb.QueryDefs("MyPTQuery")
.SQL = "EXEC dbo.GetInvoices " & MyYear & "," & MyMonth
End if
' now open report
docmd.OpenReport "rptInvoices",acViewePreview
So that query can be used for anything - reports, forms or a VBA recordset.
So I find that views are the most easy, perform REALLY good, and you can even have criteria against that view, and it works very well. (so this is the least amount of work). So built that messy multi-table join server side, save as view, and then link from Access client side. This works VERY well, since often you can take a REALLY big dog pile query in Access with multi-table joins - it not going to work well.
So, you take that messy query, move it to SSMS - get it working. Then save that working SQL as a view. Now re-name (or delete) the query. Say it was
QueryInvoices.
Well, now link to the view with the same name. now that VBA code, the forms, and the reports that were ALL using that messy client side query don't have to be changed!! - the name of the server side view is thus linked with that old query name. Again this gives fantastic performance.
So, the above should give you some ideas. As noted, views are your friend here. But as noted, the problem is you can't inject/have parameters in the sub query from Access being passed to a view. So, you can either build a stored procedure - use above PT Query idea.
And the other way? Well, you build the whole SQL string in VBA code, shove it into that PT query - and that also works. (but in-line messy long sql statements in VBA code can be quite a challenge. But for parameters in a sub-query, then the view trick can't be used, so you travel down the PT query road.
I'm running MS Access 2016 connecting to a linked table on SQL Server (2012, I think, if it matters), in order to allow a single user (me) to quickly manipulate data in that table. I have a fairly intensive pass-through query which creates a list of primary key values representing rows where there's something wrong with the data in the table (usually just nulls where there shouldn't be nulls, but also invalid combinations of values and invalid dates).
I would like to display only the rows that are identified by my pass-through query, so I can quickly input missing values and make other corrections. However, I am at a complete loss as to how to do that.
In an attempt to sort-of follow best practices, I tried to make the relevant rows display in a form where only specific fields would be editable. However, MS Access keeps throwing an error about how a relationship isn't defined between the queries and the table, even though I set up the relationships.
Failing that, I tried to make an editable query using the relevant queries and table, but none of the queries I've made have had editable recordsets, for reasons I don't really understand. As far as I can tell, the relationships between the queries and the table has been one-to-one, and the linked table is normally editable directly in MS Access. Even when I ditch the additional info query and just join the linked table to my error-finding passthrough query, I can't create an editable recordset.
Is there a good way to accomplish my goals? I'm starting to feel like my best option here is to create some sort of temporary table in Access to store the values I'm editing, and then merge them into the linked table, but that seems kind of clunky to me. Do I have any other options?
Ok, perhaps you have the stored procedure after finding those bad rows (or whatever reason), you have it return that list of keys.
The next question then becomes?
Well, if you only ever returning say < 100 rows, then I think that just returning a string of PK values would work fine.
So, the stored procedure can either:
Return one string to access with pk values, say like
34,3443,3,55333
(in the final statement of your stored procedure, just do a select on the string and Access will see the one row, one column as a recordset).
So, your code would be something like:
Sub EditBad()
Dim strKeys As String ' list of keys from stored procedure
Dim rstKeys As DAO.Recordset
With CurrentDb.QueryDefs("MyStoredProc")
.SQL = "exec sp_myproc " & "some possible parameters you send to stored procedure"
Set rstKeys = .OpenRecordset
End With
strKeys = rstKeys(0)
rstKeys.Close
' now launch our edit form based on the keys returned
Dim strWhere As String
strWhere = "ID IN (" & strKeys & ")"
DoCmd.OpenForm "frmEditBadGuys", , , strWhere
End Sub
Now, if the list to be returned of bad keys is going to larger? Well, then from the stored procedure, simply return the values as a table. So, in place of a single select on the "string of keys" from the stored procedure, you return a select of the bad key values you want to check.
So, now we have:
With CurrentDb.QueryDefs("MyStoredProc")
.SQL = "exec sp_myproc " & "some possible parameters you send to stored procedure"
Set rstKeys = .OpenRecordset
End With
strKeys = ""
do while rstKeys.EOF = False
if strKeys <> "" then strKeys = strKeys & ","
strKeys = strKyes & rstKeys(0)
.movenext
loop
rstKeys.Close
strWhere = "ID IN (" & strKeys & ")"
DoCmd.OpenForm "frmEditBadGuys", , , strWhere
Again, I think this solution is good for say 100-200 rows.
If the results returned are say 1000-2000 records? Well then that "in (value1, value2) etc. string gets too long, runs too slow, and will blow up - you limited to I think about 4000 characters - but such a where clause in SQL is simply too long aleady, and it going to run turtle slow.
So, in place of procesisng the returned recordset into a long string?
Send the results to a local temp table. You can then created a editable form, and join on the local table to the linked table. In this case, make sure the base table is the linked table, and you may well have to do a left join to the local temp table (an inner might work, but left should work.
And if performance is really required? Well, send the data sql server side to a temp table (now, by temp table, I don't mean an actual sql server temp table, since you can't use those client side in Access. (what setup you use would depend on how multi-user this applcaiton has to be).
So, you could make a sql server view (that does this join). Keep in mind that sql server views ARE editable from client side Access (unlike pass-though query - they are read only for the most part).
So, you could also have a "user" column in that "sort of temp" table sql side, and you not only add the PK values, but also a user column name. And then in access client side? You launch the "bad guys/bad data" edit form and use a where clause based on user name (this would allow mutli-user setup). And this setup would by far perform the best.
So, get the stored procedure to spit back either:
A one column select from the stored procedure that has the keys, and try the "in clause" as a where clause against a client side form that you know works (can edit data). However, I seem to recall that the access client does not do a great job with "in clause" for SQL Server - you have to give this idea a try.
The local table of PK values also would work well - not much more code, and this would be based on a linked table join + a left join to the local table of PK values that you wrote out. If the linked table is not too large, then this can work (but again, performance issues will crop up).
So, ultimate performance would be a server side view and it being joined to the temp working table of bad PK values you want to work on. And for multi-user, then you have to add a "user" column or some means to allow more then one user. Keep in mind that a "where" clause from access client works VERY well against a view. (but not those where "in (1,2,3) types of where clauses!
So, 3 possible roads above to try.
I am using this code to attach DSN less table.
'//Name : AttachDSNLessTable
'//Purpose : Create a linked table to SQL Server without using a DSN
'//Parameters
'// stLocalTableName: Name of the table that you are creating in the current database
'// stRemoteTableName: Name of the table that you are linking to on the SQL Server database
'// stServer: Name of the SQL Server that you are linking to
'// stDatabase: Name of the SQL Server database that you are linking to
'// stUsername: Name of the SQL Server user who can connect to SQL Server, leave blank to use a Trusted Connection
'// stPassword: SQL Server user password
Function AttachDSNLessTable(stLocalTableName As String, stRemoteTableName As String, stServer As String, stDatabase As String, Optional stUsername As String, Optional stPassword As String)
On Error GoTo AttachDSNLessTable_Err
Dim td As TableDef
Dim stConnect As String
For Each td In CurrentDb.TableDefs
If td.Name = stLocalTableName Then
CurrentDb.TableDefs.Delete stLocalTableName
End If
Next
If Len(stUsername) = 0 Then
'//Use trusted authentication if stUsername is not supplied.
stConnect = "ODBC;DRIVER=SQL Server;SERVER=" & stServer & ";DATABASE=" & stDatabase & ";Trusted_Connection=Yes"
Else
'//WARNING: This will save the username and the password with the linked table information.
stConnect = "ODBC;DRIVER=SQL Server;SERVER=" & stServer & ";DATABASE=" & stDatabase & ";UID=" & stUsername & ";PWD=" & stPassword
End If
Set td = CurrentDb.CreateTableDef(stLocalTableName, dbAttachSavePWD, stRemoteTableName, stConnect)
CurrentDb.TableDefs.Append td
AttachDSNLessTable = True
Exit Function
AttachDSNLessTable_Err:
AttachDSNLessTable = False
MsgBox "AttachDSNLessTable encountered an unexpected error: " & Err.Description
End Function
Suppose I changed a schema of a linked tables using ADO:
cn.execute "Alter table table 1 add add Name string not null"
I know this will change the table in SQL Server.
I thought that the linked tables in MS Access always reflected any changes made to the tables in SQL Server. Why do we need to delete the local table and relink it to reflect the schema changes?
The simple answer is that schema changes only are saved and gathered at link time for reasons of performance.
The linked table has no way really of knowing that the table structure has been changed until such time you tell Access to check (by simply refreshing the table).
I mean, when you are on a web page, they might add some new articles, or change the layout. The web server does not then call out to everyone on the internet and tell them that page has been updated. Seems like a silly approach.
The table structure is pulled at linking time. If every time you touched or used a table then performance would take a big hit, and lots of extra checking, chatter and communication with the server would have to occur.
If the linked table did not save information about the schema local, then you would not have this issue.
However, linking to a table not only saves connection information, but it also saves the information about the table structure.
Pulling data from a table is a spectacular and massive difference of an issue then asking sql server to send down the whole table structure. (And so what if it did – you would STILL THEN have to update the local information).
I mean, my form might have 2 fields out of 150 fields. Why would I want all that mass of information to come down the network pipe just to display two text boxes on a form? Why should 150 columns of information (name, length, data type, Auto number etc., indexing come down each time?
Have you seen the propriety sheet for each column on sql? It quite a large bit of information. In fact that information is quite a bit of data when you have 100 or more columns. The information sent down the network pipe would be MORE than the data contained in the one small record you are editing!!!
I mean, you might link to a view on sql server that only includes a few columns. Each query and use of that view will thus only pull the columns defined at view time.
In fact, even other views on sql server can have a hard time being updated when you modify a view. As a result even query on queries (views on views) EVEN when using pure sql server often need to be to “tell” that the data structure and columns you originally started out with are changed.
So linked tables in Access are not much different then saved views on sql server. If you update the base tables, then you have to execute a sp_refeshview command on sql server to update the view to know about these changes.
I mean, if you grab a word document from the server, and others start modify the copy on the server, does each work station know about the changes? Does the server reach out now?
So the information about the schema is persisted at link time, and the re-creating and building of the schema does not occur by just “using” the linked table. You have tell access to re-load the schema when changes are made.
Having to check the schema, and update the information in access for each record or each time you grab some data is a costly and time consuming process.
Since for 99.9% of your data pulling, the schema is not changing, so having all that overhead of pulling and testing the schema each time does not make much of any sense (it would be a poor idea and design).
As pointed out, even when not using Access, and using views on sql server that point to existing tables, the schema is persisted in that view until such time you refresh that view.
And this issue and problem certainly not limited to Access to Access back end, or Access to sql server back ends. Most database products simply don’t out of the blue update things like views that persist the original schema for reasons of performance.
I mean, I am on an access form, and I move to the next record. Are you telling me now that Access should start talking to the server and asking if the schema was changed? Makes no sense at all force the client to gather all that information and re-save the schema information.
I suppose perhaps on each first table use for a given session Access could query for this information, but really, what about all the users currently working? They not see the schema change anyway.
Schema changes are also rather rare. If they are frequent, then something is wrong with the developers - not the database system
I am in the process of converting an Access application to use a SQL Server backend while still using the Access front end forms. Sounds like fun I know.
This application needs data access to 2 SQL Server databases that are on the same server. There are numerous inline sql query strings that attempt to connect to both databases at the same time on a single ADODB connection. This is failing because I am expecting records but none are returned.
What is the best way to fix this? Is there any way to use these sql strings or must it all be converted to stored procedures? Thanks for any help.
Here is some code:
Dim conn As ADODB.Connection
Set conn = New ADODB.Connection
Dim rst As ADODB.Recordset
Set rst = New ADODB.Recordset
With conn
.Provider = "sqlncli11"
.ConnectionString = "Server=[MY_SERVER];Database=[MY_DATABASE];User Id=sa; Password=password;"
.Open
End With
Dim str As String
str = "SELECT TABLE_DB1.Parent_Item_No FROM TABLE_DB1 INNER JOIN [DB2].[dbo].TABLE_DB2 ON (TABLE_DB1.Comp_Item_No = " & _
"TABLE_DB2.item_no) AND (TABLE_DB1.Loc = TABLE_DB2.loc) " & _
"GROUP BY TABLE_DB1.Parent_Item_No " & _
"HAVING (((TABLE_DB1.Parent_Item_No)='" & str_Assembly & "'));"
With rst
.Open str, conn, adOpenKeyset, adLockOptimistic ' this fails to return records
If .RecordCount > 0 Then
'Do Stuff
Else
'Do Other Stuff
End If
End With
You're only checking RecordCount. Take a look at this: slxdeveloper.com/page.aspx?action=viewarticle&articleid=33 Some recordset types don't populate the RecordCount property (adOpenKeyset should though). What happens if you use While Not .EOF and .BOF instead? What is that actual value of RecordCount in your code?
Would it at all be possible to run queries from SQL which save to an access file? I have nothing but trouble pulling data directly into access. I do have success when I set up an ODBC database and go to data -> get external data -> from other sources -> From microsoft query
Another method I myself have been successful with is using the power query add-on from microsoft. https://www.microsoft.com/en-us/download/details.aspx?id=39379
Despite this, what I still mostly end up doing is using the SQL import/export tool. I don't have screenshots or specific instructions as I'm not at work right now, but this can write directly to text files, Access databases, everything. I love it too much. Getting the correct drivers was a loop for sure. If you're on 64-bit and having issues then this is the driver you need. http://www.microsoft.com/en-us/download/details.aspx?id=13255
What I do is:
Set up my source (SQL 11 native client), choose the database you're pulling from
Specify an outfile type and location (I think it has to already exist)
When prompted to specify whether to pull data from tables and views or write a query, select write a query.
Go through the rest of the importer, you can edit the sql statement later on when viewing conversion and specify whether the transfer fails or ignores errors etc.
I personally still use the import export tool for transfers of all sizes because it's just so difficult to get all the correct drivers and get SQL to like what I want. (and without admin rights I get tired of asking my boss).
I hope one of those solutions can help you!
I've outline a more proper fix and a quick fix...
The more proper fix is the Data Layer Pattern. There is a lot to this fix and it may require some application structural changes. This is discussed in depth in another question:
Data Access Layer design patterns
A very simple fix is to use Access Linked tables. A Linked Table works like a normal Access table except the data is stored and updated on the SQL Server. Its basically a built in Data Access Layer to SQL Server. Its not an elegant solution but it gets you up and running right away. More info can be found here:
https://support.office.com/en-us/article/Import-or-link-to-SQL-Server-data-a5a3b4eb-57b9-45a0-b732-77bc6089b84e#bm2
On thing to be aware of with Linked Tables are that some Access Queries and Forms retrieve all the records before filtering and can lock the table so you can end up with some performance headaches if you have lots of data and lots of users.
Consider using the SQL server SYNONYM feature to add aliases for objects in one database to the other. Then just update all your queries to use one database.
Also, you could merge the two databases with each one, or one of them, going into a new schema to keep them separate. This could be tough if you have a lot of stored procedures, views, and functions in the database. This may be a terrible answer, but it could also be true that the two databases should never have been separate in the first place.
In the INNER JOIN, you prefixed the table name with DatabaseName.Schema.:
... FROM TABLE_DB1 INNER JOIN [DB2].[dbo].TABLE_DB2 ...
But you didn't do it in the other places where TABLE_DB2 occurs.
So you either need to change this:
ON (TABLE_DB1.Comp_Item_No = TABLE_DB2.item_no) AND (TABLE_DB1.Loc = TABLE_DB2.loc)
...to this:
ON (TABLE_DB1.Comp_Item_No = [DB2].[dbo].TABLE_DB2.item_no) AND (TABLE_DB1.Loc = [DB2].[dbo].TABLE_DB2.loc)
Or (which I prefer) you can use aliases for the table names in the FROM clause:
... FROM TABLE_DB1 t1 INNER JOIN [DB2].[dbo].TABLE_DB2 t2...
...then you use the aliases everywhere else:
str = "SELECT t1.Parent_Item_No FROM TABLE_DB1 t1 INNER JOIN [DB2].[dbo].TABLE_DB2 t2 ON (t1.Comp_Item_No = " & _
"t2.item_no) AND (t1.Loc = t2.loc) " & _
"GROUP BY t1.Parent_Item_No " & _
"HAVING (((t1.Parent_Item_No)='" & str_Assembly & "'));"
Additional background information:
If you connect to an SQL Server via ADO, you're directly connecting to exactly one database - the one in the connection string:
.ConnectionString = "Server=[MY_SERVER];Database=[MY_DATABASE];User Id=sa; Password=password;"
So in your case, the database you're connecting to is named MY_DATABASE. Any SQL you're executing via ADO goes to that database.
If you need to get data from other databases on the same server, you need to prefix the names with DatabaseName.Schema. in all places where you use them.
So let's assume we have:
a table MY_TABLE in MY_DATABASE
a table OTHER_TABLE in OTHER_DATABASE on the same server
both tables have the schema dbo (the default in SQL Server)
With the connection string from above (connecting to MY_DATABASE), you can join them as follows:
select *
from MY_TABLE
inner join OTHER_DATABASE.dbo.OTHER_TABLE
on MY_TABLE.SomeColumn = OTHER_DATABASE.dbo.OTHER_TABLE.OtherColumn
where OTHER_DATABASE.dbo.OTHER_TABLE.EvenAnotherColumn = 'foo'
See? Everywhere I used OTHER_TABLE, I prefixed it with OTHER_DATABASE.dbo..
PS:
It's bad practice to use the sa user to connect to a database with an application. The sa user has the highest permissions possible.
You should either use Windows authentication or create a dedicated SQL user for your app.
Consider storing your SQL in a pass-through query instead of VBA code. You can apply your filter using a copy of the .sql property of the pass-through query's querydef object, modifying it with the criteria they enter in your form at runtime.
I have single big table in Ms access 2k daabase and i need a way to copy this table and then populate my two already ready tables on sql server .
I can use Migration tool but is there any way we can do it from Ms access like a form which executes the stored procedure or ODBC connection.
what i am thinking is to create a form in ms access which should have a browse button for selecting my source access mdb file and then another button for processing[on clicking the table specified above by browse button should be imported into sql server]??
Is this possbile?
please give me some details
Thanks in Advance
You could set up table links to the two SQL server tables from inside MS Access.
Then you could create a query that populates each table from the source table.
I have single big table in Ms access 2k database and I need a way to copy this table and then populate my two already ready tables on SQL server .
How big is "big"? SQL Server should be able to handle any single table you would ever store in Access. I don't think a terabyte of data would be unfeasible.
You could use Microsoft SSIS. Or just dump a .csv representation of the Access database and import it into SQL Server. Or upsize Access to SQL Server.
Your question is confusing to me. Are the two "already ready" tables in addition to this one big table, with other data to be stored in them? I certainly hope that they don't have the identical layout as the "big" table in Access (violating first normal form) and encourage you to divide the data from Access between them, thinking that it's an improvement of some kind.
You could create an Access Project (.adp) to the Sql Server and then use the File>Import menu in MS Access (in the .adp) to import your table into the (.adp) and thus Sql Server.
Then just use a simple SELECT INTO stored procedures to populate your tables in Sql Server from the imported table.
The following is some code that you can use to generate a DSNless link to a table in a SQL Server database from Access. It works, at least, in Access 2003. After you have run this code, the SQL tables can be manipulated directly just like any other Access table.
'Name : AttachDSNLessTable
'Purpose : Create a linked table to SQL Server without using a DSN
'Parameters
' stLocalTableName: Name of the table that you are creating in the current database
' stRemoteTableName: Name of the table that you are linking to on the SQL Server database
' stServer: Name of the SQL Server that you are linking to
' stDatabase: Name of the SQL Server database that you are linking to
' stUsername: Name of the SQL Server user who can connect to SQL Server, leave blank to use a Trusted Connection
' stPassword: SQL Server user password
Function AttachDSNLessTable(stLocalTableName As String, stRemoteTableName As String, stServer As String, _
stDatabase As String, _
Optional stUsername As String, Optional stPassword As String)
On Error GoTo AttachDSNLessTable_Err
Dim td As TableDef
For Each td In CurrentDb.TableDefs
If td.Name = stLocalTableName Then
CurrentDb.TableDefs.Delete stLocalTableName
End If
Next
If Len(stUsername) = 0 Then
'Use trusted authentication if stUsername is not supplied.
stConnect = "ODBC;DRIVER=SQL Server;SERVER=" & stServer & ";DATABASE=" & stDatabase & ";Trusted_Connection=Yes"
Else
'WARNING: This will save the username and the password with the linked table information.
stConnect = "ODBC;DRIVER=SQL Server;SERVER=" & stServer & ";DATABASE=" & stDatabase & ";UID=" & stUsername & ";PWD=" & stPassword
End If
Set td = CurrentDb.CreateTableDef(stLocalTableName, dbAttachSavePWD, stRemoteTableName, stConnect)
CurrentDb.TableDefs.Append td
AttachDSNLessTable = True
Exit Function
AttachDSNLessTable_Err:
AttachDSNLessTable = False
MsgBox "Error while trying to link to SQL Server on " & stServer & ": " & Err.Description
End Function
I suspect that running an INSERT from Access with ODBC linked tables would be very slow, as Jet/ACE tends to break down each row into a separate INSERT statement. Jet/ACE is being a good citizen user of the mult-user server database, since this allows the server to serialize operations generated by multiple users and interleave others updates in with the massive update. But it's horridly slow for large datasets, and when you're doing something like you are, which is initializing an empty dataset, it's being too "civically responsible."
You might consider renaming the existing table, and instead exporting the Access table up to SQL Server via an ODBC DSN. If you have the DSN already defined, you just choose EXPORT from the FILE menu in Access, choose the DSN as a destination, and Jet/ACE takes care of the rest, creating the table on the server and uploading all the data. It's very efficient (i.e., won't do it one record at a time) since Jet/ACE knows it's populating a table that didn't previously exist.
Now, the result might not be exactly what you like (Jet/ACE might not get all the data types right because of the translation aspects of ODBC drivers, but the data types should be compatible, if the the exact strictest data types desired), but you will then have your full dataset in you SQL Server database, and can then append from one SQL Server table to the correctly structured one.
You would want to do a check of the data to make sure no data has been lost or incorrectly transformed (an example of this would be a text zip code field getting converted to a number -- this would actually never happen, but it's the kind of thing you'd want to check for), but I'd think for a 500MB dataset when you don't have good upsizing tools (because the Access versions can't keep up with the recent SQL Server versions), this might get the job done more efficiently with less work.