I started exploring Databases and PostgreSQL not long ago, my goal is to create a database for my company, in which I can upload data and other people can access it through ODBC connections and explore it in Excel. I managed to create a test database in my PC and everything worked properly.
I then moved to the next step, which would be sharing this test database with my colleagues.
I guess that the database could not be stored in my own PC (as it is today) , because people need to be able to access it even when it is turned off.
We have a Network Drive, where currently all our data is stored. Is it possible to host my database in there? What else should I know for this project I'm conducting?
I'm quite new to the subject and I'm just trying to explore some possibilities.
(We all run Windows)
Related
I have a series of Access 2002 "Front-end/Back-end" applications all related to each other. For example, application A has linked tables with application B and vice versa.
The applications are at a stage where daily compaction and repair is required due to the volume and high level of corruption. Moreover, to be able to make the applications work properly, I must make the changes in a virtual environment with Access 2002. I also need to reinstall "Access runtime 2010 - 32bits" and copy the Access files (.mde) on every workstation (Windows 10) every time I am making a change in the applications.
#Gustav This is a temporary option (6 months to 18 months) because the customer would like to go to a complete solution with SQL database. The studied solution is configurable and already has a SQL database schema.
I have already done the test to transfer forms, tables, queries and modules to Access 365 but I have errors in the VBA code. All business rules are coded in the VBA code. I also did the transfer of the tables to SQL Server 2017 but I'm afraid I will have to change a lot of VBA code because of the disuse of the DAO engine in the Access 365 front-end.
In fact, to be clearer, I wonder about the need to change the front end to Access knowing that it is a temporary solution.
Maybe, I should keep the software under respirator by eliminating data history in large tables. The time the client takes their decision. Find the "sweet spot" that would allow me to erase and continue to maintain it without having to worry about corruptions because I have a hard time seeing a substantial gain in the migration to the Access 365 front-end. What do you think?
I have already proposed to migrate applications and tables in the new version of to Access 2019 and even moving on SQL Server. However, for now, I must put the application on the respirator and continue the daily compaction until a decision is made.
I would like to know if there is a gain to migrate from Access 2002 to the Access 2019 version knowing the total limit of 2 GB Access. What are the major constraints he would have to migrate to a SQL database knowing that the VBA application and code uses the DAO method?
#Albert D. Kallal
I really like your answer. I should come to a decision soon.
However, I have 2 additional questions. Perhaps, could you guide me on the subject.
There are two things that have recently came to haunt the tranquility that reigned over these applications.
1- For a unknown reason, one of the applications that was part of the swarm of Access applications was blocked for some time with these errors 'Runtime 3027: Cannot update: Database or object is read only'. The problem is that some users have ignored this error and continue their tasks which caused a data shift.
I had to go back with backup copy because some tables were not updated.
Looking more closely at the error in the VBA code I noticed that it all came from the DAO Recordset.Edit method containing queries with multiples joins.
I managed to work around this problem by modifying the Edit method with DoCmd.RunSQL and by changing the query from a Select to an Update query.
However, the whole method worked perfectly well before.
Can you explain to me the cause of this error?
2- The original developer did not necessarily use best practices for the design of the application (no autonumber, some tables without primary key, no foreign keys) so I'm afraid that a new design will need to be done if I migrate to a SQL Server database. Or maybe to save time, as this solution is going to disappear in 18 months, I should only replicate the bad practices in SQL Server and pray that it does not cause any more glitches. What would be your professional approach?
Thank you
There may be none. Open the database in Access 2019/365 and save it in the 2007 (accdb) format, and check it out.
As for the distribution, you can make this fully automatic using a script and a shortcut. It is explained in full in my article:
Deploy and update a Microsoft Access application with one click
If you don't have an account, browse to the link: Read the full article.
Push hard to get a confirmation on the move of all the shared tables to an SQL Server backend.
A few things:
To update your mde or accDE front end? That is a simple copy to each workstation. You don’t need to re-install the runtime each time. There is no “special” connection between one particular application (mde/accde) that you deploy to each workstation.
In other words:
If you writing software in VB6, then you need to install the VB6 run time (but only one time). After that you can simply copy + deploy your application to each workstation.
If you writing software in say .net then again, you have to ensure the correct .net framework is installed on each computer. Once done, then again you can update your software by a simple copy to each workstation.
And the same goes for using the access run time. Once installed, then you can simply copy any mde/accDE to that workstation, double click on it and it will run. So the run time is not connected to any particular database you copy to the workstation. Once you have the runtime installed, then you can rather easy cook up some automatic update code for the front end to "check" some version number, and then copy down the new updated front end. There are quite a few ways to do this - even a simple batch file can often suffice here.
So in near all cases these days, you will have to do “one time” install of the required run-time and support libraries. This is the case for .net, older VB6 programs, or Access.
As for migration of the access table data to SQL server?
You should be able to simply migrate the table data to SQL server. Now link the application tables from the older access back end to SQL server.
At this point, 99% of your VBA and even DAO recordset code should work just fine.
There is no need (or even a good reason) to dump using DAO code you have – it should work as before with VERY few modifications.
About the only change is for code that does this:
Dim strSQL As String
Dim rst As DAO.Recordset
strSQL = "select * from tblCustomers where City = 'Edmonton'"
Set rst = CurrentDb.OpenRecordset(strSQL)
' above for SQL server becomes:
Set rst = CurrentDb.OpenRecordset(strSQL, dbOpenDynaset, dbSeeChanges)
And you can even migrate the tables with indexes, and table relationships intact by using the Sql Server Migration Assistant for Access (SSMA). You can find this fantastic tool here:
https://www.microsoft.com/en-us/download/details.aspx?id=54255
So, about 99% of existing forms and VBA code will work as before after you migrate the data to SQL server.
I have started a new job and there is currently no database to store data which I need to perform some financial analysis (what a shame). I have to start from scratch and build a few tables which I will load every day. This is all new to me.
I will ask IT to install SQL Server Management Studio as well as a local server instance, in which I will store my tables (similar to SQLExpress instance installed on my personal laptop).
I will then load the data everyday, I am worried that other people on the network can play around and amend the data.
Do I need to secure the database or not (since its a local install)?
In case they install it on a common drive, how do I make sure that a password is needed to get in or that other people only have read access?
Thanks and Regards
Ok, let me explain the environment we are facing here:
We have an ASP.NET MVC 4 app that uses a SQL Server database.
This app isolates data in "projects", so when any user connects to it only can work on the data of one of this projects.
Sometimes... a group of users have to travel to remote regions for some days to retrieve data for a single project, and quite often they won't be able to have an internet connection (even mobile or satellite solutions are often out of reach).
While the displaced team works on a project, people at the office still can work on the rest of the projects (but no on the one that is abroad).
So... we are pondering the possibility of using a laptop to act as a "mobile server", where users can download the data from a specific project before travelling. While abroad, they can work against this "mobile server", update any data on their project and, when they come back, they could upload their updated data to the main server.
Our idea is to create stored procedures on both servers (main and mobile) that executes different queries to update data from a project between them, passing the project identifier as a parameter. Probably using Linked servers to allow main and mobile to see themselves during update operations.
Our questions here are:
Is this a good aproach?
Is there any other better approach that we're not seeing?
Are there any risks we should pay attention to in this or other approachs?
I've never used Bidirectional transaction replication so if that works for you, problem solved. I do have quite a bit of experience with data migration, including merging large data sets into software driven systems. And from that experience, replication has hurt us more than it has helped us (from a migration/merge view).
The biggest challenge in my opinion is going to be conflict resolution. I know you say that all of the data is in project specific databases, but there is no shared data at all? What about multiple remote users updating the same data? In that case you're going to need a little more than just replication.
Instead of maintaining two databases at all times (one for mobile, one as the regular in-house db), why not a system where a job is called to your main system indicating that a project needs to be prepared for "offline mode" (the job could be stored procedures or SSIS packages or straight T-SQL). Whatever the technology used, this job would copy all of the requested project data to a new database on the remote server/laptop and mark it somehow in the main database as read-only to prevent users in the office from updating that data.
Once the data is in offline mode on the remote server, the users can update and use the data as much as they want from that remote server. Then when the users get an internet connection or they are back in the office they can kick off another job that syncs the data to the main server, removes offline mode, and deletes/archives the remote database. Almost like a temporary project database.
Seriously, it sounds like a fun project.
Technologies to look at:
SSIS (Sql Server Integration Services) - In my experience, this is extremely fast at moving data and allows you the ability to add logic to handle conflict resolution, error logic, etc. It's free (with certain Sql Server editions) and the community is huge so supporting it should be easy. SSIS is not as dynamic as some of the specialized solutions out there.
A data migration suite like Pervasive's Data Integrator - I loved this but it's expensive. You could right an entire solution in this product that could handle the processing of your data bidirectionally and like SSIS it allows for complex programming logic.
T-SQL - With a linked server you could just write straight queries (using stored procedures if you wanted). The problem here is security on the linked server. We don't use them because of this issue. Linked Servers: Good or Bad?
Start using some of Microsoft's built in change detection technologies right off the bat. It's harder to implement when you're already using the system. Change Data Capture (CDC) will give you a full history of the records updated while Change Tracking will give you a light-weight summary of your changes. Using either technology will make syncing the data many times easier.
Change Tracking: http://msdn.microsoft.com/en-us/library/bb933874.aspx
Change Data Capture: http://msdn.microsoft.com/en-us/library/cc645937.aspx
SSIS: http://msdn.microsoft.com/en-us/library/ms169917.aspx
SQL Server Agent Jobs: http://msdn.microsoft.com/en-us/library/ms189237.aspx
I've recently been hired on as an intern to take over a previous intern's Access 2003 Database. I have no prior experience in Access, and only a fundamental understanding of relational databases/SQL.
I'm looking to make the database faster, and more secure. Right now it's split on the network drive, with the backend database in a subfolder within the main project folder. It's being used by around 70 employees to take tests and store certifications. Several admins use it to create and print these tests.
It's extremely slow. The files are currently stored on a server several states away. If I transferred this database to Sharepoint, would it be faster and more secure? Is it worth the time and effort to do so?
The employees that use this database currently access it from a .exe on their desktop. Would sharepoint be more user friendly for them?
Alternatively, would moving the .mdb files to a closer server solve the speed problem? I'm currently using Access 2010. The forms are painfully slow to use as of right now.
Thank you
Moving the files to a local server would alleviate a lot of the speed concerns. Moving the file to SharePoint wouldn't do much different in terms of performance. But I'm assuming the files aren't local already for an unstated reason? Ideally, it should be moved to MS SQL server if you want to move the database, but that requires MS SQL knowledge.
Moving to SharePoint will only work if you up-size the data tables to SharePoint lists.
You cannot place the Access mdb/accDB file on SharePoint in some shared folder and have multiple users update at the same time. The reason of course is SharePoint files cannot accept “partial” writes. You have to "pull whole" file to client, update, and send whole file back. So this is not a possible setup with Access.
Access requires in multi-user mode that individual users can update “ONLY bits and parts” of the file at the SAME time. When you place a Word or Excel or in this case an Access file on SharePoint then the WHOLE FILE must be downloaded to the client. User then edits and then saves the file back up to SharePoint. So SharePoint is whole document based not file based like windows is. There is no NTFS file system - only a web based up/down file system (very much like FTP).
So SharePoint is a web based interface and Access requires the windows networking system + ALSO the ability to update bits and parts of the file (something SharePoint does not support nor any web site for that matter).
However if you move your back end tables out of Access and up-size the data to SharePoint tables (lists), then the Access front end clients can connect + edit that data. This is not much different in concept of up-sizing the data tables to SQL server.
So Access front ends can connect to an Access back end on a file server (your current setup), or connect to SQL server tables, or connect to SharePoint tables.
I explain how to up-size data tables to SharePoint in this video:
https://www.youtube.com/watch?v=3wdjYIby_b0
In some cases Access to SharePoint tables will run absolute circles around Access to SQL server. However in other cases such a setup will run SLOWER then SQL server. Only an experienced Access developer on a case by case basis can determine if SharePoint tables would be appropriate for your application. As the other poster points out adopting SharePoint or SQL server will require experience with those technologies along with likely a few good years of Access experience. Remember Access has a rather long learning curve – in most cases longer then say learning c++
In your case due to the Wide Area Network (WAN), then I suggest terminal services is your best bet.
I explain in easy to grasp terms as to why your setup now is slow in this article and what solutions you can adopt:
http://www.kallal.ca//Wan/Wans.html
I've been tasked to take an Access 2007 application that relies on an ODBC connection and share it with other institutions with the same ODBC connection. Please forgive me if I don't communicate this very well. I'm not a developer, but have been tasked with this project since I've gotten it this far. I'm sure that's never happened before...
First I'll give a layout of our structure:
I work for a college that shares an database via ODBC with 31 other schools.
The system office that maintains the database for all campuses only allows us to access the read-only data through a VPN of a Common Access Point server (CAP) that then connects via ODBC
The CAP server (the only location that can link to the ODBC) has Microsoft Office and does not have internet access.
Each campus has a unique ODBC connection that requires relinking tables when the accdb is placed on their CAP server.
With each launch of Access, the user must also login to the ODBC connection.
The CAP server can can read-write on a network drive, but not vice-versa.
We can safely assume that no other software can be installed on the CAP server, but files may be placed (which is why we can distribute an accdb file)
The Access application pulls student course activity from the ODBC and and applies logic to determine if/when the student stops attending all courses. At this time, this logic is a series of queries tied to a macro. The database then generates a report (with more information from the ODBC) of the students. An active tracking process is in place so a record can be cleared from the report unless a change occurs, which will then cause the record to reappear with the changes. This requires data to be stored locally as well since the ODBC is read-only. There are various forms and reports backed by VBA as well.
The goal is to package the software and distribute for launch at all other campuses. So far we've done small distribution by simply sending them the accdb file and having a button that launches the linked table manager. After initial distribution, I will continue developing the software and distributing updates, having to preserve the data locally stored in the accdb.
The catch is that I only have experience with Access and enough knowledge of VBA to be able to google solutions individually as they come up.
My question could be simple or complex, I'm not sure. Basically I'd like to know if there is a more appropriate approach other than what I've been doing: send accdb and the user copies and pastes the only table that needs to be carried over.
Clarification
Would it be practical to convert the accdb to an executable with each version that is distributed? Is this even possible when the ODBC requires reconnecting and the ODBC is unique between campuses?
Requiring the end user to copy and paste the table(s) that store local data from one accdb file to another for each upgrade will eventually lead to data loss - someone somewhere will forget this step during an upgrade.
A more reliable approach would be to create a second accdb database. Call it "YourAppName_data.accdb" or something to that effect and either place it in the same directory as the front-end client or in a subdirectory called "Data". Link the tables from the "data" accdb file to the front-end client.
You can add startup code to your front-end client that can attempt to automatically relink these tables by looking for the data accdb file in the known location. If the program can't find it you can then prompt the user to find it for you. Incidentally, you should be able to do something similar for the ODBC tables as well. You can use code similar to what ChrisPadgham wrote in his answer for this step.
What you have done at this point is separate the data for the application (both the read-only data and the data each school needs to be able to maintain on their own) from the application front-end (the forms, queries, logic and reports).
This will make it easier to distribute updates to the front-end client. End users simply need to copy the front-end client to the correct directory, overwrite the existing file in the directory and run the program.
This will work, but it is still not as robust as it could be since anyone who has access to the CAP server could potentially delete the data files off of the server. (Hopefully each school takes regular backups of this machine to safeguard against data loss.)
As HansUp suggested, you might eventually be better served by moving the data stored in the Access accdb files to a SQL Server database at each location, which will offer a bit better control of who can access the information and would be a bit better at safeguarding the data since SQL Server database files are "locked" on the machine when the server is running. (This would prevent someone from accidentally deleting a file). The downside to SQL Server is that there is a learning curve and it would need to be installed at each school either on the CAP server or some other machine that the CAP server can access on the network. This might be something to work towards over time once you have better information to go off of.
You can add a relink button that loops through the tables in your database and reconnects them
dim tdf as tabledef
dim db as database
db = currentdb
db.tabledefs.refresh
for each tdf in db.tabledefs
with tdf
if len(.connect)>0 then ' this is a table that has a connect string
if left$(.connect)="ODBC" then ' this is an ODBC connection
.connect = newconnectstring
.refreshlink
end if
end if
end with
next