Delivering updates to an Access database application - database

I am creating an application in Microsoft Access. This is for a small database that the customer will run on a desktop. No network of any kind will be involved. All the necessary files to use the database must be on a single desktop computer.
I want to deliver the app to my customer in stages. Most likely I will email the .accdb file to the customer. How do I deliver an update and maintain any data already entered by the customer? Updates may include changes to the table structure as well as to forms.
The answers given to my original question address the issue of changing forms and other UI elements. However, what if I want to add a table or add column to an existing one? How do I seamlessly deliver such changes while preserving as much data as possible on the user's end?

Split the database and the interface into separate files. Google should have plenty of information as this is typical for MS Access apps.
Here are a few resources to get you started:
How to manually split a Access database in Microsoft Access
Splitting an Access Database, Step by Step

You absolutely MUST (!) split your database into two parts. A backend part storing the tables ("the database") and a frontend containing the forms, reports, queries and application logic ("the application"). Link the tables from the backend to the fontend.
The frontend might also contain tables with control paramameters, report dictionaries etc., but no data that your customer enters!
Newer versions of Access have a database splitting wizard.
You might need a code that automatically links the backend to the fontend on the customers site.
UPDATE
You have two possibilities to alter the schema of your database on the customers PC.
1) Do the Updates through the DAO (or ADOX) object models. e.g.
Set tdf = db.CreateTableDef("tblNew")
tdf.Fields.Append tdf.CreateField("fieldname", dbText)
...
db.TableDefs.Append tdf
2) Use DDL queries
CREATE TABLE MyNewTable (
ID AUTOINCREMENT,
Textfield TEXT(50),
LongField LONG,
...,
CONSTRAINT PK_MyNewTable PRIMARY KEY (ID)
)
Or
ALTER TABLE SomeExistingTable ADD COLUMN Newcolumn Text(50)

Related

ASP.Net Core Identity EF 5.0 restore only users to database

I am using the Microsoft.AspNetCore.Identity.EntityFramework in my Blazor Server Project and have customized it with no problem. I do enjoy the simplicity and robustness of the Identity; however, since I am developing for deployment, I came across a small problem. In the disaster recovery, the FULL restore from an MSSQL backup works without a hitch. This type of backup is crucial if you have not changed anything in your ApplicationDbContext for the deployed Blazor Server Project. So for that matter in disaster recovery to a same state and DbContext, there is no issue, but should you have a small change from one install to another in the DbContext, or say you are upgrading the previous version of the Blazor Server Project with changes in the context, you cannot recover the users since you can only do a complete FULL restore with the previous DbContext as the Db Schema and Design.
How can I restore, or better yet, migrate the date from one instance of the DBContext to another? I tried doing a data restore from backup device, but it does not let me do the restore due to the foreign key between the user table and the user claims and user roles tables. Foreign key constraints do not permit me to restore the users with the previous generated GUID which contain vital information link to other data. Say for example one table tracks events that are created by the user, if I re-create the user, it has different GUID than what is registered in the backup.
Any help will be appreciated.
If you query the table for the ASPNetUsers you get the following:
Id UserName NormalizedUserName Email NormalizedEmail EmailConfirmed PasswordHash SecurityStamp ConcurrencyStamp PhoneNumber PhoneNumberConfirmed TwoFactorEnabled LockoutEnd LockoutEnabled AccessFailedCount
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- -------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- -------------------- ---------------- ---------------------------------- -------------- -----------------
6cd26b55-b0c6-41ae-ac6c-ff9ae3242c3b userone#mymail.com USERONE#MYMAIL.COM userone#mymail.com USERONE#MYMAIL.COM 1 AQAAAAEAACcQAAAAECp/NfxxEWw8fbd0jIrXOGZ/v/ggPscxMIINueP2dUQAihgRwrE1a+t1os/7MwvPCg== OJP4HUHN6FNCXHXETVJTDQLC6RBUCLJS f69a0ff7-9298-4d1f-8faa-e8ab3db3ab70 NULL 0 0 NULL 1 0
9eda9ba1-dd47-4308-ba62-a19381a32d56 usertwo#mymail.com USERTWO#MYMAIL.COM usertwo#mymail.com USERTWO#MYMAIL.COM 1 AQAAAAEAACcQAAAAEFcXpQREt8sfhssYHlH/hRSlk3yX/bGMCYpXpTrid+YLNMFkDr5V45MnIo0JOPmWlw== YLZJSYILLPQZBFGCOJLXKFZB64YXCHIK c389b663-fcd1-4918-b9ed-54bc583666cc NULL 0 0 NULL 1 0
(2 rows affected)
Completion time: 2021-08-06T14:19:46.8143727-07:00
Since the Id is a GUID generated by the EF Core Identity Table, and is linked to the other ASPNetCore tables such as UserRoles, UserClaims, Etc., how can I import user data with the old GUID that was previously generated?
Edit Aug 12 2021
I have found a temporary work-around for the problem. For purpose of explanation I will use APP, OriginalDB, BackedUpDevice (the bak file) and RestoredDB and NewDB.
The OriginalDB I am continuing to work on the model and ASP.Net Core/Blazor application (APP), therefore on ocassions the model changes. I have my App that when it initializes for the first time, it uses the models to create the Database on the server, should it not exist. I use migrations to create the new model to be put on the server at first run. ASP.Net Core Identity facilitates this.
So, the original problem arises only when the model is modified in the APP and the APP needs to re-apply the migration. I don't use subsequent migrations since, once the OriginalDB is in use, there is data and even if the model changes, the data will need to be copied. I have absolutely no problem with this. I can restore from the BackedUpDevice to another Database as the RestoredDB. I then use the RestoredDB to insert the data back into the NewDB, which is a freshly created copy of the OriginalDB schema and the model changes in the APP. Thus, this gives me my data, but I cannot use this method for the ASP.Net Core Identity tables since it uses Foreign Keys.
The work-around that I have found, is that I can safely delete the ASP.Net Core Identity tables, namely AspNetRoles, AspNetUserClaims, AspNetUserLogins, AspNetUserRoles, AspNetUsers, and do a copy schema and table from the RestoredDB to the NewDB. While recreating the tables using this method works, I have the problem that I have also modified these user tables, not only changing the table names, but also adding custom fields as shown primarily in This Article, on future iterations of the APP, there may be additional custom fields and/or changes that affect these tables. Hence why I need to see how I may just copy the RestoreDB user tables data, including the generated GUID for the user with Foreign Keys to the NewDB user tables. Any suggestion will be greatly appreciated.
I obviously don't know everything about your situation, and I don't understand some parts of your explanation at all, but the approach you've outlined is going very far down the wrong road.
EF Core 5.0 migrations allow a lot flexibility in how they are created, maintained, and applied. You should leverage these features to accomplish what you want to do, rather than creating new databases and copying data into them.
Some examples that seem to be applicable to your use case:
From Using a Separate Migrations Project, you may want to consider having two different sets of migrations - one for local development / iteration, and one that just has migrations that take the released application from one version to the next
From Customize migration code, solving your problems with the automated migrations may involve customizing the automatically generated migration code. Or creating empty migrations and writing completely custom code to perform intermediate data migrations steps
From Excluding parts of your model, you may need to exclude certain objects in one DbContext from being included in the migrations for another DbContext (for instance, if your Identity tables are coming from a different DbContext than the rest of your app tables, but you reference the Identity entities in your main app's DbContext)

Is this an appropriate database design if I wanted to audit my table?

It's my first time creating an audit log for a PoS WPF application and was wondering on how exactly do I implement an auditing system because it seems like each option available has its ups and downs. So far from reading numerous articles/threads, I've narrowed down a few common practices on audit logs:
1. Triggers - Unfortunately, due to the nature of my app, I can't make use of triggers as it has no way of knowing which user has done the action. So what I did instead was to create a Stored Procedure which will handle the customer insert along with the customer log insert and its details. The Customer_Id will be provided by the application when using the Stored Procedure.
2. Have an old and new value - My initial plan was to only include the latter since I can reference its old value with the new value from the row before it but storing the the old and new value seemed more sensible, complexity-wise.
3. Use a separate database for the log / 4. Foreign Keys - This is probably my main concern, if I decide to use a separate database for the audit table, then I couldn't setup foreign keys for the customer and employee involved.
I created a mock-up erd with a master-detail table result to be shown on the wpf app to display the log to an admin and would really like your thoughts on possible problems that may arise (There's also an employee table but I forgot to put it):
https://ibb.co/dheaNK
Here's a few info that might be of help:
The database will reside together with the wpf app, which is a single computer.
The amount of customers will be less than 1000.
The amount of regular employees will be 3.
The amount of admins will be 2.
You can enable CDC Change Data Capture on SQL Server database for a specific table
This will enable you to collect all data changes on the database table logged in special tables.
You can also refer to official documents too
Here is a list of DML commands and how data changes are logged in the CDC table created for the source database table
What is good about CDC is it comes default with SQL Server and you don't have to do anything for logging. The only requirement is SQL Server Agent should be running so that the changes can be reflected on log table.

Perform InsertOnSubmit on a View

We have a ASP.NET MVC application with Linq2Sql and a SQL Server-Backend. The application is run on the main site of our customer, but every site has their own database in SQL Server. (Due to different reasons, they shouldn't share most of the information between each other.) However some general information is shared, this is stored in a Shared-Database, and every site-specific database has views which represent those tables in their respective databases.
For example when I have Sites S1, S2, S3 with their databases D1, D2, D3 and a shard database DS with a shared table TS
now in the databases S1-S3 I'll have a View which underlying query is simply:
SELECT * FROM DS.TS;
Having it written like this, SQL Server somehow automagically propagates all inserts, updates and deletes to DS.TS without the need of explicit instead of triggers. Which makes our lives much easier, since we also need to handle only one connection to one database and don't need to bother with two different databases.
While we write our Delete and Update commands ourselves in the application and don't use Linq2Sql, they work fine. However the insert command on the shared table is using InsertOnSubmit and fails with the following exception:
Can't perform Create, Update, or Delete operations on 'Table(TS)' because it has no primary key.
at System.Data.Linq.Table`1.InsertOnSubmit(TEntity entity)
Is there any way to make this work, or will I have to create the Insert-Commands on those shared tables by myself and execute them with DbCommand.ExecuteNonQuery()
LINQ to SQL with views doesn't know which column contains the key. You may be able to tweak the generated model and only set the column(s) that should be primary keys as appropriate. Be aware in your view however that you shouldn't do select * as it may kill performance over time.
You can set IsPrimaryKey property of the primary key in your model to True

Postgresql - one database for everyone, or one-database per customer

I'm working on a web-based business application where each customer will need to have their own data (think basecamphq.com type model) For scalability and ease-of-upgrades, I'd prefer to have a single database where each customer gets a filtered version of the data. The problem is how to guarantee that they stay sandboxed to their own data. Trying to enforce it in code seems like a disaster waiting to happen. I know Oracle has a way to append a where clause to every query based on a login id, but does Postgresql have anything similar?
If not, is there a different design pattern I could use (like creating a view of each table for each customer that filters)?
Worse case scenario, what is the performance/memory overhead of having 1000 100M databases vs having a single 1Tb database? I will need to provide backup/restore functionality on a per-customer basis which is dead-simple on a single database but quite a bit trickier if they are sharing the database with other customers.
You might want to look into adding Veil to your PostgreSQL installation.
Schemas plus inherited tables might work for this, create your master table then inherit tables into per-customer schemas which provide a company ID or name field default.
Set the permissions per schema for each customer and set the schema search path per user. Use the same table names in each schema so that the queries remain the same.

Copy Database Data from Many DBs to One. Data Replication (sort of)

This involves data replication, kind of:
We have many sites with SQL Express installed, there is an 'audit' database on each site that has one table in 1st normal form (to make life simple :)
Now I need to get this table from each site, and copy the contents (say, with a Date Time Value > 1/1/200 00:00, but this will change obviously) and copy it to a big 'super table' in sql server proper, that also has the primary key as the Site Name (That needs injecting in) and the current primary key from the SQL Express table)
e.g. Many SQL Express DBs with the following table columns
ID, Definition Name, Definition Type, DateTime, Success, NvarChar1, NvarChar2 etc etc etc
And the big super table needs to have:
SiteName, ID, Definition Name, Definition Type, DateTime, Success, NvarChar1, NvarChar2 etc etc etc
Where items in bold are the primary key(s)
Is there a Microsoft (or non MS I suppose) app/tool/thing to manager copying all this data accross already, or do we need to write our own?
Many thanks.
You can use SSIS (which comes with SQL Server) to populate, it can be set up with variables to change the connection string to the various databases. I have one that loops through the whole list and does the same process using three differnt files from three differnt vendors. You could so something simliar to loop through the different site databases. Put the whole list of database you want to copy the audit data from in a table and loop through it changing the connection string each time.
However, why on earth would you want one mega audit table per site? If every table in the database populates the audit table as changes happen, then the audit table eventually becomes a huge problem for performance. Every insert, update and delete has to hit this table and then you are proposing to add an export on top of that. This seems to me to be a guaranteed structure for locking and deadlocks and all sorts of nastiness. Do yourself a favor and limit each audit table to the table it is auditing.
Things to consider:
Linked servers and sp_msforeachdb as part of a do-it-yourself solution.
SQL Server Replication (by Microsoft) (which I believe can pull data from SQL Server Express)
SQL Server Integration Services which can pull data from SQL Server Express instances.
Personally, I would investigate Integration Services first.
Good luck.
You could do this with SymmetricDS. SymmetricDS is open source, web-enabled, database independent, data synchronization/replication software. It uses web and database technologies to replicate tables between relational databases in near real time. The software was designed to scale for a large number of databases, work across low-bandwidth connections, and withstand periods of network outage.
As of right now, however, you would need to implement a custom IDataLoaderFilter extension point (in Java) to add the extra column. The metadata would be available though because your SiteName would be the external_id.

Resources