After first synchronization with filters, data created before first synchronization is not downloaded if matched by filter through new association.
I have one SQL Server and several SQL Server CE clients. I create scopes <MAC>#Setup where filtered data is sent to the clients in a download DirectionOrder. The first synchronization occurs OK (schema is created and data is downloaded), in the subsequent syncs, data created before first synchronization is not downloaded if matched by filter. Only new inserts or updates are considered.
SyncFx tracks and applies changes per table. assuming you have a linking/association table, when you change the association, it's the association table that is updated. SyncFx will not grab the associated rows from the other tables as it only knows that the linking/association table was updated, not the related tables.
Related
My application is a front end MS Access application linked to a SQL Server database.
I have a form in MS Access for the Orders table and a sub form for Orderslines table
In the OrdersLines table, there is a trigger which calculates the total sum (Quantity x unit price) and more.
The "funny" thing, in MS Access, when I create a new order, I cannot modify the Orders table, because the database and access have not the same data.
So when I run me.requery in MS Access after the process of new order creation, the me.requery sends me to a new record.
This is not happening when I modify this command.
I have tried many things but I can't get it working to keep the current record with a new command.
Any idea will be welcome
Nico
The easiest way to solve this problem is to add a single TimeStamp field to each of your SQL Server tables.
Microsoft Access can track the changes to records in SQL by the TimeStamp field, and it will automatically requery the data from SQL Server and eliminate the "The Data has been modified by another User" message.
The field you add can have any name you wish (I use the name tsJET as this field specifically helps the JET/ACE engine track record changes in this case) and the type for the field is TimeStamp. You don't have to include this field in any queries or forms, it simply needs to exist in the table.
Be sure to refresh the table links after adding this field to your SQL Server tables so that Access can "see" the structural changes to the tables.
NOTE: You cannot modify the data in the TimeStamp field. SQL Server handles that automatically.
I want to create a daily process where I reload all rows from table A into table B. Over time table A rows will change due to changes in source system and also because of aging/deletion of records in the origin table. Table A gets truncated/reloaded daily in step 1. Table B is the master table that just gets new/updated rows.
From a historical point of view, I want to keep track of ALL the rows in table B and be able to do a point in time comparison for analytics purposes.
So I need to do two things, Daily insert rows from table A to table B if they don't exist and then also create a new record in Table B if the record already exists but ANY of the columns have changed. At one point I attempted to use temporal tables but I had too many false/positives on 'real' changes, basically certain columns were throwing off things because a date/time column was updated(only real change in row).
I'm using a Azure SQL Server Managed Instance database (Microsoft SQL Azure (RTM) - 12.0.2000.8).
At my disposal I have SSMS, SQL Server and also Azure Data Factory.
Any suggestions on the best way to do this or tools to help with this?
There are 2 concepts out of which you can implement any one.
Temporal table
Capture Data Change (CDC)
As CDC is the commonly used approach in which you can create an Azure data factory with a pipeline that loads delta data based on change data capture (CDC) information in the source Azure SQL Managed Instance database to an Azure blob storage.
To implement the CDC, you can you can follow this simple Microsoft tutorial Incrementally load data from Azure SQL Managed Instance to Azure Storage using change data capture (CDC)
Note: You also need to Create a storage account which is required but not given in above tutorial.
I have a specific requirement in Transactional Replication, but I am not sure whether it is achievable or not. Could you please help me out if there is any possible way to achieve the same.
Requirement:
As per the requirement, there will be two databases. One is the publication database and another is subscription database.
I want to replicate some of the tables (articles) of the publication database to the subscription database. But what I want is to replicate data only. Because I want to keep those tables (replicating tables) to always present in the subscription database, they may be the empty table initially and when replication starts, these tables may get their data from publication database.
But I don't want the replication to create these tables for me in subscription database. I want to use already created tables. They will have the same schema as publication database tables.
When you configure a publication, you can set the properties for articles. One of the article properties is called Action if name in use. You can set that to the option Keep existing object unchanged.
I have a merge replication scenario with one publication and multiple subscriber.
I have also a two types of tables. First base tables that has the same in all subscriber. Second, multiple table that have same structure in all subscriber but with different data (data replicate with filter row condition).
I want to add add few new subscriber to this scenario. but when I add a new table in my article SQL Server need to reinitialize all subscriber. I do not to initialize subscriber automatically because of large amount of data and latency between links.
My question is how can I add a new subscriber without reinitializtion or with manually initialization?
I have a database in which two tables have a 1:1 relationship using foreign keys. Table one is called Manifest and table two is called Inventory. When an inventory record is added using the application this is built for it uses a foreign key to reference the matching record in the manifest table. In addition, this causes an update to a column in the manifest table for the matching record called Received (datatype: BIT) to 1. This is used for reconciliation and reporting purposes.
Now here is where it gets tricky: This database is synchronized to a server database using Sync Framework in a client-server relationship. The Manifest table is synchronized in one direction from server to client, and the Inventory table is synchronized from client to server. Because of this the "received" column in the Manifest table is not always updated accurately on the server-side after a sync.
I was thinking of creating a stored procedure to perform this update, but I'm a bit rusty on my SQL (and T-SQL). The SP I was thinking of using would use a CURSOR to locate any records in the inventory table where the foreign key is NOT NULL (this is allowed due to exceptions where we receive something that was not in the manifest). The cursor would then allow me to iterate though all the records to locate the matching record in the manifest table and update the "received" column. I know that this cannot be the best way to perform this update. Can anyone suggest another way of doing this that would be faster and use less resources? Examples would be appreciated =)