I'm looking at the features of SymmetricDS (last version symmetric-server-3.7.24) and in their forum I read it is actually possibile to synch from a view.
So I tried to synch from a view but when I run the program I got an error because symmetricDs cannot create a trigger on the view.
I also read that if a use a materialized view, then the trigger should be created.
The view is on a sqlserver 2008. I dropped the view and create a new one with schemabinding and add a cluster index on it. I also check that all the options are set as required in the MSDN guide to create indexed table.
I run symmetricDS again but still fail to create the trigger on the view.
Can anyone help me?
If what I ask is actually not possibile, then it is possibile to craete an extension that does not use trigger to synchronized the tables? I don't care that the two db are synched realtime, I can use a scheduled job, it will be just fine.
Thank you for you help and suggestion.
BTW: I can also change tool you you know a better one :)
I don't think that's a supported use case. However, you can try setting the sync_on_insert/update/delete fields to 0 on the sym_trigger. Then you would be able to sync the view with an initial load or by scheduling reloads (see "symadmin reload-table" command).
Related
right now I have a challenge what I'm not sure about how to solve in the best manner. I searched the internet but did not find a suitable solution ...
I want to copy data from a view on a linked server (read only; view has severals sub views and tables) to a table which is located on my database. This View contains live data, basically showing the last 100 occuring events. However, what I need is the whole history of the data shown by the view. As I have just reading permission on that specific view and the admin of the linked server is not able (or willing) to give further rights or change the view, I was wondering what the best way is to copy the view data and basically building up the whole history on my database.
I was thinking about a stored procedure and run it on a schedule, but as the last 100 events can change very quickly, this does not seem like an appropriate approach. Another option would be to build a trigger, which takes new rows an copies them to my table. However, I'm not sure if this would be possible.
I appreciate any hints, tips or impulses.
I have an issue with a model not updating to the latest version after requiring the database (probably due to change tracking?).
I add the model to the dbset and save it through the context. At this point there is a server side trigger that updates a field with a link to another table pk column.
After adding the record i need to know the value in the updated column. I requery the database, and a SQL Server trace shows that the new value is returned, but the model value is not updated.
I have tried dethatching the entities similar to this answer. but this does not work.
I have worked around it by creating a new scope from the service scope factory and creating a second instance of my context, but would like to know why this happens and how to avoid it (I have no control of the database so I can't remove the trigger and do everything in code sadly).
Thanks
Paul
Ivans answer in the comments takes care of this. anotating the property in the model with [DatabaseGenerated(DatabaseGeneratedOption.Computed)]
takes care of everything atuomatically.
Thanks for the help
Does anyone knows how to create a view in database with CakePHP3?
Is there a way to do this with migrations?
Like define a view to being executed then when I run cake/bin migrations migrate this create view in BD. Than this may create a view and I must be able to execute this view whatever I want in models or controllers?
Thanks for your help in advance
As markstory said in this issues on GitHub this, functionalities didn't exist ye and may be released soon:
https://github.com/cakephp/cakephp/issues/11632
https://github.com/cakephp/migrations/issues/347
I am using SSIS packages to daily refresh the data. Package logic is as follows,
Delete all rows in destination table
Insert full new data into destination table.
I am trying to find out ways to rollback delete if my insert fails. I tried using SSIS package transaction as below:
But now, after Delete SQL task is run , my package goes stuck for long time and does not respond.
What is the recommended way for doing this?
Any help is much appreciated.
There are quite a few techniques to consider here including some more complex ideas, but if we're looking at simpler ones, you could insert into a table with a different name but the same structure, and only if that works would you then swap it out somehow. One way of doing this is to use views for your access to tables, and then modify the view on success to use the table you've just inserted into.
It might not be the most elegant way, but it is one of the simpler ones to consider.
Change the default of the Transaction Option property of the package to "Required" and make sure each object has that property set to "Supported" which is the default.
Additionally, you can minimize the transaction by doing the same thing with a sequence container around just your Execute SQL task and data flow.
FYI, I can't see pictures at work so I do not know what your package looks like.
I am building an application in Qt/QML.
I have a table view of the database (PostgreSQL).
Is there a way to dynamically refresh my table if there is any change in the database.
One no-so-efficient way to do it is to keep sending periodic SQL queries.
Is there any automatic way to keep my view refreshed?
I am open to use any other Database also if required.
Qt seems to support the NOTIFY mechanism of PostgreSQL databases. Googling for it it found some bug reports, so not sure of well implemented that is. Since I've never used it, I'll have to refer you to google.
If you use QSqlTableModel (or an editable subclass of QSqlQueryModel) with QTableView, any edits made will immediately be visible.