Is it possible to know when and if the contents of certain tables in a database has changed?
I' builting an multiuser app, and I want to notify the user if somebody else modified any relevant data. I'm using an Oracle 10g database and an .NET WinForms app.
Thanks!
One approach is that each time you make an update to the users data you increment a counter associated with that user. Then your application can grab that counter from time to time and see if it has increased and so know if it needs to update itself because something has changed. This is easy to implement.
I am sure Oracle has some callback mechanism that allows it to tell you when an update has occured. This would be more efficient but I do not know enough about Oracle to provide more information about that approach.
Related
We are running a pretty uncommon erp-system of a small it-business which doesn't allow us to modify data in an extensive way. We thought about doing a data update by exporting the data we wanted to change directly from the db and by using Excel VBA to update a bunch of data of different tables. Now we got the data updated in excel which is supposed to be written into the Oracle DB.
The it-business support told us not to do so, because of all the triggers running in the background during a regular data update in their program. We are pretty afraid of damaging the db so we are looking for the best way to do the data update without bypassing any trigger. To be more specific there are some thousands of changes we've done in different columns and tables merged all together in one Excel-file. Now we have to be sure to insert the modified data into the db and firing all the triggers the erp-software does during data update.
Is there anyone who knows a good way to do so?
I don't know what ERP system you are using, but I can relate some experiences from Oracle's E-Business Suite.
Nowadays, Oracle's ERP includes a robust set of APIs that will allow your custom programs to safely maintain ERP data. For example, if you want to modify a sales order, you use Oracle's API for that purpose and it makes sure all the necessary, related validations and logic are applied.
So, step #1 -- find out if your ERP system offers any APIs to allow you to safely update your data.
Back in the early days of Oracle's ERP, there were not so many APIs. In those days, when we needed to update a lot of table and had no API available, the next approach would be to use some sort of data loader tool. The most popular was, in fact, called "Data Loader". What this would do is read your data from an Excel spreadsheet and send it to the ERP's user interface -- exactly as though it were being typed in by a user. Since the data went through the ERP's UI, all the necessary validations and logic would automatically be applied.
In really extreme cases, when there was no API and DataLoader was, for whatever reason, not practical, it was still sometimes deemed necessary and worth the risk to attempt our own direct update of the ERP tables. This is, in general, risky and a bad practice, but sometimes we do what we must.
In these cases, we would start a database trace going on a user's session as they keyed in a few updates via the ERP's user interface. Then, we would use the trace to figure out what validations and related logic we needed to apply during our custom direct updates. We would also analyze the source code of the ERP system (since we had it available in the case of Oracle's ERP). Then, we would test it extensively. And, after all that, it was still risky and also prone to break after upgrades. But, in general, it worked as a last resort.
No my problem is that I need to do the work fast by make some automation in my processes. The work is already done on excel that's true but it needed the modification anyway. It's only if I put it manually with c&p into the db over our ERP or all at once over I don't know what.
But I guess Mathew is right. There are validation processes in the ERP so we can't write it directly into the db.
I don't know maybe you could contact me if you have a clue to bypass the ERP in a non risky manner.
I work as a scm developer and I am currently tasked with a activity to which involves the database versioning. Although I have done source code management I am quite new to this. Hence I would like to have different views and experience on how to implement this.
What I mean by database(oracle/sybase) version is to capture the changes which happens to the database schema/triggers/etc and store it as revisions. Basically in our company there are some changes in the customer databases which we are not aware of or at least not able to identify when and who made a particular change. We are just trying to create a record of the changes which happens in the DB.
Note: I am not a DB guy.
The usual practice is to allow changes to go through a build process. Basically.. have a version control tool like CVS where users check in the changes that have to to go to the QA and Prod environments.
So.. let's say, there are a couple of columns added to a table, the developer would check in a .ddl script with the "Alter table ..." command and that will be "applied" to the database the next time you do a build.
Unless you restrict users (in this case.. Developers) from directly making changes and instead use a standard build-process, tracking changes to objects is almost impossible over time.
Consider necessary details like the user who made the change, Time of change, reason (Check-in comments, bug Number, new feature request etc) which you'd need later to understand why a change was made. All the changes are usually compiled using a standard user like "APPOWNER" and in the absence of a version control system, you only have access to the latest change (last_ddl_change ).
If your concern is to track changes to Data, you can use triggers or use an application like Golden Gate that will read through the redo-logs and get you the change capture records. From your Question, it looks like you are looking for a way to track object changes.
The best way to do it is to have some kind of db revision software which manages all changes and allows to easily apply it to multiple databases (up/downgrade).
It requires to save all changes to revision software, no direct db changes.
Maybe similar tools for PostgreSQL will help:
depesz scripts http://www.depesz.com/index.php/projects/.
Python tool: https://code.google.com/p/sqlalchemy-migrate/
I posted a similar post to this before but since then i have done some research and thought i'd put this one out again to see if anyone maybe has any thoughts on my problem.
I am running a WPF application with C# as the code beind. The datbase i am using is SQL Server 2005.
Now i am currently binding to the database using ADO.Net and retrieving the data from the stored procs in the db. This is then stored in datasets and further down the line bound to controls in my WPF application.
Now the data that i am binding to in the db is constantly changing, let say every few seconds. What i want is a mechanism to automatically tell me in C# when the data that i have bound to in the db, so the data returned from my stored procs has changed.
I looked on the web and found notification services and SQLDependency class but this is being deprecated. I also saw CLINQ but this doesn't seem to handle the database notification side, it only works for collections in c# from what i understand.
I mean the plan B method is to just have a thread in my C# code to poll the same stored proc every few seconds based on a timestamp that is stored on every row in the returned dataset. If my current timestamp is greater then what is being returned then retreive that data. This would then run in this new thread looping over and over. An if any data was returned from the connection in the thread that would mean that data has changed and so store that data into my collections in c# so they can be bound to my WPF app.
Obviously this is something i don't really want to do as i thought there maybe a smarter way to do this but i just can't seem to located anythings.
Does anyone have any ideas on this one?
Much appreciated
Iffy.
how about SQL CLR triggers? SQL triggers can be run every time an update/insert/delete occurs in targeted tables in your database and it seems CLR SQL triggers extend this functionality to managed code:
http://msdn.microsoft.com/en-us/library/938d9dz2(VS.80).aspx
not an expert on this but hopefully it will give you something useful to look at.
Plan B sounds like a good choice to me - there are no notification methods built-in to SQL Server to allow you to do this, so your only options would be:
Triggers combined with CLR user defined functions. The trouble is that depending on how much work you do in your user defined function this could add quite a lot of overhead for anyone writing to the database - this is not the most efficient layer to implement a notification mechanism in.
Implement some sort of notification mechanism and make sure that everyone who writes to the database uses this notification mechanism to notify others that the data has changed.
Considering how much effort #2 is, I'd say that plan B is a pretty good choice.
Depending on how frequently you poll and how efficient your polling is, you will probably find that the end effect for the user is just as good as option #2 anyway, and the overhead of doing that polling is probably less than you think. (if you have many users then you can use techniques like caching / a common polling thread to help out)
I am not sure whether this has been asked before; I did a few searches but nothing appropriate showed up.
OK, now my problem:
I want to migrate an old application to a different programming language. The only requirement we have is to keep the database structure stable. So no changes in my database schema. For the rest of the application I am basically reimplementing everything from scratch without reusing old code.
My Idea: in order to verify my new code was to let users do certain actions or workflows, capture the state of the database before that and after that and then maybe create unit tests with the help of this data. Does anyone know an elegant solution to keep track of these changes? Copying the database (>10GB) is pretty expensive. I also can't modify the code of the old application in which the users will be performing these sample actions. I have to keep it on the database level.
My database is Oracle 10g.
You could capture the old application behavior with a trace and then validate the changes against your new code. But, honestly, trying to write a new application by capturing the data modifications it makes and the imitating that will be a very difficult task as the inputs and the outputs to the original application are not guaranteed to be stateless (that is, the old application might do the same thing the first 1,000,000 times it is given a certain set of inputs and do something completely different on the 1,000,001st run.)
Your best bet is to start over with the business requirements and use the old application and a functional reference.
Take a look at Oracle Flashback Queries.
It enables to execute queries which return past data. The timeframe is limited, but it can be very useful.
In 10g the only way is to do with FLASHBACK queries.in 11g we can do this with RAT(Real Application Testing). RAT is quite useful for this senarios and also for load and volume testing.
I have really odd user requirement. I have tried to explain to them there are much better ways of supporting their business process and they don't want to hear it. I am tempted to walk away but I first want to see if maybe there is another way.
Is there any way that I can lock a whole database as opposed to row-lock or table-lock. I know I can perhaps put the database into single-user mode but that means only one person can use it at a time. I would like many people to be able to read at a time but only one person to be able to write to it at a time.
They are trying to do some really odd data migration.
What do you want to achieve?
Do you want to make the whole database read-only? You can definitely do that
Do you want to prevent any new clients from connecting to the database? You can definitely do that too
But there's really no concept of a "database lock" in terms of only ever allowing one person to use the database. At least not in SQL Server, not that I'm aware of. What good would that make you, anyway?
If you want to do data migration out of this database, then setting the database into read-only mode (or creating a snapshot copy of it) will probably be sufficient and the easiest way to go.
UPDATE: for the scenario you mention (grab the data for people with laptops, and then re-syncronize), you should definitely check out ADO.NET Sync Services - that's exactly what it's made for!
Even if you can't use ADO.NET Sync Services, you should still be able to selectively and intelligently update your central database with the changes from laptops without locking the entire database. SQL Server has several methods to update rows even while the database is in use - there's really no need to completely lock the whole database just to update a few rows!
For instance: you should have a TIMESTAMP (or ROWVERSION) column on each of your data tables, which would easily allow you to see if any changes have occured at all. If the TIMESTAMP field (which is really just a counter - it has nothing to do with date or time) has not changed, the row has not changed and thus doesn't need to be considered for an update.