intersystems cache db insert/update/delete history - database

so I'm in the position of using a cache database. Not my decision, I'm coming into the project with the view of it's a database, so all the naysayers please be respectable. There's over 24 million rows per year added to this database so I'm looking for a way to do history on insert/update/delete. In sql server we would create a database model, then run a tool to generate history tables in another database, and triggers to insert/update/delete. e.g. [MyDatabase].[dbo].[Address], [MyDatabaseHistory].[dbo].[AddressHistory]...you get the idea...anyone out there with experience doing a similar thing to a cache database?

In Caché you can also use triggers, please see in documentation

Related

Import export data only for TB's data SQL Server

I have a production database of 20 TB data. We migrated our database from Oracle to SQL Server. Our old application was based on a Cobol based platform. After migrating to SQL Server indexes are giving good results.
I am creating a schema with new set of indexes without any data. Now I want to migrate only the data.
Import/Export utility will take load log time and will fill up the log files also. Is there any other alternative of this ?
My advice would be:
Set the recovery model to simple. See here.
Remove the indexes.
Batch insert the rows or use select into (this minimizes logging).
Re-create the indexes.
I admit that I haven't had to do this sort of thing in a long time in SQL Server. There may be other methods that are faster -- such as backing up a table space/partition and restoring it in another location.
You may use bcp utility to import/export data. For full details see here https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-2017

Automate sql server profiler to record data then save data to a table continuously

Is there a way to automate sql server profiler to record data then save data to a table continuously?
The reason, I am supporting a fragile SQL Server application and there is no auditing. I receive a lot of support calls regarding the deletion of records. I want a quick way to be able to view who has changed what data.
You can configure your profiler to save the trace directly to table as described here: How To Save a SQL Server Trace Data to a Table
But it's not a good idea for 2 reasons: first, profiler itself will be loading up your server, second, writing to table is the most costly option and you can even loose some events.
Maybe if you are on Enterprise edition you can use SQL Server database audit
that is more light weight
And here you can find a complete example of setting up database audit that audits the DELETE events
Here are few articles for your reference.
Save trace results to a database table
https://learn.microsoft.com/en-us/sql/tools/sql-server-profiler/save-trace-results-to-a-table-sql-server-profiler
Save Trace Results to a Table
https://technet.microsoft.com/en-us/library/ms191276(v=sql.110).aspx
9 Steps to an Automated Trace
http://sqlmag.com/t-sql/9-steps-automated-trace
alternatively, you may try this automated solution ( https://www.lepide.com/lepideauditor/sql-server-auditing.html ) to accomplish this task.

Azure sql cross database trigger

. I have two databases in same azure sql server .i want that both database interact to each other using trigger. i.e If any record is inserted in Customer table of first database the trigger gets fired and record is inserted in another database.
We had / have the same problem with triggers that we use for insert-update-delete where we write a record to Database-1 that has the primary table, but also updates Database-2 where we hold "archive" versions of the tables.
The only solution we have identified and are testing is to bring all of the tables into a single database and separate the different tables under separate database schemas in the one database.
Analysis so far of this approach looks promising.
I think what you're trying to do is not allowed in Sql Azure. From my expertise what you are trying to do is a bad practice on-premise as well (think backups-restore and availability issue scenarios).
You should move the dependency in the application and have the application update both databases, as appropriate.
Anyway, if you want to continue with this approach please take a look over Elastic Query feature: https://learn.microsoft.com/en-in/azure/sql-database/sql-database-elastic-query-overview
Please let me know if I can help with something

Linking tables between databases

I’m after a bit of advice on the best way to go about this is SQL server 2008R2 express. I have a number of applications that are in separate databases on the same server. They are all “plugins” that use a central staff/structure list that will be in a separate database. The application is in the process of being migrated from JET.
What I’m looking for is the best way of all the “plugin” databases being able to see the central database and use those tables in standard queries and views etc.
As I’m using express that rules out any replication solution and so far the only option I can think of is to use triggers or a stored procedure to “push” out all the changes to the plugins. The information needs to be populated on a near enough real time basis however the number of changes will be very small maybe up to 100 a day and the biggest table only has about 1000 rows at the moment (the staff names table).
Hopefully that will cover all everything but if anyone needs any more details then just ask
Thanks
Apologies if I've misunderstood, but from your description it sounds like all these databases are hosted on the same instance of SQL Server - it's your mention of replication that makes me uncertain.
Assuming that's the case, you should be able to replace any copies of tables from the central database which are held in the "plugin" databases with views or synonyms which reference the central tables directly, since SQL server allows you to make references between databases on the same server using three-part naming (database_name.schema_name.object_name)
For example, if each plugin db has a table StaffNames, you could replace this with a view by dropping the table, then creating a view:
drop table StaffNames
go
create view StaffNames
as
select * from <centraldbname>.<schema - probably dbo>.StaffNames
go
and your code should continue to work seamlessly, as long as permissions are set up.
Alternatively, you could replace all the references to the shared tables in the plugin databases with three-part name references to the central database, but the view method requires less work.

backup data for reporting

What is the best method to transfer data from sales table to sales history table in sql server 2005. sales history table will be used for reporting.
Take a look at SSAS. OLAP is built for reporting and is easy to query with tools like excel pivot tables.
Bulkcopy is fast and it will not use the transaction log. One batch run at the end of the day.
Deleting the copied records from your production server is a different situation that needs to be planed on that server's maintenance approach/plans. Your reporting server solution should not interfere with or affect the production server.
Keep in mind that your reporting server is not meant to be a backup of the data but rather a copy made exclusively for reporting purposes.
Also check on the server settings of your reporting server to be on Simple recovery model.
Most solutions will require 2 steps;
-copy the records from source to target
-delete records from source.
It is essential that your source table have a primary key.
The "best" method depends on a lot of things.
How many records?
Is this a production environment?
What tools do you have?
Unless you are moving a large amount of data, a simple stored procedure should do the trick.
A sql server job can manage the timing of when to call the proc.
if you just want to move the data to another table, use BulkCopy/BulkInsert. if you want to build reporting I would suggest a BI solution such as MS Analysis Service (OLAP).
It is difficult and in my opinion ugly to maintain two or more history/archive tables in the same database. For a reporting solution you will be considering all the tables for that piece of information anyway. History/Archive tables should only be used if you are going to put the data away and not touch it for a long period of time, ie. archive it away outside the operational DB.

Resources