Date and Time of Database refresh from PeopleSoft Production database - database

In a PeopleSoft application running on an Oracle database, how can I find out when a non-Production database got copied/refreshed from Production?
Is there a PeopleSoft/Oracle table (or tables) that have the date and time of the refresh?
I tried to glean this information by cross referencing PeopleSoft tables PSPRCSRQST and PSPRCSQUE with the Process Instance table (PRCSINSTANCE). That wasn't good enough though because it's unclear which Process Instance numbers exist in both production and the clone, and which exist only in the clone.
Is there any other way to find the last refresh date of a non-Production database cloned from a Production database?

There are a few places you can check, but some of them require privileges a normal user may not have.
Here is an Oracle metadata view typically meant for use by the DBA:
SELECT name, created
FROM v$database
/
There are also a couple of PeopleSoft tables I am aware of, but in my (limited) experience, they are not always reliably populated.
SELECT lastrefreshdttm
FROM sysadm.psstatus
/
SELECT releasedttm, releaselabel
FROM sysadm.psrelease
ORDER BY 1 DESC
/
psrelease is the table that shows up when you perform a PeopleTools Application Designer compare, and depending on your environment configuration will also show up in your PIA when you Ctrl+Shift+J).

Related

Copy records from a table on one SQL instance to an identical table on a different SQL instance

We had an intern who was given written instructions for deleting old data from a database based on dates (from within our ERP system). They were fascinated by the results and just kept deleting instead of stopping at the required date. There are now 4 years of missing records in the production database. I have these records in my development database, which is in a different instance on a different server. Is there a way to transfer just those 4 years worth of data from my development database to my production database, checking, of course, to make sure there are no duplicates (unique index on transaction number).
I haven't tried anything yet because I'm not sure where to start. I do have a test database on the same instance as the production database that I could use to test the transfer with.
There are several ways to do this. Assuming that this is on a different machine, you will want to create a Linked Server on your dev machine to link to the target server (Or, technically, a link from the production server to your dev machine could be used as well). Then, perform an insert of the selected records from the source to the target.
More efficiently, you can use the Export Data functionality. Right click on the database (Not the server / instance, but the database) and select Tasks / Export Data from the popup menu. This will pop up the SQL Server Import and Export Wizard. Use your query above to select the data for export.
If security considerations interfere with this, create a duplicate of the table(s) with alternate names (e.g. MyInvRecords) in a new database, and export the data into those tables. Back up that DB, transfer it to someplace accessible from the target server, restore that DB, then transfer the rows back into the original DB.
I haven't had to use anything but these methods before, so one of them should work for you.
A basic insert will work just fine.
Insert ProdDB.schema.YourTable
([Columns])
select ([Columns])
from TestDB.schema.YourTable
where YourDateRange predicates here

Is this an appropriate database design if I wanted to audit my table?

It's my first time creating an audit log for a PoS WPF application and was wondering on how exactly do I implement an auditing system because it seems like each option available has its ups and downs. So far from reading numerous articles/threads, I've narrowed down a few common practices on audit logs:
1. Triggers - Unfortunately, due to the nature of my app, I can't make use of triggers as it has no way of knowing which user has done the action. So what I did instead was to create a Stored Procedure which will handle the customer insert along with the customer log insert and its details. The Customer_Id will be provided by the application when using the Stored Procedure.
2. Have an old and new value - My initial plan was to only include the latter since I can reference its old value with the new value from the row before it but storing the the old and new value seemed more sensible, complexity-wise.
3. Use a separate database for the log / 4. Foreign Keys - This is probably my main concern, if I decide to use a separate database for the audit table, then I couldn't setup foreign keys for the customer and employee involved.
I created a mock-up erd with a master-detail table result to be shown on the wpf app to display the log to an admin and would really like your thoughts on possible problems that may arise (There's also an employee table but I forgot to put it):
https://ibb.co/dheaNK
Here's a few info that might be of help:
The database will reside together with the wpf app, which is a single computer.
The amount of customers will be less than 1000.
The amount of regular employees will be 3.
The amount of admins will be 2.
You can enable CDC Change Data Capture on SQL Server database for a specific table
This will enable you to collect all data changes on the database table logged in special tables.
You can also refer to official documents too
Here is a list of DML commands and how data changes are logged in the CDC table created for the source database table
What is good about CDC is it comes default with SQL Server and you don't have to do anything for logging. The only requirement is SQL Server Agent should be running so that the changes can be reflected on log table.

Linking tables between databases

I’m after a bit of advice on the best way to go about this is SQL server 2008R2 express. I have a number of applications that are in separate databases on the same server. They are all “plugins” that use a central staff/structure list that will be in a separate database. The application is in the process of being migrated from JET.
What I’m looking for is the best way of all the “plugin” databases being able to see the central database and use those tables in standard queries and views etc.
As I’m using express that rules out any replication solution and so far the only option I can think of is to use triggers or a stored procedure to “push” out all the changes to the plugins. The information needs to be populated on a near enough real time basis however the number of changes will be very small maybe up to 100 a day and the biggest table only has about 1000 rows at the moment (the staff names table).
Hopefully that will cover all everything but if anyone needs any more details then just ask
Thanks
Apologies if I've misunderstood, but from your description it sounds like all these databases are hosted on the same instance of SQL Server - it's your mention of replication that makes me uncertain.
Assuming that's the case, you should be able to replace any copies of tables from the central database which are held in the "plugin" databases with views or synonyms which reference the central tables directly, since SQL server allows you to make references between databases on the same server using three-part naming (database_name.schema_name.object_name)
For example, if each plugin db has a table StaffNames, you could replace this with a view by dropping the table, then creating a view:
drop table StaffNames
go
create view StaffNames
as
select * from <centraldbname>.<schema - probably dbo>.StaffNames
go
and your code should continue to work seamlessly, as long as permissions are set up.
Alternatively, you could replace all the references to the shared tables in the plugin databases with three-part name references to the central database, but the view method requires less work.

Postgresql - one database for everyone, or one-database per customer

I'm working on a web-based business application where each customer will need to have their own data (think basecamphq.com type model) For scalability and ease-of-upgrades, I'd prefer to have a single database where each customer gets a filtered version of the data. The problem is how to guarantee that they stay sandboxed to their own data. Trying to enforce it in code seems like a disaster waiting to happen. I know Oracle has a way to append a where clause to every query based on a login id, but does Postgresql have anything similar?
If not, is there a different design pattern I could use (like creating a view of each table for each customer that filters)?
Worse case scenario, what is the performance/memory overhead of having 1000 100M databases vs having a single 1Tb database? I will need to provide backup/restore functionality on a per-customer basis which is dead-simple on a single database but quite a bit trickier if they are sharing the database with other customers.
You might want to look into adding Veil to your PostgreSQL installation.
Schemas plus inherited tables might work for this, create your master table then inherit tables into per-customer schemas which provide a company ID or name field default.
Set the permissions per schema for each customer and set the schema search path per user. Use the same table names in each schema so that the queries remain the same.

Copy Database Data from Many DBs to One. Data Replication (sort of)

This involves data replication, kind of:
We have many sites with SQL Express installed, there is an 'audit' database on each site that has one table in 1st normal form (to make life simple :)
Now I need to get this table from each site, and copy the contents (say, with a Date Time Value > 1/1/200 00:00, but this will change obviously) and copy it to a big 'super table' in sql server proper, that also has the primary key as the Site Name (That needs injecting in) and the current primary key from the SQL Express table)
e.g. Many SQL Express DBs with the following table columns
ID, Definition Name, Definition Type, DateTime, Success, NvarChar1, NvarChar2 etc etc etc
And the big super table needs to have:
SiteName, ID, Definition Name, Definition Type, DateTime, Success, NvarChar1, NvarChar2 etc etc etc
Where items in bold are the primary key(s)
Is there a Microsoft (or non MS I suppose) app/tool/thing to manager copying all this data accross already, or do we need to write our own?
Many thanks.
You can use SSIS (which comes with SQL Server) to populate, it can be set up with variables to change the connection string to the various databases. I have one that loops through the whole list and does the same process using three differnt files from three differnt vendors. You could so something simliar to loop through the different site databases. Put the whole list of database you want to copy the audit data from in a table and loop through it changing the connection string each time.
However, why on earth would you want one mega audit table per site? If every table in the database populates the audit table as changes happen, then the audit table eventually becomes a huge problem for performance. Every insert, update and delete has to hit this table and then you are proposing to add an export on top of that. This seems to me to be a guaranteed structure for locking and deadlocks and all sorts of nastiness. Do yourself a favor and limit each audit table to the table it is auditing.
Things to consider:
Linked servers and sp_msforeachdb as part of a do-it-yourself solution.
SQL Server Replication (by Microsoft) (which I believe can pull data from SQL Server Express)
SQL Server Integration Services which can pull data from SQL Server Express instances.
Personally, I would investigate Integration Services first.
Good luck.
You could do this with SymmetricDS. SymmetricDS is open source, web-enabled, database independent, data synchronization/replication software. It uses web and database technologies to replicate tables between relational databases in near real time. The software was designed to scale for a large number of databases, work across low-bandwidth connections, and withstand periods of network outage.
As of right now, however, you would need to implement a custom IDataLoaderFilter extension point (in Java) to add the extra column. The metadata would be available though because your SiteName would be the external_id.

Resources