Querying the Audit Log through Database [SQL Server] - sql-server

I would like to take the Audit History provided by Enterprise Architect and create a SQL query to report through a BI tool that will allow myself and other users to search the history of an object but I am having a little trouble understanding the audit table: t_snapshot.
From what I can tell, t_snapshot has a Style column that contains "INSERT," "UPDATE," and "DELETE" which would tell me what is happening and the Notes column can tell me what object it is referencing but so far I've only been able to get a partial picture. What I have not been able to deduce is when any event occurred or which user made the change.
If anyone has encountered this problem in the past, your input would be appreciated.

Well, I don't know whether you really want to touch that.
There's a column called BinContent which contains what you are looking for. It looks like
<LogItem><Row Number="0"><Column Name="object_id"><Old Value="1797"/><New Value="1797"/></Column><Column Name="name"><Old Value="CB"/><New Value="CBc"/></Column><Column Name="modifieddate"><Old Value="07.12.2018"/><New Value="11.12.2018"/></Column><appliesTo><Element Type="Action"/></appliesTo></Row><Details User="Thomas" DateTime="2018-12-11 08:22:59"/></LogItem>
So basically some XML describing the change including the plain text user name.
The bincontent column(s) are actually zips which contain a single file str.dat holding the above information.
Good luck.

Related

Salesforce Report - Field Not Populating Within Report

Hope you're well. I'm currently building out a report, but despite my best efforts so far, I can't get some information to populate within the report. It does not appear to me that salesforce is recognizing the field "Agent Incoming Connecting Time" within the object "AC_Agent_Performance". However, I can pull in some other fields within the same object into the Agent Performance report, so I'm not clear on what is not taking place in the field that I wish to see within the report. Here are some of the things that I've tried:
I have checked the access to the field. The first photo (Photo 1) Shows an example of a working object, the the second one shows an example of one that does not.
The API name seems to work, and is consistent with other fields within the object that work.
I have checked the page layout for the object (even though I don't think this is the issue), and I have mirrored other fields to the best of my knowledge that ARE populating within the report.
I reviewed the CTI flows to see if there was something missing in there on a lark, but there was nothing in there that would lead me to believe that this was the source of the problem.
I have tried setting up a new field in the object (formula), that references the field that I'm trying. to pull in, but that just returns a result of 'zero' for all values.
One thing that I have done that appears to be working, is I have set up a joined report, which uses both "AC Agent Performance" object and "AC Historical Queue Metrics" object in the report. The result that is returning appears to be accurate (please see the picture (picture number 3)). However, I don't think that this is the right way to go about this, and I don't want to do it this way. I want to use the report with one object rather than with two.
I know that permissions are the most likely issue, so I've taken a close look at these. Please let me know if there is something wrong with how I have the permissions configured. The First image depicts the 'Field Level Security'. The second image depicts the'field accessibility'. They are both like this, the whole way down:
Please note one other thing, which is that the last picture depicts a different field within the object displaying in the report.
Does anyone have any ideas on how I can proceed so the field "Agent Incoming Connecting Time" will display within the report?
Please also note, that these are objects that contain data that is populated from AWS' Amazon Connect.
This last photo, shows that the object does not have any information in it within the report.
If the field isn't populated there's not much you can do on the reporting side of things. You already tried "joined report". You should check why the integration doesn't populate it, maybe read integration documentation, contact the managed package's support...
The tables are connected with lookup or master-detail, right? In a pinch you could try making formula field on "AC Agent Performance" looking "up" and pulling the value from related AC historical queue metrics. If the relationship is other way around (performance -> down to related list -> metrics) you could try to make-do with a master detail and rollup summary field. I don't know this package, no idea if you can pull it off when you don't have full control over the fields.
If you can't really use the relationships and absolutely need to report on single table - you could capture intermediate results of the report to a helper table and then report on that. It's called "reporting snapshots". Or write some nightly (hourly?) batch that recalculates stuff and writes homemade "rollup" to these fields?

SQL Server Mgmt Studio shows "invalid column name" when listing columns?

I'm used to scripting in Python or Matlab, and my first couple hours with SQL have been infuriating. I would like to make a list of columns appear on the screen in any way, shape, or form; but when I use commands like
select *
from "2Second Log.dbo.TagTable.Columns"
I keep getting the error:
Invalid column name '[the first column in my table]'.
even though I never explicitly asked for [the first column in my table], it found it for me. How can you correctly identify the first column name, and then still claim it's invalid!? Babies will be strangled.
This db was generated by Allen Bradley's FactoryTalk software. What I would really like to do is produce an actual list of "TagName" strings...but I get the same error when I try that. If there were a way to actually double click the table and open it up and look at it (like in Matlab), that would be ideal.
Echoing juergen's suggestion in the comment above. It looks like you're running the query on the master database, not the 2Second Log database that actually has your table. (You can tell this by looking at the database in the dropdown in the top left of your screenshot). Two things you can do:
Change the dropdown in the top left to 2Second Log. This will target your query to a different database
Put your database name in brackets as suggested by juergen i.e. select * from [2Second Log].dbo.TagTable
As an side, if you're looking for a good SQL tutorial, I highly recommend the Mode SQL tutorial. It's a fantastic interactive platform to get your SQL feet wet.
always use brackets when names/field have spaces or dashes.
select * from [2Second Log].dbo.TagTable

Epicor asking for password after making a table change

Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.

How to store data's change history?

I need to store data's change histories in database. For example some time some user modify some property of some data. The expected result is we can get the change histories for one data like
Tom changed title to 'Title one;'
James changed name to 'New name'
Steve added new_tag 'tag23'
Based on these change histories we can get all versions for some data.
Any good idea to achieve this? Not limited to traditional relation database.
These are commonly called audit tables. How I generally manage this is using triggers on the database. For every insert/update from a source table the trigger copies the data into another table called the same table name with an _AUDIT appended to it (the naming convention does not matter, it's just what I use). ORACLE provides you with something called journal tables. Using ORACLE designer (or manually) you can achieve the same thing and often developers put a _JN to the end of the journal/audit table. This, however, works the same, with triggers on the source table copying data into the audit table.
EDIT:
I should also note that you can create a new separate schema to manage just your audit tables or you can keep them in your schema with the source tables. I do both, it just depends on the situation.
I wrote an article about various options: http://blog.schauderhaft.de/2009/11/29/versioned-data/
If you are not tied to a relational database, there are things called 'append only' databases (I think), which never change data, but only append new versions. For your case this sounds kind of perfect. Unfortunately I don't know of any implementation.

Export tables from SQL Server to be imported to Oracle 10g

I'm trying to export some tables from SQL Server 2005 and then create those tables and populate them in Oracle.
I have about 10 tables, varying from 4 columns up to 25. I'm not using any constraints/keys so this should be reasonably straight forward.
Firstly I generated scripts to get the table structure, then modified them to conform to Oracle syntax standards (ie changed the nvarchar to varchar2)
Next I exported the data using SQL Servers export wizard which created a csv flat file. However my main issue is that I can't find a way to force SQL Server to double quote column names. One of my columns contains commas, so unless I can find a method for SQL server to quote column names then I will have trouble when it comes to importing this.
Also, am I going the difficult route, or is there an easier way to do this?
Thanks
EDIT: By quoting I'm refering to quoting the column values in the csv. For example I have a column which contains addresses like
101 High Street, Sometown, Some
county, PO5TC053
Without changing it to the following, it would cause issues when loading the CSV
"101 High Street, Sometown, Some
county, PO5TC053"
After looking at some options with SQLDeveloper, or to manually try to export/import, I found a utility on SQL Server management studio that gets the desired results, and is easy to use, do the following
Goto the source schema on SQL Server
Right click > Export data
Select source as current schema
Select destination as "Oracle OLE provider"
Select properties, then add the service name into the first box, then username and password, be sure to click "remember password"
Enter query to get desired results to be migrated
Enter table name, then click the "Edit" button
Alter mappings, change nvarchars to varchar2, and INTEGER to NUMBER
Run
Repeat process for remaining tables, save as jobs if you need to do this again in the future
Use the SQLDeveloper migration tools
I think quoting column names in oracle is something you should not use. It causes all sort of problems.
As Robert has said, I'd strongly advise agains quoting column names. The result is that you'd have to quote them not only when importing the data, but also whenever you want to reference that column in a SQL statement - and yes, that probably means in your program code as well. Building SQL statements becomes a total hassle!
From what you're writing, I'm not sure if you are referring to the column names or the data in these columns. (Can SQLServer really have a comma in the column name? I'd be really surprised if there was a good reason for that!) Quoting the column content should be done for any string-like columns (although I found that other characters usually work better as the need to "escape" quotes becomes another issue). If you're exporting in CSV that should be an option .. but then I'm not familiar with the export wizard.
Another idea for moving the data (depending on the scale of your project) would be to use an ETL/EAI tool. I've been playing around a bit with the Pentaho suite and their Kettle component. It offered a good range of options to move data from one place to another. It may be a bit oversized for a simple transfer, but if it's a big "migration" with the corresponding volume, it may be a good option.

Resources