I need to get with T-SQL all the objects that a view depends on. I have been trying with some examples that are in this forum and modified by me for example in Querying dependencies: Objects on which a view depends using sys.dm_sql_referenced_entities but I did not get the results that can be seen in the attached image that is obtained from the "view dependencies" option in SSMS because function are not showed ; marked with a red square are the tables that are not necessary to show. Thanks a lot
Related
Hope you're well. I'm currently building out a report, but despite my best efforts so far, I can't get some information to populate within the report. It does not appear to me that salesforce is recognizing the field "Agent Incoming Connecting Time" within the object "AC_Agent_Performance". However, I can pull in some other fields within the same object into the Agent Performance report, so I'm not clear on what is not taking place in the field that I wish to see within the report. Here are some of the things that I've tried:
I have checked the access to the field. The first photo (Photo 1) Shows an example of a working object, the the second one shows an example of one that does not.
The API name seems to work, and is consistent with other fields within the object that work.
I have checked the page layout for the object (even though I don't think this is the issue), and I have mirrored other fields to the best of my knowledge that ARE populating within the report.
I reviewed the CTI flows to see if there was something missing in there on a lark, but there was nothing in there that would lead me to believe that this was the source of the problem.
I have tried setting up a new field in the object (formula), that references the field that I'm trying. to pull in, but that just returns a result of 'zero' for all values.
One thing that I have done that appears to be working, is I have set up a joined report, which uses both "AC Agent Performance" object and "AC Historical Queue Metrics" object in the report. The result that is returning appears to be accurate (please see the picture (picture number 3)). However, I don't think that this is the right way to go about this, and I don't want to do it this way. I want to use the report with one object rather than with two.
I know that permissions are the most likely issue, so I've taken a close look at these. Please let me know if there is something wrong with how I have the permissions configured. The First image depicts the 'Field Level Security'. The second image depicts the'field accessibility'. They are both like this, the whole way down:
Please note one other thing, which is that the last picture depicts a different field within the object displaying in the report.
Does anyone have any ideas on how I can proceed so the field "Agent Incoming Connecting Time" will display within the report?
Please also note, that these are objects that contain data that is populated from AWS' Amazon Connect.
This last photo, shows that the object does not have any information in it within the report.
If the field isn't populated there's not much you can do on the reporting side of things. You already tried "joined report". You should check why the integration doesn't populate it, maybe read integration documentation, contact the managed package's support...
The tables are connected with lookup or master-detail, right? In a pinch you could try making formula field on "AC Agent Performance" looking "up" and pulling the value from related AC historical queue metrics. If the relationship is other way around (performance -> down to related list -> metrics) you could try to make-do with a master detail and rollup summary field. I don't know this package, no idea if you can pull it off when you don't have full control over the fields.
If you can't really use the relationships and absolutely need to report on single table - you could capture intermediate results of the report to a helper table and then report on that. It's called "reporting snapshots". Or write some nightly (hourly?) batch that recalculates stuff and writes homemade "rollup" to these fields?
I would like to take the Audit History provided by Enterprise Architect and create a SQL query to report through a BI tool that will allow myself and other users to search the history of an object but I am having a little trouble understanding the audit table: t_snapshot.
From what I can tell, t_snapshot has a Style column that contains "INSERT," "UPDATE," and "DELETE" which would tell me what is happening and the Notes column can tell me what object it is referencing but so far I've only been able to get a partial picture. What I have not been able to deduce is when any event occurred or which user made the change.
If anyone has encountered this problem in the past, your input would be appreciated.
Well, I don't know whether you really want to touch that.
There's a column called BinContent which contains what you are looking for. It looks like
<LogItem><Row Number="0"><Column Name="object_id"><Old Value="1797"/><New Value="1797"/></Column><Column Name="name"><Old Value="CB"/><New Value="CBc"/></Column><Column Name="modifieddate"><Old Value="07.12.2018"/><New Value="11.12.2018"/></Column><appliesTo><Element Type="Action"/></appliesTo></Row><Details User="Thomas" DateTime="2018-12-11 08:22:59"/></LogItem>
So basically some XML describing the change including the plain text user name.
The bincontent column(s) are actually zips which contain a single file str.dat holding the above information.
Good luck.
What I am currently looking to do is use our existing UI to select a number of columns from various tables (yes, multiple tables) and pass them into a BIRT report as parameters. From there, I am planning on building a query that will dynamically replace the columns into the query and pull the results automatically. I'll have to hide columns with no value passed to them as well. I also expect I'll have to setup the query to be a little heavy handed and already know all of the logical connections in the database (e.g. connect the proper tables, etc).
My question is this the best way to manage a dynamic column/table in a dataset? or is there a better way to manage this method? I'v seen some online information about the "ad-hoc" BIRT report designer that lets non-programmers create reports, but I am not looking for other people to actually build the report, just generate one using an existing template with interchangeable columns.
I think the easiest way is to firstly build a report with all columns that you need.
Then apply some logic on the visibility of the column. You can use the parameters there as well.
Select your table, select a column, open the properties window and take a look at the visibility tab there. Just add some logic that results in a true or false.
If you are using a crosstab to display the information, you could use the filter options to include or exclude columns.
Yes, this will load data that is not used, but you need to build realy big reports for performance becomming a real issue.
If you try to add this logic in the actual dataset, you have to make the query and fetch script dynamic and then you still have the problem with the visualisation of the columns. I think you'll end up using the visibility script anyway (to show/hide the colums on the report), so might just start from there and have a working report fast.
Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.
We have a View (call it X) that is the base view called by 2 other views(call them Y and Z).
Today we made a change to view X, after that view Y and Z started bringing back data that was incorrect. When we were in Management Studio and ran SELECT * FROM Y(which is exactly how the view is called in code) it would get back data that was incorrect. However, when we ran the actual SQL that the view contained it was fine. We tried a number of things until a colleague suggested adding a space to view X and Z and then running Alter, which worked. Everything returned to normal and ran fine.
My question is: Does MSSQL cache its views? and if so how do you force them not to OR force them to re-compile?
Also, any additional reading about this would be helpful.
See the sp_refreshview command.
Updates the metadata for the specified non-schema-bound view.
Persistent metadata for a view can become outdated because of changes
to the underlying objects upon which the view depends.
SQL Server does not cache view data (at least, not in the way you are referring to it).
If a view definition contains 'SELECT *', then the actual column list is defined when the view is created, i.e. the 'SELECT *' is replaced by the actual column list than exists at the time you create the view. That means if you add columns to underlying tables referenced by that view, that won't appear in the view.