I am working with the Linking API and am finding that if I include ANY of the "ds." parameters in the URL, that the resulting data source loses all calculated fields from the original data source. Based on the documentation, it seems like we should be able specify some of the "ds." parameters, but as long as we don't specify ds.connector, it should still Update (which would keep the calculated fields?). But I'm finding that the only way to get the resulting data source to include calculated fields is to specify NONE of the ds. parameters. Any assistance would be so appreciated!
This is a bug in Data Studio, thanks for flagging it!
We aren't copying calculated fields or parameters from the original datasource unless ds.<alias>.refreshFields is false, and the default is true for Sheets and BigQuery datasources. I suspect that you're only seeing this when you add a "ds" parameter because when you don't have any "ds" parameters and your original data source is reusable, that original data source gets added to the new report instead of creating a new one.
We have a fix that should be going out in the release tomorrow (July 26, ~2pm PDT) but you can work around it in the meantime by adding either
ds.<alias>.refreshFields=false (to fix one datasource) or ds.*.refreshFields=false (to fix all datasources, if you have more than one).
Related
Hope you're well. I'm currently building out a report, but despite my best efforts so far, I can't get some information to populate within the report. It does not appear to me that salesforce is recognizing the field "Agent Incoming Connecting Time" within the object "AC_Agent_Performance". However, I can pull in some other fields within the same object into the Agent Performance report, so I'm not clear on what is not taking place in the field that I wish to see within the report. Here are some of the things that I've tried:
I have checked the access to the field. The first photo (Photo 1) Shows an example of a working object, the the second one shows an example of one that does not.
The API name seems to work, and is consistent with other fields within the object that work.
I have checked the page layout for the object (even though I don't think this is the issue), and I have mirrored other fields to the best of my knowledge that ARE populating within the report.
I reviewed the CTI flows to see if there was something missing in there on a lark, but there was nothing in there that would lead me to believe that this was the source of the problem.
I have tried setting up a new field in the object (formula), that references the field that I'm trying. to pull in, but that just returns a result of 'zero' for all values.
One thing that I have done that appears to be working, is I have set up a joined report, which uses both "AC Agent Performance" object and "AC Historical Queue Metrics" object in the report. The result that is returning appears to be accurate (please see the picture (picture number 3)). However, I don't think that this is the right way to go about this, and I don't want to do it this way. I want to use the report with one object rather than with two.
I know that permissions are the most likely issue, so I've taken a close look at these. Please let me know if there is something wrong with how I have the permissions configured. The First image depicts the 'Field Level Security'. The second image depicts the'field accessibility'. They are both like this, the whole way down:
Please note one other thing, which is that the last picture depicts a different field within the object displaying in the report.
Does anyone have any ideas on how I can proceed so the field "Agent Incoming Connecting Time" will display within the report?
Please also note, that these are objects that contain data that is populated from AWS' Amazon Connect.
This last photo, shows that the object does not have any information in it within the report.
If the field isn't populated there's not much you can do on the reporting side of things. You already tried "joined report". You should check why the integration doesn't populate it, maybe read integration documentation, contact the managed package's support...
The tables are connected with lookup or master-detail, right? In a pinch you could try making formula field on "AC Agent Performance" looking "up" and pulling the value from related AC historical queue metrics. If the relationship is other way around (performance -> down to related list -> metrics) you could try to make-do with a master detail and rollup summary field. I don't know this package, no idea if you can pull it off when you don't have full control over the fields.
If you can't really use the relationships and absolutely need to report on single table - you could capture intermediate results of the report to a helper table and then report on that. It's called "reporting snapshots". Or write some nightly (hourly?) batch that recalculates stuff and writes homemade "rollup" to these fields?
I have several "copies" of the same datasource that have different calculated fields. Is there a way to merge all of the different calculated fields into a single Datasource I'm hoping there's a process to do this other than manually doing it. Any ideas?
This is currently not supported.
I have several transformations to test sync to multiple tables to one table, or one table with several fields to one table with less fields. All that works correctly, however, when I try to change "copy" type to "const" or "bsh" in transform_column table, source value is still copied into target field without doing what I write in "transform_expression" field.
Any idea why is this happening?
Thanks in advance
What version of SymmetricDS were you using? Early on, the staging area was not being cleared so while developing transforms you could end up with already transformed data being delivered. There are lots of improvements for transformations in the later 3.6 versions of SymmetricDS.
Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.
I am working on converting an Access database to a SQL Server backend. I've got most of it working, but one thing I haven't been able to figure out yet is that on one of the reports that we run, a few fields show up as #Error! The field's Control source is:
=DSum("[CustomerMinutes]","QryOutageSummaryByDateRange","NZ([CityRelated])= 0")
It works fine as shown, but it takes a lot longer to load the report and the CityRelated field is a not null field, so I feel as though I shouldn't need to use the NZ() function. I have opened the query in datasheet view and there appropriately isn't any NULLs. I would be more than happy to provide more detail, I just don't know what other information I should provide. Any help or general direction would be greatly appreciated!
The database function (DSUM, etc.) are fussy about the use of brackets. Try this.
=DSum("IIF([CustomerMinutes] Is Null,0,[CustomerMinutes])","[QryOutageSummaryByDateRange]","[CityRelated] Is Null Or [CityRelated]=0")
If CustomerMinutes is never NULL then you can just use CustomerMinutes as the first argument.
Notice that the square brackets are around the table or query name, not necessarily required for a single field-name. (This is the opposite to the how the examples appear in the Help system.)
I always prefer to avoid NZ - it can, in my experience, cause problems with aggregate functions, or when used in a sequence of queries.