While setting the fairness configuration, we are only being asked about the label column in training data and not in the payload data. So, wanted to understand that when UI calls /v1/fairness_monitoring POST API what is the value that is being set as class_label in the request payload. Is it hard-coded to prediction or something else is being done?
WML has a concept of output schema which defines the schema of the output of WML scoring end point and the role for the different columns. The roles are used to identify which column contains the prediction value, which column contains the prediction probability, and the class label value, etc. The output schema is automatically set for models created using model builder. It can also be set using the WML python client. WML also sets the output schema when data is sent for scoring.
When the output schema identifies the column which contains the prediction, the UI does not ask for this information. The name of the column is hence not hardcoded but can be anything that is defined in the output schema. If output schema does not identify the prediction column, then the UI will show a screen allowing the user to select the prediction column.
The documentation for the WML python client is available at: http://wml-api-pyclient-dev.mybluemix.net/#repository. Search for “OUTPUT_DATA_SCHEMA” to understand the output schema and the API to use is to store_model API which accepts the OUTPUT_DATA_SCHEMA as a parameter.
Related
Hope you're well. I'm currently building out a report, but despite my best efforts so far, I can't get some information to populate within the report. It does not appear to me that salesforce is recognizing the field "Agent Incoming Connecting Time" within the object "AC_Agent_Performance". However, I can pull in some other fields within the same object into the Agent Performance report, so I'm not clear on what is not taking place in the field that I wish to see within the report. Here are some of the things that I've tried:
I have checked the access to the field. The first photo (Photo 1) Shows an example of a working object, the the second one shows an example of one that does not.
The API name seems to work, and is consistent with other fields within the object that work.
I have checked the page layout for the object (even though I don't think this is the issue), and I have mirrored other fields to the best of my knowledge that ARE populating within the report.
I reviewed the CTI flows to see if there was something missing in there on a lark, but there was nothing in there that would lead me to believe that this was the source of the problem.
I have tried setting up a new field in the object (formula), that references the field that I'm trying. to pull in, but that just returns a result of 'zero' for all values.
One thing that I have done that appears to be working, is I have set up a joined report, which uses both "AC Agent Performance" object and "AC Historical Queue Metrics" object in the report. The result that is returning appears to be accurate (please see the picture (picture number 3)). However, I don't think that this is the right way to go about this, and I don't want to do it this way. I want to use the report with one object rather than with two.
I know that permissions are the most likely issue, so I've taken a close look at these. Please let me know if there is something wrong with how I have the permissions configured. The First image depicts the 'Field Level Security'. The second image depicts the'field accessibility'. They are both like this, the whole way down:
Please note one other thing, which is that the last picture depicts a different field within the object displaying in the report.
Does anyone have any ideas on how I can proceed so the field "Agent Incoming Connecting Time" will display within the report?
Please also note, that these are objects that contain data that is populated from AWS' Amazon Connect.
This last photo, shows that the object does not have any information in it within the report.
If the field isn't populated there's not much you can do on the reporting side of things. You already tried "joined report". You should check why the integration doesn't populate it, maybe read integration documentation, contact the managed package's support...
The tables are connected with lookup or master-detail, right? In a pinch you could try making formula field on "AC Agent Performance" looking "up" and pulling the value from related AC historical queue metrics. If the relationship is other way around (performance -> down to related list -> metrics) you could try to make-do with a master detail and rollup summary field. I don't know this package, no idea if you can pull it off when you don't have full control over the fields.
If you can't really use the relationships and absolutely need to report on single table - you could capture intermediate results of the report to a helper table and then report on that. It's called "reporting snapshots". Or write some nightly (hourly?) batch that recalculates stuff and writes homemade "rollup" to these fields?
I work for a software house and I'm looking for a way to authenticate database access as follow:
Someone whose personal data are stored in a table needs to read his
personal row and he has the right to access the table.
Once the requested row has been retrieved, I need an external check
which ensures he has the right to read that specific row, checking
some table's fields which are contained in the same row.
It seems to me that the attribute-based access control is what I'm looking for, but I'm not sure. Can you confirm if it's able to do what I need?
Yes, ABAC (the model) can do that. However, the check would typically happen before you get access to the data - and that's better actually (it avoids retrieving data).
ABAC gives you two things:
a policy language to express what can and cannot happen. For instance
A user can view a record they own but not the credit card field
a request / response scheme or how to enforce the policies.
In the case of data-centric access control, policies are transformed into SQL filters. For instance, you would go from:
SELECT * FROM transactions;
to
SELECT amount, owner, CASE WHEN (0=1) THEN CREDITCARD ELSE 'xxxx' FROM transactions WHERE owner = 'Alice'
This is called dynamic data filtering and dynamic data masking. Some database vendors have had that capability e.g. Oracle and VPD or MySQL and FGAC. Lately the trend is to outsource this type of behavior to tools like Informatica DDM or Axiomatics ADAF MD (which is where I work).
Currently we create Jasper PDF Reports from a single simple database table for our customers. This has been achieved programmatically. It's static. If the user wishes to change the query, he/she creates a change request, which we cannot deliver before the end of the next sprint (SCRUM).
The tool/library should be straight forward (e.g. convention over configuration) and employable from within a JavaEE container. And, open source.
Is there a dynamic tool that allows or customers to create the simple queries/reports themselves without knowing SQL? Means, they should be able to see the table and then create a query from it, execute and print (we could use Jasper Reports for the last one).
E.g. Select only data from year 2014, aggregate them by customer group and select columns x,y and z.
All of these criterias and query structure may change though, thus not just the value like year 2014.
Questions:
1) Is there a tool that presents the data in some kind of SAP-cube or something similar where the user could select the structure and attributes?
2) Can that tool save template queries (queries that the user has invoked before)?
thanks
With BIRT you could use parameters in the report... for example have one report that shows the whole data set or data cube (or at least a bit of all of the fields). Then you could add JavaScript to the report (or do all of the presentation in JavaScript for that matter), that shows the parameters a user can select from. These parameter values can then either be sent to a new report or could update the existing report. Parameters can be put into database queries too.
If that was exposed in JavaScript on a web page you could save the parameter values to an array and store them in the browser or server.
Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.
I have a custom object in Salesforce which I need to setup a Master Detail relationship from Accounts. Accounts being the Master and CompHist being the Detail. The problem I am running into is that I need to set the relation to work off of custom fields within the objects. Example:
1.) Accounts has a custom field called CustomerId.
2.) CompHist also has custom field called CustomerId.
3.) I need to be able to have this linked together by CustomerId field for report generation.
About 2,000 records are inserted into CompHist around the 8th of each month. This is done from a .NET application that kicks off at the scheduled time, collects info from our databases and then uploads that data to salesforce via the SOAP API.
Maybe I'm misunderstanding how Salesforce relationships work as I am fairly new (couple months) to salesforce development.
Thanks,
Randy
There is a way to get this to work without triggers that will link the records or pre-querying the SF to learn Account Ids in .NET before you'll push the CompHistories.
Setup
On Account: set the "External ID" checkbox on your CustomerId field. I'd recommend setting "Unique" too.
On CompHist: you'll need to make decision whether it's acceptable to move them around or when the relation to Account is set - it'll stay like that forever. When you've made that decision tick / untick the "reparentable master-detail" in the definition of your lookup / m-d to Account.
And if you have some Id on these details, something like "line item number" - consider making an Ext. Id. for them too. Might save your bacon some time in future when end user questions the report or you'll have to make some kind of "flush" and push all lines from .NET (will help you figure out what's to insert, what's to update).
At this point it's useful to think how are you going to fill the missing data (all the nulls in the Ext. Id) field.
Actual establishing of the relationship
If you have the external ids set it's pretty easy to tell salesforce to figure out the linking for you. The operation is called upsert (mix between update and insert) and can be used in 2 flavours.
"Basic" upsert is for create/update solving; means "dear Salesforce, please save this CompHist record with MyId=1234. I don't know what's the Id in your database and frankly I don't care, go figure this out will ya?"
If there was no such record - 1 will be created.
If there was exactly 1 match - it will be updated.
If there were more than 1 found - SF won't know which one to update and throw error back at you (that's why marking as "unique" is a good idea. There's a chance you'll spot errors sooner).
"Advanced" upsert is for maintaining foreign keys, establishing lookups. "Dear SF, please hook this CompHist up to Account which is marked as "ABZ123" in my DB. Did I mention I don't care about your Ids and I can't be bothered to query your database first prior to me uploading my stuff?"
Again - exact match - works as expected.
0 or 2 Accounts with same ext. id value = error.
Code plz
I'd recommend you to play with Data Loader or similar tool first to get a grasp. of what exactly happens, how to map fields and how to not be confused (these 2 flavours of upsert can be used at same time). Once you'll manage to push the changes the way you want you can modify your integration a bit.
SOAP API upsert: http://www.salesforce.com/us/developer/docs/api/Content/sforce_api_calls_upsert.htm (C# example at the bottom)
REST API: http://www.salesforce.com/us/developer/docs/api_rest/Content/dome_upsert.htm
If you'd prefer an Salesforce Apex example: Can I insert deserialized JSON SObjects from another Salesforce org into my org?