Workday SOAP API and multilingual deployments - multilingual

Looking at Workday's public SOAP API (https://community.workday.com/sites/default/files/file-hosting/productionapi/index.html) The WSDL does not seem to indicate any way to pass in the locale of the data I wanted -- if Workday is configured to be multilingual, how do I specify which locale I need the data to be when using the API?

According to the documentation, the multilingual support is for display in Workday. I can't find any mention of it within the APIs.
Workday provides 3 levels of international localization support:
Locales enable you to control the display of multinational date, time, and number formats.
Languages enable you to control the display of text in Workday.
Localized fields enable you to track certain types of information depending on the country.

Related

java backend for qooxdoo with the date hack

is there a skeleton java backend (json rpc) for qooxdoo js framework?
Could any json RPC backend work for qooxdoo or we need the date hack to have it work?
Regards,
TL;DR: If you set the "protocol" property to "2.0", you should be able to interoperate with any standards-based JSON-RPC 2.0 server.
Detailed answer:
The qooxdoo JSON RPC client supports both its original protocol, a variation of JSON-RPC 1.0 called "qx1" (the default, for age-old backward compatibility), and the standardized JSON-RPC 2.0. You'll want to switch it to 2.0 by setting the "protocol" property to "2.0". If I recall correctly, our JSON-RPC client is then fully 2.0 standards-compliant except that we don't support batch requests.
Additionally, as you've noted, qooxdoo used to try to fix the "bug" in JSON/JavaScript, that there is no literal form for a Date object as there is for all other types in JavaScript. The qooxdoo JSON-RPC implementation has provisions for automatically converting Date objects into a string format that is easily parsed.
As of many years ago, we realized that it was poor form to muck with JSON-RPC since mucking with it allowed us to communicate only with qooxdoo-enhanced JSON-RPC servers. At that time, we changed the default to not do any date conversions. This is controlled by the static variable, qx.io.remote.Rpc.CONVERT_DATES, which can be set to true to "fix the bug" as we did originally, or left at its now default null (or false) value, which says "do not muck with dates."
That's all a long-winded answer to say that qooxdoo's JSON-RPC client, if you switch it to use the 2.0 protocol, should interoperate fine with any standards-based JSON-RPC 2.0 server.
Derrell

Custom UIMA annotators in IBM Watson Retrieve&Rank

Is it possible to use custom uima annotators in Retrieve&Rank service?
How can I upload my custom annotator (packaged as jar file) to the service?
I need to create an entity annotator to discover my custom domain entities.
I don't think there is an obvious straightforward way to use a custom UIMA annotator in R&R.
Possible approaches you could use, if you want to try integrating the two though:
Use a UIMA pipeline to annotate your documents before storing them in R&R, or as you query R&R for them. I've not tried this myself, but I've seen references to this sort of thing - e.g. http://wiki.apache.org/solr/SolrUIMA so there might be some value in trying this
Use the annotations from your UIMA pipeline to generate additional feature scores that the ranker you train can include in it's training. For example, if your annotator detects the presence or absence of a particular custom domain entity, it could turn this into a score that contributes to the feature scores for a search result. For an example of contributing custom feature scorers to R&R, see https://github.com/watson-developer-cloud/answer-retrieval

Publish one product to multiple sites

Is there a way to have one product definition and have it publish to multiple sites? I am looking for this ability specifically in DNN or Umbraco, either with free or paid extensions. I did install both the platforms and played with the free extensions and looked for any extension offering such functionality but did not find one. Any links or pointers are highly appreciated!
I had looked up for this info in many places before reaching over to the expert pool here, hoping to get some hints;
In umbraco there is the built in /base extension (http://our.umbraco.org/wiki/reference/umbraco-base) which enables you to access product data that is maintained in Umbraco from other websites. Base is REST-ish so the implementation is well documented - you can access the data as XML or JSON (Returning Json instead of XML with Umbraco Base).
Also as the implementation is REST-ish the other websites that consume the content maintained in the core site could be written in anything that can consume a REST feed eg html & javascript.
It's not 100% clear to me what setup you're after, but if you're looking to set up a traditional Authoring/Delivery configuration - one of the few paid offerings Umbraco has is called Courier. It's a very reasonably priced (~$135USD,/99EUR) deployment manager that handles syncing content between two sites, i.e., Authoring and a Delivery server.
It's a very smart tool that manages content, configuration, and dependencies. It's neat and also supports a great open-source project!
If you're looking to setup something more like a centralized product database that is used by many sites - amelvin is on good pointer with BASE. They have a nice api where you may also set up your own webservice (beyond their own webservice functaionality!).
If you need this centralized product data to notify the other sites to update their caches - i encourage you to look into the 'distributedCall' functionality.
There's a bit of documentation on distributed calls in this load-balancing tutorial that may help understand the concept a bit better.
...Hope this helps get pointed in the right direction.

Creating a user-configurable New Relic Plugin

I've been playing around with the New Relic Ruby SDK and created a proof-of-concept plugin which gets data out of Graphite, and sends it to New Relic.
Other plugins I've seen target a well-known set of data (e.g. Apache Requests or CPU load). However, in this case I cannot pre-configure the dashboards for publishing, because the data for each user will be completely different, depending on how they configure it and the data they store on their graphite.
Is there a way to publish a plugin without a pre-configured dashboard / charts?
Every New Relic published plugin necessarily includes a Dashboard. You could record metrics like "Component/Graphite/" and then expose the results generically in your associated dashboard with "Component/Graphite/*". Most likely those won't be very useful graphs.
If you treat this as a Graphite plugin SDK for users who want to easily collect Graphite metrics by configuration, it will make more sense. When doing this, you should make the GUID configurable as well and include clear instructions on changing the GUID for each use of the SDK. That way, users will get their own fresh Dashboards each time their use your SDK.
Yoav, I strongly suggest you do not distribute your plugin agent with a GUID in place (you currently have com.gingerlime.graphite.graphite).
If another user runs your agent with their New Relic license key, and doesn't change the GUID, they will be unable to customize their plugin dashboards, and any customizing you do will not be seen by them.
In other words, un-published plugins should not be distributed publicly - anyone that uses a un-published plugin will have a bad experience (unless they first customize the GUID).

Is there a standard way for calendar data to be passed between modules in DotNetNuke?

After reading a bit on development in DotNetNuke, I would imagine you would just pass it via the specific table in the database, and write some sort of strategy class for pulling the data in the same why from the correct calendar module.
That said, is there a particular calendar module that is popular among DNN Users?
is there a particular calendar module
that is popular among DNN Users?
I know that both Engage: Events and InvenManager's Event Calendar are widely used for calendering in DNN.

Resources