Comparison between luis.ai vs api.ai vs wit.ai? - artificial-intelligence

Does anyone know the specific differences and features among the three, Or if one has more features/more flexible to use as a developer?

wit.ai vs Dialogflow vs luis.ai
╔══════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════╗
║ wit.ai vs api.ai(Dialogflow) vs luis.ai ║
╠══════╦════════════════════════════════════╦═════════════════════════════════════════════╦════════════════════════════════════╣
║ S.No ║ Wit.ai ║ Api.ai(Dialogflow) ║ Luis.ai ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 1 ║ Wit.ai API is completely free ║ Api.ai Has a paid enterprise option ║ LUIS is in beta and free to use ║
║ ║ with no limitations on ║ which allows for this to be run on a ║ 10K transactions per month ║
║ ║ request rates. ║ private cloud internally and more ║ and up to 5 requests per second ║
║ ║ ║ from their services team., After google ║ for each account. ║
║ ║ ║ acquisition they are providing free ║ ║
║ ║ ║ services by integrating google cloud ║ ║
║ ║ ║ services. ║ ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 2 ║ Provides a nice combination ║ Speech to Text and Text to Speech ║ LUIS uses machine learning ║
║ ║ of both voice recognition and ║ capabilities, along with machine ║ based methods to analyze ║
║ ║ machine learning for developers. ║ learning. ║ sentences. To perform machine ║
║ ║ ║ ║ learning, LUIS breaks an ║
║ ║ ║ ║ utterance into "tokens". ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 3 ║ Has two main elements to it ║ Support of Intents, Entities, actions ║ Supports Intents, Entities ║
║ ║ that you set up within your ║ and one key focus area is its “Domains”. ║ and actions. ║
║ ║ app – intents and entities. ║ ║ ║
║ ║ Actions are separated to ║ ║ ║
║ ║ use as a combined operations. ║ ║ ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 4 ║ Has pre-build entities like ║ Has pre-build entities like #sys.date, ║ Has pre-build entities ║
║ ║ temperature, number, URLs, ║ #sys.color, #sys.unit-currency… etc. ║ builtin.intent.alarm, ║
║ ║ emails, duration… etc. ║ ║ builtin.intent.calendar, ║
║ ║ ║ ║ builtin.intent.email… etc. ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 5 ║ Doesn’t have integration module ║ Has integration module to connect ║ Has integration to Microsoft ║
║ ║ to directly communicating with ║ directly to Facebook messenger and ║ Azure and other services, can be ║
║ ║ Facebook messenger or other ║ other messenger api’s. Has support for ║ deployable in any supported ║
║ ║ messenger APIs. but has web ║ deploying in to heroku server, enterprise ║ servers. ║
║ ║ service api to hook services. ║ paid environment. ║ ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 6 ║ Early in 2015, joined Facebook ║ Created by a team who built personal ║ LUIS was introduced together with ║
║ ║ and opened up the entire platform ║ assistant app for major mobile platforms ║ Microsoft Bot Framework and Skype ║
║ ║ to be free for both public and ║ with speech and text enabled conversations. ║ Developer Platform which can be ║
║ ║ private instances. ║ acquired by google (sept 2016). ║ used to create Skype Bots. ║
╠══════╬════════════════════════════════════╬═════════════════════════════════════════════╬════════════════════════════════════╣
║ 7 ║ Wit.ai API for developers of iOS, ║ Api.ai has SDKs for Android, iOS, ║ LUIS allow building applications ║
║ ║ Android, Node.js, Raspberry Pi, ║ the Apple Watch, Node.js, Cordova, ║ by using the LUIS web interface. ║
║ ║ Ruby, Python, C, Rust and ║ Unity, C#, Xamarin, Windows Phone, ║ No coding needed other than the ║
║ ║ Windows Phone. It even ║ Python and JavaScript. It also can be ║ ability to interpret and use the ║
║ ║ has a JavaScript plugin for ║ integrated with Amazon’s Echo and ║ returned JSON in application. ║
║ ║ front end developers. ║ Microsoft’s Cortana. ║ It is also possible to use the ║
║ ║ ║ ║ LUIS REST API for ║
║ ║ ║ ║ automation of applications. ║
╚══════╩════════════════════════════════════╩═════════════════════════════════════════════╩════════════════════════════════════╝
Update:
API.AI is now Dialogflow. Learn more here.

This blogpost has a really good analysis and comparison of Luis, Wit.ai, Api.ai, Amazon Alexa and IBM Watson services. It also has a nice background on why you would want to build a conversational bot in the first place and some of the challenges that come with that. It's written by the people behind YumiBot (a bot that gives you price quotes for app development).
The general gist is that Wit.ai and Luis are great choices if you're experimenting and just want to get something out for free. Api.ai has a great service and user experience but isn't free. Same with IBM Watson, the latter priced more for enterprise work. Alexa's API is great but only works with Alexa (but given that they have a huge userbase, isn't a bad deal).
Their advice is also to not rely too much on one provider:
We would recommend you store all data needed for your model in a structured way in your own code repository. So later you can retrain the model from scratch, or even change the language understanding provider if needed. You just don’t want to be in a situation when a company shuts down their service and you are completely unprepared. Do you remember Parse?
I hope this helped a little! I think the best way to make a choice is to just give these services a try. Given that a lot of them are still heavy under development and adding features/changing pricing models, you should try coming at them with a specific use-case and see which one can get you to where you need the quickest.

We have recently published an evaluation study of seven NLU API-enabled services: API.ai, Amazon Lex, Microsoft LUIS, IBM Watson Conversation, wit.ai, Recast.ai and Snips.ai.
A brief summary of our findings:
IBM Watson intent detection is the best one, especially on smaller training datasets (although when trained on over 2000 samples the difference is indistinguishable).
API.AI is free, the performance on big enough training set matches IBM Watson and Microsoft LUIS.
Microsoft LUIS works significantly faster than others in our tests.
wit.ai has somewhat worse performance and response time than the three above, but it’s free and it provides the best language coverage (some 50 languages).
Amazon Lex has quite strict API limits (the training set size is limited to 200K symbols, which may be insufficient to reach a good intent detection quality for a multi-intent assistant; also it requires all training utterances to be labeled by entities, which complicated preparation of the dataset.

One aspect of this question is how efficient are these tools at understanding natural language. In a recent benchmark we (Snips, a French AI company) just published, we have tested the built-in natural language engines of Alexa (Amazon), SiriKit (Apple), Luis (Microsoft), and API.ai (Google).
We tested their ability to understand natural queries like “Find me a salad bar I can go to for my lunch meeting”, “Order a cab for 6 people”, as well as 326 other queries.
The overall conclusion is that all solutions are imperfect.
More precisely, they all have similar levels of noise in their responses (between 60% and 90% precision), but there are significant differences in the breadth of language they can support. From this perspective, Luis performs the most poorly: on every use case we tested, it understood less than 14% of the queries. API.ai performs better, although not very reliably: it understands between 0 and 80% of the queries we tested, depending on the use cases. The highest levels of recall can be observed for Alexa (42% and 82% recall) and Siri (61% recall).
More details, and the raw data behind these results can be found in our blog post, Benchmarking Natural Language Understanding Systems

i am going to answer the last part of your question around flexibility and being a developer, IMO it finally comes down to what you are looking for in these platforms.
If you are a developer using NodeJS or .Net, LUIS.ai has an extensive library and well defined code snippets and example to spin up a decent bot pretty quickly. The intents and entity recognition is a bit below par compared to google, but if you are Microsoft Shop, there are a lot of 1-click integrations to O365, Teams, Skype, cortana etc. The cons for LUIS.ai is their service seems very unstable, as of this writing their LUIS.ai website is not functioning rejecting connections and it has been more than a week, where cortana integration is not working. So the platform is still a work in progress.
Api.ai, from a pure NLU perspective is better than Luis.ai, the followup intents are very easy to setup, the speech priming is vastly superior to Luis.ai(even after speech priming). The cons i would say, it the connectabilty and also the API to build a bot are a bit more complicated than building an MSBot based chat bot.
Another platform open source platform that is gaining traction is RASA NLU. https://rasa.com/. Comparatively the entity recognition and the ranking is still a bit sketchy of large datasets, but its open sources and if you want to get your hands dirty, you can fork their github platform and improve it.
From a pure development perspective, its easier to fly up a chatbot in the MS platform(using luis.ai or qnamaker.ai ), but be prepared to have challenges as they work on stabilizing the platform.
-Kartik

In my opinion Luis is more robust and can extract entities in different languages.
I've tested in api.ai and dutch did not work for me.
If you need english only then any one of them should be fine but if you need to support more languages then better test those languages as well before getting stuck with one service.
Bing speech to text is ok but i think to get more robust solution you will need another microsoft service that cleans voice and noise.

I was using DialogFlow but I switched to LUIS. Why? because when you call DetectIntent in DialogFlow you get a JSON with the selected intent and its confidence level but I need to get a list of intents with the confidence level of each one. The same happens with wit.ai and api.ai.
On the other hand, LUIS gives you a list of intents as a response. That way, I can apply further processing on my side.
This is an example from LUIS when you search for "book flight to Cairo" (part of the LUIS example):
{
"query": "Book me a flight to Cairo",
"topScoringIntent": {
"intent": "BookFlight",
"score": 0.9887482
},
"intents": [
{
"intent": "BookFlight",
"score": 0.9887482
},
{
"intent": "None",
"score": 0.04272597
},
{
"intent": "LocationFinder",
"score": 0.0125702191
},
{
"intent": "Reminder",
"score": 0.00375502417
},
{
"intent": "FoodOrder",
"score": 3.765154E-07
},
],
"entities": [
{
"entity": "cairo",
"type": "Location",
"startIndex": 20,
"endIndex": 24,
"score": 0.956781447
}
]
}
On the other hand, the UI to onfigure DialogFlow is much powerful than what you get with LUIS.

Related

Data Vault model for an insurance use case

I am currently working on an insurance use case and would like to apply a Data Vault 2.0 modeling approach to solve this challenge.
Following scenrio:
A contract was initially created on 28 March but will be effective on 01 April.
Now in June there was an adjustment and the premium was increased from 100 to 125 for July.
Again there was an adjustment in November and premium was increased from 125 to 150 for December.
To represent the different effective periods of the contract,
possible approach could be used for the satellite table with multiple active rows.
My experience with Data Vault has always been limited to use cases with one active record in the satellite table.
Question:
This feels not like real Data Vault 2.0 modeling approach to me.
How this use case could be simplified and solved with a Data Vault 2.0 modeling approach? How could the data model look like?
Is it even possible to apply an INSERT-only modeling approach on that use case?

Purchase Order (PO) link to Sales Order (SO) breaks / lost - MYOB Exo Business

I'm using MYOB Exo Business. We have Sales Orders in the system which sync from Salesforce through Jitterbit. Our purchasing team creates a Purchase Order against the Product lines on the SO, which creates a linked PO. However, we have been experiencing breaks in the link. That is the PO and the SO are both still there, but they are no longer connected, which makes it appear like the PO hasn't been created.
This could simply be due to updates in Exo breaking the link, and/or changes in Salesforce that are pushing through to Exo, and then breaking the link.
Has anyone experienced this, or something similar before? There are just so many potential reasons why this could be happening; but it would be good to hear the most likely causes/ideas.
Thanks.

User database ownership in webapps

I am developing a webapp where each user will access his own data.
Think about pivotal tracker and the such as an example, and assume each user will store 2 different data types like so:
table project
id | name
0 | foo
1 | bar
table story
id | name | effort
1 | baz | 5
2 | ex | 2
I can think of 2 solutions.
1) Provide each table with an additional user_id column so that each data is bound to his owner
2) Setup a new database schema for each new user
Personally, i am more on 2) because it would grant a higher security rate (not bound to the application level).
What would be the recommended way, and why?
Solution 2 appears rather exotic to me. It means creating new database schemas each time a user is added. Now, if you have very few users, and new users only get added very rarely, this may be feasible. But if you have lots of users to accomodate, you will need an automatism to create those schemas on the fly. Sure this is possible, but you will leave the grounds of existing tools and frameworks that support your development. E.g., Java Persistency API links a Java class to a table and won't support dynamic data base definition.
Also, I have doubts concerning the security level. In a web app, the database "user" is not the actual human user behind the browser, but the application server, which owns a database connection for its entire runtime. Therefore, individual human user's access rights aren't handled by the database, but by the application.

Google Cloud Bigtable vs Google Cloud Datastore

What is the difference between Google Cloud Bigtable and Google Cloud Datastore / App Engine datastore, and what are the main practical advantages/disadvantages? AFAIK Cloud Datastore is build on top of Bigtable.
Based on experience with Datastore and reading the Bigtable docs, the main differences are:
Bigtable was originally designed for HBase compatibility, but now has client libraries in multiple languages. Datastore was originally more geared towards Python/Java/Go web app developers (originally App Engine)
Bigtable is 'a bit more IaaS' than Datastore in that it's not 'just there' but requires a cluster to be configured.
Bigtable supports only one index - the 'row key' (the entity key in Datastore)
This means queries are on the Key, unlike Datastore's indexed properties
Bigtable supports atomicity only on a single row - there are no transactions
Mutations and deletions appear not to be atomic in Bigtable, whereas Datastore provides eventual and strong consistency, depending on the read/query method
The billing model is very different:
Datastore charges for read/write operations, storage and bandwidth
Bigtable charges for 'nodes', storage and bandwidth
Bigtable is optimized for high volumes of data and analytics
Cloud Bigtable doesn’t replicate data across zones or regions (data within a single cluster is replicated and durable), which means Bigtable is faster and more efficient, and costs are much lower, though it is less durable and available in the default configuration
It uses the HBase API - there’s no risk of lock-in or new paradigms to learn
It is integrated with the open-source Big Data tools, meaning you can analyze the data stored in Bigtable in most analytics tools customers use (Hadoop, Spark, etc.)
Bigtable is indexed by a single Row Key
Bigtable is in a single zone
Cloud Bigtable is designed for larger companies and enterprises who often have larger data needs with complex backend workloads.
Datastore is optimized to serve high-value transactional data to applications
Cloud Datastore has extremely high availability with replication and data synchronization
Datastore, because of its versatility and high availability, is more expensive
Datastore is slower writing data due to synchronous replication
Datastore has much better functionality around transactions and queries (since secondary indexes exist)
Bigtable and Datastore are extremely different. Yes, the datastore is build on top of Bigtable, but that does not make it anything like it. That is kind of like saying a car is build on top of wheels, and so a car is not much different from wheels.
Bigtable and Datastore provide very different data models and very different semantics in how the data is changed.
The main difference is that the Datastore provides SQL-database-like ACID transactions on subsets of the data known as entity groups (though the query language GQL is much more restrictive than SQL). Bigtable is strictly NoSQL and comes with much weaker guarantees.
I am going to try to summarize all the answers above plus what is given in Coursea Google Cloud Platform Big Data and Machine Learning Fundamentals
+---------------------+------------------------------------------------------------------+------------------------------------------+--+
| Category | BigTable | Datastore | |
+---------------------+------------------------------------------------------------------+------------------------------------------+--+
| Technology | Based on HBase(uses HBase API) | Uses BigTable itself | |
| ---------------- | | | |
| Access Mataphor | Key/Value (column-families) like Hbase | Persistent hashmap | |
| ---------------- | | | |
| Read | Scan Rows | Filter Objects on property | |
| ---------------- | | | |
| Write | Put Row | Put Object | |
| ---------------- | | | |
| Update Granularity | can't update row ( you should write a new row, can't update one) | can update attribute | |
| ---------------- | | | |
| Capacity | Petabytes | Terbytes | |
| ---------------- | | | |
| Index | Index key only (you should properly design the key) | You can index any property of the object | |
| Usage and use cases | High throughput, scalable flatten data | Structured data for Google App Engine | |
+---------------------+------------------------------------------------------------------+------------------------------------------+--+
Check this image too:
If you read papers, BigTable is this and Datastore is MegaStore. Datastore is BigTable plus replication, transaction, and index. (and is much more expensive).
This might be another set of key differences between Google Cloud Bigtable and Google Cloud Datastore along with other services. The contents shown in the image below can also help you in selecting the right service.
A relatively minor point to consider, as of November 2016, bigtable python client library is still in Alpha, which means the future change might not be backward compatible. Also, bigtable python library is not compatible with App Engine's standard environment. You have to use the flexible one.
Cloud Datastore is a highly-scalable NoSQL database for your applications.
Like Cloud Bigtable, there is no need for you to provision database instances.
Cloud Datastore uses a distributed architecture to automatically manage
scaling. Your queries scale with the size of your result set, not the size of your
data set.
Cloud Datastore runs in Google data centers, which use redundancy to
minimize impact from points of failure. Your application can still use Cloud
Datastore when the service receives a planned upgrade.
Choose Bigtable if the data is:
Big
● Large quantities (>1 TB) of semi-structured or structured data
Fast
● Data is high throughput or rapidly changing
NoSQL
● Transactions, strong relational semantics not required
And especially if it is:
Time series
● Data is time-series or has natural semantic ordering
Big data
● You run asynchronous batch or real-time processing on the data
Machine learning
● You run machine learning algorithms on the data
Bigtable is designed to handle massive workloads at consistent low latency
and high throughput, so it's a great choice for both operational and analytical
applications, including IoT, user analytics, and financial data analysis.
Datastore is more application ready and suitable for a wide range of services, especially for microservices.
The underlying technology of Datastore is Big Table, so you can imagine Big Table is more powerfuly.
Datastore come with 20K free operation per days, you can expect to host a server with reliable DB with ZERO cost.
You can also check out this Datastore ORM library, it comes with a lot of great feature
https://www.npmjs.com/package/ts-datastore-orm

Integrate multiple instances of java desktop application with online accounting (quickbooks, peachtree)

I am writing a java desktop application which will connect to a server and pull various data products. Customers will be charged a 'per-click' fee (i.e. Sally at Acme pulls a data report costing $5 anywhere from 1 to several thousand times per day -- I want her name, customer ID, product price, date/time, etc. sent to quickbooks or peachtree each time she does this). So we could potentially have anywhere from 10 to eventually several hundred instances of this application out there.
Anyone have any suggestions as to how I might approach this? I want to use ONLY java so ideally there would be an api out there for quickbooks or peachtree that would allow for this integration.
Thanks!!!
Marc
I think more detail is needed. The first problem is what application is Sally pulling a report from, if it is in Quickbooks online, how will you get that information from the website to your application. Why do you want to write that information to Quickbooks. Writing to an independent database may be easier. It sounds like there may be some security concerns with what you are trying to do, that may be difficult to overcome in the browser.

Resources