I have a basic query on Azure Cognitive search -I have confusion that, will it comes under PaaS or SaaS service? With my understanding I feel it should be PaaS, since it requires configuration and deployment involved. Please correct me if I am wrong.
Azure Cognitive Search is a PaaS service. While there is a portal experience to help configure and evaluate the service, you need to write your own application to call the service in order to use its functionality.
Azure Cognitive Search is a search-as-a-service.
Search as a service uses a software as a service (SaaS) model.
https://en.wikipedia.org/wiki/Search_as_a_service
As a Customer of Azure cognitive search, I had to code all back-end services that can handle creating of data sources, indexers and indexes and also the same for calling the rest API to get the data, so even if it is possible to do it via the portal azure interface we still need to code to get some advanced features not yet presents in the interface.
So I believe it is a PAAS cloud type.
Related
I'm looking for examples of how others might have solved for this.
Did you build a custom cartridge?
Did you leverage some externally running agent to retrieve data by OCAPI (or Commerce Cloud APIs)? ...(i.e., a "pull" strategy)
I've read the documentation, spent many nights searching Google, searched the Salesforce Commerce Cloud Marketplace, spoken with several Salesforce Commerce Cloud expert consultants & system integration firms - but it appears no one is aware of anyone else doing this before.
Yes I was able to create a custom cartridge for Twilio.
Twilio provides many separate REST APIs for sending text messages, making phone calls, looking up phone numbers, managing your accounts, and a whole lot more.
You can go through their REST API documentation https://www.twilio.com/docs/sms/api. You can use their REST APIs and implement it in Salesforce B2C Commerce.
Follow creating a simple web service in Salesforce B2C Commerce.
https://documentation.b2c.commercecloud.salesforce.com/DOC1/topic/com.demandware.dochelp/content/b2c_commerce/topics/web_services/b2c_coding_your_web_service.html
Is it possible for one GAE application to access the datastore of another GAE application (both applications are hosted under the same Google account) using Objectify? If so, how can I pass service account credentials to Objectify (which API calls)?
It is not possible. Objectify is a very simple and convenient lightweight ORM that sits on top of a GAE Datastore, thus shielding the developer from most of the complexities of using JDO/JPA.
Nowhere in the documentation have I seen the scenario you describe mentioned because that is not the problem it is trying to solve.
I suspect what you will probably need to do is create a Web Service that exposes your GAE application (whose data you want) through an API. Then have your other GAE application call those service methods to obtain the data it needs.
Alternatively, you can use something called remote_api. It allows you to access and manipulate a GAE Datastore remotely.
Below are some links I just found to similar questions after posting my answer:
Can I access Datastore entities of my other Google App Engine Applications
Can one application access other applications data querying the key in Google App Engine?
A solution is to have only one "GAE application" but to make different Modules in your application. The Datastore will be shared between the modules.
Another solution is to use the Remote API (https://developers.google.com/appengine/docs/java/tools/remoteapi), but you won't be able to use Objectify, I think...
I would like to import data from flat files stored in Google Drive into DataStore. Then use the full-text search and other query options to analyze the data using apps-script.
The script API doc shows how we can access Google Drive data from the apps-script.
Now, is there any API in apps-script to access DataStore from the scripts?
Google provides a (beta) REST API to access your data. Steps to enable are here.
However, BigQuery is usually better for the type of analysis you describe. See:
https://developers.google.com/apps-script/advanced/bigquery
At this point in time, you should consider writing your own Web Service to get you access to the DataStore. You can then access that Web Service hosted in your App Engine application from your App Scripts. A detailed example is provided over here.
Additionally, the Cloud DataStore is now provided under the Google Cloud Platform and while it is still in preview, there is an API available to interact with the Datastore. This API is exhaustive and allows for both read and write operations. But keep in mind that it is currently under Preview.
I'm trying to find a good online/cloud database that has its own web-based GUI. If possible, I would like to find one that has APIs so I can access the data programmatically, but still has a nice frontend that allows users to log in and edit records. I know Google's Fusion Tables do this, but the interface is clumsy and difficult to use for a real database.
I haven't been able to find anything else like this... can someone make a recommendation?
Thanks!
I know RackSpace Cloud has recently rolled out their Cloud Database service. I have had very good experience with their Cloud service (Cloud Server, and Cloud Files/CDN). However I have no experience with their Cloud Database service.
RackSpace Cloud Database
I am interested in the best practices to access Windows Azure API from a Silverlight application? I am pretty sure as an experienced developer it will require me to build a back end web service that silverlight can then use as an interface between Azure API and Silverlight. But I am concerned with speed and security...For instance I am guessing I can use WCF, but what is the fastest way to get this communication to occur?
Also this is an assumption on my part that a webservice is needed, is there any support from Azure for Silverlight? I couldn't find anything on Microsoft site about this, only how to host a Silverlight application on your Azure Storage Blob which is not what I am asking...
Thanks!
Your assumption is correct. You will have to create a web service (wcf is considered best practice) that exposes the methods in the Azure API that you want to access.