I have a system that publishes messages to Cloud PubSub but the data of the message is passed as query parameters in a GET request (as opposed to using the body of POST request).
Does PubSub even accept HTTP GET requests as published messages?
Is there any way to access these query parameters using the Apache Beam
Java SDK?
With regard to the first question, the Google Cloud Pub/Sub REST API documentation states which kind of HTTP method is accepted for each of the API calls. Publish requires the POST method.
Related
I am using a Mulesoft Salesforce connector for the integration of a system with Salesforce. To achieve it I was calling three-time Salesforce for an operation such as to get assets from Salesforce by using HTTP connector
to get token
to get salesforce API version
the actual rest API call
to avoid three calls I used a Salesforce connector that is easy to configure and easy to use.
My question is here now that what is the best way for it? using HTTP call three times or a salesforce connector?
We did it using Salesforce connector and in MuleSoft I would say that's the best way to do it. It simplifies the whole process by handling the OAuth token services etc.
As Salesforce bought MuleSoft I believe there will be seemless integration options will come in future. There are many already.
That connector is just a wrapper over those multiple http calls actually.
I will create an API using Google App Engine (on golang).
API endpoint I will create publishes HTTP request.
The problem is that HTTP request in this endpoint cannot be keep-alive.
Why?
Is it possible to use keep-alive?
Sample source code is following.
https://github.com/suzuito/gae-example
I am trying to use google clouds pub/sub feature to store incoming data from an IOT device. I have an event call back which should be sending a JSON string to the pub/sub topic, from the IOT device's back-end. The callback looks like this (where {project},{topic} and {YOUR_API_KEY} are filled in as required:
POST https://pubsub.googleapis.com/v1/projects/{project}/topics/{topic}:publish?key={YOUR_API_KEY}
{"messages":[{"data":"test"}]}
I am invariably getting error 403 with this set up. I have tried various slight variations on this and found other errors. I am very new to this topic, is there an obvious mistake I am making?
API Keys are not sufficient for authenticating to the Google Cloud Pub/Sub API. Only a subset of GCP services allow access using only an API key, as detailed in the Using API Keys documentation. You will need to use a service account to authenticate for using Cloud Pub/Sub. You might also want to consider Google Cloud IoT, which sends telemetry into Cloud Pub/Sub.
For testing/debugging purposes, I would create a web app that emulates the functionality of one of the 3rd party actors in our system. It should be able to publish and subscribe to messages sent on the AWS SNS service.
I was planning to make a ReactJS web app that calls an API made in AWS Lambda. Sending messages should be fine, with some buttons in the app, calling a Lambda that publishes SNS messages to a topic.
But what about monitoring the messages sent to the relevant topics that I want to watch? I was thinking about using a websocket that could receive messages. I know I can trigger a Lambda with SNS messages, but how do I make the lambda deliver these messages to the websocket? is that possible at all without having a permanent server session running? Should I combine with other things in the AWS toolbox?
When I originally wrote this answer websocket support for Lambda was not available, but it is now: https://aws.amazon.com/blogs/compute/announcing-websocket-apis-in-amazon-api-gateway/
I was also looking for exact same thing but unfortunately aws sns doesn't have websocket support.
But I came across a very interesting blog. So what he has done is using AWS IoT which supports websockets and pub-sub. You can take a look here
Edit :
AWS API Gateway gives the functionality to manage Websockets in serverless way. Here is a quick starter guide API Gateway Websockets
As Google Cloud Datastore client libraries are available for some language only. Now, How can do operation like create, update and delete of entity without using client libraries with HTTP request.
The Datastore API is built on HTTP and JSON, so any standard HTTP client can send requests to it and parse the responses.You can find more about building a flexible run time here
One of the classic way to do this is to expose the datastore CRUD operations through a set of REST APIs. Google offers Cloud Endpoints which are a set of "tools, libraries and capabilities that allow you to generate APIs and client libraries from an App Engine application" https://cloud.google.com/appengine/docs/java/endpoints/
You can have a look at this tutorial https://rominirani.com/google-cloud-endpoints-tutorial-part-1-b571ad6c7cd2#.1j9holpdt