For testing/debugging purposes, I would create a web app that emulates the functionality of one of the 3rd party actors in our system. It should be able to publish and subscribe to messages sent on the AWS SNS service.
I was planning to make a ReactJS web app that calls an API made in AWS Lambda. Sending messages should be fine, with some buttons in the app, calling a Lambda that publishes SNS messages to a topic.
But what about monitoring the messages sent to the relevant topics that I want to watch? I was thinking about using a websocket that could receive messages. I know I can trigger a Lambda with SNS messages, but how do I make the lambda deliver these messages to the websocket? is that possible at all without having a permanent server session running? Should I combine with other things in the AWS toolbox?
When I originally wrote this answer websocket support for Lambda was not available, but it is now: https://aws.amazon.com/blogs/compute/announcing-websocket-apis-in-amazon-api-gateway/
I was also looking for exact same thing but unfortunately aws sns doesn't have websocket support.
But I came across a very interesting blog. So what he has done is using AWS IoT which supports websockets and pub-sub. You can take a look here
Edit :
AWS API Gateway gives the functionality to manage Websockets in serverless way. Here is a quick starter guide API Gateway Websockets
Related
I have a react frontend on s3 and a python backend on lambda. I need to implement asynchronous processing and am finding the best way to do this. Would it be possible for my react frontend to subscribe to an aws sns topic while it being on s3? Or does that require it have a server always running?
You could link the SNS topic to an SQS queue. You could make React consume the queue (using a Cognito token to provide credentials for it).
This way you could limit your surface but still allow appropriate access to the resources.
The direct answer to your question: No, you can't connect React directly to SNS.
I am trying to use google clouds pub/sub feature to store incoming data from an IOT device. I have an event call back which should be sending a JSON string to the pub/sub topic, from the IOT device's back-end. The callback looks like this (where {project},{topic} and {YOUR_API_KEY} are filled in as required:
POST https://pubsub.googleapis.com/v1/projects/{project}/topics/{topic}:publish?key={YOUR_API_KEY}
{"messages":[{"data":"test"}]}
I am invariably getting error 403 with this set up. I have tried various slight variations on this and found other errors. I am very new to this topic, is there an obvious mistake I am making?
API Keys are not sufficient for authenticating to the Google Cloud Pub/Sub API. Only a subset of GCP services allow access using only an API key, as detailed in the Using API Keys documentation. You will need to use a service account to authenticate for using Cloud Pub/Sub. You might also want to consider Google Cloud IoT, which sends telemetry into Cloud Pub/Sub.
We have an IoT project here
Facts:
We have our app running on Google App Engine, PHP runtime.
The clients are raspberry pi or similar boards.
We are using cloudmqtt (www.cloudmqtt.com) to generate a push event on our C client app, which then runs the sync process with the server
Is there a google cloud replacement for what we are doing?
We tried Google Push/Sub, but our C app needed to be polling the service.
We would love to use Google Cloud Messaging but we could not find any way to use it for push notifications for the client.
Basically, we need to send push messages to a raspberry pi, what would you recommend for that? (remember our server is on GAE)
GCM handle polling mechanisms on its own and should let you push notifications as well as messages to the client. Try debugging your application using documentation.
During I/O 2016, they also launched Firebase Cloud Messaging (FCM) which is basically a newer version of GCM, and is the recommended product to use.
But if you wanted to use your deployment rather than a service, you can use Google Compute Engine instances to deploy EMQTTD which is a highly scalable MQTT broker written in Erlang.
I have a Cloud App hosted on Windows Azure and I need to integrate XMPP with the service. Can I use GAE's XMPP API to achieve this? I'll need to be able to create new Jabber IDs and send & receive messages from other clients like GTalk.
As far as I know, GAE only supports Java, Python and my expertise is limited to .NET & C#; so, I'll have to make my Azure App to communicate with the GAE.
Finally, can I use GAE as an alternative to running ejabbered on Windows Azure Virtual Machines or Amazon EC2?
Thanks in advance... :)
You could but it would be very limited. You may be better off running ejabbered somewhere else.
With GAE's XMPP API, your username selection is rather limited. See the GAE XMPP Overview API documentation.
Your Cloud App would need to send and receive messages from your GAE app via HTTP accesses. This is no big deal for sending, but you'll have to work your own way of receiving messages. You could buffer your messages to the datstore and poll for messages. You could use the Channel API to receive messages directly, but so far the Channel API client is only available in Javascript, so your app would need some sort of javascript interpreter to use the client.
You will be able to send/receive messages from other XMPP addresses like GTalk clients.
I have a web application running on Google App Engine and need to provide near real time updates to connected web clients. One way would be to use the Google App Engine Channels API, but I'm a bit uneasy about using a proprietary solution.
Are there any reliable hosted services allowing for clients to connect using Socket.IO (with all its supported fallback protocols), and a web server solution running on Google App Engine to publish notifications to it? Any other alternatives that offers the same functionality?
You looking for something like beaconpush.com?
I have the same problem as you.
I've thought about using the Channel API as well however the free quota is quite low (100 channels created per day, each client is one channel).
Here's the solution I'm building:
All of the server logic runs in app engine python runtime
app engine serves all the html and client code
I run a node.js socket.io server on dotcloud (using their free tier)
the node.js server sets up an http server that listens to get requests on a few special url endpoints (ie: myapp-on.dotcloud.com/room/[room_id]) and when it gets called it triggers the socket.io broadcast to the appropriate clients
html clients generated on app engine connect to my myapp-on.dotcloud.com
All user input in the client is sent to app engine via a normal ajax post/get
when the app engine server code needs to push something to the client it makes a url fetch on the appropriate url (myapp-on.dotcloud.com/room/[room_id]) that triggers a message push via socket.io to the connected clients
I'm yet to implement this, but sounds like a workable plan
the idea is to keep all the logic in app engine and only use the socket.io server as a message pusher