I'm planning to Apollo-Graphql on Google App Engine (GAE). So I don't have to worry about the scaling (I'll be using Redis or some other pubsub). However, the problem is GAE doesn't support web sockets. I heavily use GraphQL subscriptions.
What Google recommends is to separate web socket into another VM (like Google Compute Engine) and keep the rest inside GAE: https://cloud.google.com/solutions/real-time-gaming-with-node-js-websocket
Is it possible to do this on Apollo Server? I'm using node js apollo-server-express
It's a good pattern to scale your infra. There is no contraindication to use apollo-server-express/Apollo-Graphql.
Use the same code base, one will handle query and mutation with http and the other will handle subscribe with websocket. Just route correct network to (GAE) or Google Compute Engine.
Every http query will be handle by GAE. And apollo-client will subscribe ws query on Google Compute Engine. When one event is publish on Redis or other, apollo-graph will consume and resolve only if their is subscriber on ws.
So you don't need connect pubsub/Redis on GAE.
I was considering using 2 projects for http and ws but it won't share graphQL schemas.
Related
I have four services running within the same app on App Engine. I have a frontend SvelteKit application, and three backend services. If possible, I'd like to set up security in such a way that the backend services will only accept HTTP requests from the frontend application (which sends all API requests via its Node server).
Is there a way of doing this without spending a load of money on a Serverless VPC Access connector?
Ideally I want to keep these all within the same GCP project as well. So far the only solution I can come up with is to ship the services with a secret that they check against when receiving a request, but there must be a better way to do it.
Take a look at Identity Aware Proxy
Pay attention to the part of the above documentation that says
In order to make a resource publicly-accessible (while sibling resources are restricted), grant the IAP-secured Web App User role to allUsers or allAuthenticatedUsers.
Per your use case, your front-end application will be available to the public while your 3 backend services will only be available to the front-end application
Since your backend services are now secured (via IAP), you have to programmatically invoke them in your front end. See documentation on how to do that.
I am doing a project to create a login and create an account page and also store different sorts of data that needs to be securely stored in the database.
I am using react native and amazon web service. I am confused about how to use the RDS(AWS) Microsoft SQL Server and to connect with the front end. Do I use AWS amplify?
I saw a post that mentioned
Amplify is at the moment tied to dynamoDB in a very strong way. But you can use graphQL queries sent to AppSync (the backend layer of amplify) to trigger lambda functions. From there you can target any type of database you want
Is there a better or shorter process of connecting the front end with the SQL database. Please give me some tips as it is my first time working with AWS and react native.
There are many ways to accomplish this with AWS. You essentially need a backend web server component in your architecture. The classical solution would be to setup an EC2 and have this server handle REST or GraphQL requests that query your RDS.
A more modern "serverless" option would be to route your front-end traffic to an API Gateway that invokes a lambda function. This lambda function can utilize IAM permissions to query your database. There are many resources for this pattern; here's an AWS blog post about it.
I'm not familiar with the specifics of Amplify, but if you can be flexible switching datastore it seems convenient just to use a datastore that Amplify plays nicely with. I'd highly recommend trying to get Amplify working if you are not familiar with backend web software and don't have time / care to dig into the complexities of it.
As Google Cloud Datastore client libraries are available for some language only. Now, How can do operation like create, update and delete of entity without using client libraries with HTTP request.
The Datastore API is built on HTTP and JSON, so any standard HTTP client can send requests to it and parse the responses.You can find more about building a flexible run time here
One of the classic way to do this is to expose the datastore CRUD operations through a set of REST APIs. Google offers Cloud Endpoints which are a set of "tools, libraries and capabilities that allow you to generate APIs and client libraries from an App Engine application" https://cloud.google.com/appengine/docs/java/endpoints/
You can have a look at this tutorial https://rominirani.com/google-cloud-endpoints-tutorial-part-1-b571ad6c7cd2#.1j9holpdt
We need to migrate from one app engine project to another (due to the constraints put in place for changing region).
The ideal solution would just be to proxy all requests through to the new server however we are using Google Cloud Endpoints which are intercepted by the server and delivered as POST requests.
We can't redirect as we have mobile apps relying on the API.
Does anyone have a solution (rather than proxying every API method we have) to proxy to a new server?
I would write a ServletFilter on the old app that intercepts /_ah/spi/* and forwards it to the new app, also on /_ah/spi/*. Keep in mind that you'll have to keep the existing Endpoints code in place, or the proxy will delete your configuration and not forward anything.
I have a web application running on Google App Engine and need to provide near real time updates to connected web clients. One way would be to use the Google App Engine Channels API, but I'm a bit uneasy about using a proprietary solution.
Are there any reliable hosted services allowing for clients to connect using Socket.IO (with all its supported fallback protocols), and a web server solution running on Google App Engine to publish notifications to it? Any other alternatives that offers the same functionality?
You looking for something like beaconpush.com?
I have the same problem as you.
I've thought about using the Channel API as well however the free quota is quite low (100 channels created per day, each client is one channel).
Here's the solution I'm building:
All of the server logic runs in app engine python runtime
app engine serves all the html and client code
I run a node.js socket.io server on dotcloud (using their free tier)
the node.js server sets up an http server that listens to get requests on a few special url endpoints (ie: myapp-on.dotcloud.com/room/[room_id]) and when it gets called it triggers the socket.io broadcast to the appropriate clients
html clients generated on app engine connect to my myapp-on.dotcloud.com
All user input in the client is sent to app engine via a normal ajax post/get
when the app engine server code needs to push something to the client it makes a url fetch on the appropriate url (myapp-on.dotcloud.com/room/[room_id]) that triggers a message push via socket.io to the connected clients
I'm yet to implement this, but sounds like a workable plan
the idea is to keep all the logic in app engine and only use the socket.io server as a message pusher