On Google App Engine, when a project has more than one service and these services in the same project need to communicate to one another, is there any way to send a message to another service for calling function apart from using URLFetch api?
One option is to use Task Queues.
The queue definitions are an app-level configuration applicable to all services/modules. So tasks can be enqueued into any queue by any service/module and each queue can be targeted to (serviced by) a specific service/module.
Related
My project has microserver architecture working in Google Cloud. I'm thinking about moving from container with RabbitMQ to PubSub engine.
The question is: is it possible to receive messages one by one? My code is written on Go and docs says
The callback is invoked concurrently by multiple goroutines,
maximizing throughput.
But how many goroutines can be invoked? How can I set the max allowed? E.g. one of my workers works with third-party API allowing only one connection per IP, so I can have only one task in time for this worker.
Correct solution is App Engine Task Pull Queues
https://cloud.google.com/appengine/docs/standard/go/taskqueue/overview-pull
I'm trying to implement iOS push notifications for a message board app I've written (so like notification for new message etc. etc.) but have no real idea where to start.
A lot the current documentation seems to be out of date in regard to keeping persistent TLS connections open to the APNs from App Engine and links to articles about deprecated backends. I'm using the Go runtime and just keep getting stuck. For instance, the creation of the socket connection to APNs requires a Context which can only be got from a HTTP request, but architecturally this doesn't seem to make a lot of sense because ideally the socket remains open regardless.
Is there any clearer guides around that I'm missing or right now is it a better idea to set up a separate VPS or compute instance to handle it?
I'm not that familiar with Go but if you cannot figure out how to connect to APNS in Go then I would recommend creating a separate Java Module that would be responsible for sending push notifications to APNS and a Task Queue to send 'hey-send-this-push-notification' messages (tasks) from Go to this Java Module. You could enqueue tasks from Go and process them in your Java Module.
There is an open-source, Java APNS library that you can use to send push notifications. It was specifically designed to work (and be used) on Google App Engine.
Backends are deprecated; use Modules:
https://developers.google.com/appengine/docs/java/modules/
https://developers.google.com/appengine/docs/go/modules/
Regarding enqueuing tasks:
https://developers.google.com/appengine/docs/java/taskqueue/
https://developers.google.com/appengine/docs/go/taskqueue/
I want to deploy some modules on App Engine. For now I let them "talk" to each other by a simple REST api.
I wonder if is there any kind of "local address" to use instead of *.appspot.com public domain?
If nothing available, what is the fasted protocal/method to communicate between two modules not including sharing the same database and memcache?
The only way to communicate between modules is via HTTP requests, synchronously via URL Fetch API or async via Push Queue API, which can only be done via *.appspot.com URLs. But this are always resolved to local IP address so inter-module communication always goes through internal AppEngine network.
Also, the official docs about module communication uses ModuleService API which resolves module addresses to *.appspot.com addresses, so this is an official google way of addressing modules.
You can share data between modules via datasore/memcache but I don't consider this communication as it does not actively notify receiving party about the data.
I'm looking to move over a messaging system that we have over to the google app engine environment but I have a few questions that I'm hoping someone can help me with.
Our current message environment uses rabbit mq to process messages and then uses about 10 consumers that connect to the que to send the messages. This works well for us as having 10 consumer instances to process the messages dramatically increases delivery rates.
I understand that the app engine doesn't support rabbit mq so I was wondering what would be the best alternative to achieve the same result. I see that you can run tasks in the background which is great but this would only act as one instance, which will slow down the delivery rates.
Are there any other options?
I never use rabbitmq before, but your requirement looks like quite fit the usage of taskqueue and pipeline on app engine.
TaskQueue provide the ability to setup consumers and setup their process rate.
https://developers.google.com/appengine/docs/python/taskqueue/
With the Task Queue API, applications can perform work outside of a user request, initiated by a user request. If an app needs to execute some background work, it can use the Task Queue API to organize that work into small, discrete units, called tasks. The app adds tasks to task queues to be executed later.
The piepline is based on taskqueue and provide more feature on control the flow.
https://code.google.com/p/appengine-pipeline/
The Google App Engine Pipeline API connects together complex, time-consuming workflows (including human tasks). The goals are flexibility, workflow reuse, and testability. A primary use-case of the API is connecting together various App Engine MapReduces into a computational pipeline.
I am currently using google app engine as my mobile application back end. I have a few tasks that can not be performed in the gae environment (mainly image recognition using opencv). My intention is to retain gae and use AWS to perform these specific tasks.
Is there a simple way to pass specific tasks from gae to AWS? E.g. A task queue?
You could either push tasks from GAE towards AWS, or have your AWS instances pull tasks from GAE.
If you push tasks from GAE towards AWS, you could use URLFetch to push your data towards your AWS instances.
If you prefer to have your AWS instances pull tasks from GAE, you could have your GAE instances put their tasks in the GAE Pull Queue, and then have your AWS instances use the Task Queue REST API to lease tasks from the queue.
In either case, the AWS instance could report back the processing result through a simple POST request to your GAE servlets, or through inserting tasks via the abovementioned REST API which would later be leased by your GAE instances. The latter could be useful if you want to control the rate of which your GAE app process the results.
Disclaimer: I'm a lead developer on the AppScale project.
One way that you could go is with AppScale - it's an open source implementation of the App Engine APIs that runs over Amazon EC2 (as well as other clouds). Since it's open source, you could alter the AppServer that we ship with it to enable OpenCV to be used. This would require you to run your App Engine app in AWS, but you could get creative and have a copy of your app running with Google, and have it send Task Queue requests to the version of your app running in AWS only when you need to use the OpenCV libraries.
Have you considered using amazon simple queue service ? http://aws.amazon.com/sqs/
You should be able to add items to the queue from gae using a standard http clint.
Sure. AppEngine has a Task Queue, where you can put in your tasks by simply implementing DeferredTask. In that task you can make requests to AWS.
Your intention to retain the application in GAE and use AWS to perform a few tasks, that can not be performed in the GAE, seems for me a right scenario.
I'd like to share a few ideas along with some resources to answer the main part of your question:
Is there a simple way to pass specific tasks from gae to AWS? E.g. A task queue?
If you need GAE and AWS to perform the task all the time (24/7) then your application will definitely depend on batch schedule or task queue. They are available by GAE.
However if you could arrange to pull the task in GAE and perform by AWG on interval basis (say twice a day of less than an hour each), you may no need to use them as long you can manage the GAE to put the data on Google Cloud Storage (GCS) as public.
For this scenario, you need to setup AWS EC2 Instance for On/Off Schedule and let the instance to run a boot script using cloud-init to collect the data through your domain that pointed to GCS (c.storage.googleapis.com) like so:
wget -q --read-timeout=0.0 --waitretry=5 --tries=400 \\
--background http://your.domain.com/yourfile?q=XXX...
By having the data from GCS, then AWS can perform these specific tasks. Let it fire up GAE to clean the data and put the result back to GCS to be ready to be used as your mobile application back end.
Following are some options to consider:
You should note that not all of the EC2 types are suitable for On/Off Schedule. I recommend to use EC2-VPC/EBS if you want to setup AWS EC2 Instance for On/Off Schedule
You may no need to setup EC2 if you can set AWS Lambda to perform the task without EC2. The cost is cheaper, a task running twice a day for typically less than 3 seconds with memory consumption up to 128MB typically costs less than $0.0004 USD/month
As outcome of rearranging you your application in GAE and set AWG to perform some of the tasks, it might finally rise your billing rates, try to to optimize the instance class in GAE.