The documentation I have read has only covered asynchronous pull - I'd like to verify that that is the only option for the Python API.
The Cloud Pub/Sub client library only support asynchronous subscribing, which is the recommended way to run a subscriber. For specific use cases where a synchronous pull is needed, use the REST/HTTP pull method or the gRPC pull method, which requires generating the service code.
Related
Is there any way that I can use Azure Logic Apps to call streaming APIs? The API doesn't support chunking parameter so Range header is not an option.
I am looking for a logic similar to Python Requests library's Response.iter_content() that I can iterate over response data in chunks and write them back in DB.
My desired workflow: The logic app gets triggered once a day at a specific time. It calls the streaming API using HTTP connector and streams in all the data (data is much larger than HTTP connector internal buffer size). The API supports stream request and I am able to iterate through response in chunks using python Response.iter_content(chunk_size=1048576) to fetch them all.
PS: I know I can use Azure function and have it triggered within my Logic App workflow, but I'd prefer to use Logic Apps native connectors.
I have problem with one concept about async microservices.
Assume that all my services are subscribe to some event bus, and I have exposed API Gateway which takes HTTP request and translate them to AMQP protocol.
How to handle GET requests to my API Gateway? Should I use RPC? For single entity it’s ok, but what about some search or filtering (eg. get games by genre from Games service)?
I’m thinking about using RPC for getting single entities by ids and creating separate Search service with Elastic which will expose some GET endpoints to API Gateway. But maybe somewhere it’s simpler solution for my problem. Any ideas?
Btw., It’s correct to translate HTTP requests from API Gateway to AMQP messages?
Please take a look at Event Sourcing and CQRS. You can also take a look at my personal coding project:
I have to maintain a database on the Google Cloud Platform and along with it put in a script(preferably in python) that is automated to put in new values from an API on a daily basis.
I'm confused as to how to go about this. Any suggestions?
You can take advantage of the App Engine platform which allow you to deploy a python application. It can be set to simply await instructions from your API or fetch the information directly. With the help of CRON, you can schedule task that should take care of pushing the object within your Database.
Another option would be the Cloud Functions. Currently Cloud Functions only handles the Nodejs runtime but it allows you to run a backend application that only runs when triggered. With a simple HTTP trigger from your API, your function should handle the data received and organize it before storing it in your Database.
Other options are available like Cloud Endpoints, Database (Spanner, Cloud SQL, Cloud PostgreSQL, Bigtable,) API, etc. All depends of semantics of your project (Will it be run only once daily, how fast does the whole operation has to be completed, etc.). I would suggest to review all of Google CLoud products in order to find the right solution for you.
I want to access an external Google API through a GAS trigger. Is it possible/advisable to use something like the javascript Google API client library, adapted for the GAS context, instead of manually using URL Fetch, as mentioned here ?
PS. I am trying to hit the Google App Engine TaskQueue service via its REST API.
In the Apps Script Code Editor, under the RESOURCES, ADVANCED GOOGLE SERVICES menu, you can enable different API's. I don't see an Advanced Service for anything the resembles a Task Queue. There is a Tasks API, but that's for Task List, which is very different than the Task Queue.
So, I don't think you have any choice but to use the REST API with UrlFetchApp.fetch() in server side gs code in Apps Script.
As far as the trigger is concerned, you might want to look at quota limits, if you're going to be running it a lot, or running code that takes a long time to run.
You can use external APIs with OAUTH2 as outlined here: apps-script-oauth2
It's just not built-in, but you can easily add it as a library as mentioned in the Readme.
In http://blog.notdot.net/2010/09/Under-the-hood-with-App-Engine-APIs, it explains how you can perform an asynchronous datastore get request. I want to perform an asynchronous put request.
How do I do that?
Thanks!
As of GAE 1.5.0 there is 'put_async'.
Your best option for doing asynchronous calls to the datastore at the moment is to use Guido's experimental NDB project, which is a reworking of the App Engine datastore API to support asynchronicity.
My blog post was intended to be instrucitonal, but not as a template for something to do directly - reaching down to that level of the code to do asynchronous requests is likely to be very involved and awkward, and you're much better using a library that does it for you, like NDB.