We have a project where the data should not leave Canada for any reason because of some data residency requirements. Our project uses Google Compute OCR to scan documents. We have successfully created a project and within google cloud we have restricted the region to the Montreal data center (northeast-northeast1).
Wondering if there is a URL that will take us to that data center directly. Currently the URL we are accessing is vision.googleapis.com which appears to be located in US. So at some point our request crosses over to US even if our API key is generated within the Montreal data center.
GCP products are split into products available by region and global products. In the case for Cloud Vision API, it is currently listed as a global product and accessing it regionally is currently not possible.
Related
I create a homepage with a download center for PDFs.
Is it better to store it in the frontend or in the backend or is this irrelevant ?
Edit for the comment:
-> it will be a website planned for 3.000 user per month
-> not sure what PWA is, but the page gets a responsive design for computer and smartphones
-> the PDFs should be accessible for everyone, like a manual or a sample of contracts
Given that information needs to be accessible to different users I would say that it has to be stored in a server and managed by the backend, which would be equivalent to saying "It should be in the backend" on your own terms. This is the simple answer, but as far as I can see, two questions arise here:
How to store the data
Once established that the data resides on the backend part of the system you would have to choose between having the PDFs stored in the file system and the backend serving files statically or having the PDFs stored as BLOBS in the database. Both have their advantages and drawbacks, more information here.
Should it be accessible offline
If the user needs to access some information while offline then you would have to store those PDFs on his device. Another reason to do that would be if the PDFs are very large and they don't change that often but they could be fetched by an user multiple times in a day and you don't want to have the backend busy serving the same file everytime.
I am trying to deploy an a flask application entirely in the free tier of GCP.
I have deployed it on App Engine Standard in the us-west2 zone, and am now getting charged for cloud storage. It turns out cloud storage only has a free tier in the us-east1, us-west1, and us-central1 zones.
I cannot seem to figure out how to migrate or redeploy my app in the us-west1 region. There is plenty of documentation around migrating zones, but none of it seems to apply to App Engine Standard. Does GCP allow migration of App Engine Standard apps, and if so, how can I do so?
indeed, is not possible to move the region of an app once is set, the documentation states that:
You cannot change an app's region after you set it. App Engine Locations.
But, as well states that:
Cloud Storage Location
When you create an app, App Engine creates a default bucket in Cloud Storage. Generally, the location of this bucket is the region matching the location of your App Engine app.
regarding buckets, seems that is possible to rename it and move it to a different region, so you can give it a try moving your bucket back to free tier and see if that help with your billing, otherwise, as stated in the previous response, you will have to recreate your app basically from scratch.
Moving and renaming buckets
When you create a bucket, you permanently define its name, its geographic location, and the project it is part of. However, you can effectively move or rename your bucket:
--If there is no data in your old bucket, delete the bucket and create another bucket with a new name, in a new location, or in a new project.
-- If you have data in your old bucket, create a new bucket with the desired name, location, and/or project, copy data from the old bucket to the new bucket, and delete the old bucket and its contents. The steps below describe this process.
If you want your new bucket to have the same name as your old bucket, you must temporarily move your data to a bucket with a different name. This lets you delete the original bucket so that you can reuse the bucket name.
Moving data between locations incurs network usage costs. In addition, moving data between buckets may incur retrieval and early deletion fees, if the data being moved are Nearline Storage, Coldline Storage, or Archive Storage objects.
Regards.
First of all, it's not zone, but region in the Google Cloud semantic. But anyway, I understood.
And you can't change the region of a App Engine. You need to delete your project and to create it again. Or to create a new project and set the correct region from the init. Don't forget to save your data is you delete your project.
App Engine is a 13+ years old app and Google Cloud didn't think about this migration from the beginning. It's the weight of the legacy!
I am implementing a dictionary website using App Engine and Cloud Storage. App Engine controls the backend, like user authentication etc., and Cloud Storage is used to store a JSON file for each dictionary entry.
I would like to rate limit how much a user can download in a given time period so they can't bulk download the JSON files and result in a big charge for me. Ideally, the dictionary would display a captcha if a user downloads too much at once, and allow them to keep downloading if they pass the captcha. What is the best way to achieve this?
Is there a specific service for rate limiting based on IP address or authenticated user? Should I do this through App Engine and only access Cloud Storage through App Engine (perhaps slower since it's using some of my dynamic resources to serve static content)? Or is it possible to have the frontend access Cloud Storage and implement the rate limiting on Cloud Storage directly? Is a Cloud bucket the right service for storage, here? And how can I allow search engine indexing bots to bypass the rate limiting?
As explained by Doug Stevenson in this post
"There is no configuration for limiting the volume of downloads for
files stored in Cloud Storage."
and explaining further:
"If you want to limit what end users can do, you will need to route
them through some middleware component that you build that tracks how
they're using your provided API to download files, and restrict what
they can do based on their prior behavior. This is obviously
nontrivial to implement, but it's possible."
I am using google cloud platform to host my services. I have the following services running to support my application
App Engine
Cloud Sql
Cloud Storage
I need help in understanding which region to use to reduce latency and pricing. The application users will be based out of India. Based on the availibity of the above services, below is what I can do best
App Engine - asia-northeast1
Cloud Sql - asia-east1/asia-northeast1
Cloud Storage - asia-southeast1/asia-east1/asia-northeast1
The region "asia-southeast1" works best for me in terms of distance. But having all the three different regions is not adviced by google.
Is hosting all the three in asia-northeast1 the best option for me?
I'd advise against storing your application in a different region/data center as your database as it will introduce a large latency for every single database operation.
And as each request of a complex application will most likely have more database operations than round trips to the user, it makes sense to put the App and DB in the same region even if it means putting them slightly further away from users.
As far as cloud storage is concerned, if you set your storage buckets to have Multi Region Availability (which they are set to have by default) then the initial region you placed them in makes no difference as Google will serve each user from the closest region to them
I used Google Checkout API for getting subscriptions reports every half an hour.
https://checkout.google.com/api/checkout/v2/reports/Merchant/
(https://developers.google.com/checkout/developer/Google_Checkout_XML_API_Order_Report_API)
Our company have ours support team which provides help for users of application. My script downloaded last purchases every half an hour and stored in DB (time,serialnumber,email). After that support was getting access to this information through our web-interface (thus, we did not give an access (login,pass) for a main Checkout account).
Now Google Checkout was shut down and I don't understand how to do it at Google Wallet. Does anyone now can it provide this functionality or not?
AFAIK there currently is no similar "infrastructure" (reporting, querying, etc, via API like in Google Checkout) for Google Wallet for Digital Goods which, based on your comments, sounds like the product you are looking for (to replace your current Google Checkout implementation) - it's specifically for digital goods and does support subscriptions.
You'd manage orders via Merchant Center.
You will get order data via postbacks so as for ideas - you would need to store order data on your end when you handle the postback/s from Google - for both "placed" and "cancelled".
This would replace how you currently do it via Order Report API in Google Checkout...
Hth...