How do you set fixed NS values on Google Cloud DNS? - google-app-engine

I am trying to manage my domain through Cloud DNS from Google Cloud Platform:
On Google Cloud Platform, when I create a new zone for my domain, I get something like this:
Then I will proceed to namecheap to enter these values as my custom DNS server:
I've done this a few times and I realize Google Cloud actually rotates between
ns-cloud-aX.googledomains.com,
ns-cloud-bX.googledomains.com,
ns-cloud-cX.googledomains.com
and ns-cloud-dX.googledomains.com
whenever you create your own zone for your domain. (where X is a number from 1-4)
I am interested to create a managed zone with NS records that are always ns-cloud-aX.googledomains.com. This is because I am switching to a new registrar which requires me to manually set the these values per domain.
I tried changing the NS values in Google Cloud but I keep seeing this error:
Is it possible to manually set my NS records to always use ns-cloud-aX.googledomains.com?

I would suggest not changing it as Google Cloud will automatically give you an NS record, depending on your zone. This would/might cause some errors.

first step is to add Cloud DNS
CREATE ZONE
zone name you recored
then you HAVE TO create Load balancing

Related

Datastore Issue in Google App Engine making the application not allowing the inputs to work

App engine application is deployed in server and is connected to the database. All the data is loaded in the list and it is showing properly, but if we give some inputs in input box or any other form fields it is not reflecting in the application. The error i got from the console of app engine was this
com.google.apphosting.api.ApiProxy$RequestTooLargeException: The
request to API call datastore_v3.Put() was too large
In the App engine server I cleared the data store entries but it didn't work out well. I disabled the write access of the data store file but for few minutes it worked i was able to give the inputs and it was reflecting but after sometime the URL was not working and then I enabled the write access then the application was accessible.
This error message was caused due the Datastore default limits, most likely the limit related to the limit size for a Datastore entity, since the exception is thrown by datastore_v3.Put():
Maximum size for an entity: 1,048,572 bytes (1 MiB - 4 bytes)
However, also take in mind the limits regarding other Datastore metrics, in the linked documentation.
I would recommend you to double check that the size of the entities you are trying to insert into Datastore falls under this value, as well as under the values of the rest of the limits.

Where i can check my application api (request response time ) logs in google cloud project

I have a Google Cloud project and i want to see the logs of all api hits request and response parameters in GCP. In AWS we have S3 browser to get all logs folder. What is the equivalent in GCP??
In GCP logs are not stored on a filesystem, there is no logs folder, so "equivalent" is a bit relative.
Most (if not all) GCP products funnel their logs through Stackdriver Logging, which offer a somewhat consistent interface for viewing and/or further processing/exporting them (see Basic Concepts).
The structure and content/details of a particular log entry depends on the log type and the particular GCP product that produced it (and/or its flavour). For App Engine the environment being used, for example, matters for the log entry content (1st generation standard, 2nd generation standard or flexible).
At least for the 1st generation standard environment (which I use) the request response times (and all other parameters logged/available for all requests and their corresponding replies) are captured in the request logs:
11 Wallclock time Yes
Total clock time in milliseconds spent by App Engine on the request.
This time duration does not include time spent between the client and
the server running the instance of your application. Example: ms=195.

Verifying a domain for Google App Engine

We're having trouble verifying our domain for our Google App Engine application.
We have a domain registered with Hostek, where our name servers are currently:
ns-cloud-c1.googledomains.com
ns-cloud-c2.googledomains.com
ns-cloud-c3.googledomains.com
ns-cloud-c4.googledomains.com
I created a DNS zone in Google Cloud and added the TXT record with the value given to me by Google App Engine, but when click "Verify" on the Google side, I get the error
Verification failed for cbcdashboard.com using the DNS TXT record method (less than a minute ago). We couldn't find the verification token in your domain's TXT records. You might need to wait a few minutes before Google sees your changes to the TXT records.
Below that, if I click "Show found DNS TXT records", I see
google-site-verification=<the token I was given>
It seems to see the value, but to not think it's correct. I tried adding it with and without quotes. Any ideas?
I just did a dig and got this:
"google-site-verification=PDmOnhweMP0C1aXpkNh-4kG-Mlhg3o22viWjGm_gn3U"
So it seems like that it's a propagation issue. When you make changes to DNS it does not spread out on the DNS for a while. If you try again to verify does it now work?

How do I set a cost limit in Google Developers Console

Some functions in the Google Developers Console, like the Analytics API, are free until you reach a quota. Other functions, like Google Cloud Storage, create costs from the first click.
When I upload a file under https://console.developers.google.com/ > Storage > Cloud Storage > Storage Browser and I make this file publicly available, I pay about $0.12 per GB traffic.
But theoretically the traffic to this link could explode, e.g. because of sudden popularity. Therefore I would like to set something like a daily or monthly cost limit.
Q: How do I protect myself from overly high costs in the Google Developers Console?
You cannot. I asked Google about this, here's their response, from May 7 2016:
(GCE = Google cloud engine. No spending limits.
GAE = Google app engine — yes it has spending limits.)
... you are eligible for support on ... only ...
... [various helpful links] ...
That been said, at the moment there is no a feature that allows you to
configure a limited budget on GCE. This feature is certainly available
for GAE [1]. As you mentioned in your comments, you either can totally
shut down your VMs (will depend on your use case) or set the VMs to
send you alerts if they reach a certain traffic limit [2].
Sincerely,
Someone's first name
Technical Solutions Representative
Google Cloud Platform
[1] https://cloud.google.com/appengine/docs/quotas
[2] https://cloud.google.com/monitoring/support/notification-options
#wmdry, you wrote: "traffic to this link could explode" — I'm afraid of this too. That's why I asked Google about this. And I'm planning to avoid Google's CDN because of this, and use another CDN provider instead, which has spending limits. Because, unlike Nginx, I don't see any way for me to rate limit / throttle Google's CDN.
I do plan to use GCE (Google Cloud Engine) though. Therefore, right now I'm reading about how to rate limit my Nginx server. Because if I just configure Nginx correctly, then those $0.12 / GB you mentioned, cannot possible explode to ... like $10k in a month? What if Google sends a $10k bill when I'm back from an a few week's vacation, just because of my hobby project and a few people downloading a 1 MB movie over and over again forever (because: evil). Hmm, & the bigger & faster my servers, the higher the risk.
I hope Google will add spending limits, because I did want to use Google's CDN.
Update 2020: Apparently this does bite people from time to time — look here:
"Burnt $72k testing Firebase and Cloud Run and almost went bankrupt", Dec 08, 2020, https://news.ycombinator.com/item?id=25372336,
In that case, they could contact Google and in the end didn't need to pay.
As of July 2017 you can set budgets that send notifications via email but do not cap spending:
To set an alert-only budget, which will not cap spending:
Go to the Cloud Platform Console.
Open the console left side menu and click Billing
If you have more than one billing account, click the billing account name.
On the left, click Budgets & alerts.
Official help page: https://support.google.com/cloud/answer/6293540?hl=en
I found that Google's documentation now provides two methods to actually limit the cost of a GCP project. It involves the following setup:
Create a Cloud Function that checks the cost against the budget, and carries out a certain action if the cost exceeds the budget. Google's Documentation provides a sample code snip that can either shutdown all VM instances in a Project or disable the billing for a project. Shutting down all VMs would stop all VM-related cost but you get to keep your data (and still have to pay for the storage). Disabling the billing for a project would effectively zap all cost-related activities and you could lose data. You can name the Cloud Function "budget-enforcer".
The Google code snip as provided above has a hard coded ZONE variable. Remember to change it to match your zone!
Create a Service Account to run the Cloud Function "budget-enforcer". For shutting down VMs, the Service Account would need role "Compute Instance Admin (v1)". For disabling billing on a project, the Service Account would need role "Project Billing Manager".
Set a Topic for the Cloud Function (I call mine "proj-name-stop-vm" and "proj-name-disable-bill").
Set up a budget alert as usual, and connect it to one of the Pub/Sub topic above.
Please be noted that Google's documentation did mention that there could be a delay between the cost exceeds a budget and the function is triggered, so you should build in a buffer if you have an absolute hard cost limit. I use 90% of the budget as the trigger line for shutting down my instances.
The API usage can be limited with a hard limit:
Depending on the API, you can explicitly cap requests in a variety of
ways, including: requests per day, requests per 100 seconds, and
requests per 100 seconds per user. You might want to limit the
billable usage by setting caps. For example, to prevent getting billed
for usage beyond the free courtesy usage limits, you can set requests
per day caps
Source
You can combine budget pub/sub alerts with a cloud function that can disable billing on your entire account if a threshold is met.
Full Tutorial Here:
https://www.youtube.com/watch?v=KiTg8RPpGG4
GitHub Repo Here: https://github.com/aioverlords/Google-Cloud-Platform-Killswitch
To Disable Billing
const _disableBillingForProject = async projectName => {
const res = await billing.updateBillingInfo({
name: projectName,
resource: {
billingAccountName: ''
}, // Disable billing
});
console.log(res);
console.log("Billing Disabled");
return `Billing disabled: ${JSON.stringify(res.data)}`;
};
Simply go to the developer console:
https://console.developers.google.com/project
Select your project.
Select "billings & settings"
Enable billing.
Then go to Compute/AppEngine/Settings and set a daily budget.
Go to Google Cloud console, and then to Billing / Budgets and Alerts and create a new budget for one or all your projects. You can select which services should be included in the limit and set a monthly amount that should not be exceeded.

"Over quota" when using GCS json-api from App Engine

I am using Go on App Engine. In most cases, I use the file api to access GCS, which works great, except that deletes don't work so to delete files I use the JSON-API (specifically, the google-go-api-client). To authenticate, I use app engine service accounts. We are sometimes seeing an error come back of "Over quota:" with nothing after the colon. Since we are a paid app, what quota could this be? Is there a burst limit (e.g. no more than X requests in a single minute)? Is there any places where any such applicable quotas are documented?
The caching mechanism is broken for goauth2 and serviceaccount tokens. You can see the issue I created here for more detail: https://code.google.com/p/goauth2/issues/detail?id=28
I came across a "over quota" issue myself when requesting more than 60 service accounts a minute. I opened a ticket with AppEngine support (I pay for the silver package) and got this undocumented information out of them.
You can apply the patch yourself in your $GOPATH/src/code.google.com/p/goauth2/appengine/serviceaccount/cache.go file. This fixed the issue you described for my team.
Even i had found same problem and found two reasons:-
1.Daily budget
2.Logs retention
Solution:
for problem 1 increase the daily budget
for problem 2 increase the retention from 1 to higher GB
![enter image description here][1]

Resources