Higher than expected outgoing bandwidth - google-app-engine

We have an application which is experiencing much higher outgoing bandwidth than we would expect given the loads. We have over 10 GB/day of outgoing bandwidth with essentially 0 visitors/day on the front end and a bunch of back end processing (using backend servers and the task queue). We also use memcache.
Google says they bill as follows:
Outgoing Bandwidth (billable)
The amount of data sent by the application in response to requests.
This includes:
data served in response to both secure requests and non-secure requests by application servers, static file servers, or the Blobstore
data sent in email messages
data sent over XMPP or the Channel API
data in outgoing HTTP requests sent by the URL fetch service.
We are not serving static files (it only has a rest api), don't use the blob store, don't send emails, don't use XMPP. We do use the URL fetch service, but only with GET requests. I find it hard to believe that 6000 GET requests would amount to 10 GBs of data.
Does anyone know how I can track down the details of what goes into our outgoing bandwidth usage?

To get an idea of when this bandwidth is being consumed, on the appengine dashboard you can change the charts context to: Traffic (Bytes/Second)
Also, within the dashboard I would open the Quota Details page and give it a quick once over and see if you can isolate which service is consuming the bandwidth.
On a side note, have you reviewed tasks in flight to see if perhaps something is stuck in a queue?

Related

battery efficient publishing to iot-core

I have an app that, when activated, uploads location data. Currently it sends the data to the server via REST, however I would like to save on server costs and send the data via iot-core.
Previously, I would queue location updates, and only send them about once every few minutes, this way the phone would only turn on its data broadcasting once every few minutes and not keep it constantly, and save on battery life.
Is there a way to enable similar battery saving when uploading to AWS iot-core? I haven't run tests, but I assume that constantly sending messages via mqtt, websockets, or http is just as battery draining as regular rest messages.
This is somewhat related to Aws IoT Message Delivery.

How to load huge amount of data from spring boot to reactjs?

I have two applications, spring boot backend and react frontend. I need to load a lot of data (lets say 100 000 objects, each 3 Integer fields), and present it on a leaflet map. However i don't know which protocol should I use. I thought about two approaches:
Do it with REST, 1 000 (or more/less) objects each request, create some progress bar on front end so user does not refresh the page all the time because he thinks something is wrong.
Do it with websocket, so it is faster? Same idea with progress bar, however I am worried that if user starts to refresh the page, backend will stream the data even though connection from frontend is crashed and new one is established, for the new one the process will begin too, and so on.
If it is worth mentioning, I am using spring-boot 2.3.1, together with spring cloud (eureka, spring-cloud-gateway). Websocket i chose is SockJS, data is being streamed by SimpMessagingTemplate from org.springframework.messaging.simp.SimpMessagingTemplate.
If you have that amount of data and alot of read write operations, I would recommend not returning it in either websocket or rest call(reactor or MVC) sending big amount of data over tcp has it issues, what I would recommend is quite simple, save the data to Storage(AWS S3 for example), return the S3 bucket url, and from the client side read the data from the S3 directly,
alternatively you can have a message queue that the client is subscribe on(pub/sub), publish the data in the server side, and subscribe to it on the client side, but this may be an overkill.
If you are set on rest you can use multipart data see the stack overflow question here:
Multipart example

Channel API overkill?

Hi I am currently using channel API for my project. My client is a signage player which receives data from app engine server only when user changes a media content. Appengine sends data to client only ones or twice a day. Do you think channel api is a over kill for this? what are some other alternatives?
Overall, I'd think not. How many clients will be connected?
Per https://cloud.google.com/appengine/docs/quotas?hl=en#Channel the free quota is 200 channel-hours/day, so if you have no more than 8 clients connected you'll be within the free quota -- no "overkill".
Even beyond that, per https://cloud.google.com/appengine/pricing , there's "no additional charge" beyond the computational resources keeping the channel open entails -- I don't have exact numbers but I don't think those resources would be "overkill" compared with alternatives such as reasonably frequent polling by the clients.
According to the Channel API documentation (https://cloud.google.com/appengine/features/#channel), "The Channel API creates a persistent connection between an application and its users, allowing the application to send real time messages without the use of polling.". IMHO, yours might not the best use case for it.
You may want to take a look into the TaskQueue API (https://cloud.google.com/appengine/features/#taskqueue) as an alternative of sending data from AppEngine to the client.

How is data cost calculated for Google channel api?

I'm writing a p2p chess game that sends 2 byte messages back and forth (e.g. e4 or c4). I'm considering the use of GAE Channel API. I noticed that this API causes the browser to send a heartbeat message to the server with POST URL https://849.talkgadget.google.com/talkgadget/dch/bind?VER=8&clid=...
That fires about every second. I won't be charged for the response data and response headers for those heartbeat requests correct?
Also, when I send data from the server to a browser over a channel, am I charged for only the json string itself or all http header/payload packets?
Google has a newer (and totally free!) API you should look at instead of the channel API (unless its restrictions cant be worked arround.)
GCM (google cloud messaging) is free, with a few restrictions like packet size (2kb in some cases) but it will handle everything for you (queuing, broadcast to all, broadcast to topics, one-to-one messaging, battery-efficient mobile libraries (android and iOS), native chrome support etc.
https://developers.google.com/cloud-messaging/
Make sure to also see this s.o. answer for GCM implementation tips: https://stackoverflow.com/a/31848496/2213940

Programming a load balancer with sockets using C

I'm trying to make a load balancer for 3 HTTP servers{hosts= "web1", "web2", "web3"}{load balancer ports="8081","8082","8083"}.
This load balancer transfers HTTP requests randomly to one of the servers and then returns the result of the request to the sender.
I'm begining with sockets so if any one could tell me what would the program look like?
If it is not clear I'm ready to give more details.
You need to figure out if the requests are stateful, meaning - the requests belong to a valid session, then such requests should be consistently routed to the same server to avoid failures and inconsistencies. Fresh requests can be routed to any of the servers based on load balancing algorithm eg. round robin or least loaded server etc.

Resources