GAE "The API call urlfetch.Fetch() required more quota than is available" when resumable upload Video files - google-app-engine

my GAE application reads files from Drive by Drive API into a FileStream, and then the FileStream is uploaded into Youtube by Youtube API v3 with "resumable upload". When the file size gets larger (e.g. 60M ), the Youtube API returns this error "The API call urlfetch.Fetch() required more quota than is available"
I also have tried with "direct upload" for uploading 60M size video file, then error message would be "java.lang.OutOfMemory: Java heap space at com.google.protobuf.ByteString.copyFrom (ByteString.java:178)".
Here is the brief version of my code:
GoogleCredential credential = new GoogleCredential.Builder()
.setTransport(HTTP_TRANSPORT)
.setJsonFactory(JSON_FACTORY)
.setServiceAccountId(SERVICE_ACCOUNT_EMAIL)
.setServiceAccountScopes(YouTubeScopes.YOUTUBE)
.setServiceAccountPrivateKeyFromP12File(new File(P12))
.setServiceAccountUser(account).build();
YouTube service = new YouTube.Builder(HTTP_TRANSPORT, JSON_FACTORY, credential).setApplicationName("VSP").build();
Video videoObjectDefiningMetadata = new Video();
VideoSnippet snippet = new VideoSnippet();
snippet.setTitle(title);
videoObjectDefiningMetadata.setSnippet(snippet);
InputStreamContent mediaContent = new InputStreamContent(VIDEO_FILE_FORMAT, new BufferedInputStream(filestream));
mediaContent.setLength(filesize);
YouTube.Videos.Insert videoInsert = service.videos().insert("snippet,statistics,status", videoObjectDefiningMetadata, mediaContent);
MediaHttpUploader uploader = videoInsert.getMediaHttpUploader();
uploader.setDirectUploadEnabled(false);
uploader.setChunkSize(7864320);
Video returnedVideo = videoInsert.execute();
error message "The API call urlfetch.Fetch() required more quota than is available" comes at last line of the code. sometimes the uploading is done successfully with the error message, sometimes not, by setting the ChunkSize differently.
I couldn't find any useful information about this error message. But my guess is that GAE application can only send certain mount of requests during certain mount of time. Since "resumable upload" is breaking the filestream into chunks, and send them in a sequence of requests, it reaches the limit easily. if my guess is right, what is the limit? and how do i solve this problem? if my guess is wrong, where do you think the problem is?
Thanks

Thanks guys!
Here is the limit for incoming & outgoing bandwidth for URL Fetch in GAE:
https://developers.google.com/appengine/docs/quotas
By default, the limit is 22M/min, with bill enabled, the limit becomes 740M/min. so with 22M/min limit, a GAE task queue can upload about 220M video files to Youtube (22M * 10min)
But this leads to the problem of using the upper code
Video returnedVideo = videoInsert.execute();
, becoz we cannot control the how many chunks are sent every minute in that code. The solution which i did, is to follow the description in the following link to handle each of the requests by myself. https://developers.google.com/youtube/v3/guides/using_resumable_upload_protocol
in this way, we can control the size of stream which could be sent each minute.

Related

Akka http file streaming not writing bytes more than 'n' size from different clients

`Akka http 10.0.6;
max-content-length is set to 6000m`
I am using Akka file streaming to upload huge files( sent as octet-stream to my service ) to accept incoming bytes and write it to a file sink. This is what I observe from my experimentations.. My limited understand from reading the documents is the client should be able keep sending the data unless we tell explicitly from akka http for back pressure mechanisms.. Been searching online to understand this behavior, not able to get an understanding yet on this to explain the following behavior.. Is there something I am missing in the code? How can I debug more on this? Also, via scalatest, able to do it. If someone can throw more might on what’s the difference in behavior w.r.t scalatest and via curl/http clients.. Thanks
Through "curl", I can stream max of 1KB sized file. Anything more than this, it will just hang and this message is given after timeout no matter how long I wait(20 seconds, 5 mins, 10 mins, etc) "Sending an 2xx 'early' response before end of request was received... Note that the connection will be closed after this response. Also, many clients will not read early responses! Consider only issuing this response after the request data has been completely read!"
 
Through "Apache http client", max of 128KB sized file streaming happens. Anything more than this, it will hang and same message as in #1 from akka http service
 
Through "python client", max of ~26KB sized file streaming happens. Anything more than this, it will hang and same message as in #1 from akka http service
 
Through scalatest inside the service, was able to upload files like 200MB, 400MB and more also
Here’s the code..
`put {
withoutSizeLimit {
extractDataBytes { bytes =>
 
implicit val system = ActorSystem()
implicit val materializer = ActorMaterializer()
// Tried system dispatcher also
implicit val executionContext = system.dispatchers.lookup("dispatcher")
val sink = FileIO.toPath(Paths.get("/file.out"))
val action = bytes.runWith(sink).map {
case ior if ior.wasSuccessful => {
complete(StatusCodes.OK, s"${ior.count} bytes written")
}
case ior => complete(StatusCodes.EnhanceYourCalm, ior.getError.toString)
}
Await.result(action, 300.seconds)
}
}
}`

Set up configuration to Access amazon order from seller account

I want to set .config.inc.php of MarketplaceWebServiceOrders library that I used to access order of amazon seller account.
Here is my config.inc.php file setting
/************************************************************************
* REQUIRED
*
* All MWS requests must contain a User-Agent header. The application
* name and version defined below are used in creating this value.
***********************************************************************/
define('APPLICATION_NAME', 'MarketplaceWebServiceOrders');
define('APPLICATION_VERSION', '2013-09-01');
After figure out these setting I got error
Caught Exception: Resource / is not found on this server. API Section is missing or you have provided an invalid operation name. Response Status Code: 404 Error Code: InvalidAddress Error Type: Sender Request ID: 47e5f613-5913-48bb-ac9e-cb00871b36af XML: Sender InvalidAddress Resource / is not found on this server. API Section is missing or you have provided an invalid operation name. 47e5f613-5913-48bb-ac9e-cb00871b36af ResponseHeaderMetadata: RequestId: 47e5f613-5913-48bb-ac9e-cb00871b36af, ResponseContext: 6qut/Q5rGI/7Wa0eutUnNK1+b/1rvHSojYBvlGThEd1wAGdfEtnpP2vbs28T0GNpF9uG82O0/9kq 93XeUIb9Tw==, Timestamp: 2015-09-15T12:47:19.924Z, Quota Max: , Quota Remaining: , Quota Resets At:
Here GetOrderSample.php file code for service url. Which I have done already.
// More endpoints are listed in the MWS Developer Guide
// North America:
$serviceUrl = "https://mws.amazonservices.com/Orders/2013-09-01";
// Europe
//$serviceUrl = "https://mws-eu.amazonservices.com/Orders/2013-09-01";
// Japan
//$serviceUrl = "https://mws.amazonservices.jp/Orders/2013-09-01";
// China
//$serviceUrl = "https://mws.amazonservices.com.cn/Orders/2013-09-01";
$config = array (
'ServiceURL' => $serviceUrl,
'ProxyHost' => null,
'ProxyPort' => -1,
'ProxyUsername' => null,
'ProxyPassword' => null,
'MaxErrorRetry' => 3,
);
$service = new MarketplaceWebServiceOrders_Client(
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
APPLICATION_NAME,
APPLICATION_VERSION,
$config);
Caught Exception: Resource / is not found on this server.
This is telling you that the problem is with the path. It's not being able to find that.
Are you sure that the class you're trying to use is in the right path? Most classes that you need to run the samples are in the "Model" folder.
I mean to say, if you take the sample out of the sample folder and put it elsewhere, it won't be able to find classes in the "Model" folder.
An easy fix to test it is to put everything you downloaded from the MWS site into your web directory and just edit the config file. It should work that way.
I'm not a php guy, but I don't see where you're setting up the access keys, merchant id, marketplace id, and most importantly the serviceURL. The 404 error is the first clue meaning it can't find the service. Download the PHP client library which has everything you need to get started.

What is the size limit of email body in messages.send method of Gmail API?

I am using official .net api client to send emails with attachments by messages.send method. When I attach a file of size more than approximately 5mb, I've come to
[JsonReaderException: Unexpected character encountered while parsing value: <. Path '', line 0, position 0.]
Newtonsoft.Json.JsonTextReader.ParseValue() +1187
Newtonsoft.Json.JsonTextReader.ReadInternal() +65
Newtonsoft.Json.JsonTextReader.Read() +28
Newtonsoft.Json.Serialization.JsonSerializerInternalReader.ReadForType(JsonReader reader, JsonContract contract, Boolean hasConverter) +237
Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent) +783
Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType) +293
Newtonsoft.Json.JsonConvert.DeserializeObject(String value, Type type, JsonSerializerSettings settings) +274
Newtonsoft.Json.JsonConvert.DeserializeObject(String value, JsonSerializerSettings settings) +57
Google.Apis.Json.NewtonsoftJsonSerializer.Deserialize(String input) in c:\code\google.com\google-api-dotnet-client\default\Tools\Google.Apis.Release\bin\Debug\test\default\Src\GoogleApis.Core\Apis\Json\NewtonsoftJsonSerializer.cs:120
Google.Apis.Services.<DeserializeError>d__8.MoveNext() in c:\code\google.com\google-api-dotnet-client\default\Tools\Google.Apis.Release\bin\Debug\test\default\Src\GoogleApis\Apis\Services\BaseClientService.cs:286
[GoogleApiException: An Error occurred, but the error response could not be deserialized]
Google.Apis.Requests.ClientServiceRequest`1.Execute() in c:\code\google.com\google-api-dotnet-client\default\Tools\Google.Apis.Release\bin\Debug\test\default\Src\GoogleApis\Apis\Requests\ClientServiceRequest.cs:102
I think client use Metadata URI, for metadata-only requests. A am going to try another option: Upload URI, for media upload requests.
It looks like there is a limit on email size that leads to exception of parsing error response in the client library.
So, the first question: is there a size limit?
Second, there is no info about how to use different upload methods via client, do you know any client library documentation?
Update: I hacked that request produced by
var request = gmailService.Users.Messages.Send(message, AccountUserId);
is going to https://www.googleapis.com/gmail/v1/users/me/messages/send. As I supposed it didn't use media upload request.
I ended up with limit on attachments total size. Here is code snippet.
Class level:
public const int MaxAttachmentsSize = 5 * 1024 * 1024;
Method level:
var attachmentsSize = 0;
foreach (var attachment in attachments)
{
attachmentsSize += attachment.Item1;
if (attachmentsSize > MaxAttachmentsSize) break;
mailMessage.Attachments.Add(attachment.Item2);
}
There is limit in MB of whole message. Google API allows you for quite big email but it may get timeout when sending if your service because of connection speed will be doing it too long.
According to this google docs it is 35MB:
google api send docs
For anything (uploading) over a few MB you should definitely use the /upload version of the method, otherwise yes you may run into those size limitations.
In response to the second part of your question...
Second, there is no info about how to use different upload methods via client, do you know any client library documentation?
I did a little poking around in the API, and I see that there is also a method that takes a stream as the 3rd parameter.
services.Users.Messages.Send( body, userId, stream, contentType)
Digging into the source code of that, I see that it seems to use a URL that looks like:
upload/....
I haven't tried it yet, and I don't know (yet) what it wants for a "stream", but this looks like a good possibility for getting a resumable upload with a bigger limit.

How to get partial results from Google App Engine's urlfetch?

When I'm using google.appengine.api.urlfetch.fetch (or the asynchronous variant with make_rpc) to fetch a URL that steadily streams data, after a while I will get a google.appengine.api.urlfetch_errors.DeadlineExceededError as expected. Since it is a stream that I want to sample, setting the deadline to a higher value can't ever help, unless the stream finishes (which I do not expect to happen).
It seems there is no possibility of getting the partially downloaded result. At least the API doesn't offer anything. Is it possible to
either request the downloaded part
or only ask for a certain amount of data (since I can estimate the stream's rate) to be downloaded?
[Clarification: Since it is a stream, requests with a Range header will be answered with 200 OK and not 206 Partial Content.]
In your call to urlfetch.fetch, you can set HTTP headers. The Range header is how you specify a partial-download request in HTTP:
resp = urlfetch.fetch(
url=whatever,
headers={'Range': 'bytes=100-199'})
if those are the 100 bytes you want. The HTTP status code you get should be 206 for such a partial download, etc (none of that's GAE-specific). See e.g http://en.wikipedia.org/wiki/Byte_serving for details.

Sending images to google cloud storage using google app engine

I wrote a simple webapp using google app engine in python that allows users to upload images and have it stored somewhere (for now I'm just using the blob store based on the tutorials).
Now I want to send the images to google cloud storage, but am not sure how to do this. They provide two modes when opening a file: "a" and "r". Neither of them, to my knowledge, are for binary streams.
How can I send an image to google cloud storage? Code snippets or references would be nice.
I am planning to send small audio samples as well, and other binary data.
Also, how can I delete the image if a user wishes to delete it? There doesn't seem to be a delete method available.
Here's a simple upload handler that will write the blob that was uploaded to bigstore. Note you will need to have added your app's service account to the team account that is managing the bigstore bucket.
class UploadHandler(blobstore_handlers.BlobstoreUploadHandler):
def post(self):
upload_files = self.get_uploads('file')
blob_info = upload_files[0]
in_file_name = files.blobstore.get_file_name(blob_info.key())
infile = files.open(in_file_name)
out_file_name = '/gs/mybucket/' + blob_info.filename
outfile = files.gs.create(out_file_name,
mime_type = blob_info.content_type)
f = files.open(outfile, 'a')
chunkSize = 1024 * 1024
try:
data = infile.read(chunkSize)
while data:
f.write(data)
data = infile.read(chunkSize)
finally:
infile.close()
f.close()
files.finalize(outfile)
There isn't a difference between a binary stream and a text stream in cloud storage. You just write strings (or byte strings) to the file opened in "a" mode. Follow the instructions here.
Also, if you are serving images from the blobstore, you are probably better off using get_serving_url() from here, although that depends on your application.

Resources