I am trying to upload multiple files to Google drive using the PHP SDK. For this I am calling the function below iteratively passing the required parameters:
function insertFile($driveService, $title, $description, $parentId, $fileUrl) {
global $header;
$file = new Google_DriveFile();
$file->setTitle($title);
$file->setDescription($description);
$mimeType= "application/vnd.google-apps.folder";
if ($fileUrl != null) {
$fileUrl = replaceSpaceWithHtmlCode($fileUrl);
$header = getUrlHeader($fileUrl);
$mimeType = $header['content-type'];
}
$file->setMimeType($mimeType);
$parent = new Google_ParentReference();
// Set the parent folder.
if ($parentId != null) {
$parent->setId($parentId);
$file->setParents(array($parent));
}
try {
$data = null;
if ($fileUrl != null) {
if (hasErrors($driveService, $fileUrl) == True) {
return null;
}
$data = file_get_contents($fileUrl);
}
$createdFile = $driveService->files->insert($file, array(
'data' => $data,
'mimeType' => $mimeType,
));
return $createdFile;
} catch (Exception $e) {
echo "Error: 12";
return null;
}
}
I am running this app on the Google App Engine.
However, I am unable to upload all the files I pass to it. For example, if I pass about 12-15 files, only 10-11 get uploaded, and sometimes all get uploaded, even though all parameters are correct. I have caught the exception when it fails to create a file and this says it is unable to create a file, for the files that are not uploaded. I don't see any warnings or errors in the logs on the app engine.
Am I missing something? Can someone please point me where I should be looking to correct this and make it reliable enough to upload all files given to it?
The HTTP response that I get when I try to upload 30 files is this:
PHP Fatal error: The request was aborted because it exceeded the maximum execution time
Check the http response to see the detailed reason. It might be that you are hitting the throttle limit and getting a 403 rate limit response.
Related
I'm working on my cakephp project
and I am currently upgrading from 3.3.16 to 3.4.0
The project uses the cakephp-upload plugin to save an image.
The Upload plugin needs an existing entity to attach a file to it. A modification of the request is done to grab the avatar, before unsetting it to save the user.
I know this is not a good practice to modify a request, but the code was made this way.
With immutable objects in version 3.4.0, it is just not possible anymore. But i dont know how to do it properly.
Here is the error message given by my unit-test,
ran by vendor/bin/phpunit --filter testAdd tests/TestCase/Controller/Api/V1/UsersControllerTest.php:
There was 1 failure:
1) App\Test\TestCase\Controller\Api\V1\UsersControllerTest::testAdd
Failed asserting that file "/home/comptoir/Comptoir-srv/webroot/img/files/Users/photo/5/avatar/correctAvatarLogo.jpg" exists.
/home/comptoir/Comptoir-srv/tests/TestCase/Controller/Api/V1/UsersControllerTest.php:208
Here is the actual code:
public function add()
{
if (!empty($this->request->data)) {
$user = $this->Users->newEntity($this->request->data);
} else {
$user = $this->Users->newEntity();
}
$message = "";
// Get the avatar before unset it to save the user.
// The Upload plugin need an existing entity to attach a file to it.
if ($this->request->is('post')) {
if (isset($this->request->data['photo']) && !$user->errors()) {
$avatar = $this->request->data['photo'];
$this->request->data['photo'] = "";
}
$user = $this->Users->patchEntity($user, $this->request->data);
if ($this->Users->save($user)) {
$user = $this->Users->get($user->id, ['contain' => []]);
isset($avatar) ? $this->request->data['photo'] = $avatar : null;
$user = $this->Users->patchEntity($user, $this->request->data);
if ($this->Users->save($user)) {
$message = "Success";
$this->Flash->success(__d("Forms", "Your are registred on the Comptoir du Libre, welcome !"));
if (!$this->request->is('json')) {
$this->Auth->setUser($this->Auth->identify());
$this->redirect([
"prefix" => false,
"controller" => "Pages",
"language" => $this->request->param("language")
]);
}
} else {
$message = "Error";
}
} else {
$message = "Error";
$this->Flash->error(__d("Forms", "Your registration failed, please follow rules in red."));
}
$message == "Error" ? $this->set('errors', $user->errors()) : null;
}
$this->ValidationRules->config('tableRegistry', "Users");
$rules = $this->ValidationRules->get();
$userTypes = $this->Users->UserTypes->find('list', ['limit' => 200]);
$this->set(compact('user', 'userTypes', 'rules', 'message'));
$this->set('_serialize', ['user', 'userTypes', 'rules', 'message', 'errors']);
}
Does anyone know how to do that respecting the immutable rule ?
Your premise is wrong.
The Upload plugin needs an existing entity to attach a file to it
That's actually not correct, uploading files alongside creating new records works fine. There's no need for this stuff in your controller, it should be possible to handle this with a single basic save, ie you should investigate the problem that you're having with that, and fix it.
However looking at your test, it should fail anyways, because the file data that you're passing is invalid, it's neither an actual uploaded file for which is_uploaded_file() would return true, nor is it acceptable for user data to be able to define the temporary file path, and the error code, ie you're not properly validating the data if that test passes as is. Accepting such data is a security vulnerability, it could allow all sorts of attacks, from path traversal to arbitrary file injections!
Ideally your whole upload validation and writing functionality would support \Psr\Http\Message\UploadedFileInterface objects, that would allow for very simply testing by being able to pass instances of that class into the test data, that might be something worth suggesting for the plugin. Without such functionality, your second best bet would probably be something like modifying the table's validation rules before issuing the test request, so that is_uploaded_file() is being skipped, or you're switching to integration tests over HTTP, instead of the simulation in CakePHP.
I can successfully send emails with App Engine Mail API, but i cannot find the problem for why the attachment do not get attached in email. My code looks like this:
// Pull in the raw file data of the image file to attach it to the message.
$image_data = fopen('gs://pro-sitemaps-api/file.csv');
try {
$message = new Message();
$message->setSender('myemail#****.com');
$message->addTo('myemail#****.com');
$message->setSubject('Subject Google App Engine Test');
$message->setTextBody('Test body');
$message->addAttachment('file.csv', $image_data);
$message->send();
echo 'Mail Sent';
} catch (InvalidArgumentException $e) {
echo 'There was an error'.$e;
}
My file is hosted in Google Storage (bucket). The filetype of storage file is .csv
Can anyone tell me what i am doing wrong?
Edit:
Adding "mode":"r" gives me this error:
$image_data = fopen('gs://pro-sitemaps-api/file.csv', 'r');
Output:
error: {
code: "500",
message: "An error occurred parsing (locally or remotely) the arguments to mail.Send().",
status: "UNKNOWN",
details: [ ]
}
}
Edit:
This works but this only sends the top line (header). not all lines. Tried adding this inside a array and send the array as attachment but then i get same error.:
$fpmail = fopen('gs://pro-sitemaps-api/file.csv', 'r');
//$attach = fread($fpmail);
$attach = fgets($fpmail);
print_r($attach);
try {
$message = new Message();
$message->setSender('asim#redperformance.no');
$message->addTo('asim#redperformance.no');
$message->setSubject('Test');
$message->setTextBody('Test nyeste');
$message->addAttachment('file.csv', $attach);
$message->send();
echo 'Mail Sent';
} catch (InvalidArgumentException $e) {
echo $e;
}
fclose($fpmail);
I don't ever code in PHP, but it looks like you are not actually reading the contents of the file.
In most programming languages, a call like fopen is just the first step of getting the contents of the file, but doesn't actually get the contents. You'll probably need something like this:
$f = fopen('gs://pro-sitemaps-api/file.csv', 'r');
$image_data = $f.read();
but you'll need to figure out how to read data from a file in PHP.
I'm trying to play mp3 files from server-side to client-side. Where the client access the server passing some ID and the server return the file.
Right now, how this is working?
Well, using Laravel (server-side) and AngularJS (client-side) on distinct urls, i'm able to play the song.
But, if I get the request response I'm able to download the song.
So, what would be the good way to work so that information wouldn't be visible to the user?
I would write some sort of file proxy.
You have to move your files out of the publicly accessible area. F.a one level above the page root. So it is not possible to get the data directly.
Then you need a server side script, that gets the data and returns it with the headers you need.
Here is an example (plain PHP):
/**
* #param string $file_name
* #param string $mime
* #param bool $download
*/
public function fileProxyAction($file_name, $mime, $download = false) {
if(basename($file_name) != $file_name) return 'Filename not valid!';
$path = '... your path goes here';
$file = $path.$file_name;
if (!(file_exists($file) && is_readable($file))) return 'The file "'.$file_name.'" could not be found!';
ob_clean();
if($download === false) {
header('Content-type: '.$mime);
header('Content-length: '.filesize($file));
$open = # fopen($file, 'rb');
if ($open) {
fpassthru($open);
exit;
}
} else {
// download
$path_parts = pathinfo($file);
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-type: application/octet-stream");
header("Content-length: " . filesize($file));
header("Content-Disposition: filename=\"".$path_parts["basename"]."\"");
header("Cache-control: private"); // open files directly
readfile($file);
die;
}
}
Laravel has an excellent Built-In-Filesystem. Check it out. I'm sure you can optimize my method with it.
EDIT
If you need to check a token or something, you shouldn't call the fileProxyAction directly by the router. Instead let your router call a Method which checks the token or what ever you're using ;)
Example (pseudo code):
Route::get('/mp3/{id}/{token}', function($id, $token) {
if($token !== Session::get('token')) return App::abort(401);
$name = Mp3::findOrFail($id)->name;
$mime = Mp3::findOrFail($id)->mime;
return $this->fileProxyAction($name, $mime);
});
I have weird issue, it might be something silly but I can't find where the problem is.
I develop an application on cakephp 2.x and when I log data from the controller it appears twice in the log. Something like this:
2013-05-24 11:50:19 Debug: excel file uploaded
2013-05-24 11:50:19 Debug: excel file uploaded
2013-05-24 11:50:19 Debug: fire test
2013-05-24 11:50:19 Debug: fire test
Just to add some fun, it doesn't happen in all functions in that controller, only in two out of six. It annoys me a lot and I don't see what way I should to dig to get rid of it.
Any ideas?
EDIT:
OK, I found that this happens when I log to the two different files in one method.
When I change the line: CakeLog::write('time'....); to CakeLog::write('debug'....);
everything works fine. Like in the following method:
function file_upload() {
if (!$this->request->data) {
} else {
CakeLog::write('time', 'start working at: ' . date('m/d/Y', strtotime("now")));
$data = Sanitize::clean($this->request->data);
CakeLog::write('debug', 'test statement');
if ($data['Scrap']['excel_submittedfile']['type'] === 'application/vnd.ms-excel' && $data['Scrap']['csv_submittedfile']['type'] === 'text/csv') {
$tmp_xls_file = $data['Scrap']['excel_submittedfile']['tmp_name'];
$xls_file = $data['Scrap']['excel_submittedfile']['name'];
$tmp_csv_file = $data['Scrap']['csv_submittedfile']['tmp_name'];
$csv_file = $data['Scrap']['csv_submittedfile']['name'];
$upload_dir = WWW_ROOT . "/files/";
if (file_exists($upload_dir) && is_writable($upload_dir)) {
if (move_uploaded_file($tmp_xls_file, $upload_dir . $xls_file) && move_uploaded_file($tmp_csv_file, $upload_dir . $csv_file)) {
CakeLog::write('debug', 'excel file uploaded');
$this->redirect(array('action' => 'edit', $xls_file, $csv_file));
} else {
echo 'upload failed';
}
} else {
echo 'Upload directory is not writable, or does not exist.';
}
} else {
echo 'make sure the files are in correct format';
}
}
}
I guess it has something to do with declarations of log files in bootstrap.php. So it's not that big problem just annoying.
This happens because your call
CakeLog::write('time', 'start working at: ' . date('m/d/Y', strtotime("now")));
Will attempt to write a log of the type: "time". Since there is no stream configured to handle that the CakeLog will create a "default" stream for you to handle this log call.
The problem is that, from now on you will have a "default" stream configured that will catch all logs and double them for debug and error logs.
The solution is to properly configure the log in the bootstrap.php file like this:
CakeLog::config('time_stream', array(
'engine' => 'FileLog',
'types' => array( 'time' ), //<--here is the log type of 'time'
'file' => 'time', //<-- this will go to time.log
) );
Of course, that if you use other log types you will need to configure streams for those as well, otherwise the default catch-all stream will be configured for you and you will be having the same problem again.
Good luck!
I am creating a Google App Engine web app to "transform" files of 10K~50M
Scenario:
User opens http://fixdeck.appspot.com in web browser
User clicks on "Browse", select file, submits
Servlet loads file as an InputStream
Servlet transforms file
Servlet saves file as an OutputStream
The user's browser receives the transformed file and asks where to save it, directly as a response to the request in step 2
(For now I did not implement step 4, the servlet sends the file back without transforming it.)
Problem: It works for 15MB files but not for a 40MB file, saying: "Error: Request Entity Too Large. Your client issued a request that was too large."
Is there any workaround against this?
Source code: https://github.com/nicolas-raoul/transdeck
Rationale: http://code.google.com/p/ankidroid/issues/detail?id=697
GAE has a hard limits of 32MB for HTTP requests and HTTP responses. That will limit the size of uploads/downloads directly to/from a GAE app.
Revised Answer (Using Blobstore API.)
Google provides to the Blobstore API for handling larger files in GAE (up to 2GB). The overview documentation provides complete sample code. Your web form will upload the file to blobstore. The blobstore API then rewrites the POST back to your servlet where you can do your transformation and save the transformed data back in to the blobstore (as a new blob).
Original Answer (Didn't Consider Blobstore as an option.)
For downloading, I think GAE only workaround would be to break the file up in to multiple parts on the server, and then reassemble after downloading. That's probably not doable using a straight browser implementation though.
(As an alternative design, perhaps you could send the transformed file from GAE to an external download location (such as S3) where it could be downloaded by the browser without the GAE limit restrictions. I don't believe GAE initiated connections have same request/response size limitations, but I'm not positive. Regardless, you would still be restricted by the 30 second maximum request time. To get around that, you'd have to look in to GAE Backend instances and come up with some sort of asynchronous download strategy.)
For uploading larger files, I've read about the possibility of using HTML5 File APIs to slice the file in to multiple chunks for uploading, and then reconstructing on the server. Example: http://www.html5rocks.com/en/tutorials/file/dndfiles/#toc-slicing-files . However, I don't how practical a solution that really is due to changing specifications and browser capabilities.
You can use the blobstore to upload files as large as 2 gigabytes.
When uploading larger files, you can consider the file to be chunked into small sets of requests (should be less than 32MB which is the current limit) that Google App Engine supports.
Check this package with examples - https://github.com/pionl/laravel-chunk-upload
Following is a working code which uses the above package.
View
<div id="resumable-drop" style="display: none">
<p><button id="resumable-browse" class="btn btn-outline-primary" data-url="{{route('AddAttachments', Crypt::encrypt($rpt->DRAFT_ID))}}" style="width: 100%;
height: 91px;">Browse Report File..</button>
</div>
Javascript
<script>
var $fileUpload = $('#resumable-browse');
var $fileUploadDrop = $('#resumable-drop');
var $uploadList = $("#file-upload-list");
if ($fileUpload.length > 0 && $fileUploadDrop.length > 0) {
var resumable = new Resumable({
// Use chunk size that is smaller than your maximum limit due a resumable issue
// https://github.com/23/resumable.js/issues/51
chunkSize: 1 * 1024 * 1024, // 1MB
simultaneousUploads: 3,
testChunks: false,
throttleProgressCallbacks: 1,
// Get the url from data-url tag
target: $fileUpload.data('url'),
// Append token to the request - required for web routes
query:{_token : $('input[name=_token]').val()}
});
// Resumable.js isn't supported, fall back on a different method
if (!resumable.support) {
$('#resumable-error').show();
} else {
// Show a place for dropping/selecting files
$fileUploadDrop.show();
resumable.assignDrop($fileUpload[0]);
resumable.assignBrowse($fileUploadDrop[0]);
// Handle file add event
resumable.on('fileAdded', function (file) {
$("#resumable-browse").hide();
// Show progress pabr
$uploadList.show();
// Show pause, hide resume
$('.resumable-progress .progress-resume-link').hide();
$('.resumable-progress .progress-pause-link').show();
// Add the file to the list
$uploadList.append('<li class="resumable-file-' + file.uniqueIdentifier + '">Uploading <span class="resumable-file-name"></span> <span class="resumable-file-progress"></span>');
$('.resumable-file-' + file.uniqueIdentifier + ' .resumable-file-name').html(file.fileName);
// Actually start the upload
resumable.upload();
});
resumable.on('fileSuccess', function (file, message) {
// Reflect that the file upload has completed
location.reload();
});
resumable.on('fileError', function (file, message) {
$("#resumable-browse").show();
// Reflect that the file upload has resulted in error
$('.resumable-file-' + file.uniqueIdentifier + ' .resumable-file-progress').html('(file could not be uploaded: ' + message + ')');
});
resumable.on('fileProgress', function (file) {
// Handle progress for both the file and the overall upload
$('.resumable-file-' + file.uniqueIdentifier + ' .resumable-file-progress').html(Math.floor(file.progress() * 100) + '%');
$('.progress-bar').css({width: Math.floor(resumable.progress() * 100) + '%'});
});
}
}
</script>
Controller
public function uploadAttachmentAsChunck(Request $request, $id) {
// create the file receiver
$receiver = new FileReceiver("file", $request, HandlerFactory::classFromRequest($request));
// check if the upload is success, throw exception or return response you need
if ($receiver->isUploaded() === false) {
throw new UploadMissingFileException();
}
// receive the file
$save = $receiver->receive();
// check if the upload has finished (in chunk mode it will send smaller files)
if ($save->isFinished()) {
// save the file and return any response you need, current example uses `move` function. If you are
// not using move, you need to manually delete the file by unlink($save->getFile()->getPathname())
$file = $save->getFile();
$fileName = $this->createFilename($file);
// Group files by mime type
$mime = str_replace('/', '-', $file->getMimeType());
// Group files by the date (week
$dateFolder = date("Y-m-W");
$disk = Storage::disk('gcs');
$gurl = $disk->put($fileName, $file);
$draft = DB::table('draft')->where('DRAFT_ID','=', Crypt::decrypt($id))->get()->first();
$prvAttachments = DB::table('attachments')->where('ATTACHMENT_ID','=', $draft->ATT_ID)->get();
$seqId = sizeof($prvAttachments) + 1;
//Save Submission Info
DB::table('attachments')->insert(
[ 'ATTACHMENT_ID' => $draft->ATT_ID,
'SEQ_ID' => $seqId,
'ATT_TITLE' => $fileName,
'ATT_DESCRIPTION' => $fileName,
'ATT_FILE' => $gurl
]
);
return response()->json([
'path' => 'gc',
'name' => $fileName,
'mime_type' => $mime,
'ff' => $gurl
]);
}
// we are in chunk mode, lets send the current progress
/** #var AbstractHandler $handler */
$handler = $save->handler();
return response()->json([
"done" => $handler->getPercentageDone(),
]);
}
/**
* Create unique filename for uploaded file
* #param UploadedFile $file
* #return string
*/
protected function createFilename(UploadedFile $file)
{
$extension = $file->getClientOriginalExtension();
$filename = str_replace(".".$extension, "", $file->getClientOriginalName()); // Filename without extension
// Add timestamp hash to name of the file
$filename .= "_" . md5(time()) . "." . $extension;
return $filename;
}
You can also use blobstore api to directly upload to cloud storage. Blow is the link
https://cloud.google.com/appengine/docs/python/blobstore/#Python_Using_the_Blobstore_API_with_Google_Cloud_Storage
upload_url = blobstore.create_upload_url(
'/upload_handler',
gs_bucket_name = YOUR.BUCKET_NAME)
template_values = { 'upload_url': upload_url }
_jinjaEnvironment = jinjaEnvironment.JinjaClass.getJinjaEnvironemtVariable()
if _jinjaEnvironment:
template = _jinjaEnvironment.get_template('import.html')
Then in index.html:
<form action="{{ upload_url }}"
method="POST"
enctype="multipart/form-data">
Upload File:
<input type="file" name="file">
</form>