I'm not a programmer, just a regular user of Google Drive. I want to see if the files are uploaded correctly. I go through a whole process in the OAuth 2.0 Playground that lists all files, shows the MD5 checksums but also lots of information per file. If I upload a new file it's hard to search for it and verify its md5 checksum.
Is there an easier way (through an app, maybe?) to show/list MD5 checksums for the uploaded files? I wonder why the Details pane doesn't have it, only lists the file size in bytes.
edit: NB these instructions have changed slightly for the v3 API
I've figured out a quick way to get the MD5 checksums of the files uploaded and decided to share it here, too. Log into your Google Drive account, then:
Visit: https://developers.google.com/drive/v3/reference/files/list
Scroll down to the Try it! section.
Change "Authorize requests using OAuth 2.0" from OFF to ON by clicking on it, then select:
https://www.googleapis.com/auth/drive.metadata.readonly
and click Authorize.
Choose your account then click Accept.
Fill in the fields field with:
for v2 API:
items(md5Checksum,originalFilename)
for v3 API:
open "Show standard parameters" in GUI to see fields than
files(md5Checksum,originalFilename)
to only get a list of filenames and MD5 checksums.
Click Execute and you'll open a list with all the files uploaded to Google Drive and their MD5 checksums.
API instructions
Google Developers - OAuth 2.0 Playground:
https://developers.google.com/oauthplayground/
Step 1: Select & authorize APIs:
Expand "Drive API v3".
Enable "https://www.googleapis.com/auth/drive.metadata.readonly".
Click "Authorize APIs".
Click "Allow".
Step 2: Exchange authorization code for tokens:
Click "Exchange authorization code for tokens".
Step 3: Configure request to API:
Enter the "Request URI".
Click "Send the request".
Request URI instructions
All files in folder
Get specific fields of files in a folder:
https://www.googleapis.com/drive/v3/files?q="folderId"+in+parents&fields=files(md5Checksum,+originalFilename)
//
Replace "folderId" with the folder ID.
You can use &fields=files(*) to get all of the file's fields.
Single file
Get specific fields of a file:
https://www.googleapis.com/drive/v3/files/fileId?fields=md5Checksum,+originalFilename
//
Replace "fileId" with the file ID.
You can use &fields=* to get all of the file's fields.
Parsing the JSON response
Open a JavaScript console.
Save the object into a variable.
Map the object.
Copy the result.
Code
var response = {
"files": [
{
"md5Checksum": "0cc175b9c0f1b6a831c399e269772661",
"originalFilename": "a.txt"
},
{
"md5Checksum": "92eb5ffee6ae2fec3ad71c777531578f",
"originalFilename": "b.txt"
}
]
};
var result = response.files.map(function (file) { return (file.md5Checksum + " *" + file.originalFilename); }).join("\r\n");
console.log(result);
copy(result);
Here are three additional, different ways to list md5 checksums.
Install Google Skicka, a command line tool for Google Drive and run skicka ls -ll /
Although the readme file says it's not an official google product it is hosted on google's github account, so I guess it can be trusted.
There is a plugin that lists all files with their checksums in drive's spreadsheet.
Here's my python3 script that I've created for myself. It's mostly copied from google's official examples. You'll need to obtain client_secret.json file and place it in the same directory with the script - here's the instruction how to do it.
Based on: Alex's above answer!
Click the link : https://developers.google.com/drive/v3/reference/files/list
Click the Try it now link in the middle.
( An active window appears in the middle )
Scroll down the left pane in the active window.
Under fields section on the left pane, fill
files(md5Checksum,originalFilename)
Now we will limit access scopes :
(i) leave the Google OAuth 2.0 selected & clear the box against API key.
(ii) Expand Show scopes under Google OAuth 2.0
(iii) Clear all the scopes but keep this one selected:
**https: //www.googleapis.com/auth/drive.metadata.readonly**
Now click EXECUTE in blue.
(A new Google Sign In Window will open)
Use that window to Sign in with the respective google account & click Allow to permit the Google APIs Explorer access files in your google drive.
It's done! A new window will open with the results in lower right code pane. It will
provide the names & md5Checksums for all the files in the respective google
drive account.
Click outside of the active window to close the window & close the Google Drive
API tab. Now you can sign out of the google account if you want!
Combined with XP1 and Alex guides to work in my scenario, to list MD5 for private folders that shared with me
-includeItemsFromAllDrives
-includeTeamDriveItems
-supportsAllDrives
-supportsTeamDrives
Request URI in OAuth 2.0 Playground
https://www.googleapis.com/drive/v3/files?q="folderID"+in+parents&includeItemsFromAllDrives=true&includeTeamDriveItems=true&supportsAllDrives=true&supportsTeamDrives=true&fields=files(md5Checksum%2CoriginalFilename)
Related
I'm looking to make the url by adding a path which is something like this below in Google Apps Script:
https://script.google.com/macros/s/APP_ID/exec/fileName.txt
How can I achieve this for Web App service?
I believe your goal as follows.
You want to access to Web Apps using the URL of https://script.google.com/macros/s/APP_ID/exec/fileName.txt.
For this, how about this answer? I think that you can achieve your goal using Web Apps. As a sample case, I would like to explain about this using a sample script for downloading a text file, when an user accesses to https://script.google.com/macros/s/APP_ID/exec/fileName.txt.
Usage:
Please do the following flow.
1. Create new project of Google Apps Script.
Sample script of Web Apps is a Google Apps Script. So please create a project of Google Apps Script.
If you want to directly create it, please access to https://script.new/. In this case, if you are not logged in Google, the log in screen is opened. So please log in to Google. By this, the script editor of Google Apps Script is opened.
2. Prepare script.
Please copy and paste the following script (Google Apps Script) to the script editor. This script is for the Web Apps.
function doGet(e) {
const path = e.pathInfo;
if (path == "filename.txt") {
const sampleTextData = "sample";
return ContentService.createTextOutput(sampleTextData).downloadAsFile(path);
}
return ContentService.createTextOutput("Wrong path.");
}
In order to retrieve the value of fileName.txt in https://script.google.com/macros/s/APP_ID/exec/fileName.txt, please use pathInfo.
For example, when you check e of doGet(e) by accessing with https://script.google.com/macros/s/APP_ID/exec/fileName.txt, you can retrieve {"contextPath":"","contentLength":-1,"parameter":{},"parameters":{},"queryString":"","pathInfo":"fileName.txt"}.
In this case, the GET method is used.
3. Deploy Web Apps.
On the script editor, Open a dialog box by "Publish" -> "Deploy as web app".
Select "Me" for "Execute the app as:".
By this, the script is run as the owner.
Select "Anyone, even anonymous" for "Who has access to the app:".
In this case, no access token is required to be request. I think that I recommend this setting for your goal.
Of course, you can also use the access token. At that time, please set this to "Anyone". And please include the scope of https://www.googleapis.com/auth/drive.readonly and https://www.googleapis.com/auth/drive to the access token. These scopes are required to access to Web Apps.
Click "Deploy" button as new "Project version".
Automatically open a dialog box of "Authorization required".
Click "Review Permissions".
Select own account.
Click "Advanced" at "This app isn't verified".
Click "Go to ### project name ###(unsafe)"
Click "Allow" button.
Click "OK".
Copy the URL of Web Apps. It's like https://script.google.com/macros/s/###/exec.
When you modified the Google Apps Script, please redeploy as new version. By this, the modified script is reflected to Web Apps. Please be careful this.
4. Run the function using Web Apps.
Please access to https://script.google.com/macros/s/###/exec/filename.txt using your browser. By this, a text file is downloaded.
Note:
When you modified the script of Web Apps, please redeploy the Web Apps as new version. By this, the latest script is reflected to the Web Apps. Please be careful this.
References:
Web Apps
Taking advantage of Web Apps with Google Apps Script
Updated on February 14, 2023
In the current stage, it seems that pathInfo can be used with the access token. It supposes that the following sample script is used.
function doGet(e) {
return ContentService.createTextOutput(JSON.stringify(e));
}
When you log in to your Google account and you access https://script.google.com/macros/s/###/exec/sample.txt with your browser, {"contextPath":"","parameter":{},"pathInfo":"sample.txt","contentLength":-1,"parameters":{},"queryString":""} can be seen.
In this case, when you access it without logging in Google account, even when Web Apps is deployed as Execute as: Me and Who has access to the app: Anyone, the log in screen is opened. Please be careful about this.
And, if you want to access with https://script.google.com/macros/s/###/exec/sample.txt using a script, please request it by including the access token. The sample curl command is as follows. In this case, the access token can be used as the query parameter. Please include one of the scopes of Drive API in the access token.
curl -L "https://script.google.com/macros/s/###/exec/sample.txt?access_token=###"
By this, the following result is returned.
{"contextPath":"","queryString":"access_token=###"},"pathInfo":"sample.txt","parameters":{"access_token":["###"]},"contentLength":-1}
I am storing my files in Google Cloud Storage. I would like to provide downloadable links. For example https://yeketakclub.storage.googleapis.com/audios/yeketak.club-dm7aEYv7R53JRlti3HHn.mp3 one of audio files stored in google cloud storage. But when it is clicked browser tries to open it. Is it possible to force download?
The correct answer is neither of these! (if you don't want to edit a file's metadata) Add this on the end of any signed url:
&response-content-disposition=attachment;
This will make all storage links force a download instead of opening.
You can signal browsers to download the object while still retaining an accurate content type by setting the content disposition to attachment. For, example using gsutil you can do this like so:
gsutil setmeta -h 'Content-Disposition:attachment' gs://yeketakclub/audios/yeketak.club-dm7aEYv7R53JRlti3HHn.mp3
Now your object can still have the correct content type of "audio/mpeg3" (or whatever happens to match the object's content).
I'm not sure if this is necessarily a Google Cloud Storage issue (I might be wrong). The link provided there is downloadable. It just happens to be that your browser "prefers" to play it most probably because it recognises the MIME Type as one that can be handled.
In Chrome for instance, you can force download of the file by using alt + click.
Or you can right click and save link as...
In the Google Console bucket area you can click the menu and edit metadata on the object and set the Content-Disposition to attachment
edit metadata -> set Content-Disposition = attachment
image
I have the Google picker set up, as well as Blobstore. I'm able to upload files from my local machine to the Blobstore, but now I have the Picker set up, it works, but I don't know know how to use the info (url? fileid?) to then load that selected file into the Blobstore? Any tips on how to do this? I haven't been able to find much of anything on it on Googles resources
There isn't a direct link between the Google Picker and the App Engine Blobstore. They are kind of different tools for different jobs. The Google Picker is designed as an end user tool, to select data from a users Google account. It just so happens that the Picker also provides an upload interface (to Google Drive) as well. The Blobstore on the other hand, is designed as a blob storage mechanism for your App Engine application.
In theory, you could write a script to connect the two, but there are a few considerations:
Your app would need access to the users Google Drive account using OAuth2. This is necessary, as the Picker API is a client side API, whereas the Blobstore API is a server side API. You would need to send the selected document URL to the server, then download the document and finally save it to Blobstore.
Unless you then deleted the data from Drive (very risky due to point 3), your data would be persisted in 2 places
You cannot know for sure if the user selected an existing file, or uploaded a new one
Not a great user experience - the user things they are uploading to Drive
In essence, this sounds like a bad idea! What is your use case?
#Gwyn - I don't have enough reputation to add a comment to your solution, but I had an idea about problem #3: You cannot know for sure if the user selected an existing file, or uploaded a new one
Would it be possible to use Response.VIEW to see what view they were using when the file was selected? If you have one view constructor for Drive files and one for Upload files, something like
var driveView = new google.picker.View(google.picker.ViewId.DOCS);
var uploadView = new google.picker.DocsUploadView();
would that allow you to know whether the file was a new upload (safe to delete) or an existing file (leave it alone)?
Assuming that you want to pick a file from your own Google Drive and move it to the Blobstore.
1)First you have to perform Oauth for Google Drive API
2)Using the picker when you select a file from drive, you need to get it's id
3)Using the id obtained in step 2 you can programmatically download it using Drive API
4)After downloading the file you can use FileService(deprecated though) to upload the file to the
Blobstore.
I want to tell you about the malware attack to my Drupal website. Not just for your suggestions but also to create something helpful to anybody tha could suffer for the same problems. Well...
INITIAL SETUP
Drupal 7.9
Activated modules:
CORE: Block, Contextual links, Database logging, Field, Field SQL storage, Field UI, File, Filter, Image, List, Locale, Menu, Node, Number, Options, Overlay, Path, PHP Filter, RDF, System, Taxonomy, Text, Toolbar, User
CCK: Multiselectd
CHAOS TOOL SUITE: Chaos tools
DATA/ORA: Calendar, Date, Date API, Date Popup, Date views
FIELDS: Email, Field permission, Link
OTHER: Google Plus One +1, Pathauto, Token, Weight
SHARING: Share this, Share this block
TAXONOMY MENU: Taxonomy menu
VIEWS: Views, Views PDF Display, Views PHP, Views UI
OTHER MODULES THAT I REMOVED: CKEDITOR, VIEWS_SLIDESHOW, IMCE, DOMPDF, PRINT, WYSIWIG
MY SETUP ERRORS
In order to satisfy the custome, I modified some of the modules and I've never update them (AUCH!)
The customer was in posses of the login data, and maybe his computer wasn't safe (MMM...)
I didn't have a copy of the webiste, because I trusted on the provider weekly backup (DOH!)
ATTACK EXTERNAL SYMPTOMS
All the link of the homepage redirected to a malware website
Google blacklisted the website
Critical alert on the Google Webmaster Tools panel
FTP SYMPTOMS
Lots of "strange" files: mainma3.php (I found this one in every folder!), functoins.php, sum75.html, wlc.html, aol.zip, chase.zip, chaseverification.zip, 501830549263.php, wp-conf.php and a dozen of wtmXXXXn.php (dove X = numero) in the root folder. All these files was plenty of malicious functions (unescape, base64_decode, eval, etc.)
Install.php was modified with a long line of malicious code
To EVERY javascript files was appended this line of code:
;document.write('');
The weekly backup was also infeceted
Dozen of repeated "strange" request, found on the Drupal log panel (my domain is obscured with the string "-----"):
index.php?q=ckeditor/xss > Notice: Undefined offset: 5 in eval() (linea 29 di /web/htdocs/-----/home/modules/php/php.module(74) : eval()'d code(1) : eval()'d code).
-----/user?destination=node/add > Failed login by shadowke
calendar/week/2012-W19?year=2011&mini=2012-12 > page not found
misc/]};P.optgroup=P.option;P.tbody=P.tfoot=P.colgroup=P.caption=P.thead;P.th=P.td;if(!c.support.htmlSerialize)P._default=[1, > page not found
misc/)h.html(f?c( > page not found
mail.htm > page not found
RECOVER [Thank to this article: http://25yearsofprogramming.com/blog/20070705.htm]
I've put the website on Maintanance mode (error503.php + .htaccess). Traffic open just for my IP Address
[see this useful guide: http://25yearsofprogramming.com/blog/20070704.htm]
I've downloaded the whole website in local
I've searched and removed the strange files > I found forty of them
I've searched the files for these worlds [with the freeware AGENT RANSACK]: eval(base64_decode($POST["php"])), eval(, eval (, base64, document.write, iframe, unescape, var div_colors, var _0x, CoreLibrariesHandler, pingnow, serchbot, km0ae9gr6m, c3284d, upd.php, timthumb. > I've acted in one of the follow ways: a) I've replaced eval with php_eval() (the eval safe version of drupal); b) I've wrote down the suspected modules; c) I've compared the code with the fresh downloaded module; d) I've removed all the malicious code (see the javascript mentioned above)
I've searched for mohanges in the file system [with the freeware WINMERGE]
I've identifyed some suspected modules, thank to the list written at the point 4 above, and thank to some researches on Google (name_of_the_module security issue, name_of_the_module hacked, etc...) and on Secunia [http://secunia.com/community/advisories/search]
I've scan my computer (Avast, Search&Destroy, Malwarebytes Antimalware) > I didn't found any virus or spyware
I've changed all the logins (ftp, cpanel, drupal admin panel)
I've reloaded the whole website
I've removed all the suspected modules: CKEDITOR, VIEWS_SLIDEWHOW, PRINT, DOMPDF, IMCE, CAPTCHA, WYSIWIG, WEBFORM.
I've tell the whole story to the provider assistance
I request Google for a revision (they did it in 12 hours)
DRUPAL LOG NOW
dozen of these messages
- wtm4698n.php?showimg=1&cookies=1 > page not found
- fhd42i3d.html > page not found
- wp-conf.php?t2471n=1 > page not found
- -----/user?destination=node/add > Failed login by Elovogue
LESSONS LEARNED
Never touch the modules, so you can update them
Keep all the login in a safe computer / Use a safe computer to work on the FTP
Search for any security issue before installing a module
Keep a clean copy of the website somewhere
MY QUESTIONS:
What kind of attack I've received?
There are other unsure module in my installation?
What can I do yet?
Thanks to everybody for your patience!
If you are using m$ windows, I think it is a trojan/virus that steals your ftp passwords and automatically editing files. I know many such stories.
Switch to WinSCP.net.
I'm trying to use the "Copy to another app" feature of AppEngine and keep getting an error:
Fetch to http://datastore-admin.moo.appspot.com/_ah/remote_api failed with status 302
This is for a Java app but I followed the instructions on setting up a default Python runtime.
I'm 95% sure it's an authentication issue and the call to remote_api is redirecting to the Google login page. Both apps use Google Apps as the authentication mechanism. I've also tried copying to and from a third app we have which uses Google Accounts for authentication.
Notes:
The user account I log in with is an Owner on all three apps. It's a Google Apps account (if that wasn't obvious).
I have a gmail account this is an Owner on all three apps as well. When I log in to the admin console with it, I don't see the datastore admin console at all when I click it.
I'm able to use the remote_api just fine from the command-line after I enter my details
Tried with both the Python remote_api built-in and the Java one.
I've found similar questions/blog posts about this, one of which required logging in from a browser, then manually submitting the ACSID cookie you get after that's done. Can't do that here, obviously.
OK, I think I got this working.
I'll refer to the two appIDs as "source" and "dest".
To enable datastore admin (as you know) you need to upload a Python project with the app.yaml and appengine_config.py files as described in the docs.
Either I misread the docs or there is an error. The "appID" inthe .yaml should be the app ID you are uploading to to enable DS admin.
The other appID in the appengine_config file, specifically this line:
remoteapi_CUSTOM_ENVIRONMENT_AUTHENTICATION = (
'HTTP_X_APPENGINE_INBOUND_APPID', ['appID'])
Should be the appID of the "source", ID the app id of where the data is coming from in the DS copy operation.
I think this line is what allows the source appID to be authenticated as having permissions to write to the "dest" app ID.
So, I changed that .py, uploaded again to my "dest" app ID. To be sure I made this dummy python app as default and left it as that.
Then on the source app ID I tried the DS copy again, and all the copy jobs were kicked off OK - so it seems to have fixed it.