We have created two microservices, one which uploads images to Google storage (bucket) from node.js. And service which returns the serving_url from a python service. Both services work. Almost...
Our problem is that if we don't have the correct ACL on a file in the bucket, we are not allowed to get the serving_url() from the image service.
We are uploading images from the frontend with a resumable upload link, generated in the node.js file service:
createResumeableUpload: async (filename, metadata) => {
const file = await bucket.file(filename);
const upload = await file.createResumableUpload({
metadata,
public: true,
});
return upload;
},
After inspecting the file in the bucket, we do get allUsers with reader permissions, see the image below. But the transformation on the image-service will throw an exception, due to invalid permissions.
If we, however, upload an image directly in the bucket interface we get some more permissions, and after some testing, we found the following permission is what we are looking for is the editors, the second in the list:
We have tried a lot both setting the ACL when we create the resumable upload or on the file after creating it, but nothing works. We would really appreciate if someone could help us how to set the correct ACL on the file, so we are able to get the serving_url() from the image service.
[EDIT]
To answer some of the questions. We are 100 % certain this is a permission issue, the image service is working fine if we add the editor's permission, but we need to be able to add this permission when we add images to the bucket. And this is my question:
How do we add the owner permission to editors, when we upload images to the buckets from our node.js service?
Images API uses App Engine default service account (YOUR_PROJECT_ID#appspot.gserviceaccount.com). This service account does not need to be project owner for get_serving_url() to work as long as you grant it Storage Object Admin role on the bucket. Alternatively you can update object ACLs granting object owner to the service account.
In order to add the necessary permissions, you can send a patch request to the JSON API with a JSON that defines all ACLs you wish the object to have.
UPDATE:
A feature request has been created on your behalf to make this functionality available for the NodeJS client library. Please star it so that you could receive updates about this feature request and do not hesitate to add additional comments to provide details of the desired implementation. You can track the feature request by following this link.
After many hours of trying figure Google ACL system, I would like to share how I was able to setup ACL from nodejs.
After an image file is uploaded to the bucket, it needs to have the project owner ACL added. If this the image file does not have the project owner ACL, it will throw an exception when generating the serving_url().
To update ACL I used this function:
updateAcl: async (name: string) => {
const file = await bucket.file(name);
console.log('file', file);
const acl = await file.acl.add({
role: 'OWNER',
entity: 'project-editors-231929353801'
});
return acl;
}
From the storage panel, it's possible to get a list of all the default ACLs.
Related
I'm currently working on a small project using firebase. My team member is working on IOS and android while I'm trying to build a custom admin page using React.
In the app, users can signup with their phone and send a request for permission by attaching few documents.
I have to build an admin page to approve or deny these documents. For that I need to get list of all user from User Collection and view all the documents that was submitted and be able update user field 'isApproved' to true or false.
I was thinking of simply creating a new admin account directly in firebase and use that account to signin to admin page and perform the following actions (manipulate normal user info field). But I found out about firebase admin SDK. Do I need to use this in my case?
I may need to send push notifications to all users signed up and create user, update user, delete user account later on.
Give the situation should I use firebase admin SDK?
Can someone give me advice on how to set up the overall structure?
First things first, you should not use the Admin SDK on frontend. The Admin SDK has privileged access to all Firebase resources and does not follow any security rules either. You should always use Admin SDK in a secure environment like Firebase Cloud Functions or your own server.
I am not entirely sure what actions you need to perform while accepting/rejecting the documents. If you need to read/write a specific part of a database (which only a admin can access) then you can use Firebase security rules. You would have to add a Custom Claim to the admin user or store their UID in a database.
But if you need to do multiple things (maybe sending an email to user, doing some actions using 3rd party API), I'll recommend using a Cloud Functions with the Admin SDK.
How will that work?
You will have to create a Cloud Functions to accept/reject the documents.
When the admin accepts/rejects a document, you can pass details of that user (userID, document info and if the docs were accepted to the
cloud function) to the cloud function and process it over there.
The callable function may look like:
exports.verifyDocs = functions.https.onCall((data, context) => {
const {uid, token} = context.auth
if (!uid) return "Unauthorized"
if (!token.admin) return "Forbidden"
//The user is an admin
//Do database updates
//Any third party APIs
});
If you use callable functions, Firebase will automatically add auth info of the user calling that function. In the example above, I've assumed the user will have an admin custom claim but if you want to keep things simple based on UIDs you can do so by:
const adminUIDs = ["uid1", "uid2"]
if (!adminUIDs.includes(context.auth.uid)) return "Forbidden"
To call the function from your React app:
const verifyDocs = firebase.functions().httpsCallable('verifyDocs');
verifyDocs({ userID: "userID", text: messageText })
.then((result) => {
// Read result of the Cloud Function.
});
Any thing you pass in the function above will be available in your cloud functions in the 'data' parameter.
I'm trying to upload files directly from the browser to GCS using signed URLs. I'm generating a v4 signed URL from an App Engine Standard PHP application and that seems to be working fine. The problem is when I try to PUT to that URL I get a 403 with the following XML response:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>Anonymous caller does not have storage.objects.create access to <bucket-name>/some-object.txt.</Details>
</Error>
My app engine service account has Service Account Token Creator, which enabled the URL to be created.
I've enabled CORS on the bucket to accept PUT to *, which allowed me to get to where I am now.
I've switched from v2 URLs to v4 as an issue on the Go SDK suggested that was a problem.
I'm generating the signed URL using the PHP Google Cloud Library like so:
$storage = new StorageClient();
$bucket = $storage->bucket('<bucket-name>');
$object = $bucket->object('some-object.txt');
$url = $object->signedUploadUrl(new \DateTime('tomorrow'), ['version' => 'v4']);
I've tried adding the service account to the bucket's permissions and adding Storage Object Admin, Storage Object Creator, etc. but nothing seems to get me past this 403 (apart from opening it up to allUsers).
In this article is says
In addition, within Cloud Storage, you need to grant the following permissions to generate a Signed URL.
storage.buckets.get
storage.objects.create
storage.objects.delete
But I just can't work out which role they need to be added to.
At this point, I think there is one of two possibilities:
The signed URL is not actually working because it should be authenticated as the service account and not anonymous. In this case, what could be causing this?
Signed URLs authenticate as some type of anonymous role but that role does not have the permissions. In which case, how do I add the permissions for that role (allUsers is obviously wrong)?
SOLVED:
There was a number of things wrong with my implementation:
As suggested by Brandon & Charles below, signedUploadUrl is not appropriate for a direct PUT. To get around this, I needed to use beginSignedUploadSession
As suggested by John below, I needed to have Storage Object Creator added on the service account user. This however, is already added on a GAE default service account as it is Project Editor
Service Account Token Creator needs to be explicitly added to the service account as Project Editor doesn't seem to cover it.
I was embedding the URL into Javascript with Twig uses const url='{{ upload_url }}';, however Twig automatically HTML encodes variables so that was breaking the URL, instead, I needed to use {{ upload_url|raw }}. This broken format was the reason that the message included Anonymous caller
signedUploadUrl creates a URL for the POST HTTP method (see the library source code at https://github.com/googleapis/google-cloud-php/blob/master/Storage/src/StorageObject.php). You are using that signed URL for a PUT request, so the request isn't permitted. The error message does not show this as the problem, but I think that's what it really is.
You can either look into how to upload a file via POST, or create a signed URL for PUT. I've done the latter in Python, but I don't see a way to do it with this library. I'm not a PHP programmer so I might be missing it.
Or you could create your own code to create a signed URL for PUT, starting with the library code as an example. Signed URLs are extremely tricky to get exactly right, and creating your own code will probably be frustrating. It was for me in Python.
I'm creating a dating React web app where users can upload pictures of themselves to their user profile. I want to use Firebase storage. I want to protect the images so that they are only viewable when accessing from my web app by authenticated users - right now I get an image like this:
let storageRef = firebase.storage().ref(`images/${userid}/${userImageName}`);
// Get the download URL
storageRef.getDownloadURL().then(function(url) {
// Insert url into an <img> tag to "download"
})
This is great - but once I put the URL in the src attribute in the image tag anyone who views the source code can copy the URL and send it via email, text message, etc., making it "public". I have tried uploading images in base64 string using the putString() function also only for Firebase storage to yet again create a URL for it like a normal image upload when using the put() function.
So my question is - can I use Firebase Storage to store images and make them "private" so that only authenticated users of my web app are able to view them? Is there a way to get only the image data and use that to generate the images in the frontend/client so that no actual URLs are ever placed in the JS code?
The call to getDownloadURL() is protected by Security Rules, which means that if you write a rule like:
service firebase.storage {
match /b/{bucket}/o {
match /images/{userId}/{userImageName} {
// any authenticated user can read the bytes or get a download URL
allow read: if request.auth != null;
// only the given user can upload their photo
allow write: if request.auth.uid == userId;
}
}
}
They will not allow unauthenticated people to download URLs.
As for the second issue: once someone can see a file, assume that they have already downloaded/screenshotted it and can share it, even if the URL isn't publicly accessible (or even on the page). Viewing something is equivalent to downloading it, so there's really no difference where it's coming from as the end result can be the same.
I'm currently using a dropbox client js script to push zip files to a folder (in test, a couple of k, in production, a couple of hundred meg) - there currently isn't a server/back end, so am posting from an arraybuffer, not a server url.
var zip = new JSZip();
zip.file("test.txt", "Hello World\n");
var content = zip.generate({type:"arraybuffer"});
// ... code to pick a dropbox folder ...//
client.writeFile(url+"/"+fileName, content, function(error){ ... etc
This all works fine - client is able to write the binary file (which Dropbox's own Saver is unfortunately unable to do). I'm trying to see if Kloudless is able to perform the same, since I also need to support google, box, etc at some point. https://github.com/kloudless/file-explorer/'s documentation about its saver says the files are an array of urls ..
explorer({
...
files: [{
"url": "http://<your image url>",
"name": "filename.extension"
},
It doesn't seem to like local storage file references using URL.createObjectURL(blob), so I'm guessing the api is telling the remote services to pull the files rather than pushing their data.
You are correct that the Kloudless API backend servers stream the file from the URL to the final destination in whichever cloud service you would like the file to be uploaded to (e.g. a folder in a Dropbox account).
If the files are present only on the client-side, I would recommend using the Kloudless Chooser to instead prompt the user to choose a folder to save the files in, and then handle uploading the file data to that destination via the client-side manually.
To do this, refer to this example configuration: https://jsfiddle.net/PB565/139/embedded/
I have set retrieve_tokens to true so that my client-side JavaScript will receive not just the metadata of the folder the user chooses to upload the data to but also the Bearer token to use to gain access to the user's account. This allows the client-side JavaScript to then make upload or multipart upload requests to Kloudless to upload the file data to that folder. The advantage of multipart uploads is that an error uploading one chunk wouldn't require the whole upload to be retried.
Be sure to add the domain you are hosting the File Explorer on to your Kloudless App's Trusted Domains (on the App Details page) so that it can in fact receive the Bearer token in the response JS callback. In my JSFiddle example, I would have to add 'fiddle.jshell.net' to my app's list of Trusted Domains to be able to receive the Bearer token to perform further requests from the client side to the Kloudless API.
I have a Google Cloud Endpoint which I access from an HTML page through JavaScript and the Google JavaScript client Library.
I authenticate with OAuth2.0 by using the standard
gapi.auth.authorize({client_id: CLIENT_ID, scope: SCOPES, immediate: mode}, callback);
Everything works correctly and I am able to read/write data from/to the underlying Datastore.
In the same AppEngine project I have a servlet that generates a PDF based on data that is in the Datastore.
I would like to be able to call this Servlet from my HTML page using the same user that was authenticated through the api.auth.authorize() method.
And in the servlet, get the User through
UserService userService = UserServiceFactory.getUserService();
and query the datastore for the data of this user and then generate a PDF showing this data.
I have no idea how to call this url (servlet) with the credentials of the OAuth autheticated user.
Can you help me please??
Thanks in advance!
Note that the same question was asked some months ago but without a "complete" answer: GAE User API with OAuth2
You should look into bucket/object ACLs. When your API endpoint gets the User object, it can use the user's email to set the ACL on the PDF which is generated. That way, you can serve the PDF file to the user simply using its URL. You could also check with an endpoints API call whether the user is indeed authenticated as the person who is allowed to access the requested PDF (having stored a Datastore entry, perhaps, parallel to the object), and generate a signed URL once this is confirmed.