I'm currently using a dropbox client js script to push zip files to a folder (in test, a couple of k, in production, a couple of hundred meg) - there currently isn't a server/back end, so am posting from an arraybuffer, not a server url.
var zip = new JSZip();
zip.file("test.txt", "Hello World\n");
var content = zip.generate({type:"arraybuffer"});
// ... code to pick a dropbox folder ...//
client.writeFile(url+"/"+fileName, content, function(error){ ... etc
This all works fine - client is able to write the binary file (which Dropbox's own Saver is unfortunately unable to do). I'm trying to see if Kloudless is able to perform the same, since I also need to support google, box, etc at some point. https://github.com/kloudless/file-explorer/'s documentation about its saver says the files are an array of urls ..
explorer({
...
files: [{
"url": "http://<your image url>",
"name": "filename.extension"
},
It doesn't seem to like local storage file references using URL.createObjectURL(blob), so I'm guessing the api is telling the remote services to pull the files rather than pushing their data.
You are correct that the Kloudless API backend servers stream the file from the URL to the final destination in whichever cloud service you would like the file to be uploaded to (e.g. a folder in a Dropbox account).
If the files are present only on the client-side, I would recommend using the Kloudless Chooser to instead prompt the user to choose a folder to save the files in, and then handle uploading the file data to that destination via the client-side manually.
To do this, refer to this example configuration: https://jsfiddle.net/PB565/139/embedded/
I have set retrieve_tokens to true so that my client-side JavaScript will receive not just the metadata of the folder the user chooses to upload the data to but also the Bearer token to use to gain access to the user's account. This allows the client-side JavaScript to then make upload or multipart upload requests to Kloudless to upload the file data to that folder. The advantage of multipart uploads is that an error uploading one chunk wouldn't require the whole upload to be retried.
Be sure to add the domain you are hosting the File Explorer on to your Kloudless App's Trusted Domains (on the App Details page) so that it can in fact receive the Bearer token in the response JS callback. In my JSFiddle example, I would have to add 'fiddle.jshell.net' to my app's list of Trusted Domains to be able to receive the Bearer token to perform further requests from the client side to the Kloudless API.
Related
I am using:
React as front end,
.net core as back end.
I have s3 bucket
Purpose: upload file to S3
allow the users to upload directly via browser ( can be via was sdk or HTTP post)
What I want to achieve:
because of the s3 key, I don't want to give the key to one user, otherwise, one user can upload and read other users file.
I don't want to pass the file to the server, then upload via server.
What's the best way for me to control this? getting a unique key via the my backend server for a particular user?
Or any suggested link/training I can go to?
You can generate an S3 presigned POST URL in the backend using the secret access keys of an appropriate IAM user (let's call it User A). Then return this presigned POST URL to the client, and now client can use this presigned POST URL upload files to S3 bucket on behalf of User A. Here is the documentation which describes how to POST an object to S3 in detail.
I'm creating a dating React web app where users can upload pictures of themselves to their user profile. I want to use Firebase storage. I want to protect the images so that they are only viewable when accessing from my web app by authenticated users - right now I get an image like this:
let storageRef = firebase.storage().ref(`images/${userid}/${userImageName}`);
// Get the download URL
storageRef.getDownloadURL().then(function(url) {
// Insert url into an <img> tag to "download"
})
This is great - but once I put the URL in the src attribute in the image tag anyone who views the source code can copy the URL and send it via email, text message, etc., making it "public". I have tried uploading images in base64 string using the putString() function also only for Firebase storage to yet again create a URL for it like a normal image upload when using the put() function.
So my question is - can I use Firebase Storage to store images and make them "private" so that only authenticated users of my web app are able to view them? Is there a way to get only the image data and use that to generate the images in the frontend/client so that no actual URLs are ever placed in the JS code?
The call to getDownloadURL() is protected by Security Rules, which means that if you write a rule like:
service firebase.storage {
match /b/{bucket}/o {
match /images/{userId}/{userImageName} {
// any authenticated user can read the bytes or get a download URL
allow read: if request.auth != null;
// only the given user can upload their photo
allow write: if request.auth.uid == userId;
}
}
}
They will not allow unauthenticated people to download URLs.
As for the second issue: once someone can see a file, assume that they have already downloaded/screenshotted it and can share it, even if the URL isn't publicly accessible (or even on the page). Viewing something is equivalent to downloading it, so there's really no difference where it's coming from as the end result can be the same.
I have one extension api, I upload it form web/resource/REST API, affter upload, it works well. however, I used the code to upload, i used the PageAPI.createPage() to upload extension, it upload success, but it doesn' work, I check the file on server, from the server, I can't find the extension from the ${BONITA_HOME}\bonita\client\tenants\1\work, it only exist on ${BONITA_HOME}\bonita\client\tenants\1\temp, debug, the files is invode the pageAPI servlet, and invode the PageDataStore.createEngieenPage(), so my question is how can i use the REST API to add extension and deploy it?
In order to deploy a Bonita REST API extension programmatically you need to:
Call the loginservice REST API for authentication
Send the file to a temporary folder on server side using the uploadPage servlet
Register the new REST API extension by calling the portal/page REST API
I create a basic Groovy script that demonstrate that.
I have one server running the angular application and another one for the web api.
I have a mechanism to upload and save the photos path. For example, I store the path in my database:
http://localhost:37020/App_Data/Tmp/FileUploads/248/tag2.png
So, when I want to display those images, I use
<img ng-src="path">
but I get
GET http://localhost:37020/App_Data/Tmp/FileUploads/248/tag2.png 404 (Not Found)
HTTP Error 404.8 - Not Found The request filtering module is
configured to deny a path in the URL that contains a hiddenSegment
section.
What i have done in my web api is set up a route like:
routeTemplate: "App_Data/Tmp/FileUploads/{listingId}/{file}"
but the controller doesnt seem to pick up the request.
The problem was that I was using App_Data folder.
Changing the folder worked just fine. I didnt even have to create a controller or action to serve the files
I have an admin control panel where admin users can set a few options and then click a button to run a report. The report should return a CSV file download prompt to the user.
I am using ui-router and $resource services. The response headers/mime type are set correctly, but the CSV file is returned as text (no file download initiated) by the $resource handler.
I tried creating the download link directly, by forming a querystring from the $scope, but unfortunately my API's authentication scheme uses a custom HTTP header token and it's not possible to send that to the API (which is also on another subdomain) via an anchor tag, such as this one:
Run
Is there a way to initiate a file download prompt using an XHR request (with custom header)?
I'm using a custom header token so downloading the report via a simple link is impossible; the request has to be made via XHR. My two-part solution:
Returned the CSV data from the API directly as text, removing file attachment headers. This is the correct solution, anyway, because it keeps the REST JSON API unconcerned with file downloads, which are a browser concept.
Wrapped the API response data in a Blob, and then used https://github.com/eligrey/FileSaver.js to initiate a file download.
A drawback is this requires IE10+, but that is okay in my case.