One Gateway application is going to send a GET request to my Vert.x application. For this request, I need to read a large zip file from Amazon S3 server which I am able to read in BufferedInputStream. I don't want to download this file but rather I need to send this stream data to the gateway application (NOT a downloadable file with application/zip content type but stream data or byte chunks) which it will further send to end application where this stream data will be downloaded as a zip file. So the sending the zip file in form of stream to the gateway application I need to achieve using Vert.x. Already gone through lot of documentation and blogs but everywhere it is given to download the zip file which is not my intention. Could anyone please suggest how I could achieve this streaming of zip file in http response of calling request using Vert.x? Do I need to use Java NIO? If yes, could you please put details? Sorry, but I have nothing to put here as part of code. Thanks in advance!
Related
I have a zipped file containing images which I am sending as response to a python REST API call. I want to create a rest application which consumes the python rest api in this manner: The response's content should be extracted without downloading (in browser side) and all the images should be displayed to the user. Is this possible? If yes, could you please help me in the implementation? I am unable to find help anywhere.
I think what you are trying to do is have a backend server (python) where zip files of images are hosted. You need to create an application (that could be in react) that
Send HTTP calls to the server get those .zip files.
Unzip them. How to unzip file on javascript
Display the images to the user. https://medium.com/better-programming/how-to-display-images-in-react-dfe22a66d5e7
I'm not sure what utf-8 has to do with this, but this is possible. A quick google gave me the results above.
I need to do bulk upload data from CSV file to Datastore. Although the data in the CSV file is also having a field which should be URL to a file.
Each row(person) is mapped to an associated file. which either i can upload in Google Cloud Storage. Although at runtime how can i upload the file and then get the URL and update the CSV file. Then use the CSV file to do Bulk upload.
Need to have a solution for this.
THanks for Help
Two ways of doing this
Write stuff in your request handler and perform the task, raw data can be uploaded to gae as a project resources, there are some size limits obviously
The better way is to enable remote api , then use remote api python script to batch upload stuff or write some code in python which points to your remote datasource.
Im writing a single-page-web-app (angularJs) and a server back-end (node.js). The communication between them is done via REST.
Currently im trying to implement the following scenario:
Upload big files from browser to S3 public bucket.
Copy uploaded file to private bucket on S3
Transcode uploaded file to HTML 5 compatible format (AWS Elastic Transcoder)
Store Meta-Object about the file in DB to access later
I'm racking my brains to get a well working design of the communication/ data-workflow between server and client, but always got stuck at the following questions?
Store file meta-object at the end or at the beginning of the process. If it is at the beginning, i have to store and handle some state information?
Who should start copying uploaded files to private bucket. Server or client? If it is the server, how can the client get informed about the job succeeded?
Who starts the transcoding process? If it is the server, how can the client get informed about the job succeeded?
How would you do this?
there is a pretty good tutorial which describes the use case you are planning to implement: http://www.bitcodin.com/blog/2015/02/create-mpeg-dash-hls-content-for-amazon-s3-and-cloudfront/
If your transcoding system has a RESTfull API (like bitcodin which is used in this tutorial, or any other service) you can do your application also client-side and use the API calls to get the state of your transcodings, etc. However, using the API you can do the same also server-side, whatever fits better for you.
I personally would store the metadata infos at the beginning of the process, as this is the point of time where you generate the "asset" in your database/CMS/etc.
I have a REST API which use Symfony2. I need to upload a file on my REST API and send this file on an other service. How I can do that?
Thanks
You may want to you use OneupUploaderBundle. It is optimized for a few file upload javascript libraries and it supports Gaufrette, which will upload your file everywhere you want.
Also you can check FineUploader: it sends automatically your file to S3 or Azure.
Is it possible to send a http upload request a file to a Apache or IIS that will have a fileName with "../" or ".." that wouldn't be rejected and would be passed to php or ASP.Net engine?
Not really the way you are asking. By the time it gets to the server the browser has read the file and delivered it as a chunk of content with no information about where it came from other than the original file name which you can choose to use or discard.
Generally file uploads go into a temporary storage place (e.g. /tmp) and then need to be moved out of there to somewhere which you can control and name.
This storage is configured on the server, and so any attempt to put path info into the filename should also be blocked by the file upload implementation of the server which should sanitise the filenames again if the browser didn't already do so.
If there's a bug then all bets are off though.