Send .mat file through Django Rest Framework - reactjs

I have an issue to send the contents of a .mat file to my frontend. My end goal is to allow clients to download the content of this .mat file at the click of a button so that they end up with the same file in their possession. I use Next.js + Django Rest Framework.
My first try was as follow:
class Download(APIView):
def get(self, request):
with open('file_path.mat', 'rb') as FID:
fileInstance = FID.read()
return Response(
fileInstance,
status=200,
content_type="application/octet-stream",
)
If I print out the fileInstance element I get some binary results:
z\xe1\xfe\xc6\xc6\xd2\x1e_\xda~\xda|\xbf\xb6\x10_\x84\xb5~\xfe\x98\x1e\xdc\x0f\x1a\xee\xe7Y\x9e\xb5\xf5\x83\x9cS\xb3\xb5\xd4\xb7~XK\xaa\xe3\x9c\xed\x07v\xf59Kbn(\x91\x0e\xdb\xbb\xe8\xf5\xc3\xaa\x94Q\x9euQ\x1fx\x08\xf7\x15\x17\xac\xf4\x82\x19\x8e\xc9...
But I can't send it back to my frontend because of a
"UnicodeDecodeError: 'utf-8' codec can't decode byte 0x9c in position 137: invalid start byte"
This error is always the same regardless of which .mat file I try to send in my response.
Next I tried to use the scipy.io.loadmat() method. In this case, fileInstance gives me a much more readable dictionary object, but I still can't get it to transfer to the frontend because of the presence of NaN in my dict:
ValueError: Out of range float values are not JSON compliant
Finally, some suggested to use h5py to send back the data as such:
with h5py.File('file_path.mat', 'r') as fileInstance:
print(fileInstance)
But in that case the error I get is
Unable to open file (file signature not found)
I know my files are not corrupted because I can open them in Matlab with no problem.
With all this trouble I'm wondering if I'm using the right approach to this problem. I could technically send the dictionary obtained through 'scipy.io.loadmat()' as a str element instead of binary, but I'll have to figure out a way to convert this text back to binary inside a Javascript function. Would anybody have some ideas as to how I should proceed?

The problem was in my frontend after all. Still, here's the correct way to go about it:
class Download(APIView):
parser_classes = [FormParser, MultiPartParser]
def get(self, request):
try:
file_path = "xyz.mat"
response = FileResponse(file_path.open("rb"), content_type="application/octet-stream")
response["Content-Disposition"] = f"attachment; filename=file_name"
return response
except Exception as e:
return Response(status=500)
This should send to the frontend the right file in the right format. No need to worry about encoding and such.
Meanwhile, on the frontend you should receive the file as follows:
onClick={() => {
const url = '/url_to_your_api/';
axios({ method: 'get', url: url, responseType: 'blob' })
.then((response) => {
const { data } = response;
const fileName = 'file_name';
const blob = new Blob([data], { type: 'application/octet-stream' });
const href = URL.createObjectURL(blob);
const link = document.createElement('a');
link.href = href;
link.download = fileName + '.mat';
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
URL.revokeObjectURL(href);
})
.catch((response) => {
console.error(response);
});
}}
Long story short, the part I was missing was to specify to receive the data as blob inside the 'onClick()' function. By default, responseType from Axios is set to Json/String. For that reason, my file was modified at reception and would not be usable in matlab afterwards. If you face a similar problem in the future, try to use the 'shasum' BASH function to observe the hashed value of the file. It is with the help of that function that I could deduce that my API function would return the correct value and that therefore the problem was happenign on the frontend.

Related

How to send a local image instead of URL to Computer Vision API using React

I would like to upload local image file and extract text from it. I followed the below link and it works as expected when I pass URL. https://learn.microsoft.com/en-us/azure/developer/javascript/tutorial/static-web-app/add-computer-vision-react-app
I managed to configure for local image and get the base64 encoded dataURL of the uploaded image. But when I pass base64 encoded dataURL to Computer Vision API , it says "Input data is not a valid image" (POST 400 status code). I am getting error in the line that is shown below:
const analysis = await computerVisionClient.analyzeImage(urlToAnalyze, { visualFeatures });
The code I have included for handling local image:
const handleChange = (e) => {
var file = e.target.files[0];
var reader = new FileReader();
reader.onloadend = function()
{
setFileSelected(reader.result) // this is the base64 encoded dataurl
}
reader.readAsDataURL(file);
}
In computerVision.js file, I have changed the 'contentType' in header as below.
const computerVisionClient = new ComputerVisionClient(
new ApiKeyCredentials({ inHeader: {'Ocp-Apim-Subscription-Key': key, 'Content-Type': 'application/octet-stream'} }), endpoint);
I tried replacing client.read() with readTextInStream() as per docs in computerVision.js (please refer above link), but still throws error.
May I know why I get the error "Input data is not a valid image" ? Thanks.
Here is the link for input requirements.
There is a brand new online portal provided by Microsoft https://preview.vision.azure.com/demo/OCR
The advantage is that it will directly list your available resources so you just have to pick the right one, then you test, and there are also some samples.

How to send a JSON and an image file to a server?

I am trying to send a JSON file and an image file together to a server, but am really struggling.
1) If I send just the quilt item, so skipping the formData and changing the $http part below to $http.post('quilts/create/', quilt), then set the server end point to expect (#RequestBody QuiltRequest quiltRequest) without the bits about transformRequest and headers, it processes the data therein quite happily but I don't have an image to add to the records.
2) If I don't add the quilt item to the formData, and tell the server to expect (#RequestParam("image") MultipartFile image), I can save the image file on my server and generate a url string for it, but have no other quilt information to make the corresponding database entry.
How can I send both the quilt and the image in one request, and have the server receive and process both?
Many thanks!
Client-side service:
this.create = function (quilt, image) {
quilt.size = JSON.parse(quilt.size);
quilt.maker = JSON.parse(quilt.maker);
const formData = new FormData();
formData.append('quiltRequest', quilt);
formData.append('image', image);
$http.post('quilts/create/', formData, {
transformRequest: angular.identity,
headers: {'Content-Type': undefined}
}).then(function (response) {
return window.location = '#!/quilts/created/' + response.data;
})
};
Server-side end point:
#PostMapping(path = "/create")
public BigInteger create(#RequestPart QuiltRequest quiltRequest, #RequestPart MultipartFile image) throws IOException {
// do stuff based on parameters received
}
Apart of it, i think you can try to encode the image to base64 string. Send it to server and at the server, You decode it
My required solution have given by a real-world hero, and is posted here in case anyone else with a similar problem stumbles upon this thread :) (But thank you to user3562932 for taking some time to read and make a suggestion).
On the client side, we have moved the five lines of data preparation into a separate method, such that the original create() now takes a bunch of parameters and jumps straight to $http.post(url, data which has been magically transformed into something appropriate to send {rules on how to send the data}).
$http.post('quilts/create/', formData(quilt, image), {
transformRequest: angular.identity,
headers: {'Content-Type': undefined}
}).then(function (response) {
return window.location = '#!/quilts/created/' + response.data;
})
The magical transformation happens in new function formData(), which takes as its parameters the data we want to send and makes the necessary changes:
1) make a formData container for the data to be POSTed.
2) stringify information from the html form (e.g. text, numbers) into a JSON and append to formData.
2a) in this particular case, my quilt structure contains size and maker details which arrived from the backend as JSONs, and were selected in the webpage from drop-down lists of various sizes and makers, hence the parsing rows to get these items ready to be included in the formData.
3) convert files into BLOBs, and then likewise append.
4) return formData, with all required information neatly wrapped up and ready to go!
Note: in the services.js file, this formData() method actually appears above the create() method, but it feels more logical to talk about them this way around.
function formData(quilt, image) {
let formData = new FormData();
quilt.size = JSON.parse(quilt.size);
quilt.maker = JSON.parse(quilt.maker);
formData.append('quiltRequest', JSON.stringify(quilt));
formData.append('image', new Blob([image]));
return formData;
}
On the server side, we can now happily receive this through:
#PostMapping(path = "/create", consumes = MediaType.MULTIPART_FORM_DATA_VALUE)
public BigInteger create(#RequestParam(value = "quiltRequest") String quiltRequest,
#RequestParam(value = "image") MultipartFile image) throws IOException {
QuiltRequest quilt = new ObjectMapper().readValue(quiltRequest, QuiltRequest.class);
QuiltResponse quiltResponse = quiltService.create(quilt, image);
return quiltResponse.getQuilt().getId();
}
In order to enable the end point to consume our exciting multimedia input, we have to add the following import at the top of the class:
import org.springframework.http.MediaType;
We use another import to enable the use of the MultipartFile class that we have designated for the incoming image file:
import org.springframework.web.multipart.MultipartFile;
The JSON object from the webpage has come through as a String, but that needs to be parsed into its underlying components to actually be of use. This is where the ObjectMapper comes into play. Call on its readValue() method, and pass in the string argument plus a template of what the information should look like when unwrapped (here, a QuiltRequest class with defined properties corresponding to the information we fed into the JSON back in the client-end server). Remember to include the necessary import to access the ObjectMapper:
import com.fasterxml.jackson.databind.ObjectMapper;
Hopefully this breakdown of the changes makes sense, with enough explanation to help other developers build end-to-end POST requests to suit their own projects.

How to send parsed .csv file as a byte array or ArrayBuffer data from Node.js backend to AngularJS frontend?

I'm working on AngularJS app.
Module I'm currently working on should be able to either show a preview of a spreadsheet file or allow to download it.
The steps:
When clicked on "Preview File" it should send request with needed file's name as a parameter of POST request.
Backend will find neede file, which is a .csv file, convert it to byte array type and send it to frontend.
Frontend should handle this byte array and convert it to .xls or .xlsx filetype
The spreadsheet data should be opened in some small preview read-only window, like 1000x1000 px.
The POST request line looks like that:
this.$http.post(this.url + 'endpoint/getFile', params,
{responseType: "arraybuffer", showLoadingOverlay: true}
)
The response looks indeed like ArrayBuffer: three of it in one object, i.e. Uint8Array, Uint16Array and Uint32Array.
The code which should read this and convert to content suitable for preview is not working:
const byteArray = new Uint8Array(data);
const blob = new Blob([byteArray], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' });
const objectUrl = URL.createObjectURL(blob);
this.$window.open(objectUrl, 'C-Sharpcorner', 'width=1000,height=1000');
Because when created the blob, it already has 0 length in bytes, so there's no data inside.
The matter of visualising the .xls in browser window, I think, can be achieved with canvas-datagrid library. Haven't used but it looks cool.
Also, I have a problem with trying to set up a mock data for node.js (and AngularMock), for local testing when there's no data on a java backend.
I'm using 'fs' and 'csv-parse':
const fs = require('fs');
const csvParse = require("csv-parse/lib/es5");
module.exports = function stir(app) {
const getFile = () => {
const csvOutput = csvParse('../static/someData.csv', (parsed) => {
return parsed;
});
fs.readFileSync(csvOutput);
};
app.post('/stir/getFile', (req, res) => res.json(getFile()));
};
Which results in error:
TypeError: path must be a string or Buffer
What is the proper way of parsing the .csv using 'csv-parse' and sending parsed data as an ArrayBuffer to frontend in Node and AngularMock?
csv-parse docs are telling that underneath, the lib will convert the parsed object to node stream.
So why that error happens?

ReactJS image/pdf file download not working

I want to download file that can be in any format viz. pdf, jpeg, png, xlsx, csv etc. The download API on backend using pyramid framework is sending FileResponse as below:
def delivery_item_download_view(request, *args, **kw):
context = request.context
item_row = context.item_row
if item_row and item_row["deleted_at"] is None:
print(request.upload_dir+'/'+item_row["file_name"]+'.'+item_row["file_extension"])
response = FileResponse(
request.upload_dir+'/'+item_row["file_name"]+'.'+item_row["file_extension"],
request=request,
)
response.headers["attachment"] = item_row["name"];
return response
This, when executed using POSTMAN works as expected giving file as output. However,when tried implementing same using ReactJS, it's not working as expected. My client-code is as below:
onDownloadItem= (item) => {
console.log("item id is:", item.item_id)
var apiBaseUrl = "https://dev.incodax.com/api/deliveries_items/"+ item.item_id+ "/download";
fetch(apiBaseUrl, {
method: "GET",
}).then((res) => {
fileDownload(res,item.file_name)
console.log(res)
})
}
This fileDownload function simply downloading file but with no content inside. In downloaded file I could see something like:
[object Response]
I am getting 200 response from server. So I dont't think there is any issue with server side code. How can I handle it on client?
Thanks in advance
Will it suit you if you just redirected user to link to file? Browser will automatically handle it and download it.
The issue is in your fileDownload function which you do not post here. It's not clear what the first parameter is supposed to be but likely it is not the response object. Likely you at least need to pull the body out of the response and save that! The response body can be converted to a buffer object which could work (again it depends on what fileDownload is expecting):
res.arrayBuffer().then(buffer => {
fileDownload(buffer, item.file_name);
});

Uploading file to openstack object storage from JavaScript

I have a openstack object storage container to which I'm trying to upload files directly from browser.
As per the documentation here, I can upload the file using a PUT request and I'm doing this using Angularjs provided $http.put method as shown below.
$http.put(temporaryUploadUrl,
formData,
{
headers: {
'Content-Type': undefined
}
}
).then(function (res) {
console.log("Success");
});
The file uploads successfully and it has no problems in authentication and gives me a 201 Created response. However the file is now containing junk lines on the top and bottom of it because its a multipart request sent using FormData().
Sample file content before upload:
Some sample text
here is more text
here is some other text
File content after downloadiong back from openstack container :
------WebKitFormBoundaryTEeQZVW5hNSWtIqS
Content-Disposition: form-data; name="file"; filename="c.txt"
Content-Type: text/plain
Some sample text
here is more text
here is some other text
------WebKitFormBoundaryTEeQZVW5hNSWtIqS--
I tried the FileReader to read the selected file as a binary string and wrote the content to the request body instead of FormData and the request which works fine for text files but not the binary files like XLSX or PDF The data is entirely corrupted this way.
EDIT:
The following answer is now considered a less performing workaround As
it will encode the entire file to base64 multipart form data. I would
suggest go ahead with #georgeawg's Answer if you are not Looking for a
formData + POST solution
Openstack also provides a different approach using FormData for uploading one or more files in a single go as mentioned in this documentation. Funny this was never visible in google search.
Here is a brief of it.
First you need to generate a signature similar to tempUrl signature using the following python procedure.
import hmac
from hashlib import sha1
from time import time
path = '/v1/my_account/container/object_prefix'
redirect = 'https://myserver.com/some-page'
max_file_size = 104857600
max_file_count = 1
expires = 1503124957
key = 'mySecretKey'
hmac_body = '%s\n%s\n%s\n%s\n%s' % (path, redirect,
max_file_size, max_file_count, expires)
signature = hmac.new(key, hmac_body, sha1).hexdigest()
Then in your javascript call post to the container like this.
var formData = new FormData();
formData.append("max_file_size", '104857600');
formData.append("max_file_count", '1');
formData.append("expires", '1503124957');
formData.append("signature", signature);
formData.append("redirect", redirect);
formData.append("file",fileObject);
$http.post(
"https://www.example.com/v1/my_account/container/object_prefix",
formData,
{
headers: {'Content-Type': undefined},
transformRequest: angular.identity
}
).then(function (res) {
console.log(response);
});
Points to note.
The formData in POST request should contain only these
parameters.
The file entry in the formData should be the last one.(Not sure why
it doesnt work the other way around).
The formData content like path with prefix, epoch time, max file
size, max file count and the redirection urls should be the same as
the one which were used to generate the signature. Otherwise you will
get a 401 Unauthorized.
I tried the FileReader to read the selected file as a binary string and wrote the content to the request body instead of FormData and the request which works fine for text files but not the binary files like XLSX or PDF The data is entirely corrupted this way.
The default operation for the $http service is to use Content-Type: application/json and to transform objects to JSON strings. For files from a FileList, the defaults need to be overridden:
var config = { headers: {'Content-Type': undefined} };
$http.put(url, fileList[0], config)
.then(function(response) {
console.log("Success");
}).catch(function(response) {
console.log("Error: ", response.status);
throw response;
});
By setting Content-Type: undefined, the XHR send method will automatically set the content type header appropriately.
Be aware that the base64 encoding of 'Content-Type': multipart/form-data adds 33% extra overhead. It is more efficient to send Blobs and File objects directly.
Sending binary data as binary strings, will corrupt the data because the XHR API converts strings from DOMSTRING (UTF-16) to UTF-8. Avoid binary strings as they are non-standard and obsolete.

Resources