React Download Json File from Controller - reactjs

I have a react app that contacts a backend dotnet controller to download various types of files. When I download most files everything works fine and the files are correct. When I download a json file, the file contains only [object Object]. Below is the code in the call method.
public downloadFile = async (fileId: number) => {
const response = await HttpUtility.postFileDownload<any>(DOWNLOAD_URL + fileId, {})
let fileName = response.headers['content-disposition'].split('filename=')[1].split('\'')[2]
if (fileName === undefined) {
fileName = `file-id-${fileId}-${moment().format()}`
}
fileDownload(response.data, fileName, response.headers['content-type'])
}
When, I look at the dev tools in Chrome, the response looks correct. Any pointers on what I need to do to correct this?

This May works
fileDownload(JSON.stringify(response.data), fileName, response.headers['content-type'])

Related

React-Flask download excel file with button click

I'm trying to download an excel file with the click of a button in my web application. I can see the data come across from my api request, but when I download the file and try to open it I either get a:
"We found a problem with some content in ... Do you want us to try to recover as much as possible" YES => "This file is corrupt and cannot be opened"
or
"... the file format or file extension is not valid. Verify that theh file has not been corrupted..."
If I open the original file saved it works fine so it's not the file. I think the problem is somewhere in the React Code.
I've looked at a lot of other questions on Stack Overflow about this same topic but none of the answers seem to be working for me.
React
React.useEffect(() => {
if (template && downloadBtn.current != null) {
axios
.get<Blob>(
`/template`,
{ params: { filename: template } }
// responseType: 'blob' or "arraybuffer" don't work for me
)
.then((resp) => {
console.log(resp.data);
var blob = new Blob([resp.data], {
type: resp.headers['content-type'] // tried keeping and removing this
}); // removing this assuming resp.data is already a blob didn't work
console.log(blob); // PK ... b���C���h����ؒ )���G+N�
const url = window.URL.createObjectURL(blob);
console.log(url); // blob:http://localhost:3000/29fd5f64-da6a-4b9c-b4a4-76cce1d691c8
if (downloadBtn.current != null) {
downloadBtn.current.download = template;
downloadBtn.current.href = url;
}
});
}
}, [template, downloadBtn.current]);
Flask
#app.route('/template', methods=['GET'])
def template():
filename = getRouteData(['filename']) # helper function I wrote to get request.body data
print(os.path.join(
app.config['templates_folder'], filename), file=sys.stderr)
return send_file(os.path.join(app.config['templates_folder'], filename))
# adding as_attachment=True doesn't work for me either
# file path is correct

Writing to files using File System Access API fails in Electron + Create React App

I have a create-react-app that reads and writes local files using File System Access API. When run in a browser (Chrome or Edge that support it), both reading and writing files work fine.
When the app is run in Electron, reading works but writing fails due to: Uncaught (in promise) DOMException: The request is not allowed by the user agent or the platform in the current context.
I am using the latest Electron (12.0.1) which uses the same Chromium (89.0.4389.82) as the one in my Chrome browser.
Below is the relevant code. The console log after requestPermission call shows true and granted in the browser and true and denied in Electron.
I tried disabling webSecurity when creating BrowserWindow, disabling sandbox with appendSwitch but nothing helped.
Is there a way to give Chromium in Electron more permissions?
If not, I am willing to handle file writing differently when in Electron. In that case, what to write in place of TODO in the code? Note that because it is a create-react-app, the fs module is not available.
export async function chooseAndReadFile() {
const fileHandle = await window.showOpenFilePicker().then((handles) => handles[0])
const file = await fileHandle.getFile()
const contents = await file.text()
return contents
}
export async function chooseAndWriteToFile(contents: string) {
const fileHandle = await window.showSaveFilePicker()
const descriptor: FileSystemHandlePermissionDescriptor = {
writable: true,
mode: "readwrite"
}
const permissionState = await fileHandle.requestPermission(descriptor)
console.log(window.isSecureContext)
console.log(permissionState)
const writable = await fileHandle.createWritable()
await writable.write(contents)
await writable.close()
}
let isElectron = require("is-electron")
export async function chooseAndWriteToFileUniversal(contents: string) {
if (isElectron()) {
// TODO: Do what???
} else {
chooseAndWriteToFile(contents)
}
}
Answering my own question, I finally used a solution with HTML download attribute, nicely described here. When this technique is used in Electron, it presents a file save dialog which is exactly what I want. When used in a browser, this technique just downloads the file without a prompt, so I will continue using File System Access API for browser environments.
Here is the code that handles downloading when running in Electron.
function download(filename: string, contents: string) {
var element = document.createElement('a');
element.setAttribute('href', 'data:text/plain;charset=utf-8,' + encodeURIComponent(contents));
element.setAttribute('download', filename);
element.style.display = 'none';
document.body.appendChild(element);
element.click();
document.body.removeChild(element);
}
let isElectron = require("is-electron");
export async function chooseAndWriteToFileUniversal(contents: string) {
if (isElectron()) {
download("data.txt", contents)
} else {
chooseAndWriteToFile(contents) // See the original question for implementation of this function
}
}
Still, would be nice to know why/how is Chromium in Electron more restricted than in a normal Chrome or Edge browser, and if it can be changed.

Extracting zipped files using JSZIP in javascript not working on IE11

In my webpage, a user is supposed to upload a zipped file. Within the zipped file are multiple xlsx files. I am using below code for reading ZIP file , It works great on Chrome but when I am trying to run on IE11 it says, could not find resolve for null object or reference
var JSZip = require('JSZip');
fs.readFile{ filePath, function(err, data) {
if (!err) {
var zip = new JSZip();
zip.loadAsync(data).then(function(contents) {
Object.keys(contents.files).forEach(function(filename) {
zip.file(filename).async('nodebuffer').then(function(content) {
var dest = path + filename;
fs.writeFileSync(dest, content);
});
});
});
}
});
When I try to debug it is not going inside loadAync function.

Create temp file and get file with axios in React and show download pop up

In my app I have a situation that must create a binary file with custom extension and then send it to client. client request with axios and then on server create a file on fly(in tmp for example):
In Express i have method like this :
router.post('/send-some-file',(req,res)=>{
const {id} = req.body
/*
* Some code for getting information from db is here
*/
const fileName = "myfile.myextension"
const filePath = path.join(__dirname+`../${fileName}`)
fs.writeFile(filePath,"Data From Database",err=>{
if(!err){
//send file to client 'axios' , I use res.download
res.download(filePath,err=>{})
}
}
})
this code is my server code that create file and send it the problem of this code is I can't know when the file is completely downloaded for removing file.
in React with redux I dispatch an action and call this method of with axios
axios.post('some-url/send-some-file','my id',config()).then(res => {}).catch(er=>{})
know i want to browser download file or give a way to user to download file in my extension format that i described.
and the response is like this , the header is :
but I don't know how to pop up window for download or how to browser automatically download file
can you try this
axios.post('some-url/send-some-file','my id',config()).then(res =>
{
const url = window.URL.createObjectURL(new Blob([res.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'file.pdf');
document.body.appendChild(link);
link.click();
}).catch(er=>{})

Is there a way to dump a thousand images somewhere and extract them using REST Api?

Here is the thing:-
I have over a thousand images saved locally in my mac. I have a landing page that mocks an ecommerce deal site. It would be tedious to have to manually type in the src url in the img tag for a thousand pictures. Thus, i thought i could somehow have this images dumped in a cloud storage or something and use REST api get method to extract these images in a response.data. Then assign it to a $scope variable and use ng-repeat to bind the images in my landing page view. Is this possible? If not, what are the alternatives? SQL database?
Appreciate your help. P.S. I am totally a beginner at web development.
Install node.js. It's Javascript for a server which should make it pretty easy since you already know Javascript.
On a Mac, you can install node like this:
brew install node
Use this node.js code (credit to codepedia.com, tweaked a little by me):
//include http, fs and url module
var http = require('http'),
fs = require('fs'),
path = require('path'),
url = require('url');
imageDir = './images/';
//create http server listening on port 3333
http.createServer(function (req, res) {
//use the url to parse the requested url and get the image name
var query = url.parse(req.url,true).query;
pic = query.image;
if (typeof pic === 'undefined') {
getImages(imageDir, function (err, files) {
var imageList = JSON.stringify(files);
res.writeHead(200, {'Content-type':'application/json'});
res.end(imageList);
});
} else {
//read the image using fs and send the image content back in the response
fs.readFile(imageDir + pic, function (err, content) {
if (err) {
res.writeHead(400, {'Content-type':'text/html'})
console.log(err);
res.end("No such image");
} else {
//specify the content type in the response will be an image
res.writeHead(200,{'Content-type':'image/jpg'});
res.end(content, "binary");
}
});
}
}).listen(3333);
console.log("Server running at http://localhost:3333/");
//get the list of jpg files in the image dir
function getImages(imageDir, callback) {
var fileType = '.jpg',
files = [], i;
fs.readdir(imageDir, function (err, list) {
for(i=0; i<list.length; i++) {
if(path.extname(list[i]) === fileType) {
files.push(list[i]); //store the file name into the array files
}
}
callback(err, files);
});
}
Run this from the command line to start you new image server (assuming you named the file "server.js"):
node server.js
You should see this text appear on the command line:
Server running at http://localhost:3333/
You can quickly test it by going to this address in your browser and you should see a JSON object showing you an array of all the filenames in the "./images" directory. By the way, this program assumes you're putting the images folder in the same directory as "server.js". You can put the images directory anywhere and just change the path of the variable "imageDir".
Now you can load the list of files from Angular using this code in your controller:
$http.get("http://localhost:3333", function(data) {
$scope.images = data;
});
In your view, you can now use an ng-repeat like this to display all the images:
<div ng-repeat="image in images" style="padding: 8px">
<img src="http://localhost:3333/image={{ image }}">
</div>
Note: this will work if you run it locally on your Mac or if you upload all the images to a server on which you can use Node.js.

Resources