downloading .tar.gz file from nodejs server using MEAN stack - angularjs

I am trying to download a tar.gz file from server which gets created at run time. Here is the code at the server side
function downloadFile(req, res) {
//some code to generate the tar file
var file = ... code to compute the path ...
res.download(file);
}
The method which calls this from the client side looks like this
continueWithApplication(app) {
this.$http.get('/api/applications/get-agent/' + app._id + '/' + this.nodeName).then(res => {
var data = new Blob([res.data], { type: 'application/x-gzip' });
this.FileSaver.saveAs(data, 'agent-1.0.tar.gz');
})
.catch(err => {
alert('error downloading agent.');
});
}
I am using angular-file-saver to get the file downloaded.
I can see the file getting downloaded but when i try to untar the file it doesn't untar the content but creates a file with .cpgz extension. I have verified that the file that gets generated at server side is a valid .tar.gz file by unzipping it. Below screeen shot shows what happens when i try to untar the dowaloaded file (agent-1.0.tar.tar.gz -> agent-1.0.tar.gz.cpgz)
Any idea what am i doing wrong? Any pointer is deeply appreciated.
P.S. Please pardon my limited knowledge of angular and mean stack.

Related

Cut video before uploading in Reactjs

I can use ffmpeg in js but how can i use this code in react
const ffmpegPath = require('#ffmpeg-installer/ffmpeg').path
const ffmpeg = require('fluent-ffmpeg')
ffmpeg.setFfmpegPath(ffmpegPath)
ffmpeg('video.mp4')
.setStartTime('00:00:03')
.setDuration('10')
.output('video_out.mp4')
.on('end', function(err) {
if(!err) { console.log('conversion Done') }
})
.on('error', function(err){
console.log('error: ', err)
}).run()
In my understanding, you want to change a video file before uploading it?
I'm afraid this is pretty hard to do in the browser. Browsers usually don't have easy access to the local file system of the computer and have trouble reading and writing to disk.
The code you have included is meant for a node environment. A hint is the use of the required function on line 1 & 2 as node provides this function natively.
My proposed solution would be:
User uploads original (full-length) video to your server through a react app.
Server processes the file (using a node environment and the code that you have copied) and removes the first three seconds. A tutorial on how to receive video uploads in nodejs can be found here: froala nodejs video upload
Server saves the new file & deletes the original
I hope this helps a bit.

Downloading an Excel file causes it to corrupt

I have a simple service on Angular 2 and Typescript that requests Excel files to a server and then opens a download file dialogue for the user. However, as it is currently, the file becomes corrupt when downloaded.
When downloaded, it opens fine in OpenOffice and derivates, but throws a "File is Corrupt" error on Microsoft Excel, and asks if the user wants to recover as much as it can.
When Excel is prompted to recover the file, it does so successfully, and the recovered Excel has all rows and data that is expected for the Excel file. Comparing the recovered file against opening the file in OpenOffice and derivates evidence no outstanding differences.
The concrete Excel I am trying to download is generated with Apache POI in a microservice, then passed to the main backend and finally served to the frontend for the user to download. Both the backend and microservice are written in Java, through Spark Framework.
I made some tests on the backends, and concluded the problem is not the report generation nor the data transfer:
Asking the microservice to save the generated Excel in a file within the server and then opening such file (hereby file A) in Excel shows that file A is not corrupted.
Asking the main backend server to save the Excel file that it receives from the microservice in a file within itself and then opening such file in Excel (hereby file B) shows that file B is not corrupted.
Downloading both file A and file B through FileZilla from their respective servers yields completely uncorrupted files.
As such, I believe it is safe to assume the Excel becomes corrupted somewhere between the time the file is received on the frontend and the time the user downloads such file. Additionally, the Catalina logs do not evidence any error that might potentially be happening.
I have read several posts that deal with the issue, including a bug report (https://github.com/angular/angular/issues/14083) that included a workaround via XMLHTTPRequest. However, none of the workarounds detailed were successful in solving my issue.
Attached is the code I am using to both obtain the Excel file from the backend and serve it to the user. I am including both an XMLHTTPRequest and an Angular http call (within comments) since those are the two main ways I have been trying to make this work. Additionally, please do take into account the code has been altered to remove information I do not wish to make public.
download(body) {
let reply = Observable.create(observer => {
let xhr = new XMLHttpRequest();
xhr.open('POST', 'URL', true);
xhr.setRequestHeader('Content-type', 'application/json;charset=UTF-8');
xhr.setRequestHeader('Accept', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
xhr.setRequestHeader('Authorization', 'REDACTED');
xhr.responseType = 'blob';
xhr.onreadystatechange = function () {
if(xhr.readyState === 4) {
if(xhr.status === 200) {
var contentType = 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet';
var blob = new Blob([xhr.response], { type: contentType });
observer.next(blob);
observer.complete();
}
else {
observer.error(xhr.response);
}
}
}
xhr.send(JSON.stringify(body));
});
return reply;
/*let headers = new Headers();
headers.set("Authorization", 'REDACTED');
headers.set("Accept", 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
let requestOptions :RequestOptions = new RequestOptions({headers: headers, responseType: ResponseContentType.Blob});
return this.http.post('URL', body, requestOptions);*/
}
Hereby is the code to prompt the user to download the Excel. It is currently made to work with the XMLHTTPRequest. Please do note that I have also attempted to download without resorting to FileSaver, with no luck.
downloadExcel(data) {
let body = {
/*REDACTED*/
}
this.service.download(body)
.subscribe(data => {
FileSaver.saveAs(data, "Excel.xlsx");
});
}
Hereby are the versions of the tools I am using:
NPM: 5.6.0
NodeJs: 8.11.3
Angular JS: ^6.1.0
Browsers used: Chrome, Firefox, Edge.
Any help on this issue would be appreciated. Any additional information you may need I will be happy to provide.
I think what you want is CSV format which open in Excel, update your sevice as follow:
You should tell Angular you are expecting a response of type blob (Binary Large Object) that is your Excel/Csv file.
Also make sure the URL/API on your server is set to accept content-type='text/csv'.
Here's an example with Angular 2.
#Injectable()
export class YourService {
constructor(private http: Http) {}
download() { //get file from the server
this.http.get("http://localhost/..", {
responseType: ResponseContentType.Blob,
headers: new Headers({'Content-Type', 'text/csv'})
}).subscribe(
response => {
var blob = new Blob([response.blob()], {type: 'text/csv'});
FileSaver.saveAs(blob, 'yourFileName.csv');
},
error => {
console.error('something went wrong');
}
);
}
}
Have you tried uploading/downloading your xls file as base64?
var encodedXLSToUpload = 'data:application/xls;base64,' + btoa(file);
Check this for more details: Creating a Blob from a base64 string in JavaScript

How to convert files to .RAR file node js

In my project I have a table of projects. For each project there is a column for downloading pdf file. Now I want to be able to download all files and to create a single .rar file. There is a code for downloading a single file:
routes.js
app.get('/api/download/archive/:filename', function(req,res){
res.download("public/uploads/"+req.params.filename, req.params.filename);
})
archive.js
$scope.downloadPdf = function(obj){
$http.get('api/download/archive/'+obj.Documentation)
.success(function(data){
window.open('api/download/archive/'+obj.Documentation)
});
}
Unfortunately, RAR is a closed-source software. So the only way to create an archive is to install the command-line utility called rar and then use rar a command in a child process to compress the files.
To install rar on Mac I had to run brew install homebrew/cask/rar. You can find the installation instructions for other platforms here.
After you install it you can make use of child_process like this:
const { exec } = require('child_process');
const { promisify } = require('util');
const fs = require('fs');
const path = require('path');
// Promisify `unlink` and `exec` functions as by default they accept callbacks
const unlinkAsync = promisify(fs.unlink);
const execAsync = promisify(exec);
(async () => {
// Generating a different name each time to avoid any possible collisions
const archiveFileName = `temp-archive-${(new Date()).getTime()}.rar`;
// The files that are going to be compressed.
const filePattern = `*.jpg`;
// Using a `rar` utility in a separate process
await execAsync(`rar a ${archiveFileName} ${filePattern}`);
// If no error thrown the archive has been created
console.log('Archive has been successfully created');
// Now we can allow downloading it
// Delete the archive when it's not needed anymore
// await unlinkAsync(path.join(__dirname, archiveFileName));
console.log('Deleted an archive');
})();
In order to run the example please put some .jpg files into the project directory.
PS: If you choose a different archive format (like .zip) you would be able to make use of something like archiver for example. That might allow you to create a zip stream and pipe it to response directly. So you would not need to create any files on a disk.
But that's a matter of a different question.
Try WinrarJs.
The project lets you Create a RAR file, Read a RAR file and Extract a RAR archive.
Here's the sample from GitHub:
/* Create a RAR file */
var winrar = new Winrar('/path/to/file/test.txt');
// add more files
winrar.addFile('/path/to/file/test2.txt');
// add multiple files
winrar.addFile(['/path/to/file/test3.txt', '/path/to/file/test4.txt']);
// set output file
winrar.setOutput('/path/to/output/output.rar');
// set options
winrar.setConfig({
password: 'testPassword',
comment: 'rar comment',
volumes: '10', // split volumes in Mega Byte
deleteAfter: false, // delete file after rar process completed
level: 0 // compression level 0 - 5
});
// archiving file
winrar.rar().then((result) => {
console.log(result);
}).catch((err) => {
console.log(err);
}
Unfortunately Nodejs dosn't native support Rar compression/decompression, i frustated with this too so i created a module called "super-winrar" making super easy deal with rar files in nodejs :)
check it out: https://github.com/KiyotakaAyanokojiDev/super-winrar
Exemple creating rar file "pdfs.rar" and appending all pdf files into:
const Rar = require('super-winrar');
const rar = new Rar('pdfs.rar');
rar.on('error', error => console.log(error.message));
rar.append({files: ['pdf1.pdf', 'pdf2.pdf', 'pdf3.pdf']}, async (err) => {
if (err) return console.log(err.message);
console.log('pdf1, pdf2 and pdf3 got successfully put into rar file!');
rar.close();
});

Downloaded .pdf files are corrupted when using expressjs

I am working on meanjs application generated using https://github.com/DaftMonk/generator-angular-fullstack. I am trying to generate a .pdf file using phantomjs and download it to the browser.
The issue is that the downloaded .pdf file always shows the blank pages regardless of the number of pages. The original file on server is not corrupt. When I investigated further, found that the downloaded file is always much larger than the original file on the disk. Also this issue happens only with .pdf files. Other file types are working fine.
I've tried several methods like res.redirect('http://localhost:9000/assets/exports/receipt.pdf');, res.download('client\\assets\\exports\\receipt.pdf'),
var fileSystem = require('fs');
var stat = fileSystem.statSync('client\\assets\\exports\\receipt.pdf');
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Length': stat.size
});
var readStream = fileSystem.createReadStream('client\\assets\\exports\\receipt.pdf');
return readStream.pipe(res);
and even I've tried with https://github.com/expressjs/serve-static with no changes in the result.
I am new to nodejs. What is the best way to download a .pdf file to the browser?
Update:
I am running this on a Windows 8.1 64bit Computer
I had corruption when serving static pdfs too. I tried everything suggested above. Then I found this:
https://github.com/intesso/connect-livereload/issues/39
In essence the usually excellent connect-livereload (package ~0.4.0) was corrupting the pdf.
So just get it to ignore pdfs via:
app.use(require('connect-livereload')({ignore: ['.pdf']}));
now this works:
app.use('/pdf', express.static(path.join(config.root, 'content/files')));
...great relief.
Here is a clean way to serve a file from express, and uses an attachment header to make sure the file is downloaded :
var path = require('path');
var mime = require('mime');
app.get('/download', function(req, res){
//Here do whatever you need to get your file
var filename = path.basename(file);
var mimetype = mime.lookup(file);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype);
var filestream = fs.createReadStream(file);
filestream.pipe(res);
});
There are a couple of ways to do this:
If the file is a static one like brochure, readme etc, then you can tell express that my folder has static files (and should be available directly) and keep the file there. This is done using static middleware:
app.use(express.static(pathtofile));
Here is the link: http://expressjs.com/starter/static-files.html
Now you can directly open the file using the url from the browser like:
window.open('http://localhost:9000/assets/exports/receipt.pdf');
or
res.redirect('http://localhost:9000/assets/exports/receipt.pdf');
should be working.
Second way is to read the file, the data must be coming as a buffer. Actually, it should be recognised if you send it directly, but you can try converting it to base64 encoding using:
var base64String = buf.toString('base64');
then set the content type :
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Length': stat.size
});
and send the data as response.
I will try to put an example of this.
EDIT: You dont even need to encode it. You may try that still. But I was able to make it work without even encoding it.
Plus you also do not need to set the headers. Express does it for you. Following is the Snippet of API code written to get the pdf in case it is not public/static. You need API to serve the pdf:
router.get('/viz.pdf', function(req, res){
require('fs').readFile('viz.pdf', function(err, data){
res.send(data);
})
});
Lastly, note that the url for getting the pdf has extension pdf to it, this is for browser to recognise that the incoming file is pdf. Otherwise it will save the file without any extension.
Usually if you are using phantom to generate a pdf then the file will be written to disc and you have to supply the path and a callback to the render function.
router.get('/pdf', function(req, res){
// phantom initialization and generation logic
// supposing you have the generation code above
page.render(filePath, function (err) {
var filename = 'myFile.pdf';
res.setHeader('Content-type', "application/pdf");
fs.readFile(filePath, function (err, data) {
// if the file was readed to buffer without errors you can delete it to save space
if (err) throw err;
fs.unlink(filePath);
// send the file contents
res.send(data);
});
});
});
I don't have experience of the frameworks that you have mentioned but I would recommend using a tool like Fiddler to see what is going on. For example you may not need to add a content-length header since you are streaming and your framework does chunked transfer encoding etc.

How to load external Json file using karma+Jasmine for angularJS testing?/

Can anyone provide me an example in PLUNKER that how to load JSON file for karma/jasmine test.I want to read the data from JSON file for the test cases i am writing.I have been searching but nowhere they mentioned clear example on how to do it? I appreciate it if anyone can provide with the example.
You can load an external json data file using require
var data = require('./data.json');
console.log(data);
// Your test cases goes here and you can use data object
Set the path to find your file, in this case my file (staticData.json) is located under /test folder.
jasmine.getFixtures().fixturesPath = 'base/test/';
staticData= JSON.parse(jasmine.getFixtures().read("staticData.json"));
You have to add also the pattern in the karma.conf.js file, something like:
{ pattern: 'test/**/*.json', included: false, served: true}
Do you want to read the JSON file from a webserver or a local file system? No one can give an example of loading from a local file system from Plunker, since it runs in a web browser and is denied access to the file system.
Here is an example of how to load a JSON file from disk in any Node.js program, this should work for Karma/Jasmine:
var fs = require('fs');
var filename = './test.json';
fs.readFile(filename, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});

Resources