I've had a problem recently with users trolling and then deleting images before I can see what they are. So I'm creating a log to download everything into a log. (yes I've instantiated fs.js already). For some reason though, when writing the file... the file is only 9 bytes big (and the content is just "undefined"). Please help.
var attachment = (message.attachments).array();
attachment.forEach(function(attachment) {
console.log(attachment.url);
tempName = attachment.url.split("/");
attachName = tempName[tempName.length-1]
console.log(attachName);
fs.writeFileSync(dir + "/" + attachName, attachment.file, (err) => {
// throws an error, you could also catch it here
if (err) throw err;
// success case, the file was saved
console.log('attachment saved!');
});
theLog += '<img src="'+ "attachments/" + message.channel.name + "/" + attachName + '"> \n';
//theLog += '<img src="'+ attachment.url + '"> \n';
})
Lets start with answering why it saves it as undefined.
If you check the docs for MessageAttachment message.attachments.first().file is undefined. there is fileName and fileSize but no file
To save the file you can do 2 things...
Saving the URLS.
You can save the url in an array in a JSON file like so:
JSON FILE
{
"images":[]
}
JS FILE
let imgs = require(JSON_FILE)
imgs.images.push(attachment.url);
fs.writeFile(JSON_FILE,JSON.stringify(imgs,null,4));
- Saving the IMAGE itself
You can use the request module to pull images from a url
JS FILE
//Start of code
let request = require(`request`);
let fs = require(`fs`);
//Later
request.get(attachment.url)
.on('error', console.error)
.pipe(fs.createWriteStream(`Img-${Date.now()}`));//The "Img-${Date.now}" Guarantees Unique file names.
EDIT: request is deprecated. It's been replaced by fetch I can't confirm this code work's with fetch but the underlining principle is the same.
I ended up solving it with a tiny function. Thanks everyone (especially the guy asking what a variable was... that was super helpful)
function downloadAttachment(url, dest, hash){
console.log('initiating download of '+ url +'...');
request(url).pipe(fs.createWriteStream(dest));
}
the "hash" variable is not used right now. I was hungry and craving corned beef hash...
Related
I have the following intent for an Alexa Skill and I need to read a .txt file from an external URL into a variable for Alexa to say it. This is what I have so far...
'PlayVoice': function() {
var url = "https://example.com/myfile.txt";
var output = 'Okay, here is the text file' + url;
this.response.speak(output);
this.emit(':responseReady');
},
Obviously, the only thing it does now is to read the actual URL.
I have tried using fs.readFile but I just get an error in the Alexa Skill. This is the code I tried:
'PlayVoice': function() {
var content;
fs.readFile('https://example.com/myfile.txt', function read(err, data) {
content = data;
this.response.speak(content);
}
this.emit(':responseReady');
},
Any help on how to simply read a text file into a variable I can get Alexa to speak via this.response.speak?
You can use request package.
something like this should help.
var request = require('request');
request('url/of/the/file', function (error, response, body) {
console.log('error:', error); // Print the error if one occurred
console.log('statusCode:', response && response.statusCode); // Print the response status code if a response was received
console.log('body:', body); // contents of your file.
});
source : https://www.npmjs.com/package/request#super-simple-to-use
Also you'll need to add the package request to your skill's lambda.
To do that install the request package in the folder where your code is (lambda_function.js and all other files). Then create a zip of all the files (not the folder in which your files are) and upload it to your aws lambda.
I am trying to save a variable's data into a text file and update the file every time the variable changes. I found solutions in Node.js and vanilla JavaScript but I cannot find a particular solution in React.js.
Actually I am trying to store Facebook Long Live Access Token in to a text file and would like to use it in the future and when I try importing 'fs' and implementing createFile and appendFile methods I get an error saying Method doesn't exist.
Please help me out. Here is the code below
window.FB.getLoginStatus((resp) => {
if (resp.status === 'connected') {
const accessToken = resp.authResponse.accessToken;
try {
axios.get(`https://graph.facebook.com/oauth/access_token?client_id=CLIENT_id&client_secret=CLIENT_SECRET&grant_type=fb_exchange_token&fb_exchange_token=${accessToken}`)
.then((response) => {
console.log("Long Live Access Token " + response.data.access_token + " expires in " + response.data.expires_in);
let longLiveAccessToken = response.data.access_token;
let expiresIn = response.data.expires_in;
})
.catch((error) => {
console.log(error);
});
}
catch (e) {
console.log(e.description);
}
}
});
React is a frontend library. It's supposed to be executed in the browser, which for security reasons does not have access to the file system. You can make React render in the server, but the example code you're showing is clearly frontend code, it uses the window object. It doesn't even include anything React-related at first sight: it mainly consists of an Ajax call to Facebook made via Axios library.
So your remaining options are basically these:
Create a text file and let the user download it.
Save the file content in local storage for later access from the same browser.
Save the contents in online storage (which could also be localhost).
Can you precise if any of these methods would fit your needs, so I can explain it further with sample code if needed?
I have a JSON array of objects that is a result of a function in nodejs. I use json2xls to convert that to an excel file, and it downloads to the server (not in a public folder, and is formatted correctly in Excel).
I would like to send a response to the frontend with the json results (to display as a preview) and show a button they can click to download the xlsx file OR display the JSON results and automatically download the file.
But I can't get it, and I've tried so many things I'm going crazy.
My controller code (the part that creates the xls file):
var xls = json2xls(results,{});
var today = (new Date()).toDateString('yyyy-mm-dd');
var str = today.replace(/\s/g, '');
var fileName = "RumbleExport_"+ str +".xlsx";
var file = fs.writeFileSync(fileName,xls,'binary');
res.download('/home/ubuntu/workspace/'+file);
The frontend controller:
vm.exportData = function(day, event, division) {
console.log('Export registrations button pressed.', vm.export);
//send the search parameters to the backend to run checks
$http.post('/api/exportData', vm.export).then(function(response){
vm.results = response.data;
console.log("Results",response);
vm.exportMessage = "Found " + vm.results.length + " registrations.";
})
.catch(function(error){
vm.exportError = error.data;
});
};
The view:
//display a button to download the export file
<a target="_self" file="{{vm.results}}" download="{{vm.results}}">Download Export File</a>
Someone please put me out of my misery. All the classes I've taken and none have covered this.
I FINALLY got it! And since I searched forever trying to make something work, I'll share the answer:
On the backend:
//save the file to the public/exports folder
var file = fs.writeFileSync('./public/exports/'+fileName,xls,'binary');
//send the results to the frontend
res.json(200).json({results:results, fileName: fileName});
On the frontend, use HTML to download a link to the file:
<a href="exports/{{fileName}}" download>Save File</a>
I am working on meanjs application generated using https://github.com/DaftMonk/generator-angular-fullstack. I am trying to generate a .pdf file using phantomjs and download it to the browser.
The issue is that the downloaded .pdf file always shows the blank pages regardless of the number of pages. The original file on server is not corrupt. When I investigated further, found that the downloaded file is always much larger than the original file on the disk. Also this issue happens only with .pdf files. Other file types are working fine.
I've tried several methods like res.redirect('http://localhost:9000/assets/exports/receipt.pdf');, res.download('client\\assets\\exports\\receipt.pdf'),
var fileSystem = require('fs');
var stat = fileSystem.statSync('client\\assets\\exports\\receipt.pdf');
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Length': stat.size
});
var readStream = fileSystem.createReadStream('client\\assets\\exports\\receipt.pdf');
return readStream.pipe(res);
and even I've tried with https://github.com/expressjs/serve-static with no changes in the result.
I am new to nodejs. What is the best way to download a .pdf file to the browser?
Update:
I am running this on a Windows 8.1 64bit Computer
I had corruption when serving static pdfs too. I tried everything suggested above. Then I found this:
https://github.com/intesso/connect-livereload/issues/39
In essence the usually excellent connect-livereload (package ~0.4.0) was corrupting the pdf.
So just get it to ignore pdfs via:
app.use(require('connect-livereload')({ignore: ['.pdf']}));
now this works:
app.use('/pdf', express.static(path.join(config.root, 'content/files')));
...great relief.
Here is a clean way to serve a file from express, and uses an attachment header to make sure the file is downloaded :
var path = require('path');
var mime = require('mime');
app.get('/download', function(req, res){
//Here do whatever you need to get your file
var filename = path.basename(file);
var mimetype = mime.lookup(file);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype);
var filestream = fs.createReadStream(file);
filestream.pipe(res);
});
There are a couple of ways to do this:
If the file is a static one like brochure, readme etc, then you can tell express that my folder has static files (and should be available directly) and keep the file there. This is done using static middleware:
app.use(express.static(pathtofile));
Here is the link: http://expressjs.com/starter/static-files.html
Now you can directly open the file using the url from the browser like:
window.open('http://localhost:9000/assets/exports/receipt.pdf');
or
res.redirect('http://localhost:9000/assets/exports/receipt.pdf');
should be working.
Second way is to read the file, the data must be coming as a buffer. Actually, it should be recognised if you send it directly, but you can try converting it to base64 encoding using:
var base64String = buf.toString('base64');
then set the content type :
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Length': stat.size
});
and send the data as response.
I will try to put an example of this.
EDIT: You dont even need to encode it. You may try that still. But I was able to make it work without even encoding it.
Plus you also do not need to set the headers. Express does it for you. Following is the Snippet of API code written to get the pdf in case it is not public/static. You need API to serve the pdf:
router.get('/viz.pdf', function(req, res){
require('fs').readFile('viz.pdf', function(err, data){
res.send(data);
})
});
Lastly, note that the url for getting the pdf has extension pdf to it, this is for browser to recognise that the incoming file is pdf. Otherwise it will save the file without any extension.
Usually if you are using phantom to generate a pdf then the file will be written to disc and you have to supply the path and a callback to the render function.
router.get('/pdf', function(req, res){
// phantom initialization and generation logic
// supposing you have the generation code above
page.render(filePath, function (err) {
var filename = 'myFile.pdf';
res.setHeader('Content-type', "application/pdf");
fs.readFile(filePath, function (err, data) {
// if the file was readed to buffer without errors you can delete it to save space
if (err) throw err;
fs.unlink(filePath);
// send the file contents
res.send(data);
});
});
});
I don't have experience of the frameworks that you have mentioned but I would recommend using a tool like Fiddler to see what is going on. For example you may not need to add a content-length header since you are streaming and your framework does chunked transfer encoding etc.
Can anyone provide me an example in PLUNKER that how to load JSON file for karma/jasmine test.I want to read the data from JSON file for the test cases i am writing.I have been searching but nowhere they mentioned clear example on how to do it? I appreciate it if anyone can provide with the example.
You can load an external json data file using require
var data = require('./data.json');
console.log(data);
// Your test cases goes here and you can use data object
Set the path to find your file, in this case my file (staticData.json) is located under /test folder.
jasmine.getFixtures().fixturesPath = 'base/test/';
staticData= JSON.parse(jasmine.getFixtures().read("staticData.json"));
You have to add also the pattern in the karma.conf.js file, something like:
{ pattern: 'test/**/*.json', included: false, served: true}
Do you want to read the JSON file from a webserver or a local file system? No one can give an example of loading from a local file system from Plunker, since it runs in a web browser and is denied access to the file system.
Here is an example of how to load a JSON file from disk in any Node.js program, this should work for Karma/Jasmine:
var fs = require('fs');
var filename = './test.json';
fs.readFile(filename, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});