Convert CSV Array To JSON NodeJS - arrays

I am scraping data from a URL containing a csv. The format of the data I'm scraping is like this:
I am doing this in Node.js and using the nodejs-requestify package: https://www.npmjs.com/package/requestify
When I console the response.getBody() it's in the exact same format as the screenshot provided.
I am trying to convert this to a JSON array that I can iterate over in a loop to insert the values into a database, however, am struggling to get the data into JSON format.
I've tried splitting the array in multiple ways (comma, single quote, double quote). I've tried JSON.parse() and JSON.stringify() (and both in combination).
Here is the code I'm using. Ultimately when I console.log rows in the loop, this is where it should be in JSON format, however, it's just coming in as comma separated values still.
requestify.get('URL').then(function(response) {
// Get the response body
var dataBody = response.getBody();
var lineArray = dataBody.split('\r\n');
var data = JSON.parse(JSON.stringify(lineArray));
for(var s = 0; s < data.length; s++) {
var rows = data[s];
console.log(rows)
}
});

There is a basic misundertanding I think
var lineArray = dataBody.split('\r\n');
lineArray now contains something like
"a", "b", "c"
but for doing something like
var data = JSON.parse(lineArray);
you need lineArray to be
{ "a":"1", "b":"2", "c":"3" }
I think you need something like
const lineData = lineArray.split(',');
const keys = ["name", "age", "gender"];
const jsonLineData = {};
keys.forEach((key, index) => {
jsonLineData[key] = lineData(index);
});

I solved this by using csvtojson and aws-dsk since my csv is hosted on S3.
async function startAWS(db){
//Retrieve AWS IAM credentials for the 'master' user
var awsCredentials;
try{
awsCredentials = await retrievePromise(config.get('aws'));
}
catch (e) {
console.log({error:e},'startAWS error');
}
//Setup the AWS config to access our S3 bucket
AWS.config = new AWS.Config({
accessKeyId : awsCredentials.principal,
secretAccessKey :awsCredentials.credential,
region:'us-east-1'
});
//Call S3 and specify bucket and file name
const S3 = new AWS.S3();
const params = {
Bucket: '***',
Key: '***' //filename
};
//Convert csv file to JSON
async function csvToJSON() {
// get csv file and create stream
const stream = S3.getObject(params).createReadStream();
// convert csv file (stream) to JSON format data
const json = await csv().fromStream(stream);
//connect to DB and continue script
db.getConnection()
.then(async (conn) => {
if(json.length) {
for(var s = 0; s < json.length; s++) {
var rows = json[s];
const insert = await conn.query(
'SQL HERE'
);
}
}
})
};
csvToJSON();
}

Related

How to properly store uploaded file from multipartform-data in Rust?

I'm trying to build a web server in Rust, and i'm having a few issues trying to upload file into the server. With text based files it uploads fine, but whenever i try to upload other type of media (images, videos, etc), if the file is small enough, it will save, but corrupted, as showned.
original file raw data
file save on the server raw data
And when the file is too big, multer-rs library panicks with "received with incomplete data".
Error log
async fn parse_body(content_type: Option<&String>, body: String) -> HashMap<String, String> {
match content_type {
Some(content_type) => {
let ct = content_type.as_str();
if ct.contains("application/x-www-form-urlencoded") {
let buffer = body.replace("\r\n\r\n", "");
let _body = from_bytes::<Vec<(String, String)>>(buffer.as_bytes()).unwrap();
return _body.into_iter().collect();
}
if ct.contains("multipart/form-data") {
let boundary = multer::parse_boundary(ct).unwrap();
let data = once(async move { Result::<Bytes, Infallible>::Ok(Bytes::from(body)) });
let mut multipart = multer::Multipart::new(data, boundary);
let mut _body: HashMap<String, String> = HashMap::new();
// Iterate over the fields, use `next_field()` to get the next field.
while let Some(mut field) = multipart.next_field().await.unwrap() {
// Get field name.
let name = field.name().unwrap().to_string();
// Get the field's filename if provided in "Content-Disposition" header.
//
// Process the field data chunks e.g. store them in a file.
while let Some(chunk) = field.chunk().await.unwrap() {
// Do something with field chunk.
if let Some(file_name) = field.file_name() {
let file_dir = format!("src\\static\\temp\\{}", file_name);
let current_dir: &Path = Path::new(&file_dir);
let path = env::current_dir().unwrap().join(current_dir);
if let Ok(mut file) = std::fs::File::create(path) {
file.write_all(&chunk).unwrap();
}
} else {
_body.insert(name.clone(), String::from_utf8(chunk.to_vec()).unwrap());
}
}
}
return _body;
}
},
None => return HashMap::new()
}
HashMap::new()
}

Fetch Data from nested props in ReactJS

I am uploading the data from excel file using react-excel-renderer, storing the excel render response containing column & rows response in state and passing to other component.
Expected Use-case result:- I am fetching the data from excel using render storing the values in states(rows). I am passing the state to other component where i need these values to pass in API .
The data stored is in nested form. Can you please let me know how to get data separately stored under array in props. Attached is the screenshot.
Excel Render code:-
changeHandler(event) {
let fileObj = event.target.files[0];
//just pass the fileObj as parameter
ExcelRenderer(fileObj, (err, resp) => {
if (err) {
console.log(err);
} else {
this.setState({
cols: resp.cols,
rows: resp.rows,
});
}
});
}
Code to fetch the prop data:-
for (let i = 0; i < this.props.data.length; i++) {
let stDate = this.props.data[i].startDate;let TripName = this.props.data[i].TripName;
let totalFare = this.props.data[i].totalFare;
let FirstName = this.props.data[i].FirstName;
let LastName = this.props.data[i].LastName;
let Currency = this.props.data[i].Currency;
}
You can use an Array method
Still not 100% sure what your final data should look like, but it feels like you're using 2 arrays. 1 as the key and 1 as the value.
So to combine these 2 we can use Reduce
const keys = data[0];
const values = data[1];
keys.reduce((acc, keys, index) => {
return {...acc, [key]: values[index]}
}, {})
That will return an object of key values.

How to convert Blob back into a file with nodejs?

I'm currently working on an application that allows the user to upload a file which gets sent to a express server that then converts that file into a bytearray that I can then store somewhere else.
However, I need to be able to convert this bytearray back into a file and send it back to the user. This is my current code in the express API:
app.post("/upload", async (req, res) => {
const file = req.files.file;
const filePath = "divali";
file.mv(filePath, async (err) => {
const nfile = fs.readFileSync(filePath);
let fileData = nfile.toString("hex");
let result = [];
for (var i = 0; i < fileData.length; i += 2)
result.push("0x" + fileData[i] + "" + fileData[i + 1]);
console.log(result);
var pfile = new Blob([result], { type: "application/pdf" });
// var fileURL = URL.createObjectURL(pfile);
console.log(pfile);
pfile.lastModifiedDate = new Date();
pfile.name = "some-name";
console.log(pfile);
});
});

get download url from multiple file upload firebase storage

I am new in firebase and angularjs and i am having difficulties in getting download url from firebase storage and store them in firebase realtime database.
I was able to upload multiple files to firebase storage. the problem is when i store the download url into firebase realtime database, all database url value are same.It should different based each files downloadURL.
Here my script:
$scope.submitPhotos = function(file){
console.log(file);
var updateAlbum = [];
for (var i = 0; i < file.length; i++) {
var storageRef=firebase.storage().ref(albumtitle).child(file[i].name);
var task=storageRef.put(file[i]);
task.on('state_changed', function progress(snapshot){
var percentage=( snapshot.bytesTransferred / snapshot.totalBytes )*100;
if (percentage==100){
storageRef.getDownloadURL().then(function(url) {
var galleryRef = firebase.database().ref('gallery/'+albumkey);
var postkey = firebase.database().ref('gallery/'+albumkey).push().key;
updateAlbum={img:url};
firebase.database().ref('gallery/'+ albumkey+'/'+postkey).update(updateAlbum);
});
};
})
};
};
As you can see i was able store the url into database but all of the urls are same. What i need is every key store each different links from storage.
Any helps appreciated. Thanks
function uploadImg(file,i) {
return new Promise((resolve,reject)=>{
var storageRef=firebase.storage().ref("store-images/"+file[i].file.name);
task = storageRef.put(file[i].file);
task.on('state_changed', function progress(snapshot){
var percentage=( snapshot.bytesTransferred / snapshot.totalBytes )*100;
console.log(percentage);
// use the percentage as you wish, to show progress of an upload for example
}, // use the function below for error handling
function (error) {
console.log(error);
},
function complete () //This function executes after a successful upload
{
task.snapshot.ref.getDownloadURL().then(function(downloadURL) {
resolve(downloadURL)
});
});
})
}
async function putImage(file) {
for (var i = 0; i < file.length; i++) {
var dd = await uploadImg(file,i);
firebase.database().ref().child('gallery').push(dd);
}
}
Try using the code below:
$scope.submitPhotos = function(file){
console.log(file);
var updateAlbum = [];
for (var i = 0; i < file.length; i++) {
var storageRef=firebase.storage().ref(albumtitle).child(file[i].name);
var task=storageRef.put(file[i]);
task.on('state_changed', function progress(snapshot)
{
var percentage=( snapshot.bytesTransferred / snapshot.totalBytes )*100;
// use the percentage as you wish, to show progress of an upload for example
}, // use the function below for error handling
function (error) {
switch (error.code) {
case 'storage/unauthorized':
// User doesn't have permission to access the object
break;
case 'storage/canceled':
// User canceled the upload
break;
case 'storage/unknown':
// Unknown error occurred, inspect error.serverResponse
break;
}
}, function complete () //This function executes after a successful upload
{
let dwnURL = task.snapshot.downloadURL;
let galleryRef = firebase.database().ref('gallery/'+albumkey);
let postkey = firebase.database().ref('gallery/'+albumkey).push().key;
updateAlbum={img:dwnURL};
firebase.database().ref('gallery/'+ albumkey+'/'+postkey).update(updateAlbum);
});
};
};
All the best!

Uploading CustomData with ng-file-upload and WebApi

I am trying to upload a file along with some metadata to a WebApi Service that I have created with ng-file-upload and Angular. I am getting the file name and bytes as expected, but I am not able to get the metadata I am passing as well. Here is what I am doing on the Angular side
Upload.upload({
url: '/api/FileStorage/AddContent' + location.search,
data: {file: files, year: vm.year }
})
And the WebApi side
var streamProvider = new CustomMultipartFileStreamProvider();
IEnumerable<HttpContent> parts = null;
Task.Factory
.StartNew(() => parts = Request.Content.ReadAsMultipartAsync(streamProvider).Result.Contents,
CancellationToken.None,
TaskCreationOptions.LongRunning, // guarantees separate thread
TaskScheduler.Default)
.Wait();
var customData = streamProvider.CustomData;
Here I am using a MultiStreamProvider to get the file, here is the meat of that provider
public override Task ExecutePostProcessingAsync()
{
foreach (var file in Contents)
{
var parameters = file.Headers.ContentDisposition.Parameters;
var filename = GetNameHeaderValue(parameters, "filename");
var year = GetNameHeaderValue(parameters, "year");
}
return base.ExecutePostProcessingAsync();
}
I am able to get filename without issue, but am never able to get the year. Here is the value in the debugger when I am looking at the parameters variable
As you can see, the name is "name" and the value is "year" when I would expect the name to be "year" and value to be "2016" or whatever I am passing in. What am I doing wrong here and how do I get the metadata included in the same call to the Api?
We use a similar approach with ng-file-upload and WebAPI. To get the values out of the form data, we weren't able to use GetNameHeaderValue. We had to do some manual parsing. We decided to use modified version of what was posted at http://conficient.wordpress.com/2013/07/22/async-file-uploads-with-mvc-webapi-and-bootstrap/ to dynamically take a form and unload it to a strongly-typed Model. But basically, here's what it does in the ExecutePostProcessingAsync method:
public override async Task ExecutePostProcessingAsync()
{
var formData = new FormCollection();
for (int index = 0; index < Contents.Count; index++)
{
ContentDispositionHeaderValue contentDisposition = headers.ContentDisposition;
if (contentDisposition != null)
{
HttpContent formContent = Contents[index];
string formFieldName = UnquoteToken(contentDisposition.Name) ?? String.Empty;
// Read the contents as string data and add to form data
string formFieldValue = await formContent.ReadAsStringAsync();
formData.Add(formFieldName, formFieldValue);
}
//For your case
var filename = formData["filename"];
var year = formData["year"];
This is the UnquoteToken method this uses:
private static string UnquoteToken(string token)
{
if (String.IsNullOrWhiteSpace(token))
{
return token;
}
if (token.StartsWith("\"", StringComparison.Ordinal) && token.EndsWith("\"", StringComparison.Ordinal) && token.Length > 1)
{
return token.Substring(1, token.Length - 2);
}
return token;
}

Resources