Amplify AppSync doesn't upload S3Object file from client - reactjs

First, when the docs at https://aws-amplify.github.io/docs/js/api#complex-objects say:
input CreateTodoInput {
id: ID
name: String!
description: String
file: S3ObjectInput # This input type will be generated for you
}
I get an error Type "S3ObjectInput" not found in document. and I have to add S3ObjectInput manually.
This is my schema (the docs are not very clear on it so I put it together from similar questions)
type Picture #model {
id: ID!
file: S3Object!
url: String!
rating: Int
appearedForRanking: Int
}
type S3Object {
bucket: String!
key: String!
region: String!
}
input CreatePictureInput {
id: ID
file: S3ObjectInput!
url: String!
rating: Int
appearedForRanking: Int
}
input S3ObjectInput {
bucket: String!
region: String!
localUri: String
visibility: Visibility
key: String
mimeType: String
}
enum Visibility {
public
protected
private
}
And this is the client code (with React)
class PictureUpload extends Component {
state = { fileUrl: '', file: '', filename: '' }
handleChange = e => {
let file = e.target.files[0]
let filext = file.name.split('.').pop()
let filename = uuid() + '.' + filext
this.setState({
fileUrl: URL.createObjectURL(file),
filename: filename
})
}
saveFile = async () => {
let visibility = 'public'
let fileObj = {
bucket: awsConfig.aws_user_files_s3_bucket,
region: awsConfig.aws_user_files_s3_bucket_region,
key: visibility + '/' + this.state.filename,
mimeType:'image/jpeg',
localUri: this.state.fileUrl,
visibility: visibility
}
try {
const picture = await API.graphql(
graphqlOperation(mutations.createPicture, {
input: {
url: this.state.filename,
file: fileObj
}
})
)
The problem is that the mutation runs without errors, setting the DB records, but the file does not appear in S3. The docs say the SDK uploads the file to Amazon S3 for you. so I don't think I forgot to add something.
Any idea why the upload doesn't happen?

Automatic upload of file to S3 happens only if using the aws-appsync package, with aws-amplify you need to upload the file yourself using Storage.put(...).
This GitHub issue explain the differences in more detail

For ReactNative I've found that you can't simply provide a uri, but rather a blob. Try this code instead:
const response = await fetch(uri);
const blob = await response.blob();
let file = {
bucket,
key,
region,
localUri: blob,
mimeType,
};
This should get the image data to S3 as long as your authentication is properly configured.

Related

How to send attachment from input to an Email using ReactJS

I'm trying to use Nodemailer for this, but I can't use an attachment in the input, or I don't know how
https://nodemailer.com/message/attachments/
please help, anything is helpful for me
postuler.js
let details = {
name: name.value,
prenom: prenom.value,
email: email.value,
telephone: telephone.value,
cv: cv.file,
profil: profil.value,
motivation: motivation.value
};
let response = await fetch("http://localhost:5000/postuler", {
method: "POST",
headers: {
"Content-Type": "application/json;charset=utf-8",
},
body: JSON.stringify(details),
});
server.js
router.post("/postuler", (req, res) => {
const name = req.body.name;
const prenom = req.body.prenom
const email = req.body.email;
const telephone = req.body.telephone;
const cv = req.files;
const profil = req.body.profil;
const motivation = req.body.motivation;
const mail = {
from: email,
to: "*************#gmail.com",
subject: `Contact Form ${name} ${prenom}`,
html: `<p>Name et Prenom: ${name} ${prenom}</p>
<p>Email: ${email}</p>
<p>Telephone: ${telephone}</p>
<p>Linkedin: ${profil}</p>
<p>Motivation: ${motivation}</p>
`,
attachments: [
{ // use URL as an attachment
filename: cv.originalname,
contentType: 'application/pdf',
path: cv.path
},
]
};
You tring to send req.file.path as an attachment.
Nodemailer get files as zip , csv,pdf and ect...
Req.file.path it's tricky because the nodemailer don't no where the path is .
Try to zip or save your files in to one file in some folder (you can use fs or js-zip modules) and than give the path of the file as an path in the attachment object.

How do I get the video url with ytdl-core?

const { YTSearcher } = require('ytsearcher');
const searcher = new YTSearcher({
key: config.YOUTUBE_API_KEY,
revealed: true
});
let result = await searcher.search(args.join(" "), { type: "video" })
var songInfo = await ytdl.getInfo(result.first.url)
var song = {
title: songInfo.videoDetails.title,
url: songInfo.videoDetails.video_url
};
I'm using ytdl-core to play music using a discord bot, when I try to play a song it sends the error "Cannot read property 'url' of underfined"

Trying to get SalesForce to recognize an Attachment as a PDF

I am able to use sObject to put an Attachment onto one of my records. The problem is that SF is not recognizing the file as a PDF but as a generic file.
const base64data = await new Buffer.from(pdfBuffer).toString('base64');
try {
await conn.sobject('Attachment').create({
ParentId: filename,
Name: resumeFileName,
Body: base64data,
ContentType: fileType,
Description: 'Resume Attachment',
});
} catch (e) {
console.log('Attachment Error', e);
}
When I look at the attachments of my record, the file does not have all of the options that a PDF file has (only download and delete)
Thanks in advance!
Turns out in order for Salesforce to recognize the pdf correctly you need to have the content type set to application/pdf AND the name of the file must include the .pdf extension. This worked for me:
(async () => {
const jsforce = require('jsforce');
const fs = require('fs');
var conn = new jsforce.Connection({
instanceUrl : '...',
accessToken : '...'
});
const pdfData = fs.readFileSync('./test.pdf').toString('base64');
try {
await conn.sobject('Attachment').create({
ParentId: '0012300000RWedX',
Name: 'My Test PDF.pdf', // <= Turns out the name has to have .pdf
Body: pdfData,
ContentType: 'application/pdf',
Description: 'Testing PDF Attachment',
});
} catch(err) {
console.error(err);
}
})();

Does uuidv1() generate differently?

I am confused about uuidv1(). In the following code it uses uuidv1() as a salt and encrypt a password. But I thought that uuidv1() generates different strings so that I am not able to use it to encrypting a password.
Does uuidv1() generate always the same strings?
const mongoose = require("mongoose");
const uuidv1 = require("uuid/v1");
const crypto = require("crypto");
const { ObjectId } = mongoose.Schema;
const userSchema = new mongoose.Schema({
name: {
type: String,
trim: true,
required: true
},
email: {
type: String,
trim: true,
required: true
},
hashed_password: {
type: String,
required: true
},
salt: String,
...
});
// virtual field
userSchema
.virtual("password")
.set(function(password) {
// create temporary variable called _password
this._password = password;
// generate a timestamp
this.salt = uuidv1();
// encryptPassword()
this.hashed_password = this.encryptPassword(password);
})
.get(function() {
return this._password;
});
// methods
userSchema.methods = {
authenticate: function(plainText) {
return this.encryptPassword(plainText) === this.hashed_password;
},
encryptPassword: function(password) {
if (!password) return "";
try {
return crypto
.createHmac("sha1", this.salt)
.update(password)
.digest("hex");
} catch (err) {
return "";
}
}
};
uuidv1 does generate a unique output everytime which is why you save that as salt in the user model
so uuid creates salt which is like alphabet for crypting strings, uuid has like v1, v2 ... check out their npm doc, i checked it and it was simple, then your userSchema encryptPassword crypts your password using crypto (you imported it in user model) based on that "salt" alphabet and you store the outcome as hashed_password which in future will be used in comparison, based on the saved salt every time

How to retrieve multiple image from Amazon S3 using imgURL at once?

I want to retrieve list of images in one go from Amazon S3 based on image URL.
Currently I am able to fetch single image using the following code:-
AWS.config.update({
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey
});
AWS.config.region = region;
var bucketInstance = new AWS.S3();
var params = {
Bucket: bucketName,
Key: awsImgUrl
}
bucketInstance.getObject(params, function (err, file) {
if (file) {
var dataSrc = "data:" + file.ContentType + ";base64," + EncodeData(file.Body);
callbackSuccess(dataSrc);
} else {
callbackSuccess("Error");
}
});
EncodeData = function (data) {
var str = data.reduce(function (a, b) { return a + String.fromCharCode(b) }, '');
return btoa(str).replace(/.{76}(?=.)/g, '$&\n');
}
In my scenario I have multiple S3 image url like awsImgUrl1, awsImgUrl2..awsImgUrln.
How to fetch it in one go instead of one by one?
You cannot get more than one image per api call with S3. You can however make multiple calls in parallel.
Using promises this is straightforward.
var bucketInstance = new AWS.S3();
var imageKeys = [ awsImgUrl1, awsImgUrl2, awsImgUrl3];
var promisesOfS3Objects = imageKeys.map(function(key) {
return bucketInstance.getObject({
Bucket: bucketName,
Key: key
}).promise()
.then(function (file) {
return "data:" + file.ContentType + ";base64," + EncodeData(file.Body);
})
})
Promise.all(promisesOfS3Objects)
.then(callbackSuccess) // callbackSuccess is called with an array of string
.catch(function() { callbackSuccess("Error") })
You can change the way you upload the image data. Instead of uploading a single image, upload one document containing multiple image datas.
const addImageBlock = () => {
var photoBlock = [
{
imageId: 'id',
type: 'png',
body: 'data:image/png;base64,iVBORw0K...'
},
{
imageId: 'id2',
type: 'png',
body: 'data:image/png;base64,iVBORw0K...'
},
{
imageId: 'id3',
type: 'png',
body: 'data:image/png;base64,iVBORw0K...'
},
{
imageId: 'id4',
type: 'png',
body: 'data:image/png;base64,iVBORw0K...'
}
//...ect
];
s3.upload({
Key: photoBlockId + '.json',
Body: photoBlock,
ACL: 'public-read'
}, function(err, data) {
if (err) {
return alert('There was an error', err.message);
}
});
}
Then when you receive this data with one s3 call, you can loop through and render the images on the frontend,
getObject(params, function (err, file) {
imageArr = [];
if (file) {
JSON.parse(file.toString()).map((image) => {
var image = new Image();
image.src = image.body;
imageArr.push(image)
})
callbackSuccess(imageArr);
}
else {
callbackSuccess("Error");
}
});
AWS SDK does not have any method to read multiple files as once and same with console, you can not download multiple files at once.
they have only GetObject method do read a object in bucket by key only.
so in your case you have to read one by one with their key name only if you already have key names as list..
you can get summary of objects in bucket if you would like to get list of objects then put a loop to download all files.

Resources