i'm trying to send a post request to create a new task for a user, the mongoose schema of each task is:
let todoSchema = new AppSchema ({
userId: String,
title: String,
completed: Boolean
}, {versionKey: false})
and when i send the request, the task is created without the userId.
this is the request:
add = () => {
const task = {
userId: this.props.id,
title: this.state.title,
completed: false,
}
if(task.title) {
axiosUtils.create('http://localhost:8000/todos/', task)
alert('task created!')
}
}
*the typeof task.userId is string, it is the mongoDB _id of the specific user toString()
The function axios.create() just creates an axios instance with the specified config to send requests, but does not actually send any request.
Here's the documentation about how to send a POST request with axios.
You are doing axios.create() which is used to create an instance of axios.
To know more about axios.create() you can refer the official axios doc:
https://axios-http.com/docs/instance
In order to create a task you need to make a post request like:
axios.post("http://localhost:8080/todos/", { taskParam: task })
Here the taskParam is the name of that variable you are using to accept the task at backend in req.body most probably.
Related
I must be really stupid, But I have been struggling for weeks to try solve this issue, and all the digging I have done (in Stack overflow and MS Documentation) has yielded no results (or I'm too stupid to implement auth correctly)
I have a dotnet service which needs to act as an API - both for an application to post data to (an exe which logs exception data), and for a UI (react app) to get the posted exceptions
the exe can successfully send data to the dotnet app after first getting a token from login.microsoftonline.com and then sending the token (and secret) in the http request.
A sample postman pre-request script of the auth used (I've set all the secret stuff as environment variables):
pm.sendRequest({
url: 'https://login.microsoftonline.com/' + pm.environment.get("tenantId") + '/oauth2/v2.0/token',
method: 'POST',
header: 'Content-Type: application/x-www-form-urlencoded',
body: {
mode: 'urlencoded',
urlencoded: [
{key: "grant_type", value: "client_credentials", disabled: false},
{key: "client_id", value: pm.environment.get("clientId"), disabled: false},
{key: "client_secret", value: pm.environment.get("clientSecret"), disabled: false}, //if I don't configure a secret, and omit this, the requests fail (Azure Integration Assistant recommends that you do not configure credentials/secrets, but does not provide clear documentation as to why, or how to use a daemon api without it)
{key: "scope", value: pm.environment.get("scope"), disabled: false}
]
}
}, function (err, res) {
const token = 'Bearer ' + res.json().access_token;
pm.request.headers.add(token, "Authorization");
});
Now in React, I am using MSAL(#azure/msal-browser) in order to login a user, get their token, and pass the token to one of the dotnet endpoints using axios as my http wrapper, but no matter what I do, it returns http status 401 with WWW-Authenticate: Bearer error="invalid_token", error_description="The signature is invalid".
A simplified code flow to login user and request data from the API:
import {publicClientApplication} from "../../components/Auth/Microsoft";//a preconfigured instance of PublicClientApplication from #azure/msal-browser
const data = await publicClientApplication.loginPopup();
// ... some data validation
publicClientApplication.setActiveAccount(data.account);
// .. some time and other processes may happen here so we don't access token directly from loginPopup()
const activeAccout = publicClientApplication.getActiveAccount();
const token = publicClientApplication.acquireTokenSilent(activeAccount).accessToken;
const endpointData = await api()/*an instance of Axios.create() with some pre-configuration*/.get(
'/endpoint',
{ headers: {'Authorization': `bearer ${token}`} }); // returns status 401
The dotnet service has the following configurations
public void ConfigureServices(IServiceCollection services){
...
var authScheme = services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme);
authScheme.AddMicrosoftIdentityWebApi(Configuration.GetSection("AzureAd"));
...
}
namespace Controllers{
public class EndpointController : ControllerBase{
...
[Authorize]
[HttpGet]
public IActionResult GetEndpoint(){
return Ok("you finally got through");
}
}
}
I've literally tried so many things that I've lost track of what I've done...
I've even cried myself to sleep over this - but that yielded no results
i can confirm that running the request in postman, with the pre request script, it is possible to get the response from the endpoint
So....
After much digging and A-B Testing I was able to solve this issue.
I discovered that I was not sending the API scope to the OAuth token endpoint. To do this I needed to change the input for acquireTokenSilent.
The updated code flow to login user and request data from the API:
import {publicClientApplication} from "../../components/Auth/Microsoft";//a preconfigured instance of PublicClientApplication from #azure/msal-browser
const data = await publicClientApplication.loginPopup();
// ... some data validation
publicClientApplication.setActiveAccount(data.account);
// .. some time and other processes may happen here so we don't access token directly from loginPopup()
const activeAccout = publicClientApplication.getActiveAccount();
const token = publicClientApplication.acquireTokenSilent({scopes:["api://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/.default"],account:activeAccount}).accessToken;//here scopes is an array of strings, Where I used the api URI , but you could directly use a scope name like User.Read if you had it configured
const endpointData = await api()/*an instance of Axios.create() with some pre-configuration*/.get(
'/endpoint',
{ headers: {'Authorization': `bearer ${token}`} }); // returns status 401
Im using next.js and Stripe webhooks to insert checkout sessions to Supabase that will create a customer's order history. I'm able to get the information about the whole order written to a table called 'orders', but am wondering what the best way to add individual items within each checkout session to another table called 'order_items' is. This way I can map through main orders and then the children items. Appreciate any help provided. Here is what I have for getting orders associated with a customer:
const upsertOrderRecord = async (session: Stripe.Checkout.Session, customerId: string) => {
const { data: customerData, error: noCustomerError } = await supabaseAdmin
.from<Customer>('customers')
.select('id')
.eq('stripe_customer_id', customerId)
.single();
if (noCustomerError) throw noCustomerError;
const { id: uuid } = customerData || {};
const sessionData: Session = {
id: session.id,
amount_total: session.amount_total ?? undefined,
user_id: uuid ?? undefined
};
const { error } = await supabaseAdmin.from<Session>('orders').insert([sessionData], { upsert: true });
if (error) throw error;
console.log(`Product inserted/updated: ${session.id}`);
};
The Checkout Session object contains a line_items field which is a list of each item included in the purchase.
However this field is not included in the object by default, and therefore won't be a part of your webhook payload. Instead you'll need to make an API call in your webhook handle to retrieve the Checkout Session object, passing the expand parameter to include the line_items field:
const session = await stripe.checkout.sessions.retrieve('cs_test_xxx', {
expand: ['line_items']
});
EDIT: Since I wasn't able to find a correct solution, I changed the
application's structure a bit and posted another question:
Mongoose - find documents not in a list
I have a MEAN app with three models: User, Task, and for keeping track of which task is assigned to which user I have UserTask, which looks like this:
const mongoose = require("mongoose");
const autopopulate = require("mongoose-autopopulate");
const UserTaskSchema = mongoose.Schema({
completed: { type: Boolean, default: false },
userId: {
type: mongoose.Schema.Types.ObjectId,
ref: "User",
autopopulate: true
},
taskId: {
type: mongoose.Schema.Types.ObjectId,
ref: "Task",
autopopulate: true
}
});
UserTaskSchema.plugin(autopopulate);
module.exports = mongoose.model("UserTask", UserTaskSchema);
In my frontend app I have AngularJS services and I already have functions for getting all users, all tasks, and tasks which are assigned to a particular user (by getting all UserTasks with given userId. For example:
// user-task.service.js
function getAllUserTasksForUser(userId) {
return $http
.get("http://localhost:3333/userTasks/byUserId/" + userId)
.then(function(response) {
return response.data;
});
}
// task-service.js
function getAllTasks() {
return $http.get("http://localhost:3333/tasks").then(function(response) {
return response.data;
});
}
Then I'm using this data in my controllers like this:
userTaskService
.getAllUserTasksForUser($routeParams.id)
.then(data => (vm.userTasks = data));
...and because of autopopulate plugin I have complete User and Task objects inside the UserTasks that I get. So far, so good.
Now I need to get all Tasks which are not assigned to a particular User. I guess I should first get all Tasks, then all UserTasks for a given userId, and then make some kind of difference, with some "where-not-in" kind of filter.
I'm still a newbie for all the MEAN components, I'm not familiar with all those then()s and promises and stuff... and I'm really not sure how to do this. I tried using multiple then()s but with no success. Can anyone give me a hint?
You can do at server/API side that will more efficient.
In client side, if you want to do then try below
var userid = $routeParams.id;
userTaskService
.getAllTasks()
.then((data) => {
vm.userTasks = data.filter(task => task.userId !== userid)
});
I have a Lambda function that handles reading data from a file(stored inside S3 bucket) as well as inserting data to a Dynamodb table. This Lambda function is exposed as a REST endpoint using API gateway. The function accepts GET request as well as POST request. I'm making GET/POST requests from my REACT project using axios and aws4(for signing) libraries. GET request is to read data from a file stored inside S3 and it works just fine. And POST request is for inserting data into Dynamodb table. However, it doesn't work and AWS returns InvalidSignatureException error as a respond. This is an excerpt of my code :
createAWSSignedRequest(postData) {
let request = {};
if (postData) {
request = {
host: process.env.AWS_HOST,
method: 'POST',
url: process.env.AWS_URL,
path: process.env.AWS_PATH,
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(postData)
}
} else {
request = {
host: process.env.AWS_HOST,
method: 'GET',
url: process.env.AWS_URL,
path: process.env.AWS_PATH
}
}
let signedRequest = aws4.sign(request, {
secretAccessKey: process.env.AWS_SECRET_KEY,
accessKeyId: process.env.AWS_ACCESS_KEY
});
return signedRequest;
}
This is how GET request is made :
let signedRequest = this.createAWSSignedRequest('GET');
axios(signedRequest)
.then(response => {
})
.catch((error) => {
console.log("error",error);
});
This is how POST request is made :
const data = {
uuid: "916b7d90-0137-11e8-94e6-116965754e23", //just a mock value
date : "22/jan/2018",
user_response: [
{
question:"this is quesiton1",
choice:"user selected A"
},
{
question:"this is quesiton2",
choice: "user selected b"
},
{
question:"this is quesiton3",
choice: "user selected C"
}
]
};
let signedRequest = this.createAWSSignedRequest(data);
axios(signedRequest)
.then(response => {
......
})
.catch((error) => {
console.log("error",error);
});
As you can see, the code for both GET and POST requests are exactly the same (except payload and method type). I'm singing with the same secret access key and access key id for both requests. I'm not sure why one request results in "InvalidSignatureException" when the other doesn't. Can anyone shed a light on this issue for me.
Thanks
After having discussion with AWS4 lib developer, I figured out what I did wrong. AWS4 uses "body" as a payload attribute to compute signature. However, Axios uses "data" attribute as payload. My mistake was only setting either one of them. So when I set just "data" attribute, the payload was present in the request and content-length is computed correctly. However, the signature was incorrect since the payload was not taken into consideration when computing signature. When I set just "body", payload was not present in the request because Axios does not use "body" attribute for payload. The solution is to set both attributes with payload. I hope this helps to anyone who are having the same issue I have.
If you use the AWS Amplify library it has a module called API which should fit your use cases, and it will perform Sigv4 signing for you either with authenticated or unauthenticated roles. The Auth category uses Cognito as the default implementation. For instance:
npm install aws-amplify --save
Then import and configure the lib:
import Amplify, { API } from 'aws-amplify';
Amplify.configure({
Auth: {
identityPoolId: 'XX-XXXX-X:XXXXXXXX-XXXX-1234-abcd-1234567890ab',
region: 'XX-XXXX-X'
},
API: {
endpoints: [
{
name: "APIName",
endpoint: "https://invokeURI.amazonaws.com"
}
]
}
});
Then for your API Gateway endpoint calling a Lambda:
let apiName = 'MyApiName';
let path = '/path';
let options = {
headers: {...} // OPTIONAL
}
API.get(apiName, path, options).then(response => {
// Add your code here
});
More info here: https://github.com/aws/aws-amplify
I'm using Apollo with the Scaphold.io service and for writing blobs to that service I need to be able to add additional options to the request body.
The full Scaphold example can be found here: https://scaphold.io/docs/#uploading-files
But it does something like this:
form.append("variables", JSON.stringify({
"input": {
"name": "Mark Zuck Profile Picture",
"userId": "VXNlcjoxMA==",
"blobFieldName": "myBlobField"
}
}));
// The file's key matches the value of the field `blobFieldName` in the variables
form.append("myBlobField", fs.createReadStream('./mark-zuckerberg.jpg'));
fetch("https://us-west-2.api.scaphold.io/graphql/scaphold-graphql", {
method: 'POST',
body: form
}).then(function(res) {
return res.text();
}).then(function(body) {
console.log(body);
});
Where it adds a blobField to the request body.
Based on this, I need to pass into the variables a blobFieldName property, and then add to the request body the value passed to that property with the blob. However, using Apollo I can't add to the request body. I tried the following, but it's absent from the request body when I check in the network inspector:
export const withCreateCoverPhoto = graphql(CreateCoverPhoto, {
props: ({mutate}) => ({
createCoverPhoto: (name, file) => mutate({
variables: {
input: {
blobFieldName: name,
},
},
[name]: file
}),
}),
});
Please advise.
Thanks for the question. Currently the best way to upload files is by [appending it to FormData and sending it up to the server using fetch] (https://medium.com/#danielbuechele/file-uploads-with-graphql-and-apollo-5502bbf3941e). Basically, you won't be able to have caching here for the file itself, but this is the best way to handle uploading files on Scaphold using a multipart request.