I have a React/Redux front-end with an Express back-end application, and I'm rather new, but I have a question regarding how to deal with the flow of data.
So, on my front end side, I have a search bar. When a user enters a search term, I sent a post request from React which is handled in my Express routes.js file. In this file, I am taking that search term, and I am looking for that term in my Mongo database. After that, all I want to do is send an object back if the term was found in the database.
I have used axios in this application to make an HTTP request to a certain route to pull off some data, but that was within an app.get(...) on the express side, and I used an axios.get(...) on the React side to retrieve the information.
But, this situation is slightly different since the data is flowing in two directions. Initially, from a front end to the backend, and then back-end to front-end. And in this case, I'm using app.post(...).
Now my question is, how would I retrieve the data to the front end? Could I simply just do an axios.get(...) on an app.post(...) or is there some other way to do this?
If you GET from the browser to your back-end's route which is implemented to respond to POST only, you will probably get a 405 error. Implement a POST Axios request and a POST Express reply.
You can use either GET or POST, but you need to be consistent on the server and the client side. If you do an http GET from the client, the server will only respond if you have a app.get(...) as a server route.
As far as the flow of data is concerned, both a get and a post can return data, it just needs to be specified on the express route.
After the business logic of looking if the key exists in mongo do something like res.send({'found': true}) or res.json({'found': false}). This will ensure that the data gets back to the client.
If I were to do this, I would:
1.) Use an Axios get request and pass in as a parameter the identifying attributes, such as a related _id or key phrase.
2.) Use mongoDB's query filter search parameters to index and aggregate the schema data in the DB. I would probably use .findOne or .find.
3.) Use the router callback to pass in the filtered data, then dispatch a function to save it to a state.
This way you can set up specific terms or keywords to search with, and utilize the searched data throughout the app.
Related
I have been applying a pattern in some small apps I worked on, which next tools:
Frontend: ReactJS, Redux, Firebase (only authentication stuff)
Backend: Node with Express and the library for the database I was using (mysql or mongo)
The flow used to be like this:
Page loading and validating if user is signed in. If yes, fetching app data with backend. If not, just checking if user is not on a protected route and then let him navigate (create account, reset password, etc)
When fetch occurs and user signed in, backend send all data from that user (categories, stores, products, profile and any other info to let application run fluid) so when navigating there is not continuously loading to fetch each chunk of data. This also happens like this because is not a big amount of data (at the beginning), lets say maybe 1 or 2 stores with 15-20 products each one.
When updating data like changing product name, price or any kind of data mutation, the frontend send a request to the backend and then the update is done and a response of success true/false is sent as an answer for the request.
Let's say the previous step is a store creation. So, the frontend receives the success response and then dispatch an APPEND_STORE event to redux store to catch the new store with its data (only the ID + other data which is created on the backend is received as response on http request, data which was generated at the frontend is taken from the app itself) and append it to the stores array and then set the store array object again.
The reason why the previous step occurs like that is to avoid the backend first make the update and then make a new query to get again the new register (ej: create new store request -> backend creates the store and retrieve the new ID -> backend find/query the new register with its ID -> answer the data)
So, basically the question is about the pattern. So far this pattern had not give big troubles by now but I'm pretty sure there are some details I haven't seen from this point and also I think there are recommended ways to handle this that I couldn't find on internet yet.
I would like to know the best approach for this (or at least a better approach from the above) in order to implement it to bigger applications to handle more users and data which will increase as time goes.
Main questions:
When loading, should the backend process all data and send it to the frontend or should send minimal data to let the application begin and then send as navigation requires? I was also thinking when data grows up send a chunk of data with limit of 10/20 registers to keep the same approach but want to know the standard/correct way to handle it.
When a data validation fails on backend, should the backend answer with an OK status code with a key called success: "true/false" and additional data to handle this on the frontend, or should the backend answer with an error status code?
As I said, backend only answer with the data created by backend itself (ID, creation date, etc). Should the backend answer only with this or should backend make a new query to get full register and send it as response? Initially took this approach in order to optimize resources (minimum quantity of request and data sent by response). I'm also thinking maybe this is a dumb approach because today's world have a lot of resources so that difference should not change performance of anything. Also, with this answer my behavior on redux will change. Do you have any comments also from this redux approach?
Is it ok to make a query to database to first update and then another query to fetch all registered again? I know databases are created to handle multiple queries but I don't know if there are some cons to take on mind when doing like this.
Really thanks!
Sorry for a general question. My situation looks so: i have mongodb database and 2 reactjs pages. In each page i want to fetch a different information from database. Depending by your practice, which is the best way to fetch data from mongodb in a reactjs component?
I would recommend reading up on the MERN stack - tons of guides available online via google and youtube. The gist would be that a typical web application will consist of a few key components. In this case:
1 - (React) The client page rendered to the user
2 - (Node + Express) The server which processes data, allows you to use endpoints to make changes to your application. These endpoints make the necessary database queries. You can use a database client to write these queries as JavaScript within your NodeJS endpoints.
3 - (MongoDB) Your database.
So for instance a typical CRUD app allows you to create, read, update, and delete. Let's say you are looking at making a standard TODO list app.
You would need to make requests to these endpoints to perform these operations.
You could have a POST to /todo which would then insert a new document into your database.
You would need a way to read the information from the page... say a GET request to /todos to read all items. Or also a GET request to /todo/:id to get a specific item.
You would need a way to update an existing item... say a PUT request to /todo/:id with the updates you want to take place.
Finally you would need a way to delete an item... a DELETE request to /todo/:id which would delete the item.
Each of these endpoints would make a request to insert / read / update / delete items from the database, and return content to the client React code --> which then displays it to the user.
Frontend side, in react.js call api data using fetch() method. Pass your Mongodb URI string. If you want data in slot based use limit() and Skip() function for pagination.
Follow MVC pattern where your frontend only calla controller api. And controller calls DAO methods for Mongodb. You can Use Mongodb Stitch for serverless app.sor data leak can be avoided forntend side. Mongodb has connecting pool max.100 so that each time client request Mongodb connection cashed object given from pool to further speed up your connection time.
I am writing a simple filter function that sends a request and gets all entries that partially match the query. Desired behavior is if do a GET request localhost:3000/employees?email=mai I would like to receive all entries from a database starting with mai. Is there any way to construct an http requests for the desired behavior from a front-end part of the application? Perhaps some headers that tell the server that the search should not be strict? I have tryed googling it but failed. I am using AngularJS fro front-end and json-server as a back-end mock server
Up to my understanding if you filter data in front End you will get only that data filtered which you are showing. In the other hand If you do this using Query(Back End) all the data get filtered in DB.
I want to make a database query from frontend (Angular) to backend. But I need to send lots of parameters for that.
As far as I understand, if we are not making any database changes, it is better to use GET as it uses cached entries. POST should be used used if we need to make changes on server/DB.
But if I want to send many parameters (some are serialized objects) and make no server side changes, will it be alright to use POST request in that case and embed all parameters in the POST body instead of sending a huge URL encoded GET request?
To first clear this up: responses to POST requests can be cached, as long as the origin server returns the proper caching response headers. However, browsers and proxy servers generally don't apply caching to POST requests.
That being said, with the proper encoding you can store a lot of information in the ~ 2 KB of a query string, so GET should be the way to go.
If you're certain you'll go beyond the limits of a GET request and you'll need to go the POST way while remaining cacheable, you could introduce a kind of "nonce", for example using a hash of the search parameters:
Client does a POST to /search, with the search parameters.
Server stores the parameters somewhere, for example in a database.
Server generates a nonce from the parameters, for example by hashing the search parameters, or the row ID.
Server redirects the client to the result page for that nonce: /search/123abc.
Client requests the /search/123abc search results page.
Server performs the search based on the nonce, and returns a response which is cacheable.
This will introduce one additional HTTP roundtrip, but enable caching cross-browser and through proxy servers.
I think you should use post in this situation which is more manageable and looks clean. For more benefit of post follow these links:
Link 1
Link 2
I have a server written in Express that interfaces with an application written in Angular on the client.
My Express server is receiving a post from a third-party service to a route which will perform business logic, and then here is where I am a little uncertain about the best path forward.
After receiving the post variables, I want to redirect the request to an Angular route, but I want to make those received post variables available to the route as well.
Somehow, I want to be able to mix the res.json() and res.redirect() method, but I'm pretty sure they both end the response.
What would be a logical way to structure this?
Update: To expand on the issue, imagine I have a route called /receivetransaction which receives some postback variables, including transaction ID, amount etc. I want to perform business logic (save to a database), and then redirect the user to /thankyou (an angular route) but have them be able to access that data that was just received in the postback.
It looks like maybe my best option would be to save to the database, and then send the transaction-id as JSON to the angular view, which will then hit the database and pull the info. A little inefficient though (not really a big deal) but I would hope there would be another way around it.
What I've decided to do is the following:
After the express route receives the postback variables, it performs the business logic (specifically saving the data to the database), and redirects the request to the angular route with a query var indicating the id of the transaction.
The angular controller uses $location.search() to pull the transaction id from the query var, and from there it performs a get request to the express API, which performs authentication and loads the relevant information into $scope variables to be passed to the view.