I'm developing a small react/redux/saga application. It is basically a multistep form with Back and Next buttons, and a progress bar. Each form step (or form screen) has a number of fields.
According to spec, the form should be able to store the user entered data on server step-by-step – i.e., upon each click on the Next button. This button acts like a typical submit button for the current form step, but it should also take the user further if all requests are successful. If the server returns an error for at least one field on the current step, the user should not have the possibility to progress to the next one.
The catch is that requests to the server are not predetermined – some entities may be modified, some – added, and some – deleted. This all depends on the user choices.
So, technically, before letting the user move to the next step, I must ensure that only needed requests were sent to the server, and, of course, that the server responded with success statuses to those requests.
I'm using redux-saga to manage side effects, and I just don't see how I can make individual requests (like addUserEducation, deleteUserEducation, etc.) from within the component itself and still meet the spec. It feels like this specific case forces me to put a significant part of my "business" logic into a saga that takes all data from the form screen and use it to decide on what should be sent to the server. Since I use a blocking call for each request, I can, hopefully, rely on the error selector in the end of the saga to take (or not to take) the user to the next step conditionally.
This approach works but I`m not sure if it agrees with redux-saga best practises because a number of my sagas become too "fat" and not strictly doing only side effects stuff. I think it may be suboptimal and there should be more elegant alternatives to solve this problem. Please help!
// Education.js (one of the form screens)
const sendForms = (nextStep, history) => {
handleEducation(
{ userId, educationList, forms, formList },
nextStep,
history,
);
};
// userSagas.js
function* handleEducation({ payload }) {
try {
const {
screenData: { userId, educationList, forms, formList },
nextStep,
history,
} = payload;
// Collect ids of the forms that were `dropped` by the user,..
const formsToDelete = educationList.reduce((acc, edObj) => {
if (!Object.prototype.hasOwnProperty.call(forms, edObj.id)) {
return [...acc, edObj.id];
}
return acc;
}, []);
// ...and delete the corresponding educations on server.
yield all(
formsToDelete.map(educationId =>
call(deleteUserEducation, userId, educationId),
),
);
// Handle forms that are still present.
yield all(
formList.map(([educationId, form], index) => {
const { school, startYear, endYear, toDate } = form;
let fields = {
institution: school,
enrolled_on: startYear,
graduated_on: endYear,
is_current: toDate,
};
const isLast = index + 1 === formList.length;
if (educationId.includes('new')) {
// Add new education
if (isLast && toDate === true) {
// Omit `graduated_on` field
const { graduated_on: graduatedOn, ...withoutGraduated } = fields;
fields = withoutGraduated;
return call(addUserEducation, userId, fields);
}
return call(addUserEducation, userId, fields);
}
// ...more code here...
const errors = yield select(userSelectors.selectErrors);
if (!errors) {
history.push(steps[nextStep].route);
}
} catch (er) {
const { response } = er;
console.log('errors: ', response.data);
}
}
Related
I'm new to React as we are trying to migrate our app from AngularJS. One thing I'm struggling to wrap my head around is what's the best way to make and cache state mapping requests.
Basically, I would do a search, that returns a list of objects and one of the field is a status code (e.g. 100, 200, 300, etc.), some number. To display the result, I need to map that number to a string and we do that with a http request to the server, something like this:
GET /lookup/:stateId
So my problem now is:
I have a list of results but not many different states, how can I make that async call (useEffect?) to make that lookup only once for different stateId? Right now, I can get it to work, but the request is made on every single mapping. I'm putting the Axio call in a utility function to try and reuse this across multiple pages doing similar things, but is that the "React" way? In AngularJS, we use the "|" filter to map the code to text.
Once I have that mapping id => string, I want to store it in cache so next one that needs to map it no longer make the http request. Right now, I put the "cache" in the application level context and use dispatch to update/add values to the cache. Is that more efficient? It appears if I do a language change, where I keep the language in the same application context state, the cache would be re-initialized, and I'm not sure what other things would reset that. In AngularJS, we used the $rootState to 'cache'.
Thanks for any pointers!
In a lookupUtil.js
const DoLookupEntry = async (entryId) => {
const lookupUrl = `/lookup/${entryId}`;
try {
const response = await Axios.get(looupUrl,);
return response.data;
} catch (expt) {
console.log('error [DoLookupEntry]:',expt);
}
}
In a formatUtils.js
const formatLookupValue = (entryId) => {
const appState = useContext(AppContext);
const appDispatch = useContext(DispatchContext);
const language = appState.language;
if (appState.lookupCache
&& appState.lookupCache[entryId]
&& appState.lookupCache[entryId][language]) {
// return cached value
const entry = appState.lookupCache[entryId][language];
return entry.translatedValue;
}
// DoLookup is async, but we are not, so we want to wait...
DoLookupEntry(entryId)
.then((entry) => { // try to save to cache when value returns
appDispatch({type: States.APP_UPDATE_LOOKUP_CACHE,
value:{language, entry}})
return entry.translatedValue;
});
}
And finally the results.js displaying the result along the line (trying formatLookupValue to map the id):
{searchState.pageResults.map((item) => {
return (
<tr>
<td><Link to={/getItem/item.id}>{item.title}</Link></td>
<td>{item.detail}</td>
<td>{formatLookupValue(item.stateId)}</td>
</tr>
)
})}
I have to pretty weird case to handle.
We have to few boxes, We can call some action on every box. When We click the button inside the box, we call some endpoint on the server (using axios). Response from the server return new updated information (about all boxes, not the only one on which we call the action).
Issue:
If user click submit button on many boxes really fast, the request call the endpoints one by one. It's sometimes causes errors, because it's calculated on the server in the wrong order (status of group of boxes depends of single box status). I know it's maybe more backend issue, but I have to try fix this on frontend.
Proposal fix:
In my opinion in this case the easiest fix is disable every submit button if any request in progress. This solution unfortunately is very slow, head of the project rejected this proposition.
What we want to goal:
In some way We want to queue the requests without disable every button. Perfect solution for me at this moment:
click first button - call endpoint, request pending on the server.
click second button - button show spinner/loading information without calling endpoint.
server get us response for the first click, only then we really call the second request.
I think something like this is huge antipattern, but I don't set the rules. ;)
I was reading about e.g. redux-observable, but if I don't have to I don't want to use other middleware for redux (now We use redux-thunk). Redux-saga it will be ok, but unfortunately I don't know this tool. I prepare simple codesandbox example (I added timeouts in redux actions for easier testing).
I have only one stupid proposal solution. Creating a array of data needs to send correct request, and inside useEffect checking if the array length is equal to 1. Something like this:
const App = ({ boxActions, inProgress, ended }) => {
const [queue, setQueue] = useState([]);
const handleSubmit = async () => { // this code do not work correctly, only show my what I was thinking about
if (queue.length === 1) {
const [data] = queue;
await boxActions.submit(data.id, data.timeout);
setQueue(queue.filter((item) => item.id !== data.id));
};
useEffect(() => {
handleSubmit();
}, [queue])
return (
<>
<div>
{config.map((item) => (
<Box
key={item.id}
id={item.id}
timeout={item.timeout}
handleSubmit={(id, timeout) => setQueue([...queue, {id, timeout}])}
inProgress={inProgress.includes(item.id)}
ended={ended.includes(item.id)}
/>
))}
</div>
</>
);
};
Any ideas?
I agree with your assessment that we ultimately need to make changes on the backend. Any user can mess with the frontend and submit requests in any order they want regardless how you organize it.
I get it though, you're looking to design the happy path on the frontend such that it works with the backend as it is currently.
It's hard to tell without knowing the use-case exactly, but there may generally be some improvements we can make from a UX perspective that will apply whether we make fixes on the backend or not.
Is there an endpoint to send multiple updates to? If so, we could debounce our network call to submit only when there is a delay in user activity.
Does the user need to be aware of order of selection and the impacts thereof? If so, it sounds like we'll need to update frontend to convey this information, which may then expose a natural solution to the situation.
It's fairly simple to create a request queue and execute them serially, but it seems potentially fraught with new challenges.
E.g. If a user clicks 5 checkboxes, and order matters, a failed execution of the second update would mean we would need to stop any further execution of boxes 3 through 5 until update 2 could be completed. We'll also need to figure out how we'll handle timeouts, retries, and backoff. There is some complexity as to how we want to convey all this to the end user.
Let's say we're completely set on going that route, however. In that case, your use of Redux for state management isn't terribly important, nor is the library you use for sending your requests.
As you suggested, we'll just create an in-memory queue of updates to be made and dequeue serially. Each time a user makes an update to a box, we'll push to that queue and attempt to send updates. Our processEvents function will retain state as to whether a request is in motion or not, which it will use to decide whether to take action or not.
Each time a user clicks a box, the event is added to the queue, and we attempt processing. If processing is already ongoing or we have no events to process, we don't take any action. Each time a processing round finishes, we check for further events to process. You'll likely want to hook into this cycle with Redux and fire new actions to indicate event success and update the state and UI for each event processed and so on. It's possible one of the libraries you use offer some feature like this as well.
// Get a better Queue implementation if queue size may get high.
class Queue {
_store = [];
enqueue = (task) => this._store.push(task);
dequeue = () => this._store.shift();
length = () => this._store.length;
}
export const createSerialProcessor = (asyncProcessingCallback) => {
const updateQueue = new Queue();
const addEvent = (params, callback) => {
updateQueue.enqueue([params, callback]);
};
const processEvents = (() => {
let isReady = true;
return async () => {
if (isReady && updateQueue.length() > 0) {
const [params, callback] = updateQueue.dequeue();
isReady = false;
await asyncProcessingCallback(params, callback); // retries and all that include
isReady = true;
processEvents();
}
};
})();
return {
process: (params, callback) => {
addEvent(params, callback);
processEvents();
}
};
};
Hope this helps.
Edit: I just noticed you included a codesandbox, which is very helpful. I've created a copy of your sandbox with updates made to achieve your end and integrate it with your Redux setup. There are some obvious shortcuts still being taken, like the Queue class, but it should be about what you're looking for: https://codesandbox.io/s/dank-feather-hqtf7?file=/src/lib/createSerialProcessor.js
In case you would like to use redux-saga, you can use the actionChannel effect in combination with the blocking call effect to achieve your goal:
Working fork:
https://codesandbox.io/s/hoh8n
Here is the code for boxSagas.js:
import {actionChannel, call, delay, put, take} from 'redux-saga/effects';
// import axios from 'axios';
import {submitSuccess, submitFailure} from '../actions/boxActions';
import {SUBMIT_REQUEST} from '../types/boxTypes';
function* requestSaga(action) {
try {
// const result = yield axios.get(`https://jsonplaceholder.typicode.com/todos`);
yield delay(action.payload.timeout);
yield put(submitSuccess(action.payload.id));
} catch (error) {
yield put(submitFailure());
}
}
export default function* boxSaga() {
const requestChannel = yield actionChannel(SUBMIT_REQUEST); // buffers incoming requests
while (true) {
const action = yield take(requestChannel); // takes a request from queue or waits for one to be added
yield call(requestSaga, action); // starts request saga and _waits_ until it is done
}
}
I am using the fact that the box reducer handles the SUBMIT_REQUEST actions immediately (and sets given id as pending), while the actionChannel+call handle them sequentially and so the actions trigger only one http request at a time.
More on action channels here:
https://redux-saga.js.org/docs/advanced/Channels/#using-the-actionchannel-effect
Just store the promise from a previous request and wait for it to resolve before initiating the next request. The example below uses a global variable for simplicity - but you can use smth else to preserve state across requests (e.g. extraArgument from thunk middleware).
// boxActions.ts
let submitCall = Promise.resolve();
export const submit = (id, timeout) => async (dispatch) => {
dispatch(submitRequest(id));
submitCall = submitCall.then(() => axios.get(`https://jsonplaceholder.typicode.com/todos`))
try {
await submitCall;
setTimeout(() => {
return dispatch(submitSuccess(id));
}, timeout);
} catch (error) {
return dispatch(submitFailure());
}
};
I have an application that triggers many update and I would like to know more about the best way to update the app properly.
In my app, I have 5 slots to fill with books (can be managed by drag and drop). When the app launches, the filled book for the user are loaded and are stored in the state.
Problem : when I update a book, like if I switch the position of 2 books in my list, I must do some operations to say "this book belongs here now and the other one belongs here now, switch!"
I feel like I'm doing some tedious actions because if I just return the whole data (get, after updating) from my API call and call the "load" function (as I do when I launch the app) I will not have to handle the update of the operation.
Plus, it could create bug If I'm loading correctly, but not updating correctly (if I miss position of a book for example)
The benefit I see in a functional update is that I only update the 2 books I need, instead of reload all of them again and again.
What way would be better? Should I get rid of those updates functions and just reload the data entirely? I think there could be also some libraries that cache it to only re-render modified books
Thanks you
Without code it is difficult to fully understand the problem but getting the data from the server has 2 advantages.
You are sure the ui shows the data as it is on the server
Your client code does not need to contain the logic of what needs to happen, the server has this logic. When the logic is refactored in some way they don't go out of sync.
Because of this I usually choose to get the data as is on the server.
One problem with fetching data based on user interaction is that fetching is async so the following can happen:
User does action A, request made for A, user Does action B, request made for B, B request resolves and UI is set to result of request B, request made for A resolves and UI is set to result of A.
So the order the user does the actions does not guarantee the order in which the requests are resolved.
To solve this you can use a helper that resolves only if it was last requested, in the example above when A request resolves the UI does not need to be set with anything because it has already been replaced with another request.
In the example below you can type search value, when the value is 1 character long it'll take 2 seconds to resolve so when you type ab the ab request will resolve before the a request. but because the function making the request is wrapped with the last helper when a resolves it'll will be rejected because it has been replaced with the newer request ab.
//constant to reject with when request is replaced with a
// more recent request
const REPLACED = {
message: 'replaced by more recent request',
};
//helper to resolve only last requested promise
const last = (fn) => {
const check = {};
return (...args) => {
const current = {};
check.current = current;
return Promise.resolve()
.then(() => fn(...args))
.then((result) => {
//see if current request is last request
if (check.current === current) {
return result;
}
//was not last request so reject
return Promise.reject(REPLACED);
});
};
};
const later = (howLong, value) =>
new Promise((resolve) =>
setTimeout(() => resolve(value), howLong)
);
const request = (value) =>
later(value.length === 1 ? 2000 : 10, value).then(
(result) => {
console.log('request resolved:', result);
return result;
}
);
const lastRequest = last(request);
const App = () => {
const [search, setSearch] = React.useState('');
const [result, setResult] = React.useState('');
React.useEffect(() => {
//if you use request instead of lastRequest here
// you see it will break, UI is updated as requests
// resolve without checking if it was the last request
lastRequest(search)
.then((result) => setResult(`result:${result}`))
.catch((err) => {
console.log(
'rejected with:',
err,
'for search:',
search
);
if (err !== REPLACED) {
//if the reject reason is not caused because request was
// replaced by a newer then reject this promise
return Promise.reject(err);
}
});
}, [search]);
return (
<div>
<label>
search
<input
type="text"
value={search}
onChange={(e) => setSearch(e.target.value)}
></input>
</label>
<div>{result}</div>
</div>
);
};
ReactDOM.render(<App />, document.getElementById('root'));
<script src="https://cdnjs.cloudflare.com/ajax/libs/react/16.8.4/umd/react.production.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/react-dom/16.8.4/umd/react-dom.production.min.js"></script>
<div id="root"></div>
In my React/Redux app, I make a backend API call to create an entry in a calendar. This is initiated in my handler function which calls the action creator.
Once this initial step is done, I check to see if the entry the user has just created has the same date as the current date my calendar component showing. If so, I call the backend API to get calendar events. I do this to refresh the calendar.
As I step through the process, everything seems to be working fine BUT my calendar does not show updated data.
Here comes the weird part: as I step through this process, everything works and the calendar updates fine. In other words, if I somehow slow down the process, everything seems to be working perfectly fine.
If I don't slow down the process, the calendar fails to update. There are no errors. And as I said, as I step through the process, I see that the API returns correct data, action creator to SET_CALENDAR_EVENTS gets called which then calls the reducer and the reducer sets the data.
Like I said, there are no problems except if I let it happen without slowing down the process, the calendar doesn't update.
Any idea what's causing this? Any suggestions?
My handler function code looks like this:
clickHandleCreateEvent(event) {
// Call API
this.props.actions.createEvent(event);
// Get active date
const activeDate = this.props.activeDate;
if(activeDate === event.eventDate) {
this.props.actions.getCalendarEvents(activeDate);
}
}
UPDATE:
Here's my getCalendarEvents function:
export const getCalendarEntries = (calendarId, date) => {
// Create get calendar entries object
var request = {
id: calendarId,
date: date
};
// Get calendar entries
return (dispatch) => fetch('/api/calendars/entries', fetchOptionsPost(request))
.then((response) => {
if (response.ok) {
// Got events
parseJSON(response)
.then(entries => {
dispatch(setEvents(entries))
})
.then(() => dispatch(setCalendarIsLoading(false)))
} else {
// Couldn't get data
dispatch(setBadRequest(true))
}
})
}
Since both createEvent and getCalendarEvents are async functions involving network communication there is no guarantee which request reaches the server first. So you might read old data while createEvent request were still travelling over the wire.
To avoid this you need to synchronize both requests ie call getCalendarEvents after the server has responded ok to createEvent request.
clickHandleCreateEvent(event) {
// Call API
return this.props.actions
.createEvent(event);
.then(() => {
// Get active date
const activeDate = this.props.activeDate;
if(activeDate === event.eventDate) {
return this.props.actions.getCalendarEvents(activeDate)
}
})
}
In my react App I have a input element. The search query should be memoized, which means that if the user has previously searched for 'John' and the API has provided me valid results for that query, then next time when the user types 'Joh', there should be suggestion for the user with the previously memoized values(in this case 'John' would be suggested).
I am new to react and am trying caching for the first time.I read a few articles but couldn't implement the desired functionality.
You don't clarify which API you're using nor which stack; the solution would vary somewhat depending on if you are using XHR requests or something over GraphQL.
For an asynchronous XHR request to some backend API, I would do something like the example below.
Query the API for the search term
_queryUserXHR = (searchTxt) => {
jQuery.ajax({
type: "GET",
url: url,
data: searchTxt,
success: (data) => {
this.setState({previousQueries: this.state.previousQueries.concat([searchTxt])
}
});
}
You would run this function whenever you want to do the check against your API. If the API can find the search string you query, then insert that data into a local state array variable (previousQueries in my example).
You can either return the data to be inserted from the database if there are unknowns to your view (e.g database id). Above I just insert the searchTxt which is what we send in to the function based on what the user typed in the input-field. The choice is yours here.
Get suggestions for previously searched terms
I would start by adding an input field that runs a function on the onKeyPress event:
<input type="text" onKeyPress={this._getSuggestions} />
then the function would be something like:
_getSuggestions = (e) => {
let inputValue = e.target.value;
let {previousQueries} = this.state;
let results = [];
previousQueries.forEach((q) => {
if (q.toString().indexOf(inputValue)>-1) {
result.push(a);
}
}
this.setState({suggestions: results});
}
Then you can output this.state.suggestions somewhere and add behavior there. Perhaps some keyboard navigation or something. There are many different ways to implement how the results are displayed and how you would select one.
Note: I haven't tested the code above
I guess you have somewhere a function that queries the server, such as
const queryServer = function(queryString) {
/* access the server */
}
The trick would be to memorize this core function only, so that your UI thinks its actually accessing the server.
In javascript it is very easy to implement your own memorization decorator, but you could use existing ones. For example, lru-memoize looks popular on npm. You use it this way:
const memoize = require('lru-memoize')
const queryServer_memoized = memoize(100)(queryServer)
This code keeps in memory the last 100 request results. Next, in your code, you call queryServer_memoized instead of queryServer.
You can create a memoization function:
const memo = (callback) => {
// We will save the key-value pairs in the following variable. It will be our cache storage
const cache = new Map();
return (...args) => {
// The key will be used to identify the different arguments combination. Same arguments means same key
const key = JSON.stringify(args);
// If the cache storage has the key we are looking for, return the previously stored value
if (cache.has(key)) return cache.get(key);
// If the key is new, call the function (in this case fetch)
const value = callback(...args);
// And save the new key-value pair to the cache
cache.set(key, value);
return value;
};
};
const memoizedFetch = memo(fetch);
This memo function will act like a key-value cache. If the params (in our case the URL) of the function (fetch) are the same, the function will not be executed. Instead, the previous result will be returned.
So you can just use this memoized version memoizedFetch in your useEffect to make sure network request are not repeated for that particular petition.
For example you can do:
// Place this outside your react element
const memoizedFetchJson = memo((...args) => fetch(...args).then(res => res.json()));
useEffect(() => {
memoizedFetchJson(`https://pokeapi.co/api/v2/pokemon/${pokemon}/`)
.then(response => {
setPokemonData(response);
})
.catch(error => {
console.error(error);
});
}, [pokemon]);
Demo integrated in React