what would be the difference between the two approaches below?
export function* watchLoginUser() {
yield takeEvery(USER_LOGIN, loginUser)
}
export function* watchLogoutUser() {
yield takeEvery(USER_LOGOUT, logoutUser)
}
export function* watchGetParties() {
yield takeEvery(PARTIES_GET, getParties)
}
export default function* root() {
yield [
fork(watchLoginUser),
fork(watchLogoutUser),
fork(watchGetParties)
]
}
export default function* root() {
yield [
takeEvery(USER_LOGIN, loginUser),
takeEvery(USER_LOGOUT, logoutUser),
takeEvery(PARTIES_GET, getParties)
]
}
When do I need to use fork and when not?
In general, fork is useful when a saga needs to start a non-blocking task. Non-blocking here means: the caller starts the task and continues executing without waiting for it to complete.
There is a variety of situations where this can be useful, but the 2 main ones are:
grouping sagas by logical domain
keeping a reference to a task in order to be able to cancel/join it
Your top-level saga can be an example of the first use-case. You'll likely have something like:
yield fork(authSaga);
yield fork(myDomainSpecificSaga);
// you could use here something like yield [];
// but it wouldn't make any difference here
Where authSaga will likely include things like:
yield takeEvery(USER_REQUESTED_LOGIN, authenticateUser);
yield takeEvery(USER_REQUESTED_LOGOUT, logoutUser);
You can see that this example is equivalent to what you suggested, calling with fork a saga yielding a takeEvery call. But in practice, you only need to do this for code organisation purposes. takeEvery is itself a forked task, so in most cases, this would be uselessly redundant.
An example of the second use-case would be something like:
yield take(USER_WAS_AUTHENTICATED);
const task = yield fork(monitorUserProfileUpdates);
yield take(USER_SIGNED_OUT);
yield cancel(task);
You can see in this example that the monitorUserProfileUpdates will execute while the caller saga resumes, and gets to wait to the USER_SIGNED_OUT action to be dispatched. It can in addition keep a reference to it in order to cancel it when needed.
For the sake of completeness, there is another way to start non-blocking calls: spawn. fork and spawn differ in how errors and cancellations bubble from child to parent saga.
Usually fork become more useful for some cases that has multiple dispatches of API calls, the reason is you can reject the those fetches by instantiating the cancel from the task, e.g. cancel(task1);
Useful if the end-user forcefully exit the application or if one of the tasks was failed that make a problem from your instructions, strategy and logic and it might be reasonable to cancel or terminate the current processing tasks on your saga;
There are 2 ways to cancel the task
base from the documentation of redux-saga Non-Blocking effect cancellation
import { take, put, call, fork, cancel } from 'redux-saga/effects'
// ...
function* loginFlow() {
while (true) {
const {user, password} = yield take('LOGIN_REQUEST')
// Non-Blocking Effect which is the fork
const task = yield fork(authorize, user, password)
const action = yield take(['LOGOUT', 'LOGIN_ERROR'])
if (action.type === 'LOGOUT'){
//cancel the task
yield cancel(task)
yield call(Api.clearItem, 'token')
}
}
}
OR
import {call, put, fork, delay} from 'redux-saga/effects';
import someAction from 'action/someAction';
function* fetchAll() {
yield fork(fetcher, 'users');
yield fork(fetcher, 'posts');
yield fork(fetcher, 'comments');
yield delay(1500);
}
function* fetcher(endpoint) {
const res = yield call(fetchAPI, endpoint);
if (!res.status) {
throw new Error(`Error: ${res.error}`);
}
yield put(someAction({payload: res.payload}));
}
function* worker() {
try {
yield call(fetchAll);
} catch (err) {
// handle fetchAll errors
}
}
function* watcher() {
yield takeEvery(BLOGS.PUSH, worker);
}
Your welcome :)
Related
I have a react project set up to work with redux saga, but for some reason, I'm unable to cancel a running saga / action task. The expectancy is that, after some action, (like user navigating away, or clicking on a button), the running saga would be cancelled. Tried catching the cancelled action, but doesn't happen after the saga has run completely:
function* fetchOverviewSaga(): SagaReturnType<any> {
try {
yield delay(5000);
console.log('still running');
const response = yield call(getOverviewData);
console.log('still running');
yield all([
put(updateTagsPendingState(false)),
put(updateTagsDataState(response.items)),
]);
} finally {
if(yield cancelled()) {
console.log('saga task canceled');
}
}
}
function* cancelOverviewSaga(): SagaReturnType<any> {
const runningAction = yield fork(fetchOverviewSaga);
yield cancel(runningAction);
}
function* overviewSaga() {
yield all([
takeLatest(startOverview, fetchOverviewSaga),
takeLatest(cancelOverview, cancelOverviewSaga)
]);
}
The result is that, even after the action was dispatched for cancelling (cancelOverviewSaga), the fetchOverviewSaga still runs, do gets catched in the if(yield cancelled()) only after it completely finished running. Not sure if this is the actual behaviour, would have expected to cancel when requested. Any ideas are most welcomed.
*Edit:
Upon calling the cancel action, the fetchOverviewSaga seems to be canceled, since it does log saga task canceled, however the ones remaining to run is the block from yield all([..])), looking at the console, probably the problem lies within that block
*Edit2:
To better illustrate the behaviour:
dispatch startOverview action
immediately cancel it
the log: saga task canceled
after 5000ms (the delay finishes in fetchOverviewSaga)
the log: 'still running' x2 and dispatches the yield all from fetchOverviewSaga
The same saga can run in multiple instances independently. So e.g. if you do
const task1 = yield fork(mySaga);
const task2 = yield fork(mySaga);
task1 === task2 // false
it will run two independent instances of mySaga, each with its own task. If you cancel one, it doesn't cancel the other one. So forking and immediately canceling your fetchOverviewSaga saga in cancelOverviewSaga will have no effect on the saga that is running as a result of dispatching the startOverview action.
In this case you can instead use e.g. the race effect to achieve your goal:
function* fetchOverviewSaga() {
try {
// your fetching logic ...
} finally {
if (yield cancelled()) {
console.log("saga task canceled");
}
}
}
function* startOverviewSaga() {
yield race([
call(fetchOverviewSaga),
take(cancelOverview),
]);
}
function* overviewSaga() {
yield takeLatest(startOverview, startOverviewSaga);
}
The race effect waits for one if the items in the array to finish and then it cancels all other ones, and so:
If the cancelOverview action is disptached before the fetchOverviewSaga is finished it will cancel the fetchOverviewSaga
If the fetchOverviewSaga finishes before a cancel action is dispatched it will just stop waiting for the cancel action.
Working demo:
https://codesandbox.io/s/https-stackoverflow-com-questions-69717869-redux-saga-unable-to-cancel-task-vu4b0?file=/src/index.js
When you cancel a saga, all sub tasks cancel. Example:
function* mainTask(){
const task = yield fork(subTask1) // or call
yield cancel(task)
}
function* subTask1(){
yield fork(subTask2) // or call
}
function* subTask2(){
...
}
In this example, since cancel propagates downward, subTask1 will be cancelled through mainTask and subTask2 will be cancelled through subTask1.
However, when you yield put, you dispatched an action, which is caught in a reducer or listened in a saga. This is just like sending an erroneous e-mail, and sending a second e-mail to fix the error. Therefore, you can dispatch another action inside if (yield cancelled()) to undo what other actions do.
One way
} finally {
if(yield cancelled()) {
yield all[
put(cancelUpdateTagsPendingState()),
put(cancelUpdateTagsDataState()),
]
}
}
Helpful docs
I am dispatching an action let's say "GET_STATUS" in a loop for X number of time from a component.
In saga file I have
function* actionWatcher() {
yield all([
takeLatest(Actions.GET_LATEST, getLatest),
]);
}
Inside getLatest* function there is this API call
//Some code
const results = yield call(api, {params});
//code after
callback()
I can clearly see API being called X number of time in network and also in chrome debugger I can see //Some code is executed X number of time. But //code after is executed only once in the end and callback function is being called just once in the end.
I am expecting to be called for each occurrence.
If multiple Actions.GET_LATEST happen in rapid succession, then takeLatest is designed to cancel the old saga, and start a new one. If the saga is canceled while it's executing const results = yield call(api, {params});, that means it will never get to callback()
If you don't want them to be canceled, then use takeEvery instead of takeLatest
function* actionWatcher() {
yield all([
takeEvery(Actions.GET_LATEST, getLatest),
]);
}
If you want to keep the cancellation, but you need the callback to be called even if it's cancelled, you can use a try/finally:
function* getLatest() {
try {
const results = yield call(api, {params});
} finally {
// This code will run whether it completes successfully, or throws an error, or is cancelled
callback();
}
}
If you need to specifically check whether it was cancelled in order to perform custom logic, you can do so with an if (yield cancelled()) in the finally block:
function* getLatest() {
try {
const results = yield call(api, {params});
callback(); // This line will only run if it's not cancelled and does not throw
} finally {
if (yield cancelled()) {
// This line will only run if cancelled
}
}
}
I have the following scenario:
export function* addCircle(circleApi, { payload }) {
try {
const response = yield apply(
circleApi,
circleApi.addCircle,
[payload]
);
if (response.error_type) {
yield put(addCircleFailedAction(response.error));
} else {
yield put(addCircleSucceededAction(response));
}
} catch (err) {
console.error(err);
}
}
export function* addTender(tenderApi, { payload }) {
try {
// NOTE: I want this to finish before continuing with rest of saga below.
yield call(addCircleAction(payload.circlePayload));
// Rest of saga removed for brevity.
} catch (err) {
console.error(err);
}
}
So, basically addCircle is making an API call, and depending on its success I call the appropriate redux action. Now, inside another saga I call the action responsible for addCircle saga, and I want it to finish execution before I continue with the rest of the saga. I tried to use call, but it basically doesn't wait for the addCircle saga to finish executing. Is there any way to wait for it? I call addCircle from inside my components and I didn't have the need to wait it, but in this specific instance I have to call it inside the saga, so I really need to wait for it to finish execution, change the state of the app, so that I can use the updated state in the rest of addTender saga. Any ideas?
As per your code snippet, your addCircle saga will dispatch either addCircleFailedAction or addCircleSucceededAction action creators just before it finishes execution. So we will have to wait for those action in your addTender saga.
Basically, this is what you should do. I'm just guessing your action types based on action creator names.
yield call(addCircleAction(payload.circlePayload));
yield take([ADD_CIRCLE_FAILED_ACTION, ADD_CIRCLE_SUCCEEDED_ACTION]);
// Rest of the saga
There is one edge case though. You are not dispatching any action in the catch block of your addCircle saga. Maybe you can dispatch an action called addCircleExceptionAction inside catch block and wait for it along with the other actions like this:
yield take([ADD_CIRCLE_FAILED_ACTION, ADD_CIRCLE_SUCCEEDED_ACTION, ADD_CIRCLE_EXCEPTION_ACTION]);
If you are dispatching multiple actions that would trigger addRender then there is no guarantee that take(...) would actually wait for the action that resulted of the yield call.
export function* addCircle(circleApi, { payload }) {
try {
const response = yield apply(
circleApi,
circleApi.addCircle,
[payload]
);
if (response.error_type) {
yield put(addCircleFailedAction(response.error));
return response;
} else {
yield put(addCircleSucceededAction(response));
return response;
}
} catch (err) {
console.error(err);
return {err};
}
}
export function* addTender(tenderApi, { payload }) {
try {
//because addCircle saga is returning something you can re use it
// in other sagas.
const result = yield call(addCircle,circleAPI?,payload.circlePayload);
//check for result.error_type here
// Rest of saga removed for brevity.
} catch (err) {
console.error(err);
}
}
Your code and the accepted answer would result in an error because call does not take an action object as first argument (it does take a {context,fn} type object).
Dispatching an action and then listening to another action that may or may not have been a side effect of the action you just dispatched is bad design. You dispatch these actions asynchronously and there is no guarantee they all take the same time to complete or provide the side effect you are waiting for in the same order as they were started.
I am tring to upload multiple files from my react native app. It's giving Unexpected Token error on yield statement.
Is it possible to do yield inside a loop?
files.map((fileOb)=>{
const response=yield call(FileManager.uploadFile, fileOb)
yield put(Actions.fileUploaded(response))
})
Thanks,
Sorry for my bad English
In your example above, you're yielding inside the callback passed to files.map. It doesn't work because you can use yield only inside a Generator function.
To handle parallel requests, you can either yield arrays of effects
function* uploadFiles(files) {
const responses = yield files.map(fileOb => {
return call(FileManager.uploadFile, fileOb)
})
yield responses.map(response => {
return put(Actions.fileUploaded(response))
})
}
Note that in this case all calls must succeed in order to dispatch the actions. i.e. the actions will not be dispatched until all calls are resolved with success (otherwise the Saga will cancel the remaining calls and raise an error).
Another way (perhaps what you'd expect) is to have parallel sagas for each individual process (call -> put). For example
function* uploadFiles(files) {
yield files.map(file => call(uploadSingleFile, file))
}
function* uploadSingleFile(file) {
try {
const response = yield call(FileManager.uploadFile, file)
yield put(Actions.fileUploaded(response))
} catch(err) {
yield put(Actions.fileUploadedError(response))
}
}
In the later example, an upload action will be dispatched as soon as the corresponding call has returned. Also because we've surrounded each individual process with a try/catch block, any errors will be handled individually and won't cause the other upload processes to fail
This worked for me to pass multiple files to uploadSingleFile generator function.
function* uploadFiles(files) {
yield all(files.map(file => call(uploadSingleFile, file)));
}
I'm trying to implement React-boilerplate with redux-saga inside. So i'm trying to fetch some data from the server and then make a redirect to another page. The problem is that before redirecting saga makes second request to the server. I guess there is something wrong with cancelling it. Here is a part of my code:
export function* fetchData() {
...
console.log('fetched');
yield browserHistory.push('/another-page');
}
export function* fetchDataWatcher() {
while (yield take('FETCH_DATA')) {
yield call(fetchData);
}
}
export function* fetchDataRootSaga() {
const fetchWatcher = yield fork(fetchDataWatcher);
yield take(LOCATION_CHANGE);
yield cancel(fetchWatcher);
}
export default [
fetchDataRootSaga
]
So in this example i have two console logs, the second one appears before redirecting. How can i fix it?
And another question. Actually, i have more functions in this file. Should i create "rootSaga" for each of them or i can cancel them all in that fetchDataRootSaga()? I mean is it normal if i cancel sagas this way:
export function* fetchDataRootSaga() {
const watcherOne = yield fork(fetchDataOne);
const watcherTwo = yield fork(fetchDataTwo);
...
yield take(LOCATION_CHANGE);
yield cancel(watcherOne);
yield cancel(watcherTwo);
...
}
Thanks in advance!
P.S. I'm not sure if this code is best practices. It is inspired by this repository
Maybe start by adjusting your loop inside fetchDataWatcher to look a little more like this
export function* fetchDataWatcher() {
while (true) {
yield take('FETCH_DATA');
yield call(fetchData);
}
}
Also you can route better by doing something like this perhaps
import { push } from 'react-router-redux';
import { put } from 'redux-saga/effects';
export function* fetchData() {
...
console.log('fetched');
yield put(push('/another-page'));
}
Overall I would hesitate to put a route change and then altogether separately do a take on it, only if you wish to cancel on all location changes (but I assume that's what you're after :) )
This defeats the purpose of saga, which is to handle potentially long running async requests and returns. You could instead set a state in your redux store like so
export function* fetchData() {
...
console.log('fetched');
yield put(setRedirectState('/another-page'));
}
Then see if the redirect state is set in your container in ComponentWillUpdate and redirect accordingly to something like this
import { push } from 'react-router-redux';
dispatch(push(state.redirecturl))
I haven't tried this, but the experience I have with React-boilerplate, this is what I would try first.