Create and update inside map function - reactjs

I'm trying to find the right way to create and consequently update inside a map function.
These are the steps I need:
Map function "reads" the array of elements ids
Create new record on "leads_status" table
Using the new record id (from "leads_status") "leads" table is updated using "leads_status.id" as foreign key related to "leads.id_ls"
This is the code I tried.
const [create, { isLoading: isLoadingCreate, error: errorCreate }] = useCreate();
const [record, setRecord] = React.useState(null);
leadsIDS.map((value, index) => {
create('leads_status', {
data: {
id_lead: value,
id_status: 5
}
}, {
onSuccess: ({ id }) => {
setRecord([id, value]);
},
onError: () => {
console.log();
}
});
update('leads', {
id: record[1],
data: {
id_ls: record[0]
}
}, {
enabled: !isLoadingCreate && record !== null
}, {
onSuccess: () => {
console.log(record);
},
onError: error => notify('Error', { type: 'warning' })
})
})
I tried also to put the "update" function inside the "create --> onSuccess" but also there the code is not working as I want.
In "leads_status" table records are always created for each element in "leadsIDS" array but in "leads" table only 1 records is updating.
Where am I wrong?

The useCreate and useUpdate hooks are designed for single actions. If you want to chain several actions, I suggest you use the useDataProvider hook, instead, which lets you manipulate Promises.
const dataProvider = useDataProvider();
const notify = useNotify();
try {
await Promise.all(leadsIDS.map(async (value, index) => {
const { data: leadStatus } = await dataProvider.create('leads_status', {
data: {
id_lead: value,
id_status: 5
}
});
await dataProvider.update('leads', {
id: value,
data: { id_ls: leadStatus.id }
});
}));
} catch (e) {
notify('Error', { type: 'warning' });
}

Related

How to effectively do optimistic update for deeply nested data in react query?

So I'm making a kanban board style task manager using react and react query. My current implementation of the data fetching is like the following:
const { data } = useQuery('listCollection', getListCollection)
and the content of data is something like this:
// data
{
listOrder: number[]
lists: IList[]
}
interface IList {
id: number
title: string
todoOrder: number[]
todos: ITodo[]
}
interface ITodo {
id: number
text: string
checked: boolean
}
So basically a list collection contains multiple lists and each lists contain multiple todos.
Now, I want this application to do optimistic update on each mutation (add a new todo, delete, check a todo, etc).
Here is my current implementation of optimistic update when toggling a todo check:
const mutation = useMutation(
(data: { todoId: number; checked: boolean }) =>
editTodo(data),
{
onMutate: async (data) => {
await queryClient.cancelQueries('listCollection')
const previousState = queryClient.getQueryData('listCollection')
queryClient.setQueryData('listCollection', (prev: any) => ({
...prev,
lists: prev.lists.map((list: IList) =>
list.todoOrder.includes(data.todoId)
? {
...list,
todos: list.todos.map((todo) =>
todo.id === data.todoId
? { ...todo, checked: data.checked }
: todo,
),
}
: list,
),
}))
return { previousState }
},
onError: (err, newTodo, context) => {
queryClient.setQueryData(parent, context?.previousState)
},
onSuccess: () => queryClient.invalidateQueries(parent),
},
)
As you can see, that's overly complicated. How should I approach this?
The best way to update the deeply nested data in react query is by using "Immer" library. It is very light weight and it uses proxies to change the reference of only updated data, and reducing the cost of rendering for non-updated data.
import produce from "immer";
const mutation = useMutation(
(data: { todoId: number; checked: boolean }) =>
editTodo(data),
{
onMutate: async (data) => {
await queryClient.cancelQueries('listCollection')
const previousState = queryClient.getQueryData('listCollection')
const updData = produce(previousState, (draftData) => {
// Destructing the draftstate of data.
let {lists} = draftData;
lists = lists.map((list: IList) => {
// Mapping through lists and checking if id present in todoOrder and todo.
if(list.todoOrder.includes(data.todoId) && list.todo[data.id]){
list.todo[data.id].checked = data.checked;
}
return list.
}
// Updating the draftstate with the modified values
draftData.lists = lists;
}
// Setting query data.
queryClient.setQueryData("listCollection", updData);
return { previousState }
},
onError: (err, newTodo, context) => {
queryClient.setQueryData(parent, context?.previousState)
},
onSuccess: () => queryClient.invalidateQueries(parent),
},
)
This will solve your case. You can modify the listOrder if needed just the way I updates lists.

Update an array relation belongs to many with Strapi controller

I use Strapi V4. I have a link collection and I want to update likes.
How update the relation array ? When I put new data old value are replace by the new one.
Example :
likes : [1]
if I update another time
likes:[2].
BUT I want this likes : [1,2]
I try this but It d'oesn't work. Thans for your replay
'use strict';
/**
* link controller
*/
const { createCoreController } = require('#strapi/strapi').factories;
module.exports = createCoreController('api::link.link', ({ strapi }) => ({
// Method 2: Wrapping a core action (leaves core logic in place)
async find(ctx) {
const { data, meta } = await super.find(ctx);
const linkId = data.map((link) => link.id);
const allPosts = await strapi.entityService.findMany('api::link.link', {
fields: ["id"],
filters: { id: { $in: linkId } },
populate: {
likes: { count: true },
},
});
data.forEach(link => {
link.likes = allPosts.find(({ id }) => id === link.id)?.likes?.count || 0;
});
//update value with new array => need to be fix
await strapi.entityService.update("api::link.link", {
likes: [...allPosts.likes.map(({ id }) => id), ...likes],
});
return { data, meta };
},
}));
This part need to be fix. Can you help me ? Thanks
//update value with new array => need to be fix
await strapi.entityService.update("api::link.link", {
likes: [...allPosts.likes.map(({ id }) => id), ...likes],
});

Apollo Client delete Item from cache

Hy I'm using the Apollo Client with React. I query the posts with many different variables. So I have one post in different "caches". Now I want to delete a post. So I need to delete this specific post from all "caches".
const client = new ApolloClient({
link: errorLink.concat(authLink.concat(httpLink)),
cache: new InMemoryCache()
});
Postquery:
export const POSTS = gql`
query posts(
$after: String
$orderBy: PostOrderByInput
$tags: JSONObject
$search: String
$orderByTime: Int
) {
posts(
after: $after
orderBy: $orderBy
tags: $tags
search: $search
orderByTime: $orderByTime
) {
id
title
...
}
}
`;
I tried it with the cache.modify(), which is undefined in my mutation([https://www.apollographql.com/docs/react/caching/cache-interaction/#cachemodify][1])
const [deletePost] = useMutation(DELETE_POST, {
onError: (er) => {
console.log(er);
},
update(cache, data) {
console.log(cache.modify())//UNDEFINED!!!
cache.modify({
id: cache.identify(thread), //identify is UNDEFINED + what is thread
fields: {
posts(existingPosts = []) {
return existingPosts.filter(
postRef => idToRemove !== readField('id', postRef)
);
}
}
})
}
});
I also used the useApolloClient() with the same result.
THX for any help.
Instead of using cache.modify you can use cache.evict, which makes the code much shorter:
deletePost({
variables: { id },
update(cache) {
const normalizedId = cache.identify({ id, __typename: 'Post' });
cache.evict({ id: normalizedId });
cache.gc();
}
});
this option worked for me
const GET_TASKS = gql`
query tasks($listID: String!) {
tasks(listID: $listID) {
_id
title
sort
}
}
`;
const REMOVE_TASK = gql`
mutation removeTask($_id: String) {
removeTask(_id: $_id) {
_id
}
}
`;
const Tasks = () => {
const { loading, error, data } = useQuery(GET_TASKS, {
variables: { listID: '1' },
});
сonst [removeTask] = useMutation(REMOVE_TASK);
const handleRemoveItem = _id => {
removeTask({
variables: { _id },
update(cache) {
cache.modify({
fields: {
tasks(existingTaskRefs, { readField }) {
return existingTaskRefs.filter(
taskRef => _id !== readField('_id', taskRef),
);
},
},
});
},
});
};
return (...);
};
You can pass your updater to the useMutation or to the deletePost. It should be easier with deletePost since it probably knows what it tries to delete:
deletePost({
variables: { idToRemove },
update(cache) {
cache.modify({
fields: {
posts(existingPosts = []) {
return existingPosts.filter(
postRef => idToRemove !== readField('id', postRef)
);
},
},
});
},
});
You should change variables to match your mutation. This should work since posts is at top level of your query. With deeper fields you'll need a way to get the id of the parent object. readQuery or a chain of readField from the top might help you with that.

How can I see state within a function? using hooks

I'm trying to update the uploadFiles state inside my updateFile function, when reloading the file, I'm rewriting this component in hooks, but inside the function the state is given as empty.
const [uploadedFiles, setUploadedFiles] = useState({
slides: [],
material: [],
});
const updateFile = useCallback(
(id, data) => {
const value = uploadedFiles.slides.map(uploadedFile => {
return id === uploadedFile.id
? { ...uploadedFile, ...data }
: uploadedFile;
});
console.log('value', value);
console.log('uploadedFilesOnFunction', uploadedFiles);
},
[uploadedFiles]
);
function processUpload(upFile, type) {
const data = new FormData();
data.append('file', upFile.file, upFile.name);
api
.post('dropbox', data, {
onUploadProgress: e => {
const progress = parseInt(Math.round((e.loaded * 100) / e.total), 10);
updateFile(upFile.id, {
progress,
});
},
})
.then(response => {
updateFile(upFile.id, {
uploaded: true,
id: response.data.id,
url: response.data.url,
type,
});
})
.catch(response => {
updateFile(upFile.id, {
error: true,
});
});
}
function handleUpload(files, type) {
const uploaded = files.map(file => ({
file,
id: uniqueId(),
name: file.name,
readableSize: filesize(file.size),
preview: URL.createObjectURL(file),
progress: 0,
uploaded: false,
error: false,
url: null,
type,
}));
setUploadedFiles({
slides: uploadedFiles.slides.concat(uploaded),
});
uploaded.forEach(e => processUpload(e, type));
}
console.log('slides', uploadedFiles);
I expected the state values to be viewed by the function. For me to manipulate and set the state.
There might be other issues, but one thing I've noticed is:
const [uploadedFiles, setUploadedFiles] = useState({
slides: [],
material: [],
});
// A setState CALL FROM THE useState HOOK REPLACES THE STATE WITH THE NEW VALUE
setUploadedFiles({
slides: uploadedFiles.slides.concat(uploaded),
});
From: https://reactjs.org/docs/hooks-state.html
State variables can hold objects and arrays just fine, so you can still group related data together. However, unlike this.setState in a class, updating a state variable always replaces it instead of merging it.
The setState from the useState hook doesn't merge the state. Because it can hold any type of value, not only objects, like we used to do with classes.
From your code you can see that you're erasing some property from state when you're updating like that.
Instead, you should use the functional form of the setState and access the current state prevState, like:
setUploadedFiles((prevState) => {
return({
...prevState,
slides: uploadedFiles.slides.concat(uploaded)
});
});
The updated updateFiles function:
const updateFile = (id, data) => {
setUploadedFiles(prevState => {
const newSlide = prevState.slides.map(slide => {
return id === slide.id ? { ...slide, ...data } : slide;
});
return {
...prevState,
slides: newSlide,
};
});
};

Meteor React publish merged collections

With Meteor (1.4.2.3) and React, I have the collection Objects which has an itemId which refers to the collection Items.
Currently I subscribe to the collection on the client side with:
export default createContainer(() => {
let objectsSub = Meteor.subscribe('allObjects');
var objects = Objects.find({}, {
transform: function (doc) {
doc.item = Items.findOne({
_id: doc.itemId
});
return doc;
}
}).fetch();
return {
objects: objects,
}
}, App);
This works perfect, but I think it is more elegant to merge the collections on the server side. However, none of the solutions I found seem to work
Transform at collection definition
const Objects = new Mongo.Collection('objects',
{
transform: function (doc) {
doc.item = Items.findOne({
_id: doc.itemId
})
}
});
The console gives:
Error: transform must return object
Transform at publish
if (Meteor.isServer) {
Meteor.publish('allObjects', function () {
return Objects.find({}, {
sort: { startedAt: -1 },
transform: function (doc) {
doc.item = Items.findOne({
_id: doc.itemId
});
return doc;
}
});
});
};
TypeError: Cannot read property 'name' of undefined
Where name is a property of Items
i usually do it in the publish like this:
Meteor.publish('allObjects', function () {
let cursor = Objects.find({}, {
sort: { startedAt: -1 });
});
let transformData = (fields) => {
fields.item = Items.findOne({
_id: fields.itemId
});
return fields;
};
let handle = cursor.observeChanges({
added: (id, fields) => {
fields = transformData(fields);
this.added('objects', id, fields);
},
changed: (id, fields) => {
fields = transformData(fields);
this.changed('objects', id, fields);
},
removed: (id) => {
this.removed('objects', id);
}
});
this.ready();
this.onStop(() => {
handle.stop();
});
}

Resources