Wanna mutate a path of array of Obj.
The object should change on-click and it looks like this:
<iron-icon id="id" icon="icons:arrow-downward" on-click="_sortTags"
class$="arrow [[sortData.id.icon]] [[sortData.id.state]]"></iron-icon>
Here i wanna mutate the sortData Obj, this function gets triggered on click on the above icon
_changeSortData(field,order,iconShape,status){ //there is a function calls this function but did not bring it here to make issue simple
this.set('sortData[field].sort', order);
this.set('sortData[field].icon', iconShape);
this.set('sortData[field].state', status);
}
The object below is the property:
sortData: {
type: Object,
value: function () {
return {
"id": {
"icon": "downward",
"sort": "default",
"state": "inactive"
},
"date": {
"icon": "downward",
"sort": "default",
"state": "inactive"
}
}
},
},
Now is it possible to escape single qoutes here in order to apply [field] as a pram
this.set('sortData[field].sort', order);
since there is tow fields in sortData Obj (id and data)
In this.set(path, value), path can be specified as a string or an Array. Since you have a dynamic path part, you would use an Array path like this:
this.set(['sortData', field, 'sort'], order); // `field` is dynamic
demo
Related
Im trying to filter the elements of this JSON array to return only the first element it will find.
{
"elements": [{
"urn": "urn:li:lyndaCourse:189800",
"details": {
"classifications": [{
"associatedClassification": {
"urn": "urn:li:lyndaCategory:9331",
"type": "LIBRARY"
}
},
{
"associatedClassification": {
"urn": "urn:li:lyndaCategory:8982",
"type": "SUBJECT"
}
},
{
"associatedClassification": {
"urn": "urn:li:lyndaCategory:8920",
"type": "LIBRARY"
}
}
]
}
}]
}
But this results in an EMPTY array [].
I tried this JSONPATH query in https://jsonpath.herokuapp.com/
$.elements[0].details.classifications..associatedClassification[?(#.type=='LIBRARY')][0]
Expecting to get:
[{
"urn": "urn:li:lyndaCategory:9331",
"type": "LIBRARY"
}]
Another way to filter the information is by filtering the property "classification" (without using ".."), and use "associatedClassification.type" in your filter, so you should have something like this:
$.elements[0].details.classifications[?(#.associatedClassification.type=='LIBRARY')]
With the above JSONPATH you will have all items which type is "LIBRARY" (in your example will return 2 items).
You mentioned you need only the first one of the filtered items, as far as I investigated it seems there's no posible solution to return only the first item using only JSONPATH (see the thread bellow):
https://github.com/json-path/JsonPath/issues/272
The following works on all elements:
$..classifications[?(#.associatedClassification.type=='LIBRARY')]
and returns a list of all matching associatedClassification objects.
JSONPath is expected to point inside the original document. Therefore, getting the first element from the result list would require post-processing, e.g. a separate JSONPath, e.g.
$.[0]
I am currently trying to loop through an array of objects (each object is a task), where each task contains relevant information such as a name and date. Information from each task is then utilized in creating an object containing arrays, where each array contains objects that correspond to the date, or the array.
My current code is as follows:
contextTasks.forEach((taskItem) => {
taskItem["taskSchedule"].forEach((dateItem) => {
setItems((items) => ({
...items,
[dateItem["date"]]: [
{
name: taskItem["taskName"],
time: new Date(dateItem["time"]).toLocaleTimeString([], {
hour: "2-digit",
minute: "2-digit",
}),
type: "Task",
},
],
}));
});
});
However, if there are multiple tasks with the same date, they will override each other and I only end up with one task per date. How would I go about pushing further objects to the array if there are other entries for that specific date?
Finished object:
Object {
"2021-04-21": Array [
Object {
"name": "Test Class v1",
"type": "Class",
},
],
"2021-04-24": Array [
Object {
"name": "Test Task v2",
"type": "Task",
},
//I would like to add another object here without overriding existing contents of the array
],
}
Have you tried using reduce ?
the idea will be to have something like this inside your accumulator:
{"date1": [{val}, {val}, ...] , "date2": [{val}, {val}, ...]}
array.reduce((acc, val) => {
// test if your accumulator has the same date as your date from the val
if(acc[val.date]) {
acc[val.date] = [... acc[val.date], ...your val]
} else {
// no date found in the accumulator so make acc.date = ...acc.date, val
acc[val.date] = [ val ]
}
}, {})
Sorry if the code is not perfect but if you want provide your initial array of data and I will fix the response code
The cause of your issue is the fact you're executing an async method inside a synchronous loop. Also, modifying state forces a re-render, and you're attempting to do it presumably many times at once. It might, and will cause a bottleneck at some point.
The solution: build your new state first, and execute a setState once.
I'm new with hooks and I would like to add or delete an item from my tab.
I don't know how to do it because this tab is an attribute of my hook tab.
const [questionResponses, setResponses] = useState(null);
I tried this fix but the syntax don't work :
setResponses( questionResponses[idQuestion].responses => [...questionResponses[idQuestion].responses,{
response_text: itemValue,
response_type: type,
}]);
I tried to use concat(), but it's freezing when the tab responses is not empty:
setResponses({
...questionResponses[idQuestion].responses, [idQuestion]: questionResponses[idQuestion]['responses'].concat([{
response_text: itemValue,
response_type: type,
}])
});
My tab have this structure:
[
{
"question_id": 1,
"question_text": "Best time of day",
"responses": [
{
"response_id": 33,
"response_text": "Morning",
"response_type": "radio"
}
]
},
{
"question_id": 2,
"question_text": "I heard about Marie-France Group via",
"responses": []
},...]
Could you help me please ? I don't know how to do it
Please see the code I added https://codesandbox.io/s/mystifying-liskov-quv5m.
Use spread operator for adding value in the existing array.
function addRespone() {
setResponses([...responses, { name: response }]);
}
For removal filter the exsiting array to remove the intended item.
function deleteResponse(itemIndex) {
const newResponses = responses.filter((item, index) => index !== itemIndex);
setResponses(newResponses);
}
Check codesandbox link for complete code.
Until recently, I've always just used lodash's cloneDeep to make a copy of my state, then change values and return the cloned state. For example like this:
This would be my initial state:
{
"id": 1213,
"title": "Some title...",
"pages": {
"page1": {
"id": 459,
"title": "Some Page title...",
"fields": {
"field_1": {
"title": "My field",
"type": "text",
"value": "my text value..."
},
"field_2": {
"title": "My field 2",
"type": "text",
"value": "my text two value..."
},
"field_3": {
"title": "My field 3",
"type": "text",
"value": "my text value..."
}
}
}
}
}
Now, I want to update the value of field_2.
My redux reducer would look like this:
import cloneDeep from 'lodash/fp/cloneDeep';
export default function reducer(state, action) {
const {type, payload} = action;
switch (type) {
case 'UPDATE_FIELD_VALUE': {
const { pageIdent, fieldIdent, newValue } = payload;
// This is what I'm doing right now....
const newState = cloneDeep(state);
newState.pages[pageIdent]fields[fieldIdent]value = newValue;
return newState;
// Instead could I do this?
return {
...state,
state.pages[pageIdent]fields[fieldIdent]value = newValue;
}
}
}
}
So, I've read that I don't always have to do deep clone...but in other places I've read that you cannot return the same object, you have to return new objects at all times. So what is the right way to do this?
Yeah, don't do that. Quoting the Redux FAQ on whether you should deep-clone state:
Immutably updating state generally means making shallow copies, not deep copies. Shallow copies are much faster than deep copies, because fewer objects and fields have to be copied, and it effectively comes down to moving some pointers around.
However, you do need to create a copied and updated object for each level of nesting that is affected. Although that shouldn't be particularly expensive, it's another good reason why you should keep your state normalized and shallow if possible.
Common Redux misconception: you need to deeply clone the state. Reality: if something inside doesn't change, keep its reference the same!
So, you don't want "deep clones", you need "nested shallow clones".
Deep-cloning is bad for performance in two ways: it takes more work to clone everything, and the new object references will cause UI updates for data that didn't actually change in value (but the new references make the UI think that something changed).
You should read the Redux docs page on "Immutable Update Patterns". Here's the nested state update example from that page:
function updateVeryNestedField(state, action) {
return {
....state,
first : {
...state.first,
second : {
...state.first.second,
[action.someId] : {
...state.first.second[action.someId],
fourth : action.someValue
}
}
}
}
}
If you find that to be too tedious or painful, you should either change how you structure your state so it's flatter, or you can use one of the many immutable update utility libraries out there to handle the update process for you.
You really shouldn't always do a clone of the state object. Redux shines if you can ensure that:
The only section of state that changes is the section that should change, based on the action provided to the reducer. So if you're updating field_1, nothing about field_2 should change.
If there's a section of state that should change, it always changes. So if your action updates field_2, then the field_2 object reference should change.
This is much easier if your state allows for 'deep' updates. State normalization is one of the better 'patterns' used in redux apps and is described in the docs.
For instance, let's restructure that state of yours a bit (assuming the top level is a book object, and the field IDs are globally unique):
Initial state
"books" : {
"1213": {
"id": 1213,
"title": "Some title...",
"pages: [..., "page1", ...],
}
},
"pages": {
"page1": {
"id": 459,
"title": "Some Page title...",
"fields": [..., "field_1", "field_2", "field_3", ...],
}
},
"fields": {
"field_1": {
"title": "My field",
"type": "text",
"value": "my text value..."
},
"field_2": {
"title": "My field",
"type": "text",
"value": "my text value..."
},
"field_3": {
"title": "My field",
"type": "text",
"value": "my text value..."
}
}
Note that each 'book' entity has a list of page IDs rather than a full objects nested in it. Similarly, each page has a list of field IDs rather than the actual fields. This way all your data-carrying entities are stored at the 'top-level' of state and can be independently updated without touching the entire state.
By flattening your state structure, you can create 'sub-reducers' that are only concerned with a small slice of your state. In the above case, I'd have:
import { combineReducers } from 'redux';
// This reducer handles all actions that affect book entities
const books = (state = {}, action = {}) => state;
// This reducer handles all actions that affect page entities
const pages = (state = {}, action = {}) => state;
// This reducer handles all actions that affect field entities.
// For your problem, this would look like:
const fields = (state = initialFields, action = {}) => {
switch (action.type) {
case 'UPDATE_FIELD_VALUE':
return {
...state,
[fieldIdent]: {
...state[fieldIdent],
value: newValue,
}
}
default:
return state;
}
}
// This results in the state structure above
const reducer = combineReducers({
books,
pages,
fields,
});
In the above code, changing a field has no effect on the page or book entities, which prevents unnecessary re-renders. That being said, changing the field_2 value will definitely result in a new field_2 object, and re-render as necessary.
There are libraries that help with structuring your state like this, from JSON API response data. This is a rather good one: https://github.com/paularmstrong/normalizr
Hope this helps!
I think facebook's immutable js solve all your problems. Read docs here
I write a really simple schema using graphql, but some how all the IDs in the edges are the same.
{
"data": {
"imageList": {
"id": "SW1hZ2VMaXN0Og==",
"images": {
"edges": [
{
"node": {
"id": "SW1hZ2U6",
"url": "1.jpg"
}
},
{
"node": {
"id": "SW1hZ2U6",
"url": "2.jpg"
}
},
{
"node": {
"id": "SW1hZ2U6",
"url": "3.jpg"
}
}
]
}
}
}
}
I posted the specific detail on github here's the link.
So, globalIdField expects your object to have a field named 'id'. It then takes the string you pass to globalIdField and adds a ':' and your object's id to create its globally unique id.
If you object doesn't have a field called exactly 'id', then it wont append it, and all your globalIdField will just be the string you pass in and ':'. So they wont be unique, they will all be the same.
There is a second parameter you can pass to globalIdField which is a function that gets your object and returns an id for globalIdField to use. So lets say your object's id field is actually called '_id' (thanks Mongo!). You would call globalIdField like so:
id: globalIdField('Image', image => image._id)
There you go. Unique IDs for Relay to enjoy.
Here is the link to the relevant source-code in graphql-relay-js: https://github.com/graphql/graphql-relay-js/blob/master/src/node/node.js#L110
paste the following code in browser console
atob('SW1hZ2U6')
you will find that the value of id is "Image:".
it means all id property of records fetched by (new MyImages()).getAll()
is null.
return union ids or I suggest you define images as GraphQLList
var ImageListType = new GraphQL.GraphQLObjectType({
name: 'ImageList',
description: 'A list of images',
fields: () => ({
id: Relay.globalIdField('ImageList'),
images: {
type: new GraphQLList(ImageType),
description: 'A collection of images',
args: Relay.connectionArgs,
resolve: (_, args) => Relay.connectionFromPromisedArray(
(new MyImages()).getAll(),
args
),
},
}),
interfaces: [nodeDefinition.nodeInterface],
});