I'm new to Rxjs(^6.5.3). I'm using it to fetch data from api for my react app.
I am making two requests which one is dependent to the other.
I don't know what i did wrong, but i get this error:
// console ouput
Observable {_isScalar: false, _subscribe: f}
Observable {_isScalar: false, _subscribe: f}
.....
An example of how the results are shown:
// users endpoint
{
"data": {
"total": 130,
"users": [ // this format is also used as the User interface for typescript type check
{"id": 1, "name": "John Doe", "profile_url": "https://myapi.co/user/id/1"},
{"id": 2, "name": "Johny Doe", "profile_url": "https://myapi.co/user/id/2"}, ...
]
}
}
// user details endpoint
{
"data": {
"info":{"name": "John Doe", "age": 50, "gender": "male", "status": "active", ...}
}
}
Here's my code that deals with fetching data from the api
// User class
class User{
.....
private function getAllUsers(): Observable<User[]> {
return from(fetch(api.getUsers()).then(res => res.json()).then(res => res.data.users))
}
private function getUserDetails(url: string): Observable<User> {
return from(fetch(api.getUserDetails(url)).then(res => res.json()).then(res => res.data.info))
}
public function getUsers(): Observable<User[]> {
return this.getAllUsers()
.map(users => users.map(user => user.profile_url))
.flatMap(profiles => {
// console.log('flatmap: ', profiles.map(profiles => this.getUserDetails(profile)))
return r.map(x =>
this.getUserDetails(profile)
);
})
.map(users => users);
}
}
// index page
import ...
....
const userClass = new User();
userClass.getUsers()
.subscribe(users => {
console.log('users: ', users);
})
I found a similar issue Observable dependent on another observable
Update 1: replaced the the returned Observable type to Observable<User> or Observable<User[]>
I think there are a few issues here. I recommend replacing the type of Observable<any> with the type the observable is actually returning which will really help you find errors. It helps me a lot when chaining together observables. Also take(1) was my friend when they weren't completing (but I only needed the first result).
That being said, this post has some good discussion on mapping.
flatMap/mergeMap - creates an Observable immediately for any source item, all previous Observables are kept alive
concatMap - waits for the previous Observable to complete before creating the next one
switchMap - for any source item, completes the previous Observable and immediately creates the next one
exhaustMap - source items are ignored while the previous Observable is not completed
You want to wait for the outer observable - getAllUsers - to complete before starting the inner observable - getUserDetails - right? You are likely looking for concatMap or switchMap instead of flatmap.
Related
I am having such a difficulty inserting observable into an array. What am I doing wrong here..
app.component.ts
const secondNavList = [];
this.appService.issuerList$.subscribe(iss => {
iss.forEach(value => {
console.log(value) //prints {name: 'A', id:'1'} {name: 'B', id:'2'}
secondNavList.push({
config: {
label: value.name
id: value.id
},
type: 'button'
});
});
};
console.log(secondNavList) // prints []
//But I want
//(2)[{...}.{...}]
appService.ts
get issuerList$(): Observable<Issuer[]>{
return this._issuerList.asObservable();
}
getIssuerList(){
const url = DBUrl
this.httpService.getData(url).subscribe((data:any[]) => {
let issuerList = [];
data.forEach(x=>{
issuerList.push(<Issuer>{name: x.issuerName, id: x.issuerId.toString()});
});
this._issuerList.next(issuerList)
})
}
Although inside my secondNavList, it contains data but I can't access it.
The fundamental issue you have is that you're trying to display the value of secondNavList before it is actually set in the subscriber. The rxjs streams are asynchronous, which implies that the the callback inside the subscribe method that appends to the list will get executed at some unknown point after subscribe is executed.
More importantly, I'd recommend that you try to take advantage of the map operator and array.map method, as well as the asyncronous pipes.
appService.ts
readonly issueUpdateSubject = new Subject<string>();
readonly issuerList$ = this.issueUpdateSubject.pipe(
switchMap(url => this.httpService.getData(url)),
map((data: any[]) => data.map(x => ({ name: x.issuerName, id: x.issuerId.toString() }))),
shareReplay(1)
);
getIssuerList() {
this.issueUpdateSubject.next(DBUrl);
}
app.component.ts
readonly secondNavList$ = this.appService.issuerList$.pipe(
map(iss => iss.map(value => ({
config: { label: value.name, id: value.id },
type: 'button'
}))
);
In the appService, instead of having an observable update a subject, I just had a subject emit update requests. Then instead of having to convert the subject to an observable, it just is an observable.
The shareReplay operator will share the most recently emitted list to any new subscribers.
Instead of appending to new arrays, I just use the array.map method to map each array element to the new desired object.
Instead of creating new array outside of the observable, and setting them in subscribe, I use the map operator to stream the latest instances of the arrays.
I find the more comfortable I got with rxjs the less I actually set the values of streams to instances of variables and rarely call subscribe - I just connect more and more streams and there values are used in components via async pipes. It's hard to get your head around it at first (or after a year) of using rxjs, but it's worth it in the end.
The error is because the observable value is an object array, and you want to add this into a simple object.
Try this.
const secondNavList = [];
this.appService.issuerList$.subscribe(iss => {
iss.forEach(value => {
console.log(value) //prints {name: 'A', id:'1'} {name: 'B', id:'2'}
value.forEach(v => {
secondNavList.push({
config: {
label: v.name,
id: v.id
},
type: 'button'
});
});
});
};
console.log(secondNavList) // prints []
I am currently trying to loop through an array of objects (each object is a task), where each task contains relevant information such as a name and date. Information from each task is then utilized in creating an object containing arrays, where each array contains objects that correspond to the date, or the array.
My current code is as follows:
contextTasks.forEach((taskItem) => {
taskItem["taskSchedule"].forEach((dateItem) => {
setItems((items) => ({
...items,
[dateItem["date"]]: [
{
name: taskItem["taskName"],
time: new Date(dateItem["time"]).toLocaleTimeString([], {
hour: "2-digit",
minute: "2-digit",
}),
type: "Task",
},
],
}));
});
});
However, if there are multiple tasks with the same date, they will override each other and I only end up with one task per date. How would I go about pushing further objects to the array if there are other entries for that specific date?
Finished object:
Object {
"2021-04-21": Array [
Object {
"name": "Test Class v1",
"type": "Class",
},
],
"2021-04-24": Array [
Object {
"name": "Test Task v2",
"type": "Task",
},
//I would like to add another object here without overriding existing contents of the array
],
}
Have you tried using reduce ?
the idea will be to have something like this inside your accumulator:
{"date1": [{val}, {val}, ...] , "date2": [{val}, {val}, ...]}
array.reduce((acc, val) => {
// test if your accumulator has the same date as your date from the val
if(acc[val.date]) {
acc[val.date] = [... acc[val.date], ...your val]
} else {
// no date found in the accumulator so make acc.date = ...acc.date, val
acc[val.date] = [ val ]
}
}, {})
Sorry if the code is not perfect but if you want provide your initial array of data and I will fix the response code
The cause of your issue is the fact you're executing an async method inside a synchronous loop. Also, modifying state forces a re-render, and you're attempting to do it presumably many times at once. It might, and will cause a bottleneck at some point.
The solution: build your new state first, and execute a setState once.
I followed this video on the best practices for creating flat databases with firestore: Converting SQL structures to Firebase structures
I came up with something that looks like this:
const firestore = {
events: {
eventID: { // Doc
description: "Event Description", // Field
title: "Event Title", // Field
}
},
eventComments: { // Collection
eventID: { // Doc
comments: { // Field
commentID1: true, // Value
commentID2: true, // Value
commentID3: true, // Value
}
}
},
comments: { // Collection
commentID1: { // Doc
createdAt: "Timestamp", // Field
createdBy: "uid", // Field
content: "Comment Body" // Field
},
commentID2: {...},
commentID3: {...},
},
};
I'm not sure what the best way to get the related data is however
I'm using react and react-redux-firestore to access the data. My current setup for the app looks like this
<EventsDetailPage>
<Comments>
<Comment />
<Comment />
<Comment />
</Comments>
</EventsDetailPage>
I've come up with two potential methods...
Method 1
I have useFirestoreConnect in each component. The top level gets the event and passes the eventID to the comments component, the comments component uses the eventID to get the eventComments list which passes the individual commentID for each comment to the comment component, then finally the individual comment component uses the commentID to get the relevant comment data.
My issue with this: Wouldn't this mean that there is a listener for the event, comment list, and every individual comment? Is that frowned upon?
EX: This would be in the event, the comments, and comment component but each with respective values
useFirestoreConnect(() => [
{collection: 'events', doc: eventID},
]);
const event = useSelector(({firestore: {data}}) => data.events && data.events[eventID]);
Method 2
Let's say I have a list of events, I can do a query to get the lists
useFirestoreConnect(() => [{
collection: 'events',
orderBy: ["createdAt", "desc"],
limitTo: 10
}]);
const events = useSelector(({ firestore: { ordered } }) => ordered.events);
This is great because I believe it's one listener but if any of the data is changed in any of the events the listener will still respond to the changes.
My issue with this: I don't know how to do a where clause that would return all events for a given list of IDs.
So like say if I wanted to get a list of events with where: ['id', '==', ['eventID1', 'eventID2', 'eventID3']]
To retrieve up to 10 items by their ID, you can use an in query:
.where('id', 'in', ['eventID1', 'eventID2', 'eventID3'])
If you have more than 10 IDs, you'll have to run multiple of these queries.
Example store:
{
todos: {
byId: {
"1": { id: "1", title: "foo" },
"2": { id: "2", title: "bar" }
},
allIds: ["2", "1"] // ordered by `title` property
}
}
Now the user wants to add a new Todo Entry:
dispatch({
type: 'ADD_TODO_REQUEST',
payload: { title: "baz" }
})
This triggers some API request: POST /todos. The state of the request is pending as long as there's no response (success or error). This also means, that I have no id yet for the newly created Todo Entry.
Now I already want to add it to the store (and display it). But of course I can't add it to byId and allIds, because it has no id yet.
Question 1: How should I change the layout of my store to make this possible?
After the response arrives, there are two possibilities:
success: Update the store and set the id property of the new Todo Entry. Using dispatch({type:'ADD_TODO_SUCCESS', payload: response.id}).
error: Remove the new Todo Entry from the store. Using dispatch({type:'ADD_TODO_ERROR', payload: ???})
Now the reducer for those two actions has to somehow find the corresponding element in the store. But it has no identifier.
Question 2: How do I find the item in the store if it has no id?
Additional information:
I'm using react with redux-saga
It should be possible to have multiple concurrent ADD_TODO_REQUEST running at the same time. Though it must be possible to have multiple pending Todo Entries within the store. (For example if the network connection is really slow and the user just enters "title1" and hits the "add" button, then "title2" and "add", "title3" and "add".) Though it's not possible to disable the AddTodo component while a request is pending.
How do you solve these kind of problems within your applications?
EDIT: There's even more:
The same functionality should be available for "updating" and "deleting" Todo Entries:
When the user edits a Todo Entry and then hits the "save" button, the item should be in the pending state, too, until the response arrives. If it's an error, the old version of the data must be put back into the store (without requesting it from the server).
When the user clicks "delete", then the item will disappear immediately. But if the server response is an error, then the item should be put back into the list.
Both actions should restore the previous data, if there's an error respsonse.
I found a simple solution. But I'm sure that there are other possibilities and even better solutions.
Keep the Todo Entries in 2 separate collections:
{
todos: {
byId: {
"1": { id: "1", title: "foo" },
"2": { id: "2", title: "bar" }
},
allIds: ["2", "1"],
pendingItems: [
{ title: "baz" },
{ title: "42" }
]
}
}
Now I can find them in the store "by reference".
// handle 'ADD_TODO_REQUEST':
const newTodoEntry = { title: action.payload.title };
yield put({ type: 'ADD_TODO_PENDING', payload: newTodoEntry });
try {
const response = yield api.addTodoEntry(newTodoEntry);
yield put({ type: 'ADD_TODO_SUCCESS', payload: { id: response.id, ref: newTodoEntry } });
} catch(error) {
yield put({ type: 'ADD_TODO_ERROR', payload: newTodoEntry });
}
The reducer will look like this:
case 'ADD_TODO_PENDING':
return {
..state,
pendingItems: // add action.payload to this array
}
case 'ADD_TODO_SUCCESS':
const newTodoEntry = { ...action.payload.ref, id: action.payload.id };
return {
..state,
byId: // add newTodoEntry
allByIds: // add newTodoEntry.id
pendingItems: // remove action.payload.ref from this array
}
case 'ADD_TODO_ERROR':
return {
..state,
pendingItems: // remove action.payload.ref from this array
}
There are 2 problems:
The reducer must use the object reference. The reducer is not allowed to create an own object from the action payload of ADD_TODO_PENDING.
The Todo Entries cannot be sorted easily within the store, because there are two distinct collections.
There are 2 workarounds:
Use client side generated uuids which only exist while the items are within the pending state. This way, the client can easily keep track of everything.
2.
a) Add some kind of insertAtIndex property to the pending items. Then the React component code can merge those two collections and display the mixed data with a custom order.
b) Just keep the items separate. For example the list of pending items on top and below that the list of already persisted items from the server database.
I write a really simple schema using graphql, but some how all the IDs in the edges are the same.
{
"data": {
"imageList": {
"id": "SW1hZ2VMaXN0Og==",
"images": {
"edges": [
{
"node": {
"id": "SW1hZ2U6",
"url": "1.jpg"
}
},
{
"node": {
"id": "SW1hZ2U6",
"url": "2.jpg"
}
},
{
"node": {
"id": "SW1hZ2U6",
"url": "3.jpg"
}
}
]
}
}
}
}
I posted the specific detail on github here's the link.
So, globalIdField expects your object to have a field named 'id'. It then takes the string you pass to globalIdField and adds a ':' and your object's id to create its globally unique id.
If you object doesn't have a field called exactly 'id', then it wont append it, and all your globalIdField will just be the string you pass in and ':'. So they wont be unique, they will all be the same.
There is a second parameter you can pass to globalIdField which is a function that gets your object and returns an id for globalIdField to use. So lets say your object's id field is actually called '_id' (thanks Mongo!). You would call globalIdField like so:
id: globalIdField('Image', image => image._id)
There you go. Unique IDs for Relay to enjoy.
Here is the link to the relevant source-code in graphql-relay-js: https://github.com/graphql/graphql-relay-js/blob/master/src/node/node.js#L110
paste the following code in browser console
atob('SW1hZ2U6')
you will find that the value of id is "Image:".
it means all id property of records fetched by (new MyImages()).getAll()
is null.
return union ids or I suggest you define images as GraphQLList
var ImageListType = new GraphQL.GraphQLObjectType({
name: 'ImageList',
description: 'A list of images',
fields: () => ({
id: Relay.globalIdField('ImageList'),
images: {
type: new GraphQLList(ImageType),
description: 'A collection of images',
args: Relay.connectionArgs,
resolve: (_, args) => Relay.connectionFromPromisedArray(
(new MyImages()).getAll(),
args
),
},
}),
interfaces: [nodeDefinition.nodeInterface],
});