Firestore Update single item in array of objects - arrays

I have a document in Firebase Firestore like the above picture, i want to modify a single element in "order" array, i know there is no direct way to do it but is there any way?
i tryed the below code
let ww = db.collection("collectionName").document("docName")
let arrayElement = 0
ww.updateData([
"order.\(arrayElement).isHasOffer": true,
"order.\(arrayElement).referenceID": self.referenceID,
"order.\(arrayElement).DosageForm": "dd",
"order.\(arrayElement).GenericName": "test",
"order.\(arrayElement).DISC": 33
]) { err in
if let err = err {
print("Error updating document: \(err)")
} else {
print("Document successfully updated")
}
}
it's override "order" array (remove all the element and adding only what i pass in the code) -> what if there is 100 elements? should i rewrite them all again
it's also convert it from "Array" type to "map", once converted to "map" i'm not able to read them back as an array
note: i have a unique id for every element in the array

Whenever you want to modify individual elements of an array type field, you have to read the document, modify the array in memory, then update the modified array back to the document.
You will not be able to do this without reading the document first. If you require an atomic update, you can perform this read-then-update procedure in a transaction.

Related

MongoDB update array of documents and replace by an array of replacement documents

Is it possible to bulk update (upsert) an array of documents with MongoDB by an array of replacement fields (documents)?
Basically to get rid of the for loop in this pseudo code example:
for user in users {
db.users.replaceOne(
{ "name" : user.name },
user,
{ "upsert": true }
}
The updateMany documentation only documents the following case where all documents are being updated in the same fashion:
db.collection.updateMany(
<query>,
{ $set: { status: "D" }, $inc: { quantity: 2 } },
...
)
I am trying to update (upsert) an array of documents where each document has it's own set of replacement fields:
updateOptions := options.UpdateOptions{}
updateOptions.SetUpsert(true)
updateOptions.SetBypassDocumentValidation(false)
_, error := collection.Col.UpdateMany(ctx, bson.M{"name": bson.M{"$in": names}}, bson.M{"$set": users}, &updateOptions)
Where users is an array of documents:
[
{ "name": "A", ...further fields},
{ "name": "B", ...further fields},
...
]
Apparently, $set cannot be used for this case since I receive the following error: Error while bulk writing *v1.UserCollection (FailedToParse) Modifiers operate on fields but we found type array instead.
Any help is highly appreciated!
You may use Collection.BulkWrite().
Since you want to update each document differently, you have to prepare a different mongo.WriteModel for each document update.
You may use mongo.ReplaceOneModel for individual document replaces. You may construct them like this:
wm := make([]mongo.WriteModel, len(users)
for i, user := range users {
wm[i] = mongo.NewReplaceOneModel().
SetUpsert(true).
SetFilter(bson.M{"name": user.name}).
SetReplacement(user)
}
And you may execute all the replaces with one call like this:
res, err := coll.BulkWrite(ctx, wm)
Yes, here we too have a loop, but that is to prepare the write tasks we want to carry out. All of them is sent to the database with a single call, and the database is "free" to carry them out in parallel if possible. This is likely to be significantly faster that calling Collection.ReplaceOne() for each document individually.

Compare two big arrays value for value in Node.js

I have two arrays, one containing 200.000 product objects coming from a CSV file and one containing 200.000 product objects coming from a database.
Both arrays contains objects with the same fields, with one exception: the database objects have a unique ID as well.
I need to compare all 200.000 CSV objects with the 200.000 database objects. If the CSV object already exists in the database objects array I put it in an "update" array together with the ID from the match, and if it doesn't, then I put it in a "new" array.
When done, I update all the "update" objects in the database, and insert all the "new" ones. This goes fast (few seconds).
The compare step however takes hours. I need to compare three values: the channel (string), date (date) and time (string). If all three are the same, it's a match. If one of those isn't, then it's not a match.
This is the code I have:
const newProducts = [];
const updateProducts = [];
csvProducts.forEach((csvProduct) => {
// check if there is a match
const match = dbProducts.find((dbProduct) => {
return dbProduct.channel === csvProduct.channel && moment(dbProduct.date).isSame(moment(csvProduct.date), 'day') && dbProduct.start_time === csvProduct.start_time;
});
if (match) {
// we found a match, add it to updateProducts array
updateProducts.push({
id: match.id,
...csvProduct
});
// remove the match from the dbProducts array to speed things up
_.pull(dbProducts, match);
} else {
// no match, it's a new product
newProducts.push(csvProduct);
}
});
I am using lodash and moment.js libraries.
The bottleneck is in the check if there is a match, any ideas on how to speed this up?
This is a job for the Map collection class. Arrays are a hassle because they must be searched linearly. Maps (and Sets) can be searched fast. You want to do your matching in RAM rather than hitting your db for every single object in your incoming file.
So, first read every record in your database and construct a Map where the keys are objects like this {start_time, date, channel} and the values are id. (I put the time first because I guess it's the attribute with the most different values. It's an attempt to make lookup faster.)
Something like this pseudocode.
const productsInDb = new Map()
for (const entry in database) {
const key = { // make your keys EXACTLY the same when you load your Map ..
start_time: entry.start_time,
date: moment(entry.date),
entry.channel}
productsInDb.add(key, entry.id)
}
This will take a whole mess of RAM, but so what? It's what RAM is for.
Then do your matching more or less the way you did it in your example, but using your Map.
const newProducts = [];
const updateProducts = [];
csvProducts.forEach((csvProduct) => {
// check if there is a match
const key = { // ...and when you look up entries in the Map.
start_time: entry.start_time,
date: moment(entry.date),
entry.channel}
const id = productsInDb.get(key)
if (id) {
// we found a match, add it to updateProducts array
updateProducts.push({
id: match.id,
...csvProduct
});
// don't bother to update your Map here
// unless you need to do something about dups in your csv file
} else {
// no match, it's a new product
newProducts.push(csvProduct)
}
});

Reduce the size of array that directly get from API

in my app, I get an array from an API
do {
var data = try JSONDecoder().decode([NameInfo].self, from: data!)
data.sort{$0.updated_at! > $1.updated_at!}
completion(.success(data))
} catch {
print(error)
}
for example in this data contains about 600 items which take a long time to load in the app, I just want to reduce the size of this array to 50 items directly here that this reduced array goes to the app to show.
I tired to use this method
let limitedData = data.prefix(50)
completion(.success(limitedData))
but this error shows up:
Member 'success' in 'Result<[NameInfo], Error>' produces result of
type 'Result', but context expects
'Result<[NameInfo], Error>'
Could anyone help me on that?
Thanks
Your Result expects [NameInfo] as Success type. prefix() method returns ArraySlice<NameInfo>. You need to create array from your limitedData:
let limitedData = data.prefix(50)
completion(.success(Array(limitedData)))

Reading data from cvs file and converting data into multidimensional array in Swift

I'm new to Swift. I can read data (many rows and columns of names and mailing addresses) from csv file format. I have several of these files, so I created a function just to read the files and extract the data into a multidimensional array(s) - names, addresses, city, state, country. I read each of the lines from the file and try to append it to multidimensional array but I get errors - either index out of range or file type mismatch. What's the best way to enable this. See code below.
func getMailing(fileName: String) -> ([[String]])? {
let totalList = 243
var tempList: [String] = []
var arrayList = [[String]]()
guard let path = Bundle.main.url(forResource: fileName, withExtension: "csv") else {
print("File Error")
arrayList = [[""]]
return (arrayList)
}
do {
// get mailing data from file
let content = try String(contentsOf:path, encoding: String.Encoding.utf8)
// separate each line entry
tempList = content.components(separatedBy: "\r\n")
for index in 0...totalList - 1 {
// get each line from list and post into an array
let singleLine = tempList[index].components(separatedBy: ",").dropFirst().prefix(5)
// store each line data into into a multidimensional array for easy retrieval
arrayList[index].append(singleLine)
}
}
return (arrayList)
} catch {
print("File Error")
arrayList = [[""]]
return (arrayList)
}
}
Based on the code you've shown, it looks like you're trying to change the values of two different empty arrays 243 times. You have a loop setup to iterate based on your totalList property, but where you got that value, I have no idea. It would be wise to determine that value programmatically if you can.
You're setting both tempList and arrayList as empty arrays:
var tempList: [String] = []
var arrayList = [[String]]()
But then you're going through a loop and trying to change the value of an entry that doesn't even exist, hence your index out of range error. You need to first add something to both these arrays, because right now they are empty. It's probably crashing the first time through the loop when you try to set singleLine to tempList[index].components(separatedBy: ",").dropFirst().prefix(5), because you're saying tempList[0].components(separatedBy: ",").dropFirst().prefix(5), while there isn't an entry for tempList at index 0 because it's still empty! If you're going to loop through an array, it's always wise to do it based on the count of the array, or at least a quick fix when you need to use an index from two different arrays:
// Get the maximum times you can iterate based on the lowest count from each array
let maxLoop = min(tempList.count - 1, arrayList.count - 1)
for index in 0...maxLoop {
// get each line from list and post into an array
let singleLine = tempList[index].components(separatedBy: ",").dropFirst().prefix(5)
// store each line data into into a multidimensional array for easy retrieval
arrayList[index].append(singleLine)
}
Now that little chunk of code above won't even go through the loop once, because both arrays are still empty. You need to somewhere take your mailing data and parse it so that you can populate tempList and arrayList

How can I tell the index of an array of objects in Ruby on Rails 3?

I have an array of todos [Todo1, Todo2, Todo3]
Each object has an attribute, :done_date
I need to find the first instance of the object where :done_date => null
THEN I need to know what index it is todos[N] so I can find the object before todos[N-1]
How can I do that?
You could try going about it in a slightly different way. Making use of Ruby's Enumerable#take_while:
# assuming 'todos' holds your todo objects
todos.take_while { |todo| todo.done_date != nil }.last
This will get all todo objects from todos until it sees a nil done_date, and then grab the last one. You'll have the last todo item before the first nil done_date.
So, if you have
todos = [todo1, todo2, todo3, todo4_with_null_done_date]
the code example above will return todo3.
That said, if you're really looking for something that makes use of the array's indicies, you could try something like this as well:
first_nil_index = todos.find_index { |todo| todo.done_date.nil? }
todos[first_nil_index - 1]

Resources