app engine datastore: model for progressively updated terrain height map - google-app-engine

Users submit rectangular axis-aligned regions associated with "terrain maps". At any time users can delete regions they have created.
class Region(db.Model):
terrain_map = reference to TerrainMap
top_left_x = integer
top_left_y = integer
bottom_right_x = integer
bottom_right_y = integer
I want to maintain a "terrain height map", which progressively updates as new rectangular regions are added or deleted.
class TerrainMap(db.Model):
terrain_height = blob
Here's a "picture" of what the map could look like with two overlapping regions: http://pastebin.com/4yzXSFC5
So i thought i could do this by adding a RegionUpdate model, created when Region entity is either created or deleted, and also enqueuing a Task which would churn through a query for "RegionUpdate.applied = False"
class RegionUpdate(db.Model):
terrain_map = reference to TerrainMap
top_left_x = integer
top_left_y = integer
bottom_right_x = integer
bottom_right_y = integer
operation = string, either "increase" or "decrease"
applied = False/True
The problem is it seems all RegionUpdates and Region entities have to be under the same entity group as their TerrainMap: RegionUpdates must only get created when Region entities are created or deleted, so this must be done transactionally; TerrainMap.terrain_height is updated in a Task, so this must be an idempotent operation - i can only think of doing this by transactionally grabbing a batch of RegionUpdate entities, then applying them to the TerrainMap.
That makes my entity group much larger than the "rule of thumb" size of about "a single user's worth of data or smaller".
Am i overlooking some better way to model this?

As I suggested in the Reddit question, I think that Brett's data pipelines talk has all the information you need to build this. The basic approach is this: Every time you insert or update a Region, add a 'marker' entity in the same entity group. Then, in that task, update the corresponding TerrainMap with the new data, and leave a Marker entity as a child of that, too, indicating that you've applied that update. See the talk for full details.
On the other hand, you haven't specified how many Regions you expect per TerrainMap, or how frequently they'll be updated. If the update rate to a single terrain map isn't huge, you could simply store all the regions as child entities of the TerrainMap to which they apply, and update the map synchronously or on the task queue in a single transaction, which is much simpler.

Related

update data in Array of Structs with data in other Array of Struct

Let's say I have struct :
struct Planet {
var id : UUID
var name: String
...
}
I have an array of such structs which is constructed from data fetched from a database. I use this for a form in a browser where the user can:
edit the fields (eg change the name of Planet)
create one or more new Planets
at this time the user may not delete a Planet but it would be great if the solution would support that too
When the form is submitted I get an array with those structures (the order is also not the same as the original). What is the best/most efficient way to do update the data in the original array with the data from the second.
My current idea is:
map the original array to a dictionary with key= id, value= aPlanetStructure
loop over the second array (with the edited data) and if that 'key' can be retrieved in the dictionary (=data from first array)-> update the struct there, if not create an additional planet in the first array.
I'm not sure if this is a good approach, it seems like there could be a more efficient way (but I can't think of it). It would also not support deleting a Planet
In general, if you can separate out the elements of the array by action, you'll make your life easier.
For example:
var created= [Planet]()
var updated= [Planet]()
var deleted = [Planet]()
In your UI layer, when an edit is made, add the edited planet to the
updated array, when a planet is deleted, add it to the deleted array, etc.
Submit all 3 arrays with your form.
Loop over the results of each and pass too your create, update, and delete methods that access your database.
That will require restructuring your form code a bit, but... in general it's easier in your UI layer to tell whether someone is doing a create, an update, or a delete, than it is to mush them all together and try to figure it out after the fact by doing comparisons.

How do I modify/enrich an Eloquent collection?

I'm using eloquent relationships.
When I call $carcollection = $owner->cars()->get(); I have a collection to work with. So let's say that I have, for this particular owner, retrieved three cars. The collection is a collection of three arrays. Each array describes the car.
This is all working fine.
Now I want to add more attributes to the array, without breaking the collection. The additional attributes will come from a different source, in fact another model (e.g. servicehistory)
Either I retrieve the other model and then try merge() them, or I try manipulate the arrays within the collection without breaking the collection.
All this activity is taking place in my controller.
Is one way better than another, or is there a totally different approach I could use.... perhaps this logic belongs in the model themselves? Looking for some pointers :).
Just to be specific, if you do $owner->cars()->get(); you have a collection of Car Models, not array.
That have been said, you can totally load another relation on you Car model, using
$carcollection = $owner->cars()->with('servicehistory')->get();
$carcollection->first()->servicehistory;
You can try to use the transform method of the collection.
$cars = $owner->cars()->get();
$allServiceHistory = $this->getAllService();
$cars->transform(function($car) use($allServiceHistory) {
// you can do whatever you want here
$car->someAttribute = $allServiceHistory->find(...):
// or
$car->otherAttribute = ServiceHistoryModel::whereCarId($car->getKey())->get();
});
And this way, the $cars collection will be mutated to whatever you want.
Of course, it would be wiser to lazy load the data instead of falling into an n+1 queries situation.

Most efficient way to increment a value of everything in Firebase

Say I have entries that look like this:
And I want to increment the priority field by 1 for every Item in the list of Estimates.
I can grab the estimates like this:
var estimates = firebase.child('Estimates');
After that how would I auto increment every Estimates priority by 1?
FOR FIRESTORE API ONLY, NOT FIREBASE
Thanks to the latest Firestore patch (March 13, 2019), you don't need to follow the other answers above.
Firestore's FieldValue class now hosts a increment method that atomically updates a numeric document field in the firestore database. You can use this FieldValue sentinel with either set (with mergeOptions true) or update methods of the DocumentReference object.
The usage is as follows (from the official docs, this is all there is):
DocumentReference washingtonRef = db.collection("cities").document("DC");
// Atomically increment the population of the city by 50.
washingtonRef.update("population", FieldValue.increment(50));
If you're wondering, it's available from version 18.2.0 of firestore. For your convenience, the Gradle dependency configuration is implementation 'com.google.firebase:firebase-firestore:18.2.0'
Note: Increment operations are useful for implementing counters, but
keep in mind that you can update a single document only once per
second. If you need to update your counter above this rate, see the
Distributed counters page.
EDIT 1: FieldValue.increment() is purely "server" side (happens in firestore), so you don't need to expose the current value to the client(s).
EDIT 2: While using the admin APIs, you can use admin.firestore.FieldValue.increment(1) for the same functionality. Thanks to #Jabir Ishaq for voluntarily letting me know about the undocumented feature. :)
EDIT 3:If the target field which you want to increment/decrement is not a number or does not exist, the increment method sets the value to the current value! This is helpful when you are creating a document for the first time.
This is one way to loop over all items and increase their priority:
var estimatesRef = firebase.child('Estimates');
estimatesRef.once('value', function(estimatesSnapshot) {
estimatesSnapshot.forEach(function(estimateSnapshot) {
estimateSnapshot.ref().update({
estimateSnapshot.val().priority + 1
});
});
});
It loops over all children of Estimates and increases the priority of each.
You can also combine the calls into a single update() call:
var estimatesRef = firebase.child('Estimates');
estimatesRef.once('value', function(estimatesSnapshot) {
var updates = {};
estimatesSnapshot.forEach(function(estimateSnapshot) {
updates[estimateSnapshot.key+'/priority'] = estimateSnapshot.val().priority + 1;
});
estimatesRef.update(updates);
});
The performance will be similar to the first solution (Firebase is very efficient when it comes to handling multiple requests). But in the second case it will be sent a single command to the server, so it will either fail or succeed completely.

Django: lock particular rows in table

I have the following django method:
def setCurrentSong(request, player):
try:
newCurrentSong = ActivePlaylistEntry.objects.get(
song__player_lib_song_id=request.POST['lib_id'],
song__player=player,
state=u'QE')
except ObjectDoesNotExist:
toReturn = HttpResponseNotFound()
toReturn[MISSING_RESOURCE_HEADER] = 'song'
return toReturn
try:
currentSong = ActivePlaylistEntry.objects.get(song__player=player, state=u'PL')
currentSong.state=u'FN'
currentSong.save()
except ObjectDoesNotExist:
pass
except MultipleObjectsReturned:
#This is bad. It means that
#this function isn't getting executed atomically like we hoped it would be
#I think we may actually need a mutex to protect this critial section :(
ActivePlaylistEntry.objects.filter(song__player=player, state=u'PL').update(state=u'FN')
newCurrentSong.state = u'PL'
newCurrentSong.save()
PlaylistEntryTimePlayed(playlist_entry=newCurrentSong).save()
return HttpResponse("Song changed")
Essentially, I want it to be so that for a given player, there is only one ActivePlaylistEntry that has a 'PL' (playing) state at any given time. However, I have actually experienced cases where, as a result of quickly calling this method twice in a row, I get two songs for the same player with a state of 'PL'. This is bad as I have other application logic that relies on the fact that a player only has one playing song at any given time (plus semantically it doesn't make sense to be playing two different songs at the same time on the same player). Is there a way for me to do this update atomically? Just running the method as a transaction with the on_commit_success decorator doesn't seem to work. Is there like a way to lock the table for all songs belonging to a particular player? I was thinking of adding a lock column to my model (boolean field) and either just spinning on it or pausing the thread for a few milliseconds and checking again but these feel super hackish and dirty. I was also thinking about creating a stored procedure but that's not really database independent.
Locking queries were added in 1.4.
with transaction.commit_manually():
ActivePlayListEntry.objects.select_for_update().filter(...)
aple = ActivePlayListEntry.objects.get(...)
aple.state = ...
transaction.commit()
But you should consider refactoring so that a separate table with a ForeignKey is used to indicate the "active" song.

Excel VBA Programming with Arrays: To Pass them or Not To Pass them?

Question: I am wondering which is the optimal solution for dealing with Arrays in Excel 2003 VBA
Background: I have a Macro in Excel 2003 that is over 5000 lines. I have built it over the last 2 years adding new features as new Procedures, which helps to segment the code and debug, change, or add to that feature. The downside is that I am using much of the same base information in multiple procedures, which requires me to load it into arrays with minor differences multiple times. I am now running into issues with the length of run time, so I am now able to do a full rewrite.
This file is used to grab multiple items of manufacturing flows (up to 4 different set ups with a total of up to 10 distinct flows , of up to 1000 steps each) with the information being Flow specific, Sub-Flow specific for grouping / sorting purposes, and Data (such as movements, inventory, CT, ...)
It then will stick the data onto multiple sheets used to manage the process utilizing data sheets to be perused, charts, and Cell Formatting to denote process flow capability / history.
The Flow is in the Excel File, while the Manufacturing data is read in with 7 different OO4O Oracle SQL pulls, some reused multiple times
The Arrays are:
arrrFlow(1 to 1000, 1 to 4) as a Record Type with 4 strings
arrrSubFlow(1 to 1000, 1 to 10) as a Record Type with 4 strings, 2 integers, and 1 single
arrrData(1 to 1000, 1 to 10) as a Record Type with 1 string, 4 integers, 12 longs, and 1 single
arriSort(1 to 1000, 1 to 4) as Integer (Used as a pointer Array to sort the Flow, Sub Flow, and Data in a Group, Sub Group, and Step order while leaving the original arrays in Step order)
Possibilities:
1) Rewrite the macro into one big procedure that loads the data into master arrays dimensioned within the Procedure once
Pro: Dimensioned in the Procedure rather than as a Public Variable in the Module and not passed.
Con: Harder to debug with one mega procedure instead of multiple smaller ones.
2) Keep macro with multiple procedures but passing the Arrays
Pro: Easier to debug code with multiple smaller procedures.
Con: Passing Arrays (Expensive?)
3) Keep macro with multiple procedures but with the Arrays being Public Dim'ed variables in the Module
Pro: Easier to debug code with multiple smaller procedures.
Con: Public Arrays (Expensive?)
So, what's the community's verdict? Does anyone know the expense of using Public Arrays vs Passing Arrays? Is the Cost of either of these worth losing the ease of having my procedures being focused on one feature?
UPDATE:
I load Inventory Data at a discrete level (multiple per Step), Moves Data at a aggregate level (one per step), and the Beginning of Shift Inventory at an aggregate level. I aggregate the Inventory data by step placing it in Work State categories (Run, Wait,...) I create targets off data already on the sheets.
I have a Flow sheet that shows the Work Flows by Type, currently 3 products have a similar but not exactly the same flow, and 2 products are a different flow, that are similar but again not the same as each other. I have assigned each set of steps in the different flows a group and sub-group.
I place this data on multiple sheets, some in Step Order, some in group / sub-group order. I also need the data summed up by group and product, group / sub-group and product, portion of the line and product, and product.
I use Record Types so I actually have a readable three dimensional array, arrSubFlow(1,1).strStep (Step Name of the 1st Step of the 1st Device), arrData(10,5).lngYest (Yesterday's movement for the 10th Step of the 5th Device).
My main point of optimization is going to be in the section where I create 10 pages from scratch every single time. With Merging Cells, Borders, Headers, ... This is a very time consuming process. I will add a section that will compare my data with the page to see if it needs to be changed and if so, only then recreate it otherwise, I'll clear each section of data and only write data that changes to the sheet. This will be huge, based on my time logging data. However, whenever I update code, I always try to improve other aspects of the code as well. I see the loading of the data into a Structure (Array, RecordSet, Collection) once as both a little bit of optimization, but more so for data integrity, so I do not have the opportunity to load it differently for different sheets.
The main issues I see getting away from Arrays right now are:
* Already heavily invested in them, but this is not a good enough reason to not change
* Don't know if there is much cost to passing them, since it will by ByRef
* I use a Sort Function to create a Sorted "Pointer" array that lets me leave the Array in Step Flow order, while easily referencing it by Group / Sub-group order.
Since I am always trying to make my code for now and the future, I am not against updating the arrays to either RecordSets or Collections, but not merely for the sake of changing them to learn something cool. My arrays work and from my research, they add seconds to the run time, not substantial amounts for this 2 minute report. So If another structure is easier to update in the future than Two-dimensional Arrays of Record Types, then please let me know, but does anyone know the cost of passing an Array to a procedure, assuming you are not doing a ByVal pass?
You've provided a good bit of detail, but it's still quite difficult to understand exactly what's going on without seeing some code. In your question, I can identify at least 4 big topics that you interweave throughout: Manufacturing, Data Access, VBA, and Coding Best-Practices. It's hard for me to tell exactly what you're asking because your question scope is huge. Either way, I appreciate your trying to write better code in VBA.
It's hard for me to understand exactly what you plan to do with the arrays. You say:
The downside is that I am using much of the same base information in multiple procedures, which requires me to load it into arrays with minor differences multiple times.
I'm not sure what you mean here. Are you using arrays to represent a row of data that you retrieved from a database? If so, you might consider using class modules instead of the usual "macro" modules. These will allow you to work with full-blown objects instead of arrays of values (or references, as the case may be). Classes take more work to set up and consume, but they make your code a lot easier to work with and will greatly help you to segment your code.
As user Emtucifor already pointed out, there may be objects such as ADO Recordset objects (which may require Access to be installed...not sure) that can help greatly. Or you might create your own.
Here's a long example of how using a class might help you. Although this example is lengthy, it will show you how a few principles of object-oriented programming can really help you clean up your code.
In the VBA editor, go to Insert > Class Module. In the Properties window (bottom left of the screen by default), change the name of the module to WorkLogItem. Add the following code to the class:
Option Explicit
Private pTaskID As Long
Private pPersonName As String
Private pHoursWorked As Double
Public Property Get TaskID() As Long
TaskID = pTaskID
End Property
Public Property Let TaskID(lTaskID As Long)
pTaskID = lTaskID
End Property
Public Property Get PersonName() As String
PersonName = pPersonName
End Property
Public Property Let PersonName(lPersonName As String)
pPersonName = lPersonName
End Property
Public Property Get HoursWorked() As Double
HoursWorked = pHoursWorked
End Property
Public Property Let HoursWorked(lHoursWorked As Double)
pHoursWorked = lHoursWorked
End Property
The above code will give us a strongly-typed object that's specific to the data with which we're working. When you use multi-dimension arrays to store your data, your code resembles this: arr(1,1) is the ID, arr(1,2) is the PersonName, and arr(1,3) is the HoursWorked. Using that syntax, it's hard to know what is what. Let's assume you still load your objects into an array, but instead use the WorkLogItem that we created above. This name, you would be able to do arr(1).PersonName to get the person's name. That makes your code much easier to read.
Let's keep moving with this example. Instead of storing the objects in array, we'll try using a collection.
Next, add a new class module and call it ProcessWorkLog. Put the following code in there:
Option Explicit
Private pWorkLogItems As Collection
Public Property Get WorkLogItems() As Collection
Set WorkLogItems = pWorkLogItems
End Property
Public Property Set WorkLogItems(lWorkLogItem As Collection)
Set pWorkLogItems = lWorkLogItem
End Property
Function GetHoursWorked(strPersonName As String) As Double
On Error GoTo Handle_Errors
Dim wli As WorkLogItem
Dim doubleTotal As Double
doubleTotal = 0
For Each wli In WorkLogItems
If strPersonName = wli.PersonName Then
doubleTotal = doubleTotal + wli.HoursWorked
End If
Next wli
Exit_Here:
GetHoursWorked = doubleTotal
Exit Function
Handle_Errors:
'You will probably want to catch the error that will '
'occur if WorkLogItems has not been set '
Resume Exit_Here
End Function
The above class is going to be used to "do something" with a colleciton of WorkLogItem. Initially, we just set it up to count the total number of hours worked. Let's test the code we wrote. Create a new Module (not a class module this time; just a "regular" module). Paste the following code in the module:
Option Explicit
Function PopulateArray() As Collection
Dim clnWlis As Collection
Dim wli As WorkLogItem
'Put some data in the collection'
Set clnWlis = New Collection
Set wli = New WorkLogItem
wli.TaskID = 1
wli.PersonName = "Fred"
wli.HoursWorked = 4.5
clnWlis.Add wli
Set wli = New WorkLogItem
wli.TaskID = 2
wli.PersonName = "Sally"
wli.HoursWorked = 3
clnWlis.Add wli
Set wli = New WorkLogItem
wli.TaskID = 3
wli.PersonName = "Fred"
wli.HoursWorked = 2.5
clnWlis.Add wli
Set PopulateArray = clnWlis
End Function
Sub TestGetHoursWorked()
Dim pwl As ProcessWorkLog
Dim arrWli() As WorkLogItem
Set pwl = New ProcessWorkLog
Set pwl.WorkLogItems = PopulateArray()
Debug.Print pwl.GetHoursWorked("Fred")
End Sub
In the above code, PopulateArray() simply creates a collection of WorkLogItem. In your real code, you might create class to parse your Excel sheets or your data objects to fill a collection or an array.
The TestGetHoursWorked() code simply demonstrates how the classes were used. You notice that ProcessWorkLog is instantiated as an object. After it is instantiated, a collection of WorkLogItem becomes part of the pwl object. You notice this in the line Set pwl.WorkLogItems = PopulateArray(). Next, we simply call the function we wrote which acts upon the collection WorkLogItems.
Why is this helpful?
Let's suppose your data changes and you want to add a new method. Suppose your WorkLogItem now includes a field for HoursOnBreak and you want to add a new method to calculate that.
All you need to do is add a property to WorkLogItem like so:
Private pHoursOnBreak As Double
Public Property Get HoursOnBreak() As Double
HoursOnBreak = pHoursOnBreak
End Property
Public Property Let HoursOnBreak(lHoursOnBreak As Double)
pHoursOnBreak = lHoursOnBreak
End Property
Of course, you'll need to change your method for populating your collection (the sample method I used was PopulateArray(), but you probably should have a separate class just for this). Then you just add your new method to your ProcessWorkLog class:
Function GetHoursOnBreak(strPersonName As String) As Double
'Code to get hours on break
End Function
Now, if we wanted to update our TestGetHoursWorked() method to return result of GetHoursOnBreak, all we would have to do as add the following line:
Debug.Print pwl.GetHoursOnBreak("Fred")
If you passed in an array of values that represented your data, you would have to find every place in your code where you used the arrays and then update it accordingly. If you use classes (and their instantiated objects) instead, you can much more easily update your code to work with changes. Also, when you allow the class to be consumed in multiple ways (perhaps one function needs only 4 of the objects properties while another function will need 6), they can still reference the same object. This keeps you from having multiple arrays for different types of functions.
For further reading, I would highly recommend getting a copy of VBA Developer's Handbook, 2nd edition. The book is full of great examples and best practices and tons of sample code. If you're investing a lot of time into VBA for a serious project, it's well worth your time to look into this book.
It sounds like maybe Excel and arrays are not the best tools for the job you're doing. If you could please explain a little bit about the type of data that you're working with and what you're doing, that will really help provide a better answer. Give as much detail as you can about the types of manipulations you're doing on the data and what the inputs and outputs are.
I'm going to give some highlights that I think will help you, and then may edit my answer to be more complete as I get responses from you, and so I have more time to flesh things out a bit.
There is an object that naturally handles the record-type objects you're working with called a Recordset. In the VBA editor, go to Tools -> References and add Microsoft ActiveX Data Objects 2.X Library (the highest one on your machine). You can declare an object of type ADODB.Recordset, then do Recordset.Fields.Append to add fields to it, then .Open it and finally .AddNew, set field values, and .Update. This is a natural object to pass around in programs as an input or output parameter. It has natural traversal and positioning functions (.Eof, .Bof, .AbsolutePosition, .MoveNext, .MoveFirst, .MovePrevious) and supports searching and filtering (.Filter = "Field = 'abc'", .Find and so on).
I don't recommend using public variables, though without an understanding of what you're doing I can't really advise you well here.
I also would avoid one big procedure. Code should be broken out into reusable functional units that do only one thing, whose names are essentially self-documenting about what they do.
If you want to improve the performance of your code, hit ctrl-break at random times while it's running and break into the code. Then press Ctrl-L to view the call stack. Make a note of what is in the list each time. If any item shows up a majority of the time, it is the bottleneck and is where you should spend your time trying to optimize it. However, I don't advise trying to optimize what you have until you make some higher-level decisions (like whether you will switch to a recordset).
I really need more information to help you better.
If you're interested, I'll work up some demonstration code that will show how useful the Recordset object is. Inserting the data from a Recordset into an Excel range is super easy with Recordset.GetRows or .GetString (though some array transposition may be required, that's not hard, either).
UPDATE: If your goal is to speed up your process, then before doing anything I think it's best to be armed with the knowledge of what is taking the most time. Would you please hit ctrl-break about 10 times and note down the call stack each time, then tell me what the most common items in the call stack are?
In terms of updating the speed of cell formatting, here's my experience:
Merge is the slowest operation you can possibly do. Try to avoid it if at all possible. Using "center across selection" is one alternative. Another is just not merging, but using some combination of sizing properly, borders, cell background color, and turning off gridlines for the entire workbook.
Apply borders or other formatting once to the largest thing possible instead of to many small things such as cell by cell. For example, if most cells have all borders but some don't, then apply all borders to the entire range and during your looping remove the ones you don't want. And even then, try to do entire rows and larger ranges.
Save a template file with borders and formatting already applied. Let's say you put one row in it with the formatting for a certain section. In one step duplicate that row into as many rows are needed for that section, say 20 rows, and they will all have the same formatting. Duplicating rows is MUCH faster than applying formatting cell by cell.
Also, I wouldn't automatically go for using classes. While OO is great and I do it myself (heck, I just built 8 classes for something the other day to model a hierarchical structure so I could easily expose the parts of it when I needed them), in practice it can be slower. A simple set of public variables in a class is faster than using getters and setters. A user defined Type is even faster than a class, but you can run into gotchas trying to pass around UDTs in classes (they have to be declared in a non-class public module and even then they can give problems).

Resources