Why Here Maps Routes App and Web Different - maps

We have here api on both mobile apps(Android and Ios) and web app(client dashboard) - and we have a problem encountered when we add truck parameters in api request - sometimes routes differ on mobile API and web API, we used totally same parameters, bu in the end routes still different, and we cant figure what else to change to make them same.
Here's one test example:
point1 = 37.7793808,-122.4184174(San Francisco),
point2 = 40.7559247,-73.9846107(New York),
vehicleWidth = 300 feet,
vehicleHeight = 400 feet,
vehicleLength = 200 feet,
limitedVehicleWeight = 50 lbs,
web part request:
var routingParameters = {
'mode': 'fastest;truck;traffic:enabled;boatFerry:-1',
'representation': 'display',
'routeAttributes': 'waypoints,summary,shape,legs'
};
var feet = 0.3048;
var ton = 0.00045359237;
routingParameters.vehicleWidth = (300 * feet);
routingParameters.vehicleHeight = (400 * feet) + 'm';
routingParameters.vehicleLength = (200 * feet) + 'm';
routingParameters.limitedVehicleWeight = (50 * ton) + 't';
for (var x = 0; x < points.length; x++) {
var point = points[x];
routingParameters['waypoint' + x] = 'geo!' + point.lt + ',' + point.lng;
}
var router = platform.getRoutingService();
var onResult = function (result) {
And then we display route.
As a result we have this route
Same stuff on app side(Ios example):
v
ar stops = [NMAWaypoint.init(geoCoordinates: NMAGeoCoordinates(latitude: Double(trip.startLatitude) ?? 0, longitude: Double(trip.startLongitude) ?? 0), waypointType: .stopWaypoint)]
stops.append(NMAWaypoint.init(geoCoordinates: NMAGeoCoordinates(latitude: Double(trip.endLatitude) ?? 0, longitude: Double(trip.endLongitude) ?? 0), waypointType: .stopWaypoint) )
let routingMode = NMARoutingMode.init(routingType: NMARoutingType.fastest, transportMode: NMATransportMode.truck, routingOptions: NMARoutingOption.avoidBoatFerry)
let dimensions = TruckDimensions.getSingleEntity()
routingMode.vehicleWidth = dimensions.width.floatValue * feet
routingMode.vehicleHeight = dimensions.height.floatValue * feet
routingMode.vehicleLength = dimensions.length.floatValue * feet
routingMode.limitedVehicleWeight = dimensions.weight.floatValue * ton
let penalty = NMADynamicPenalty.init()
penalty.trafficPenaltyMode = NMATrafficPenaltyMode.optimal
coreRouter.dynamicPenalty = penalty
progress = coreRouter.calculateRoute(withStops: stops, routingMode: routingMode) { (result, error) in
And as result we're getting totally different route:
We found all the parameters that sent by default with app request and tried to do same on web part(at least one that we found on documentation, as its quite empty):
But nothing helped, routed still different. Our guess that web and mobile API uses different calculations on api side, but we cant find any proofs. How can we have both app side and web side api's with trucks parameters give us same routes? This is critical part of the logic.
Thank you

The base algorithms are or started the same, but the code base has diverged between the mobile SDK and web services in the last years.
The Web services have newer features like HSP (historical speed patterns), but the mobile SDK service does not (and can not). Therefore you might get slightly different results.
The mobile SDK route calculation might also be done in offline mode, where the algorithm is different for sure.

Related

Is it fine to call localStorage.getItem in React render? [duplicate]

How fast is looking up a value in localStorage with Javascript?
Does anyone have links to any performance tests that indicate whether or not it is worth caching the data in a JavaScript object? Or does the browser already cache values that are accessed from localStorage anyway?
I am particularly interested in Firefox and Chrome implementations of localStorage.
Just made a small benchmark. What i plan to do is to get some data from localStorage quite often and i wondered will it block. While localStorage.getItem is synced operation it may sound scary but is it?
1st test to call 1 million times localStorage to get item which does exist.
2nd test to call 1 million times localStorage to get item which does NOT exist.
Results:
"Item found: 0.0007991071428571318ms per call"
"Item not found: 0.0006365004639793477ms per call"
In short - just use it. It takes no time. 0.0007 of 1 millisecond.
https://jsbin.com/resuziqefa/edit?js,console
let results = [];
let sum = 0;
localStorage.setItem('foo', 'foo bar baz foo bar baz foo bar baz foo');
for (let i = 0; i < 1000000; i++) {
let a = performance.now();
localStorage.getItem('foo');
let result = performance.now() - a;
sum += result;
results.push(result);
}
console.log(`Item found: ${sum / results.length}ms per call`);
results = [];
sum = 0;
for (let i = 0; i < 1000000; i++) {
let a = performance.now();
localStorage.getItem('bar');
let result = performance.now() - a;
sum += result;
results.push(result);
}
console.log(`Item not found: ${sum / results.length}ms per call`);
For what it's worth, here is a jsperf test.
The benchmark usage of localStorage is significantly slower than access of a regular object properties in both FF7 and IE9. Of course, this is just a micro-benchmark, and does not necessarily reflect real-world usage or performance...
Sample pulled from my FF 7 run to show what "significantly slower" means, in ops/second:
native local-storage notes
small set 374,397 13,657 10 distinct items
large set 2,256 68 100 distinct items
read-bias 10,266 342 1 write, 10 reads, 10 distinct items
Also, there are restrictions on what can be put in localStorage. YMMV.
Apples to apples
There is not much value in comparing localStorage to object storage, the two have different purposes in JavaScript. It is likely that you will only need to touch localStorage a few times in your app and the rest of the work will be done in memory.
Local Storage vs Cookies
A better comparison against localStorage would be that of its classic counterpart, document.cookie. Both localStorage and document.cookie's main purpose is to persist a value across browser refreshes.
I've put together an example on codsandbox.io
localStorage is two orders of magnitude faster than document.cookie.
Object storage is an order of magnitude faster than localStorage (irrelevant but added for reference).
localStorage is by far the fastest mechanism to persist values across a browser refresh.
Note that I've precompiled cookie regex getters in order to make cookies as fast as possible and used the browser performance API for accurate measurements. All tests do a set of a unique key followed by a get of the same key.
I appreciate the efforts of the previous answers but found the benchmarks lacking. Here's a better micro-benchmark, do note, it's still a micro-benchmark. Always prefer measuring real performance bottlenecks to doing premature performance optimization.
Benchmarks are for reading and writing a single value, a list of a hundred objects, and a list of ten-thousand objects from and to localStorage.
TL;DR:
single read: 0.0004ms, write: 0.0114ms
small list read: 0.0325ms, write: 0.0498ms
large list read: 3.1787ms, write: 3.3190ms
Ran on a 3,1 GHz Quad-Core Intel Core i7. Chrome 79.
read local storage - single x 2,439,715 ops/sec ±0.91% (62 runs sampled)
read local storage - small x 30,742 ops/sec ±0.78% (62 runs sampled)
read local storage - large x 315 ops/sec ±1.30% (60 runs sampled)
write local storage - single x 88,032 ops/sec ±4.25% (33 runs sampled)
write local storage - small x 20,079 ops/sec ±1.89% (59 runs sampled)
write local storage - large x 301 ops/sec ±1.09% (60 runs sampled)
Test: read local storage - single
mean: 0.0004098839352502669ms
margin of error: ±0.000003731514453196282ms
devation: ±0.00001499080315635531ms
Test: read local storage - small
mean: 0.03252840093744983ms
margin of error: ±0.0002551322114660716ms
devation: ±0.001024955633672395ms
Test: read local storage - large
mean: 3.1786666666666674ms
margin of error: ±0.041479799689699ms
devation: ±0.16392915653288143ms
Test: write local storage - single
mean: 0.011359496605398242ms
margin of error: ±0.00048286808926899016ms
devation: ±0.0014152377493978731ms
Test: write local storage - small
mean: 0.04980309857651518ms
margin of error: ±0.0009408982120607311ms
devation: ±0.0036873348473201325ms
Test: write local storage - large
mean: 3.31899154589372ms
margin of error: ±0.03605551145015122ms
devation: ±0.14249224018921072ms
Here's a snippet to run it yourself if you wish.
<script src="https://cdn.jsdelivr.net/npm/lodash#4.17.15/lodash.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/benchmark/2.1.4/benchmark.min.js"></script>
<script>
const suite = new Benchmark.Suite();
const genNum = () => Math.floor(Math.random() * 1000000);
const genObj = () => ({
target: String(genNum()),
swap: String(genNum()),
price: genNum()
});
const printStats = test =>
console.log(
`Test: ${test.name}
mean: ${test.stats.mean * 1000}ms
margin of error: ±${test.stats.moe * 1000}ms
devation: ±${test.stats.deviation * 1000}ms`
);
const singleNum = String(genNum());
const smallList = _.range(100).map(genObj);
const largeList = _.range(10000).map(genObj);
const singleKey = "single-key";
const smallKey = "small-key";
const largeKey = "large-key";
localStorage.setItem(singleKey, singleNum);
localStorage.setItem(smallKey, JSON.stringify(smallList));
localStorage.setItem(largeKey, JSON.stringify(largeList));
suite
.add("read local storage - single", function() {
const readData = localStorage.getItem(singleKey);
})
.add("read local storage - small", function() {
const readData = JSON.parse(localStorage.getItem(smallKey));
})
.add("read local storage - large", function() {
const readData = JSON.parse(localStorage.getItem(largeKey));
})
.add("write local storage - single", function() {
localStorage.setItem("single_write_key", singleNum);
})
.add("write local storage - small", function() {
localStorage.setItem("small_write_key", JSON.stringify(smallList));
})
.add("write local storage - large", function() {
localStorage.setItem("large_write_key", JSON.stringify(largeList));
})
.on("cycle", function(event) {
console.log(String(event.target));
})
.on("complete", function() {
this.forEach(printStats);
})
.run({ async: true });
</script>

Google Sheets custom function displays "Loading..." forever in mobile app

I have written a custom function for Google Sheets in Apps Script. The goal is to have a sheet which automatically calculates who owes how much money to whom (e.g. to split a bill).
My sheet looks like this:
The first bill (Restaurant) is to be split among all 5 and the second bill is to be split among all 5 except Peter, because there is no 0 in B3.
The input for my Apps Script function will be cells B1 to F3 (thus, values AND names). The function works fine - it calculates the correct results. I open that spreadsheet via browser (sheets.google.com) AND via my phone app (Google Sheets). However, on my phone it often happens that the result cell (with the formula =calc_debt(B1:F3)) only displays "Loading ...". What's the problem?
For the sake of completeness, here is custom function's code:
function calc_debt(input) {
var credit = [0, 0, 0, 0, 0]; // credit[0] = Peter, credit[1] = Mark ...
for (var i = 1; i < input.length; i++) { // starting at i = 1 to skip the first row, which is the names!
// first: calculate how much everybody has to pay
var sum = 0;
var people = 0;
for (var j = 0; j <= 4; j++) {
if (input[i][j] !== "") {
sum += input[i][j];
people += 1;
}
}
var avg_payment = sum / people;
// second: calculate who has payed too much or too little
for (var j = 0; j <= 4; j++) {
if (input[i][j] !== "") {
credit[j] += input[i][j] - avg_payment;
}
}
}
// this function is needed later
function test_zero (value) {
return value < 0.00001;
};
var res = ""; // this variable will contain the result string, something like "Peter to Mark: 13,8 | Katy to ..."
while (!credit.every(test_zero)) {
var lowest = credit.indexOf(Math.min.apply(null, credit)); // find the person with the lowest credit balance (will be minus!)
var highest = credit.indexOf(Math.max.apply(null, credit)); // find the person with the highest credit balance (will be plus!)
var exchange = Math.min(Math.abs(credit[lowest]), Math.abs(credit[highest])); // find out by how much we can equalize these two against each other
credit[lowest] += exchange;
credit[highest] -= exchange;
res += input[0][lowest] + " to " + input[0][highest] + ": " + exchange.toFixed(2) + " | "; // input[0] = the row with the names.
}
return res;
}
I'm having a similar issue in the android app that loading a custom formula sometimes just shows 'Loading...', while in the web it always works fine. I've found a workaround to load the formulas in the android app:
Menu - > Export - > Save as - > PDF.
This will take a moment and behind the modal loading indicator you will see that the formulars eventually resolve. You can wait for the export to finish or cancel it as soon as you see your formular was resolved.
Also making the document available offline via the menu toggle could resolve the formulars.
Another thing you could do is using caching in your script. So whenever you use the web version to render more complex formulars the results are being stored and immediately loaded for the mobile app. Unfortunately, the Google cache is limited in time and does invalidate after a few hours. See here for more information:
https://developers.google.com/apps-script/reference/cache/
This two things work quite well. However, I'm searching for a better solution. Let me know if you find one.
Solved follow the solution provided here ..
Menu - > Export - > Save as - > PDF
This forces the script to run on mobile and be readable by the mobile sheet

Koa2 session page view count 2 times for each update

When you load the first time, it shows correctly. After, each update gets 2 page view count, not one.
const Koa = require('koa');
const session = require('koa-session');
const app = new Koa();
app.keys = ['Shh, its a secret!'];
app.use(session(app));
app.use(async function(ctx) {
let n = ctx.session.views || 0;
ctx.session.views = ++n;
console.log(`times= ${n}`);
if (n === 1) {
ctx.body = 'Welcome here for the first time!';
} else {
ctx.body = `You visited this page ${n} times!`;
}
});
app.listen(3000);
Actually, I testet it right now and it seems to work correctly. What you have to consider, is, when typing your URL in the browser, some browsers already try to preload things. This also happened here.
What I then tried to be sure, that it works as expected: I shutted down the node/koa app, then in the browser cleared the session for that side (Developertools - Storage - Sessions), started node/koa app again and then on client side hit reload several times. Here, on server side I gut the correct output:
times= 1
times= 2
times= 3
times= 4
times= 5
and on client side the output was also correct.

Real-time multiplayer game latency compensation

To be brief, I am working on a real-time multiplayer game. In my game, the clients send updated position and velocity data to the server at a frequency of 20Hz. In the example code below, I am converting data from a Lua table into a C struct using LuaJIT FFI. It's a nifty way of transporting data over a network:
self.dt_f = self.dt_f + dt
if self.dt_f >= self.tick_f and self.id then
self.dt_f = self.dt_f - self.tick_f
local player = self.players[self.id]
local data = {
type = packets["player_update_f"],
id = self.id,
position_x = player.position.x,
position_y = player.position.y,
position_z = player.position.z,
velocity_x = player.velocity.x,
velocity_y = player.velocity.y,
velocity_z = player.velocity.z,
}
local struct = cdata:set_struct("player_update_f", data)
local encoded = cdata:encode(struct)
self.client:send(encoded)
end
When the server receives the packet, it tries to adjust the data to compensate for the latency between that particular client and itself:
local player = self.players[id]
player.position = update.position or player.position
player.velocity = update.velocity or player.velocity
local server = self.server.connection.socket
local peer = server:get_peer(id)
local ping = peer:round_trip_time() / 2 / 1000
player.position = player.position + player.velocity * ping
Once the data is normalized, it then broadcasts the updated position info to all other clients:
local data = {
type = packets["player_update_f"],
id = id,
position_x = player.position.x,
position_y = player.position.y,
position_z = player.position.z,
velocity_x = player.velocity.x,
velocity_y = player.velocity.y,
velocity_z = player.velocity.z,
}
local struct = cdata:set_struct("player_update_f", data)
local encoded = cdata:encode(struct)
self.server:send(encoded)
When the other clients finally get the packet, they adjust the data based on their latency with the server:
if id ~= self.id then
self.players[id] = self.players[id] or {}
self.players[id].position = update.position or self.players[id].position
self.players[id].velocity = update.velocity or self.players[id].velocity
local ping = self.client.connection.peer:round_trip_time() / 2 / 1000
self.players[id].position = self.players[id].position + self.players[id].velocity * ping
end
So herein lies the problem: The objects are very jittery. Every time I receive a packet, the other players warp a little forward or a little backward, so it seems like my latency compensation is off which makes my interpolation off. Perhaps someone could point out some obvious flaw in my code, or perhaps my understanding of how the process works?
For smooth animation, your server side position updates should take place on a fixed clock using the current values stored for your position/velocity vectors.
When a client update is received you need to make two calculations for the next tick:
First, using the client's vector, find the position the player should be at on the next tick.
Next, calculate a new vector for the server side player to reach that position and use that value to update the server velocity.
The server will then update all client positions in a very uniform way on the next tick. You can smooth the direction changes further by simply projecting 2 or more ticks into the future. The goal when trying to compensate for latency is really just to fall within an acceptable margin of error.

Preforming Bulk data transactions with SalesForce using .Net C#

I am new to SalesForce (3 months).
Thus far I have been able to create an application in C# that I can use to preform Inserts and Updates to the SalesForce database. These transactions are one at a time.
No I have the need to preform large scale transactions. For example updating thousands of records at a time. Doing them one by one would quickly put us over our allotted API calls per 24 hour period.
I want to utilize the available bulk transactions process to cut down on the number of API calls. Thus far I have not had much luck coding this nor have I found any such documentation.
If anyone could either provide some generic examples or steer me to reliable documentation on the subject I would greatly appreciate it.
FYI, the data I need to use to do the updates and inserts comes from an IBM Unidata database sitting on an AIX machine. So direct web services communication is not realy possible. Getting the data from Unidata has been my headache. I have that worked out. Now the bulk api to SalesForce is my new headache.
Thanks in advance.
Jeff
You don't mention which API you're currently using, but using the soap partner or enterprise APIs you can write records to salesforce 200 at a time. (the create/update/upsert calls all take an array of SObjects).
Using the bulk API you can send data in chunks of thousands of rows at a time.
You can find the documentation for both sets of APIs here
The answers already given are a good start; however, are you sure you need to actually write a custom app that uses the bulk API? The salesforce data loader is a pretty robust tool, includes a command line interface, and can use either the "normal" or bulk data API's. Unless you are needing to do fancy logic as part of your insert/updates, or some sort of more real-time / on-demand loading, the data loader is going to be a better option than a custom app.
(this is the SOAP code though, not the Salesforce "Bulk API" ; careful not to confuse the two)
Mighy be below code provide clear insight on how to do bulk insertion.
/// Demonstrates how to create one or more Account records via the API
public void CreateAccountSample()
{
Account account1 = new Account();
Account account2 = new Account();
// Set some fields on the account1 object. Name field is not set
// so this record should fail as it is a required field.
account1.BillingCity = "Wichita";
account1.BillingCountry = "US";
account1.BillingState = "KA";
account1.BillingStreet = "4322 Haystack Boulevard";
account1.BillingPostalCode = "87901";
// Set some fields on the account2 object
account2.Name = "Golden Straw";
account2.BillingCity = "Oakland";
account2.BillingCountry = "US";
account2.BillingState = "CA";
account2.BillingStreet = "666 Raiders Boulevard";
account2.BillingPostalCode = "97502";
// Create an array of SObjects to hold the accounts
sObject[] accounts = new sObject[2];
// Add the accounts to the SObject array
accounts[0] = account1;
accounts[1] = account2;
// Invoke the create() call
try
{
SaveResult[] saveResults = binding.create(accounts);
// Handle the results
for (int i = 0; i < saveResults.Length; i++)
{
// Determine whether create() succeeded or had errors
if (saveResults[i].success)
{
// No errors, so retrieve the Id created for this record
Console.WriteLine("An Account was created with Id: {0}",
saveResults[i].id);
}
else
{
Console.WriteLine("Item {0} had an error updating", i);
// Handle the errors
foreach (Error error in saveResults[i].errors)
{
Console.WriteLine("Error code is: {0}",
error.statusCode.ToString());
Console.WriteLine("Error message: {0}", error.message);
}
}
}
}
catch (SoapException e)
{
Console.WriteLine(e.Code);
Console.WriteLine(e.Message);
}
}
Please find the small code which may help you to insert the data into salesforce objects using c# and WSDL APIs. I stuck to much to write code in c#. I assigned using direct index after spiting you can use your ways.
I split the column using | (pipe sign). You may change this and also <br>, \n, etc. (row and column breaking)
Means you can enter N rows which are in your HTML/text file. I wrote the program to add order by my designers who put the order on other website and fetch the data from e-commerce website and who has no interface for the salesforce to add/view the order records. I created one object for the same. and add following columns in the object.
Your suggestions are welcome.
private SforceService binding; // declare the salesforce servive using your access credential
try
{
string stroppid = "111111111111111111";
System.Net.HttpWebRequest fr;
Uri targetUri = new Uri("http://abc.xyz.com/test.html");
fr = (System.Net.HttpWebRequest)System.Net.HttpWebRequest.Create(targetUri);
if ((fr.GetResponse().ContentLength > 0))
{
System.IO.StreamReader str = new System.IO.StreamReader(fr.GetResponse().GetResponseStream());
string allrow = str.ReadToEnd();
string stringSeparators = "<br>";
string[] row1 = Regex.Split(allrow, stringSeparators);
CDI_Order_Data__c[] cord = new CDI_Order_Data__c[row1.Length - 1];
for (int i = 1; i < row1.Length-1; i++)
{
string colstr = row1[i].ToString();
string[] allcols = Regex.Split(colstr, "\\|");
cord[i] = new CDI_Order_Data__c(); // Very important to create object
cord[i].Opportunity_Job_Order__c = stroppid;
cord[i].jobid__c = stroppid;
cord[i].order__c = allcols[0].ToString();
cord[i].firstname__c = allcols[1].ToString();
cord[i].name__c = allcols[2].ToString();
DateTime dtDate = Convert.ToDateTime(allcols[3]);
cord[i].Date__c = new DateTime(Convert.ToInt32(dtDate.Year), Convert.ToInt32(dtDate.Month), Convert.ToInt32(dtDate.Day), 0, 0, 0); //sforcedate(allcols[3]); //XMLstringToDate(allcols[3]);
cord[i].clientpo__c = allcols[4].ToString();
cord[i].billaddr1__c = allcols[5].ToString();
cord[i].billaddr2__c = allcols[6].ToString();
cord[i].billcity__c = allcols[7].ToString();
cord[i].billstate__c = allcols[8].ToString();
cord[i].billzip__c = allcols[9].ToString();
cord[i].phone__c = allcols[10].ToString();
cord[i].fax__c = allcols[11].ToString();
cord[i].email__c = allcols[12].ToString();
cord[i].contact__c = allcols[13].ToString();
cord[i].lastname__c = allcols[15].ToString();
cord[i].Rep__c = allcols[16].ToString();
cord[i].sidemark__c = allcols[17].ToString();
cord[i].account__c = allcols[18].ToString();
cord[i].item__c = allcols[19].ToString();
cord[i].kmatid__c = allcols[20].ToString();
cord[i].qty__c = Convert.ToDouble(allcols[21]);
cord[i].Description__c = allcols[22].ToString();
cord[i].price__c = Convert.ToDouble(allcols[23]);
cord[i].installation__c = allcols[24].ToString();
cord[i].freight__c = allcols[25].ToString();
cord[i].discount__c = Convert.ToDouble(allcols[26]);
cord[i].salestax__c = Convert.ToDouble(allcols[27]);
cord[i].taxcode__c = allcols[28].ToString();
}
try {
SaveResult[] saveResults = binding.create(cord);
}
catch (Exception ce)
{
Response.Write("Buld order update errror" +ce.Message.ToString());
Response.End();
}
if (str != null) str.Close();
}

Resources