Use GeoJson data with Leaflet.PixiOverlay - reactjs

I am using Leaflet.PixiOverlay(this package) in my App in order to render a big number of data(suck as markers, shapes, geoJson data, etc) without performance issues.
But the documentation of this package does not specify how to use data with GeoJson format to draw on leaflet map.
Does anybody know how to do that ?
Or is there a better way to have a good performance even with big amounts of data on leaflet maps ?
GeoJson Data Sample:
"jsonData": [
{ "type": "Feature",
"geometry": {"type": "Point", "coordinates": [102.0, 0.5]},
"properties": {"prop0": "value0"}
},
{ "type": "Feature",
"geometry": {
"type": "LineString",
"coordinates": [
[102.0, 0.0], [103.0, 1.0], [104.0, 0.0], [105.0, 1.0]
]
},
"properties": {
"prop0": "value0",
"prop1": 0.0
}
}]

Related

JSONSchema keyword "type" when encased inside "items" fails to validate

I'm trying to write a json validator to check files before runtime but there's a really odd issue happening with using "type".
I know "type" is a reserved word but jsonSchema doesn't have an issue with it if it doesn't have a value pairing as I found in this other question: Key values of 'key' and 'type' in json schema.
The solution to their problem was encasing "type": { "type": "string"} inside "properties" and it does work. However, my implementation requires it to be inside an array. Here's the snippet of my code:
{
"type": "object",
"additionalProperties": false,
"properties":{
"method":{
"type": "array",
"items":{
"type": {
"type": "string"
},
"name":{
"type": "string"
},
"provider": {
"type": "array"
}
}
}
}
}
Oddly enough, VScode doesn't have a problem with it in a file when it's isolated, but when it's included in the main code, it doeesn't like it and yields no solution. Regardless, validating it with python yields:
...
raise exceptions.SchemaError.create_from(error)
jsonschema.exceptions.SchemaError: {'type': 'string'} is not valid under any of the given schemas
Failed validating 'anyOf' in metaschema['allOf'][1]['properties']['properties']['additionalProperties']['$dynamicRef']['allOf'][1]['properties']['items']['$dynamicRef']['allOf'][3]['properties']['type']:
{'anyOf': [{'$ref': '#/$defs/simpleTypes'},
{'items': {'$ref': '#/$defs/simpleTypes'},
'minItems': 1,
'type': 'array',
'uniqueItems': True}]}
On schema['properties']['method']['items']['type']:
{'type': 'string'}
What further confuses me is that https://www.jsonschemavalidator.net/ tells me
Expected array or string for 'type', got StartObject. Path 'properties.method.items.type', line 8, position 17. yet JSON Schema Faker is able to generate a fake file without any problems. The generated fake json also returns the same error when validated with python and JSONSchemaValidator.
I'm a beginner and any help or insight will be greatly appreciated, thanks for your time.
Edit: here's the snippet of the input data as requested.
{
...
"method": [
{
"type": "action",
"name": "name of the chaos experiment to use here",
"provider": [
]
}
}
]
}
Arrays don't have properties; arrays have items. The schema as you have included it is not valid; are you sure you don't mean to have this?
{
"type": "object",
"additionalProperties": false,
"properties":{
"method":{
"type": "array",
"items":{
"type": "object",
"properties": {
"type": {
"type": "string"
},
"name":{
"type": "string"
},
"provider": {
"type": "array"
}
}
}
}
}
}

Convert JSON file from flat to nested arrays in Azure Data Factory

I'm trying to copy data from my Oracle DB into the Search Index and I'm using Azure Data Factory to copy data from oracle to Azure Blob Storage.
How I can use it, to import data as nested JSON file.
At now, after query Oracle I get data like this:
[{"BOOKING_ID":1.0,"REFERENCES":"ABC00001","ROUTES":{"ROUTE":1.0,"DESTINATION":"Atlanta, USA","ORIGIN":"New York, USA"}}
,{"BOOKING_ID":2.0,"REFERENCES":"ABC00322","ROUTES":{"ROUTE":2.0,"DESTINATION":"Las Vegas, USA","ORIGIN":"Los Angeles, USA"}}
,{"BOOKING_ID":3.0,"REFERENCES":"ABC32322","ROUTES":{"ROUTE":3.0,"DESTINATION":"Berlin, GER","ORIGIN":"Moscow, RUS"}}
,{"BOOKING_ID":4.0,"REFERENCES":"ABC543345","ROUTES":{"ROUTE":4.0,"DESTINATION":"Rome, ITA","ORIGIN":"Bejin, CHN"}}
,{"BOOKING_ID":5.0,"REFERENCES":"ABC51145","ROUTES":{"ROUTE":5.0,"DESTINATION":"Warsaw, POL","ORIGIN":"Copenhagen, DEN"}}
,{"BOOKING_ID":5.0,"REFERENCES":"ABC51145","ROUTES":{"ROUTE":6.0,"DESTINATION":"Copenhaged, DEN","ORIGIN":"Paris, FRA"}}
,{"BOOKING_ID":5.0,"REFERENCES":"ABC51145","ROUTES":{"ROUTE":7.0,"DESTINATION":"Paris, FRA","ORIGIN":"Madrid, ESP"}}
]
but I need data like this:
[
{
"BOOKING_ID": 1.0,
"REFERENCES": "ABC00001",
"ROUTES": [
{
"ROUTE": 1.0,
"DESTINATION": "Atlanta, USA",
"ORIGIN": "New York, USA"
}
]
},
{
"BOOKING_ID": 2.0,
"REFERENCES": "ABC00322",
"ROUTES": [
{
"ROUTE": 2.0,
"DESTINATION": "Las Vegas, USA",
"ORIGIN": "Los Angeles, USA"
}
]
},
{
"BOOKING_ID": 3.0,
"REFERENCES": "ABC32322",
"ROUTES": [
{
"ROUTE": 3.0,
"DESTINATION": "Berlin, GER",
"ORIGIN": "Moscow, RUS"
}
]
},
{
"BOOKING_ID": 4.0,
"REFERENCES": "ABC543345",
"ROUTES": [
{
"ROUTE": 4.0,
"DESTINATION": "Rome, ITA",
"ORIGIN": "Bejin, CHN"
}
]
},
{
"BOOKING_ID": 5.0,
"REFERENCES": "ABC51145",
"ROUTES": [
{
"ROUTE": 5.0,
"DESTINATION": "Warsaw, POL",
"ORIGIN": "Copenhagen, DEN"
},
{
"ROUTE": 6.0,
"DESTINATION": "Copenhaged, DEN",
"ORIGIN": "Paris, FRA"
},
{
"ROUTE": 7.0,
"DESTINATION": "Paris, FRA",
"ORIGIN": "Madrid, ESP"
}
]
}
]
UPDATE
I use Azure Functions with lodash, but now I'm trying to receive the JSON from Azure Blob Storage. Problem is, what when I try to read the JSON, I have results like this:
"type": "Buffer",
"data": [
239,
187,
191,
91,
123,
...
and all data is in byte type.
Your requirement is group by the BOOKING_ID, merge the ROUTES object into one array. It can't be implemented in the copy activity directly.
Two ideas:
1.Using Web Activity + Azure Function Activity.
In the Web Activity,encapsulate query methods into REST API and return the flat json data.Pass the output of Web Activity into Azure Function Activity. In the Azure Function method,loop the array json data into nested arrays following your needs then configure the output of Azure Function as Azure Blob Storage.(Please refer to this link)
2.Using Custom Activity.
You could execute scripts in the Azure Batch Job rely on VM. Such as you could use cx-Oracle package to query the json data order by the BOOKING_ID,then using python code to loop the result and convert it following your needs.

Apache Nifi: Parse data with UpdateRecord Processor

I'm trying to parse some data in Nifi (1.7.1) using UpdateRecord Processor.
Original data are json files, that I would like to convert to Avro, based on a schema.
The Avro conversion is ok, but in that convertion I also need to parse one array element from the json data to a different structure in Avro.
This is a sample data of the input json:
{ "geometry" : {
"coordinates" : [ [ 4.963087975800593, 45.76365595859971 ], [ 4.962874487781098, 45.76320922779652 ], [ 4.962815443439148, 45.763116079159374 ], [ 4.962744732112515, 45.763010484202866 ], [ 4.962096825239138, 45.762112721939246 ] ]} ...}
Being its schema (specified in RecordReader):
{ "type": "record",
"name": "features",
"fields": [
{
"name": "geometry",
"type": {
"type": "record",
"name": "geometry",
"fields": [
{
"name": "coordinatesJson",
"type": {
"type": "array",
"items": {
"type": "array",
"items": "double"
}
}
},
]
}
},
....
]
}
As you can see, coordinates is an array of arrays.
And I need to parse those data to Avro, based on this schema (specified in RecordWriter):
{
"name": "outputdata",
"type": "record",
"fields": [
{"name": "coordinatesAvro",
"type": {
"type": "array",
"items" : {
"type" : "record",
"name" : "coordinatesAvro",
"fields" : [ {
"name" : "X",
"type" : "double"
}, {
"name" : "Y",
"type" : "double"
} ]
}
}
},
.....
]
}
The problem here is that I'm not being able to parse from coordinatesJson to coordinatesAvro, using RecordPath functions
I tried several mappings, like:
Property: Value:
/coordinatesJson[0..-1]/X /geometry/coordinatesAvro[*][0]
/coordinatesJson[0..-1]/Y /geometry/coordinatesAvro[*][1]
It should be a pretty straighforward parsing step, but as I said, I've been going in circles to achive this for a while.
Any help would be really appreciated.
When I collide with something like that I do next:
1) Transofrm Json into Json with strcuture that I need (for example in your case: coordinatesAvro) by ExecuteScript Processor. I have used ECMAScript cause you can simple parse JSON and work with objects (transform them).
2) ConvertJsonToAvro with one common schema (coordinatesAvro in your case) for Reader and Writer.
It works very good and I have used it on BigData cases. This is one of possible resolutions for your problem.

Leaflet - Convert lat/lng to standard projection

How to convert coordinates from Leaflet coordinate system to coordinate system that Google uses (WGS-84?), if the data are in an external file (geojson)?
In example with external geojson file, I've defined coordinates for Paris and Zagreb and I'm looking for solution to transform these coordinates to accurate location :)
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"id": "par",
"properties": {
"name": "Paris"
},
"geometry": {
"type": "Point",
"coordinates": [
48.858093,
2.294694
]
}
},
{
"type": "Feature",
"id": "zg",
"properties": {
"name": "Zagreb"
},
"geometry": {
"type": "Point",
"coordinates": [
45.815399,
15.966568
]
}
}
]
}
There is Proj4js JavaScript library, but I cannot find similar example for this case (with external file).
Your GeoJSon can be used directly by Leaflet without converting the lat/lng.
I use google maps to get GPS lat/lng of some point and use them in Leaflet without conversion.
// Edit
For me Leaflet and Google maps use the same projection.
//EDIT 2
Html :
<div id="map" class="leaflet-container leaflet-fade-anim"></div>
JS :
var map=L.map( "map" ,{
center:[0,0],
zoom:1, minZoom: 1 , maxZoom:18
});
var base_maps = [];
var layer_OSM = new L.TileLayer('http://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
noWrap: true,
// continuousWorld: true,
attribution: '© OpenStreetMap contributors',
unloadInvisibleTiles: true
});
base_maps['OpenStreetMap'] = layer_OSM;
map.addLayer(layer_OSM);
var markersLayer = new L.FeatureGroup();
var marker = L.marker([p_lat, p_lon]);
markersLayer.addLayer(marker);
map.addLayer(markersLayer);

Angular Leaflet geojson custom marker

Is it possible to define your own marker icons in GeoJSON?
I have tried many ways to get the desired effect but nothing works ..
Example code from geojson FeatureCollection where i want add custom icon:
{
"type": "Feature",
"id": "Point1",
"properties": {
"name": "Last point"
},
"geometry": {
"type": "Point",
"coordinates": [22.57031047873893, 51.25080964529834]
}
}
MapBox has created a CustomMarker plugin for Leaflet that should do the trick.
And another great example from Mapbox, GeoJSON Custom Markers and Style
Here's some sample code from that site:
var geoJsonData = {
"type": "FeatureCollection",
"features": [{
"type": "Feature",
"properties": {
"fillColor": "#eeffee",
"fillOpacity": 0.8
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[119.2895599, 21.718679],
[119.2895599, 25.373809],
[122.61840, 25.37380917],
[122.61840, 21.71867980],
[119.2895599, 21.718679]
]
]
}
}, {
"type": "Feature",
"properties": {
"marker-color": "#00ff00"
},
"geometry": {
"type": "Point",
"coordinates": [120.89355, 23.68477]
}
}]
};
var geoJson = L.geoJson(geoJsonData, {
pointToLayer: L.mapbox.marker.style,
style: function(feature) { return feature.properties; }
}).addTo(map);
NOTE: This is not part of the core LeafletJS library, it requires mapbox.js (and mapbox.css)
If you update to the latest version (at time of writing) the pointToLayer callback in your geojson object will start to work. It doesn't seem to be implemented in 0.7.7 but is currently available on master and I assume it will be in 0.8+.
If you just assign a geoJSON definition to your scope's geojson property with the function present leaflet will render it properly now.

Resources