AnyChart 'undefined is not an object' error occurs when touching the graph on a mobile device - anychart

When using a mobile device that is viewing an AnyChart graph, when you touch the graph 'undefined is not an object' error is thrown. This occurs on iOS. Anyone have ideas on what causes this and how to resolve the issue? I am using the latest release 8.9.0
I am using iOS 14.4.2 on a chrome browser, though the same occurs when using Safari.
Below is the base code for generating the chart:
this.chart = anychart.stock();
var credits = this.chart.credits();
credits.enabled(false);
this.plot = this.chart.plot(0);
this.plot.yScale().ticks().allowFractional(false);
this.plot.yScale().maximumGap(0);
this.plot.yScale().minimumGap(0);
let activeSeries = this.seriesTypeOptions.find(opt => opt.checked);
if (this.selectedRange === 'intra' && activeSeries.ohlc !== undefined) {
activeSeries = this.seriesTypeOptions.find(opt => opt.value === 'area');
}
let data = activeSeries.ohlc ? this.ohlcMapping : this.valueMapping;
this.series = this.plot[activeSeries.value](data);
this.series.name(this.longName + ' (' + this.symbol + ')');
if (this.mainColor && this.mainColor !== '') {
this.series.stroke(this.mainColor);
this.series.fill(this.mainColor, 0.5);
}
let grouping = this.chart.grouping();
grouping.maxVisiblePoints(700);
if (this.selectedRange !== 'intra') {
switch (this.groupOptions) {
case 'day': {
grouping.levels([
{ unit: 'day', count: 1 },
]);
grouping.forced(true);
break;
}
case 'week': {
grouping.levels([
{ unit: 'week', count: 1 },
]);
grouping.forced(true);
break;
}
case 'month': {
grouping.levels([
{ unit: 'month', count: 1 },
]);
grouping.forced(true);
break;
}
default: {
break;
}
}
}
this.plot.legend().titleFormat('');
this.plot.yAxis().labels().format("{%value}{decimalsCount:0, groupsSeparator:}");
this.plot.crosshair().yLabel().offsetX(-24);
this.plot.xAxis().labels(true);
this.plot.xAxis().minorLabels(true);
this.plot.xAxis().ticks(true);
this.plot.xAxis().minorTicks(true);
this.plot.yGrid().enabled(true);
if (!this.isIndex) {
let volPlot = this.chart.plot(1);
volPlot.legend().titleFormat('');
let volumeSeries = volPlot.column(this.volumeMapping);
volumeSeries.name('Volume ' + ' (' + this.symbol + ')');
volPlot.crosshair().yLabel().offsetX(-24);
volPlot.height('25%');
volPlot.xAxis().labels(false);
volPlot.xAxis().minorLabels(false);
volPlot.xAxis().ticks(false);
volPlot.yGrid().enabled(true);
if (this.mainColor && this.mainColor !== '') {
volumeSeries.stroke(this.mainColor);
volumeSeries.fill(this.mainColor, 0.5);
}
volPlot.yAxis().labels().format("{%value}{decimalsCount:1, scale: (1)(1000)(1000)(1000)|()(K)(M)(B)}");
volPlot.enabled(this.isVolume);
}
// add indicators
if (this.selectedRange !== 'intra') {
let ind = this.indicatorOptions.filter(o => o.checked);
ind.forEach(i => this.addIndicator(i.type));
}
var scroller = this.chart.scroller();
scroller.area(this.scrollMapping);
scroller.listen('scrollerchange', () => {
let mv = this.chart.getSelectedRange();
this.rangeStartDate = new Date(mv.firstVisible);
this.rangeStopDate = new Date(mv.lastVisible);
});
scroller.enabled(this.isScroller);
let offSet: number = (this.isInfo) ? 425 : 275;
if (this.container !== undefined) {
this.container.nativeElement.style.height = (window.innerHeight - offSet) + 'px';
this.chart.container(this.container.nativeElement);
}
// draw chart
this.chart.draw();

If someone encounters the same issue.
Ignore the exception thrown by the chart on a mobile device. This resolved my issue.

We are glad to notify you that a new version of AnyChart library v8.11.0 is available.
Now the Stock chart doesn't throw errors on the user's tap.

Related

Is there any way I can pick a text from the image (OCR) IN REACT NATIVE

This is the image I want to bring those text present in the image in my valve serial number used many libraries https://github.com/ashrithks/rn-text-detector
https://github.com/zsajjad/react-native-text-detector It does not work in iOS but it works in android while implementing in iOS I got this error
CompileC /Users/Macbook/Library/Developer/Xcode/DerivedData/CRANE-ciqqhizauxjmlaevomkjkiypelyu/Build/Intermediates.noindex/Pods.build/Debug-iphonesimulator/RNTextDetector.build/Objects-normal/x86_64/RNTextDetector.o /Users/Macbook/Documents/GitHub/crane-app/CRANE/node_modules/rn-text-detector/ios/RNTextDetector.m normal x86_64 objective-c com.apple.compilers.llvm.clang.1_0.compiler (in target 'RNTextDetector' from project 'Pods')
(1 failure)
The code I used inside the
import RNTextDetector from "rn-text-detector";
const captureImage = async type => {
let options = {
mediaType: type,
maxWidth: 300,
maxHeight: 550,
quality: 1,
videoQuality: 'low',
durationLimit: 30, //Video max duration in seconds
saveToPhotos: true,
};
let isCameraPermitted = await requestCameraPermission();
let isStoragePermitted = await requestExternalWritePermission();
if (isCameraPermitted && isStoragePermitted) {
launchCamera(options, response => {
console.log('Response = ', response
);
if (response.didCancel) {
alert('User cancelled camera picker');
return;
} else if (response.errorCode == 'camera_unavailable') {
alert('Camera not available on device');
return;
} else if (response.errorCode == 'permission') {
alert('Permission not satisfied');
return;
} else if (response.errorCode == 'others') {
alert(response.errorMessage);
return;
}
console.log('base64 -> ', response.assets[0].base64);
console.log('uri -> ', response.assets[0].uri);
console.log('width -> ', response.assets[0].width);
console.log('height -> ', response.assets[0].height);
console.log('fileSize -> ', response.assets[0].fileSize);
console.log('type -> ', response.assets[0].type);
console.log('fileName -> ', response.assets[0].fileName);
setFilePath(response.assets[0].uri);
(async () => {
const textRecognition = await RNTextDetector.detectFromUri(response.assets[0].uri);
console.log('Base64_Image', textRecognition);
setSerialNumber(textRecognition[0].text)
})();
RNFS.readFile(response.assets[0].uri, "base64").then(result => {
// setSingleFileBase64('data:image/png;base64,' + result)
setImage('data:/' + response.assets[0].type + ';' + 'base64,' + result)
// console.log('Base64_Image', 'data:/'+response.assets[0].type+';'+'base64,' + result);
setScreenData({ ...screenData, image1: 'data:/' + response.assets[0].type + ';' + 'base64,' + result });
})
});
}
};
]

How can I implement ocr react native . I have implemented it in android but it does not work in iOS

https://github.com/ashrithks/rn-text-detector
https://github.com/zsajjad/react-native-text-detector
These are the libraries I used while implementing my project.
It does not work in iOS but it works in android while implementing in iOS I got this error
CompileC /Users/Macbook/Library/Developer/Xcode/DerivedData/CRANE-ciqqhizauxjmlaevomkjkiypelyu/Build/Intermediates.noindex/Pods.build/Debug-iphonesimulator/RNTextDetector.build/Objects-normal/x86_64/RNTextDetector.o /Users/Macbook/Documents/GitHub/crane-app/CRANE/node_modules/rn-text-detector/ios/RNTextDetector.m normal x86_64 objective-c com.apple.compilers.llvm.clang.1_0.compiler (in target 'RNTextDetector' from project 'Pods')
(1 failure)
The code I used inside the
import RNTextDetector from "rn-text-detector";
const captureImage = async type => {
let options = {
mediaType: type,
maxWidth: 300,
maxHeight: 550,
quality: 1,
videoQuality: 'low',
durationLimit: 30, //Video max duration in seconds
saveToPhotos: true,
};
let isCameraPermitted = await requestCameraPermission();
let isStoragePermitted = await requestExternalWritePermission();
if (isCameraPermitted && isStoragePermitted) {
launchCamera(options, response => {
console.log('Response = ', response);
if (response.didCancel) {
alert('User cancelled camera picker');
return;
} else if (response.errorCode == 'camera_unavailable') {
alert('Camera not available on device');
return;
} else if (response.errorCode == 'permission') {
alert('Permission not satisfied');
return;
} else if (response.errorCode == 'others') {
alert(response.errorMessage);
return;
}
console.log('base64 -> ', response.assets[0].base64);
console.log('uri -> ', response.assets[0].uri);
console.log('width -> ', response.assets[0].width);
console.log('height -> ', response.assets[0].height);
console.log('fileSize -> ', response.assets[0].fileSize);
console.log('type -> ', response.assets[0].type);
console.log('fileName -> ', response.assets[0].fileName);
setFilePath(response.assets[0].uri);
(async () => {
const textRecognition = await RNTextDetector.detectFromUri(response.assets[0].uri);
console.log('Base64_Image', textRecognition);
setSerialNumber(textRecognition[0].text)
})();
RNFS.readFile(response.assets[0].uri, "base64").then(result => {
// setSingleFileBase64('data:image/png;base64,' + result)
setImage('data:/' + response.assets[0].type + ';' + 'base64,' + result)
// console.log('Base64_Image', 'data:/'+response.assets[0].type+';'+'base64,' + result);
setScreenData({ ...screenData, image1: 'data:/' + response.assets[0].type + ';' + 'base64,' + result });
})
});
}
};

Reflections in Three Js

I have a bad problem to solve in an application react-based written with typescript that should manage 3D shapes.
The problem is that the reflections don't look good at all. I tried everything I can to better the code but nothing was successful.
The workflow is to create a material holder in a blender, export it and to apply to an empty shape imported from another gltf.
For all the materials I tried (paper, with normal map structure) everything works fine.
with simple paper material
In case the material is reflective (roughness 0.2 and metalness 0.8) this is the result:
with reflective material
As you can see the portion of the environment reflected is dramatically wrong if you think that the cube is a placeholder so done:
one of the six images of the cube map
The code is very simple, here I create the cube Map:
const loader = new THREE.CubeTextureLoader();
const path = `${this.props.baseUrl}backgrounds/${background.value}`;
const ext = background.value === "test" ? "jpg" : "png";
const bkcg = [
`${path}/py.${ext}`,
`${path}/nz.${ext}`,
`${path}/px.${ext}`,
`${path}/ny.${ext}`,
`${path}/pz.${ext}`,
`${path}/nx.${ext}`,
];
// const loader = new THREE.TextureLoader();
// const bkcg = `${this.props.baseUrl}backgrounds/test_texture.jpg`;
loader.load(bkcg, async (t) => {
t.minFilter = THREE.LinearFilter;
t.magFilter = THREE.LinearFilter;
t.encoding = THREE.sRGBEncoding;
t.format = THREE.RGBFormat;
this.setState({
background: t,
bckgLoaded: true,
});
Here I assign the material to the different meshes:
public editMeshes = (
item: THREE.Mesh,
isOrderPage: boolean,
design: IDesign,
materials: {[name: string]: THREE.MeshStandardMaterial | string},
textures: ITextures,
renderer: THREE.WebGLRenderer,
scene: THREE.Scene,
) => {
if (item.name === "Ground" || item.name === "X_Ground_Freigabe") {
const color = isOrderPage ? new THREE.Color("#ffffff") : new THREE.Color("#aaaaaa"); // #8f8f8f
const groundMaterial = new THREE.MeshBasicMaterial({ color });
item.material = groundMaterial;
} else {
item.scale.x = item.scale.x + 0;
item.scale.y = item.scale.y + 0;
item.scale.z = item.scale.z + 0;
if (Object.keys(design).indexOf(item.name) !== -1 || Object.keys(materials).indexOf(item.name) !== -1) {
if (Object.keys(design).indexOf(item.name) === -1 && Object.keys(materials).indexOf(item.name) !== -1) {
if (item.material instanceof THREE.MeshStandardMaterial) {
item.material.color = new THREE.Color(`${materials[item.name]}`);
}
} else {
if (typeof(materials[item.name]) !== "string") {
item.material = materials[item.name] as THREE.MeshStandardMaterial;
if (item.material instanceof THREE.MeshStandardMaterial) {
if (shinyMaterials.indexOf((item.material as THREE.MeshStandardMaterial).name) !== -1) {
this.createCubeCamera(item, textures, scene, renderer);
item.material.needsUpdate = true;
} else {
(item.material as THREE.MeshStandardMaterial).map = textures ? textures[item.name] :
(item.material as THREE.MeshStandardMaterial).map;
(item.material as THREE.MeshStandardMaterial).envMapIntensity = 4;
(item.material as THREE.MeshStandardMaterial).needsUpdate = true;
item.material.flatShading = false;
}
}
}
}
} else {
if (shinyMaterials.indexOf((item.material as THREE.MeshStandardMaterial).name) !== -1) {
this.createCubeCamera(item, textures, scene, renderer);
}
}
}
}
}
here I create the camera for the reflections:
public createCubeCamera = (
item: THREE.Mesh,
textures: ITextures,
scene: THREE.Scene,
renderer: THREE.WebGLRenderer) => {
const cubeCamera = new THREE.CubeCamera(0.001, 10000, 2048);
scene.add(cubeCamera);
cubeCamera.name = `cubeCamera_${item.name}`;
cubeCamera.position.set(item.position.x, item.position.y, item.position.z);
item.visible = false;
cubeCamera.update(renderer, scene);
item.visible = true;
const renderCamera = cubeCamera.renderTarget.texture as THREE.CubeTexture; // scene.background as THREE.CubeTexture
(item.material as THREE.MeshStandardMaterial).map =
textures ? textures[item.name] : (item.material as THREE.MeshStandardMaterial).map;
(item.material as THREE.MeshStandardMaterial).envMap = renderCamera;
(item.material as THREE.MeshStandardMaterial).envMapIntensity = 1;
(item.material as THREE.MeshStandardMaterial).flatShading = false;
(item.material as THREE.MeshStandardMaterial).needsUpdate = true;
}
and here I render the scene:
if (this.props.background) {
scene.background = this.props.background;
const camera = this.props.mainScene.cameras[0];
scene.children[0].children.map((item: THREE.Object3D) => {
if (item instanceof THREE.Light) {
this.props.mockUtils.editLights(
item as THREE.SpotLight | THREE.DirectionalLight,
this.props.scene.lights_intensity,
scene,
);
}
if (item instanceof THREE.Mesh) {
this.props.mockUtils.editMeshes(
item,
this.props.isOrderPage,
this.props.design,
this.props.materials,
this.props.textures,
// this.props.background,
renderer,
scene,
);
}
});
this.props.mockUtils.addAmbientLight(scene, this.props.scene.ambient_light);
scene.children.map((obj: THREE.Object3D) => {
if (obj instanceof THREE.CubeCamera) {
obj.update(renderer, scene);
}
});
renderer.render(scene, camera);
if (this.design.current) {
this.props.mockUtils.finish(
renderer,
FinalImage,
id,
`${this.props.baseUrl}products/${this.props.scene.product_name}.jpg`,
this.props.isOrderPage,
this.props.scene.zoomFactor,
this.design.current,
this.assetsLoaderCallback,
);
}
}
}
Can anyone help me to understand how to receive a normal reflection of the cube map that is applied to the scene background?
Thank you
Michael Moretti

Display more than 100 markers in angularjs google maps with rotation

I have been using ngMap with my angularjs code for displaying markers. However, with more than 100 markers I have noticed that there is a considerable decrease in performance mainly related to ng-repeat and two way binding. I would like to add markers with custom HTML elements similar to CustomMarker but using ordinary Markers and modified from controller when required.
Challenges faced :
I have SVG images which need to be dynamically coloured based on the conditions (These SVGs are not single path ones, hence doesn't seem to work well when I used it with Symbol)
These are vehicle markers and hence should support rotation
I have solved this by creating CustomMarker with Overlay and then adding the markers that are only present in the current map bounds to the map so that map doesn't lag.
Below is the code snippet with which I achieved it.
createCustomMarkerComponent();
/**
* options : [Object] : options to be passed on
* - position : [Object] : Google maps latLng object
* - map : [Object] : Google maps instance
* - markerId : [String] : Marker id
* - innerHTML : [String] : innerHTML string for the marker
**/
function CustomMarker(options) {
var self = this;
self.options = options || {};
self.el = document.createElement('div');
self.el.style.display = 'block';
self.el.style.visibility = 'hidden';
self.visible = true;
self.display = false;
for (var key in options) {
self[key] = options[key];
}
self.setContent();
google.maps.event.addListener(self.options.map, "idle", function (event) {
//This is the current user-viewable region of the map
var bounds = self.options.map.getBounds();
checkElementVisibility(self, bounds);
});
if (this.options.onClick) {
google.maps.event.addDomListener(this.el, "click", this.options.onClick);
}
}
function checkElementVisibility(item, bounds) {
//checks if marker is within viewport and displays the marker accordingly - triggered by google.maps.event "idle" on the map Object
if (bounds.contains(item.position)) {
//If the item isn't already being displayed
if (item.display != true) {
item.display = true;
item.setMap(item.options.map);
}
} else {
item.display = false;
item.setMap(null);
}
}
var supportedTransform = (function getSupportedTransform() {
var prefixes = 'transform WebkitTransform MozTransform OTransform msTransform'.split(' ');
var div = document.createElement('div');
for (var i = 0; i < prefixes.length; i++) {
if (div && div.style[prefixes[i]] !== undefined) {
return prefixes[i];
}
}
return false;
})();
function createCustomMarkerComponent() {
if (window.google) {
CustomMarker.prototype = new google.maps.OverlayView();
CustomMarker.prototype.setContent = function () {
this.el.innerHTML = this.innerHTML;
this.el.style.position = 'absolute';
this.el.style.cursor = 'pointer';
this.el.style.top = 0;
this.el.style.left = 0;
};
CustomMarker.prototype.getPosition = function () {
return this.position;
};
CustomMarker.prototype.getDraggable = function () {
return this.draggable;
};
CustomMarker.prototype.setDraggable = function (draggable) {
this.draggable = draggable;
};
CustomMarker.prototype.setPosition = function (position) {
var self = this;
return new Promise(function () {
position && (self.position = position); /* jshint ignore:line */
if (self.getProjection() && typeof self.position.lng == 'function') {
var setPosition = function () {
if (!self.getProjection()) {
return;
}
var posPixel = self.getProjection().fromLatLngToDivPixel(self.position);
var x = Math.round(posPixel.x - (self.el.offsetWidth / 2));
var y = Math.round(posPixel.y - self.el.offsetHeight + 10); // 10px for anchor; 18px for label if not position-absolute
if (supportedTransform) {
self.el.style[supportedTransform] = "translate(" + x + "px, " + y + "px)";
} else {
self.el.style.left = x + "px";
self.el.style.top = y + "px";
}
self.el.style.visibility = "visible";
};
if (self.el.offsetWidth && self.el.offsetHeight) {
setPosition();
} else {
//delayed left/top calculation when width/height are not set instantly
setTimeout(setPosition, 300);
}
}
});
};
CustomMarker.prototype.setZIndex = function (zIndex) {
if (zIndex === undefined) return;
(this.zIndex !== zIndex) && (this.zIndex = zIndex); /* jshint ignore:line */
(this.el.style.zIndex !== this.zIndex) && (this.el.style.zIndex = this.zIndex);
};
CustomMarker.prototype.getVisible = function () {
return this.visible;
};
CustomMarker.prototype.setVisible = function (visible) {
if (this.el.style.display === 'none' && visible) {
this.el.style.display = 'block';
} else if (this.el.style.display !== 'none' && !visible) {
this.el.style.display = 'none';
}
this.visible = visible;
};
CustomMarker.prototype.addClass = function (className) {
var classNames = this.el.className.trim().split(' ');
(classNames.indexOf(className) == -1) && classNames.push(className); /* jshint ignore:line */
this.el.className = classNames.join(' ');
};
CustomMarker.prototype.removeClass = function (className) {
var classNames = this.el.className.split(' ');
var index = classNames.indexOf(className);
(index > -1) && classNames.splice(index, 1); /* jshint ignore:line */
this.el.className = classNames.join(' ');
};
CustomMarker.prototype.onAdd = function () {
this.getPanes().overlayMouseTarget.appendChild(this.el);
// this.getPanes().markerLayer.appendChild(label-div); // ??
};
CustomMarker.prototype.draw = function () {
this.setPosition();
this.setZIndex(this.zIndex);
this.setVisible(this.visible);
};
CustomMarker.prototype.onRemove = function () {
this.el.parentNode.removeChild(this.el);
// this.el = null;
};
} else {
setTimeout(createCustomMarkerComponent, 200);
}
}
The checkElementVisibility function helps in identifying whether a marker should appear or not.
In case there are better solutions please add it here.Thanks!

customizing ag-grid to set a max number of selectable rows

I am trying to customize a data table using ag-grid in my Angular 1.5 based project. The customization is that the user is allowed to select a maximum number of rows in the table, for example, the maximum is 2.
I have the following code by using node.setSelected(false) that I found in the documentation page here, but I got the error: node.setSelected is not a function when the selection exceeds the maximum of 2.
var gridOptions = {
columnDefs: columnDefs,
rowSelection: 'multiple',
onRowSelected: onRowSelected
};
function onRowSelected(event) {
var curSelectedNode = event.node;
var selectionCounts = vm.gridOptions.api.getSelectedNodes().length;
if (selectionCounts > 2) {
var oldestNode = vm.gridOptions.api.getSelectedNodes()[0]; // get the first node, to be popped out
oldestNode.setSelected(false); // causes the above 'not a function' error
}
}
Does anyone know what might be wrong with ag-grid for its setSelected() API? or any better way to do this customization?
it turns out that setSelected(false) method is outdated in its current ag-grid API, and I found that I can use deselectIndex() method to deselect the oldest node:
if (selectionCounts > 2) {
vm.gridOptions.api.deselectIndex(0, true); // This works!
}
Hope this will help someone else in the future!
var columnDefs =[{
headerName: 'Name',
field: 'name',
width: 108,
minLength: 1,
maxLength: 20,
editable: true
}]
- Modify prototype in file .js
TextCellEditor.prototype.init = function (params) {
var eInput = this.getGui();
var startValue;
// Set min & max length
if (params.column.colDef.maxLength)
eInput.maxLength = params.column.colDef.maxLength;
if (params.column.colDef.minLength)
eInput.minLength = params.column.colDef.minLength;
// cellStartedEdit is only false if we are doing fullRow editing
if (params.cellStartedEdit) {
this.focusAfterAttached = true;
var keyPressBackspaceOrDelete = params.keyPress === constants_1.Constants.KEY_BACKSPACE
|| params.keyPress === constants_1.Constants.KEY_DELETE;
if (keyPressBackspaceOrDelete) {
startValue = '';
}
else if (params.charPress) {
startValue = params.charPress;
}
else {
startValue = params.value;
if (params.keyPress !== constants_1.Constants.KEY_F2) {
this.highlightAllOnFocus = true;
}
}
}
else {
this.focusAfterAttached = false;
startValue = params.value;
}
if (utils_1.Utils.exists(startValue)) {
eInput.value = startValue;
}
this.addDestroyableEventListener(eInput, 'keydown', function (event) {
var isNavigationKey = event.keyCode === constants_1.Constants.KEY_LEFT || event.keyCode === constants_1.Constants.KEY_RIGHT;
if (isNavigationKey) {
event.stopPropagation();
}
});
};

Resources