ARKit - Issue with ARSession - ios11

I'm trying to create ARSession, using ARKit with Xcode 9 beta (for iOS 11). But it seems not working properly.
Code I've tried is:
override func viewDidLoad() {
super.viewDidLoad()
// configure session
let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal
// run session
sceneView.session.run(configuration)
}
Can anyone help? Code is almost correct according to Apple Documentation.

You have problem with View controller's life cycle. According to apple guideline for ARSession, it Session can after you view is loaded completely. I mean, user view will appear to run your session.
Here is apple document for the same: Building a Basic AR Experience
Also, look at following sample.

ARWorldTrackingSessionConfiguration supports only iOS devices with an A9 processor or later.
As per Apple doc:
On iOS devices with an A9 processor or later, the
ARWorldTrackingSessionConfiguration
subclass provides high-precision motion tracking and enables features to help you place virtual content in relation to real-world
surfaces.
On other devices supported by ARKit, the ARSessionConfiguration base
class provides basic motion tracking that permits slightly less
immersive AR experiences.
iPhone and Processors:
iPhone 6 and iPhone 6 plus has A8 processor.
iPhone SE, iPhone 6s, iPhone 6s plus has A9 processor.
iPhone 7, iPhone 7plus has A10 processor

There is another problem due to iOS11 beta1 bug, iOS 11 Beta 1 Release Notes And Known Issues According To Apple
This means you need an iPhone 6S or better to use ARKit(ARSessionConfiguration) at the current time. Until the iOS11 beta2 release...
2017.07.13 update
My iphone6 had update to iOS11 beta3, and it can run ARWorldTrackingSessionConfiguration, amazing!
2017.09.07 update
iphone6 can not run ARWorldTrackingConfiguration in recently iOS11 beta...... :(

You need to also initialize a SCNScene and add it to the ARSCNView.
let scene = SCNScene()
sceneView.scene = scene
I usually do this before I set up the session too.

Related

Not getting touch events in Google Chrome Windows 10 after Multi-Touch or Gesture

We are building a kiosk app and noticing some odd behavior with Google Chrome. To debug, we are monitoring all of the incoming events with monitorEvents(document); through the dev console and noticed 2 different kind of scenarios.
Eventually after a few minutes of heavy multi-finger/gesture use, we are stuck in a touchstart/touchend loop where touchend does not propagate down to a mouse/click event (breaking our app as we only handle onClick). It is continuously stuck in touchstart-touchend-touchstart-touchend for every finger click.
A continuation of scenario 1 with heavier use, we are getting absolutely no events in the console, not even a touchstart.
We are using Chrome 62 with Windows 10 with the following flags:
C:\Program Files (x86)\Google\Chrome\Application\chrome.exe --kiosk --incognito --disable-pinch --overscroll-history-navigation=0 URL
We have tried to add --touch-events, add --disable-gpu, remove our previous flags, but still facing the same issue. It seems to be an issue between Chrome and Windows as we are able to replicate it on google.com (with getting no events). When we plug in a mouse, we are able to click the buttons so the app is not frozen, we are just not getting any touch events (only mouse works at this point). Touch is still working on the Windows and Chrome App level, just not our app viewport.
The kiosk is a Dell All-in-One machine. Has anyone else experienced this kind of issue?
Thanks!
UPDATE: Using Chrome 68+ I no longer have this issue. I have not tried versions between 62 and 68 to see where it was fixed but I am glad that it is!
It is a little late now, but maybe this will bring some closure. I experienced the same problem and like you, only in Chrome. The best I could determine was that this glitch is dependent on a particular model of touchscreen in the computer.
Check the hardware IDs for the touchscreen in Device Manager. Only the Inspiron 20-3059 models with a touchscreen made by Advanced Silicon (VID_2149&PID_4C39) exhibit this behavior. Other 20-3059 models with a different touchscreen model (VID_0457&PID_1192) work fine.
Oddly enough, later Inspiron 20-3064 models also work fine, though they also had Advanced Silicon touchscreens. The PID was increased to 4C3E so perhaps they discovered this issue and fixed it for the newer model.
I wasn't able to find any combination of driver/firmware/Chrome versions that would fix the problem, but this is as close to a root cause as I could get.

App doesn't work on iPhone 4S

I have an iOS application codename One, which is running in normally iPhone 5 and 6, which use the iOS operating system version 10.
However, that my app is not working on iPhone 4 and 4s, which use the iOS operating system 9.
Any idea what might be happening?
Thanks in advance.
Apps should work fine all the way back to OS 6.0 without a problem (Apple made the cutoff of 6.0 mandatory).
I'm guessing you failed at the install stage which could mean one of the following:
Missing or incorrect UDID setting - if you got the device UDID from an app it's probably incorrect if it's not in the provisioning well... it's missing.
Previously installed app with the same package/id either from the appstore or a previous build with a difference certificate
Parental control settings or corporate restriction limiting installs
Apple OTA install bug - the workaround for this is simple, connect the device with a cable and try to install the IPA thru itunes. Sometimes it starts working OTA after doing this once (FYI: OTA == Over the air)

ActionScript 3 AIR — Video make blink jump

I make an application for iOS and Android using ActionScript 3 and Adobe AIR ( 3.7 ) to build the ipa and apk. In this application, I load a Video from an FLV and add it in the scene.
The problem is, on the emulator or the Flash view, all is ok, but, on the iPad ( test on iPad 1, 2 and 3 with same results ) the video makes shorts jumps ( like a sudden freeze follow by a short jump into the time-line ) every 2 secondes, approximately.
Of course, I make sure that the video wasn't under other elements or above moving clips. I try to load the video without the rest of the interface : same result. Change the renderMode to "direct" or "gpu", no... Export the video in different quality and assure no redimensionnement ( Even with a dimension in a multiple of 8 ), no again.
I use a similarity of this code to load my video ( It's the test code I use to be sur that the problem wasn't elsewhere in my code )
var myVideo:Video = new Video();
this.addChild(myVideo);
var nc:NetConnection = new NetConnection();
nc.connect(null);
var ns:NetStream = new NetStream(nc);
ns.client = { onMetaData:ns_onMetaData, NetStatusEvent:ns_onPlayStatus };
myVideo.attachNetStream(ns);
ns.play("myLink.flv");
var ns_onMetaData:* = function(item:Object):void { }
var ns_onPlayStatus:* = function(event:NetStatusEvent):void {}
ns.addEventListener(NetStatusEvent.NET_STATUS, ns_onPlayStatus);
Thanks in advance and sorry for my bad english
You should not use FLV on iOS devices. That is my personal guess as to why you are seeing the "jump". FLV is software decoded, so it is relatively slow. My personal guess is you are experiencing dropped frames while the video is being decoded.
On iOS (and all mobile devices, really), you want to use h.264 video with an mp4 extension (m4v will work on iOS, but not on Android, I believe). For playback, you want to use either StageVideo or StageWebView rather than an AS3-based video-player. StageVideo will render using the actual media framework of the device. StageWebView will only work on iOS and certain Android devices, and will render using the actual media player of the device.
The difference between this and Video or FLVPlayback (or the Flex or OSMF-based video players) is that the video will be hardware accelerated/decoded. This means that your app's render time (and thus the video render time) will not also be dictated by how fast the video is decoded because a separate chip will be handling it.
Additionally, hardware accelerated video will be much, much better on battery life. I ran a test last year on an iPad 3 and the difference between battery life consumed by software/CPU decoded FLV and hardware decoded h.264 was somewhere in the neighborhood of 30%.
Keep in mind that both of these options do not render in the display list. StageWebView renders above the display list and StageVideo renders below.
I suggest viewing my previous answers on video render in an AIR for Mobile app as well. I have gone into more detail about video in AIR for mobile in the past. I have built three video-on-demand apps using AIR for Mobile now and it is definitely a delicate task.
NetStream http Video not playing on IOS device
Optimizing video playback in AIR for iOS
How to implement stagevideo in adobe air for android?
Hopefully this is of some help to you.

iOS 6 Audio multi-route - use external microphone AND internal speaker simultaneously

This presentation: http://www.slideshare.net/invalidname/core-audioios6portland on Core Audio in iOS6 seems to suggest (slide 87) that it is possible to over-ride the automatic output / input of audio devices using Av Session.
So, specifically, it is possible to have an external mic.plugged into an iOS6 device and output sound through the internal speaker ? I've seen this asked before on this site: iOS: Route audio-IN thru jack, audio-OUT thru inbuilt speaker but no answer was forthcoming.
Many thanks !
According to Apple's documentation:
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioSession_ClassReference/Reference/Reference.html#//apple_ref/occ/instm/AVAudioSession/overrideOutputAudioPort:error:
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioSession_ClassReference/Reference/Reference.html#//apple_ref/doc/c_ref/AVAudioSessionPortOverride
You can override to the speaker, but if you look more closely at the C based Audio Session services (which is actually being deprecated, but still has helpful information) reference:
https://developer.apple.com/library/ios/documentation/AudioToolbox/Reference/AudioSessionServicesReference/Reference/reference.html#//apple_ref/doc/constant_group/Audio_Session_Property_Identifiers
If a headset is plugged in at the time you set this property’s value
to kAudioSessionOverrideAudioRoute_Speaker, the system changes the
audio routing for input as well as for output: input comes from the
built-in microphone; output goes to the built-in speaker.
I would suggest looking at the documentation for iOS 7 to see if they've added any new functionality. I'd also suggest running tests with external devices like iRiffPort or USB based inputs (if you have an iPad with CCK).

Installing Target Bar file in Blackberry 10 alpha Simulator

I have question w.r.t BB10 BAR file installation in Alpha Simulator.
I created a blackberry 10 casecade application on QNX Momentics IDE. Application works fine on Blackberry 10 alpha simulator when i use the 'Simulator-Debug' mode.
I also created BAR file using 'Export Release Build' wizard from bar-descriptor.xml. BAR created successfully. Later i cleaned up BB 10 alpha simulator and able to install the bar file using vnBB10 tool. It works great.
The question here is Why the app is not running in the BB 10 Simulator When i create a BAR file using 'Device-Release' mode and signed the BAR? (Actually speaking, i installed the device-release BAR file in Simulator. I can see application icon & Splash screen. it also shows the Blackberry permissions window. But after that it stopped abruptly. There was no logs to explore)
-> do you think 'Device-Release' Bar is only made for BB 10(Z10/Q10/Playbook) devices? not for Simulators ?
-> How can i create a single BAR which can run on both BB 10 Device & Simulator?
I really appreciate you for looking into this query.
~albeee~
Device-Release and Device-Debug will build arm binaries while Simulator-Debug will build x86 binaries. Essentially, the processor between real device and simulators are completely different and you need to tell the cross compiler which architecture you are targeting. That's why Device-Release/Device-Debug will never run on the simulator and why Simulator-Debug will never run on a real device.
There might be a way to package both architectures in one bar, but why would you want to? The simulator doesn't require signed bars and is only for development.

Resources