Enabling on-screen keyboard in simulator - blackberry-simulator

How can I enable the on-screen, virtual keyboard in the BlackBerry 10 alpha simulator?
I'm having some troubles inputting text into my ported Android application, and had read that the PlayBook simulator also had problems with inputting text through the host PC's keyboard.
I've enabled "Keypress Pop-up" (not knowing what it does) in the "On-screen keyboard" setting in the simulator's settings, but it appears to have no effect:
Is there any way to get the on-screen keyboard to appear, so I can test using that?

I discovered that the on-screen keyboard only appears when dragging in from the bottom-left corner.
Furthermore, I discovered that in portrait mode, the keyboard doesn't display correctly:
To get the simulator in landscape mode, drag from the bottom-right corner; the keyboard then does appear correctly:
Incidentally, this didn't resolve my woes with text input. Since it works so sporadically, I believe this is simply due to the simulator still being in alpha. In fact, at the moment, various keys map incorrectly, such as "3" mapping to "1" and "4" to "8", but most of them simply not at all. This may have to do with android:inputType="number".

Related

How to enable Screenreader "Browse Mode" in a custom Application?

Screen readers like NVDA implement two modes of operation: Browse Mode and Focus Mode.
Browse Mode is for reading a website/document/application. The screen reader will read all visible contents.
To interact with the website/document/application, Focus Mode offers some advantages: only the interactive content parts are read. In a website, this would mean that links, buttons, forms and navigation are read, but normal text is not read.
In NVDA, you can usually switch between Browse and Focus mode with Insert+Space, which is then confirmed by a sound. This works in most applications: Browsers, Windows Explorer, Skype, VS Code.
However, in my own WPF application (which e.g. has accessibility labels), when I press Insert+Space, nothing happens. NVDA seems to always be in Focus Mode, and there is no Browse Mode.
Intuitively this makes sense, because for Browse Mode, the screenreader needs to "know" what elements it should read, and in what order.
I have no clue where to begin implementing it. Is this a common WPF problem? Is it a problem of NVDA, which somehow needs to know that the application is capable of Browse Mode?
Is it possible you built your application with the accessibility compiler option turned off? Here are a few things you can check:
Accessibility switches in .NET
Example of solving an accessibility issue in .NET
Using Accessibility Insights to inspect accessibility properties in a WPF
If you run the built-in calculator app on Windows, it has the same problem as your app. You're always in forms mode and Ins+space won't switch to browse mode. However, there isn't really any "plain text" to read in the calculator app. Every element is an interactive element.
However, the Settings app does have some plain text and it has the same problem too. I can navigate to all the interactive elements but I can't get to the "Get even more out of Windows" text or the text underneath it. Visually it looks like a heading followed by a paragraph but switching to NVDA browse mode doesn't work.
Seems this is the way NVDA detects accessibility of an application:
Normally, NVDA uses the IAccessible2 API to get accessibility information from Chrome. With this embedded version of Chrome, NVDA seems to be unable to query the IAccessible2 interfaces and falls back to plain IAccessible/MSAA. I've seen this in embeded Chrome versions in Qt as well. Pretty sure it is a problem in the embedded version of Chrome.
Source: https://github.com/nvaccess/nvda/issues/13493

"Hide" text box from automatic Win10 keyboard showing

My goal is to show Windows 10 on-screen keyboard when user clicks on text box.
Windows 10 has option to show its on-screen keyboard automatically, even outside of Tablet mode, if specific option is enabled in settings.
However, it seems this logic has some serious issues when working with WPF applications - flickering, not showing up at all etc. You can easily test it on simple WPF application with several text boxes, if you have touch screen Win10 device.
So, I've decided to control keyboard myself, which now works perfectly, with automatic keyboard display option disabled. However, I can't ensure that every users Windows 10 will have this option disabled, so I'd like to make Windows "ignore" clicks on text boxes in my WPF application, so only application itself controls keyboard visibility.
So, my question is - is there any way to make Windows ignore focus on particular text boxes?
PS. If there is no clear way to do this, I would be grateful for any hints towards how Windows actually gets information about WPF text box being edited, so I can maybe play around with my own TextBox implementations, which will not trigger this logic.
Update:
It seems, it is possible to remove "hooks" keyboard is using to find out that text box is focused, by using FrameworkElementAutomationPeer instead of TextBoxAutomationPeer in custom implementation of TextBox, yet this ruins possibility to use this text box in automation (which I don't like).
I don't have a touch screen device to test on - but from my quick mouse clicking tests I seen there is a TextBox.Focusable = false;
https://msdn.microsoft.com/en-us/library/system.windows.uielement.focusable(v=vs.110).aspx
This how ever makes it unable to get keyboard input, so maybe put another method on a TextBox like:
txtBoxTestFocus_MouseDown or txtBoxTestFocus_TouchDown which then could set txtBoxTestFocus.Focusable = true;
Not sure is this will help, as I've been unable to test it sorry

Full screen touch application hangs when a user touches the lower edge

I am working on a wpf touchscreen application for a tabletop monitor. The monitor is a 46" 3M monitor. All the touch responses seem to work fine until someone touches the lower edge of the screen, then it appears to hang. No button responses or drag options work until the finger or hand is moved. I think it has something to do with the touch being in the region of the taskbar even though the taskbar itself does not appear.
The program does not hang if the user touches the side or top in the same way or even if they place their entire hand in the middle of the screen.
Since this is a tabletop monitor, it is very common for a user to lean on the monitor and unknowingly touch the lower edge with their hand.
It appears that this is a bug related to Windows 7 and multitouch drivers for Windows 7. I can reliably reproduce this bug with Windows 7 with a multitouch display and a bezel flat enough to have touch contact on the lower edge.
Using the same program and multitouch display with Windows 8, the problem does not occur.

Can't change orientation to portrait, mobile air apps

I am developing a Blackberry PlayBook application using Flash Builder 4.6 on a Windows 7, x64 machine. I have installed the Blackberry SDK and generally can test my application with one catch - Rotate Left and Rotate Right options from the menu are grayed out. I don't have the actual tablet (and purchasing one is not a solution) so there is absolutely no way for me to test how the app behaves in the Portrait mode.
Perhaps the problem is the fact that my resolution is 1280x800 without any way to make it bigger (I had expected I'd be able to do this and would just activate the peculiar desktop scrolling mechanism). Is there any solution to this problem?
You can rotate the Playbook emulator in VMWare Player (I use Win 7, 64bit too) - by pulling its black corner with the mouse.
You can check portrait as well as landscape mode both in VMWare simulator.
Playbook OS is designed in that manner to support development on both resolutions , if you have 1280 x 800 resolution you can access right most part by using scroll bar in Vmware tool.
BB Playbook os is designed with some predefined features. like in bottom if you use gesture from left most bottom to right , it will open virtual keyboard , & if you will do the same from right most bottom to left , just like swipe down event . it will change the orientation to landscape to portrait or vice versa.

Information on developing WPF touch-screen application for Windows XP

I am currently working on the WPF project which involves creating a touch-screen application for Windows XP embedded. And as Windows XP wasn't built for touch interaction, there are some problems and issues with developing those applications.
An example would be a click: On windows XP click is mouse down and mouse up event, however if you use your finger instead of the mouse, you might get a drag motion instead of the click, as when you press mouse down you finger might slightly move to the side from the initial position and you will get a drag instead of click. This is just a single example of the problems you get when developing an touch-screen app for Windows XP.
If someone has been working on the WPF touch-screen application for Windows XP, could you share some knowledge and point out the pitfalls you have encountered or if you know of any resources on this topic, could you please share it.
I would agree with #bflosabre91. With a mouse you could have the same problems and in fact happens quite frequently when someone is learning to use a mouse. I think this problem is more relevant at the hardware level and how the touchscreen actually interprets what the user is doing.
On the software side, you COULD add some logic something along the lines of:
On mouse down: record coordinates and maybe the control (button, etc.) that is under the pointer
On mouse up: compare recorded coordinates with current coordinates. If it's within x pixels, either do a "control.click" or move the mouse to the old coordinates and tell tell the mouse to click.
The hardware may already be doing something like this...
i have a WPF touch screen application and it is running on kiosks with XP(although its not XP embedded like you said). I haven't had any issue with any type of click event or anything like that. I programmed it using all the normal mouse click events so it technically does work with a mouse or with the touch screen. As long as you build the controls to be large enough to account for the fact a finger will be touching it instead of a mouse pointer, I did not come across any issues.

Resources