Displaying android's commands through code in Codename One - codenameone

I have an android build of codename one
Can i show the command menu through code in codename one for android build?
Right now , i need to click on the menu option to access the commands. Is there any code snippet through which I can make the commands appear on the screen (invoke the menu) ?

Not at the moment since we don't support showing the native command menu and the command menu might not exist (think ActionBar on Android 3.x and newer).
However, showing the "lightweight" menu might work for some cases. Be warned that this is unsupported officially and this is in no way a recommendation to doing this!
You can call Form.getMenuBar().showMenu(), again this will work great in the simulator but on the actual devices your millage might vary in terms of device look/feel.

Related

How to enable Screenreader "Browse Mode" in a custom Application?

Screen readers like NVDA implement two modes of operation: Browse Mode and Focus Mode.
Browse Mode is for reading a website/document/application. The screen reader will read all visible contents.
To interact with the website/document/application, Focus Mode offers some advantages: only the interactive content parts are read. In a website, this would mean that links, buttons, forms and navigation are read, but normal text is not read.
In NVDA, you can usually switch between Browse and Focus mode with Insert+Space, which is then confirmed by a sound. This works in most applications: Browsers, Windows Explorer, Skype, VS Code.
However, in my own WPF application (which e.g. has accessibility labels), when I press Insert+Space, nothing happens. NVDA seems to always be in Focus Mode, and there is no Browse Mode.
Intuitively this makes sense, because for Browse Mode, the screenreader needs to "know" what elements it should read, and in what order.
I have no clue where to begin implementing it. Is this a common WPF problem? Is it a problem of NVDA, which somehow needs to know that the application is capable of Browse Mode?
Is it possible you built your application with the accessibility compiler option turned off? Here are a few things you can check:
Accessibility switches in .NET
Example of solving an accessibility issue in .NET
Using Accessibility Insights to inspect accessibility properties in a WPF
If you run the built-in calculator app on Windows, it has the same problem as your app. You're always in forms mode and Ins+space won't switch to browse mode. However, there isn't really any "plain text" to read in the calculator app. Every element is an interactive element.
However, the Settings app does have some plain text and it has the same problem too. I can navigate to all the interactive elements but I can't get to the "Get even more out of Windows" text or the text underneath it. Visually it looks like a heading followed by a paragraph but switching to NVDA browse mode doesn't work.
Seems this is the way NVDA detects accessibility of an application:
Normally, NVDA uses the IAccessible2 API to get accessibility information from Chrome. With this embedded version of Chrome, NVDA seems to be unable to query the IAccessible2 interfaces and falls back to plain IAccessible/MSAA. I've seen this in embeded Chrome versions in Qt as well. Pretty sure it is a problem in the embedded version of Chrome.
Source: https://github.com/nvaccess/nvda/issues/13493

Menu in desktop applications made with Codename One

I would like to be able to add a menu to a desktop application that I have started to develop. I mean a classic application menu, which on Windows is displayed immediately below the title bar (with items like "File", "Edit", "View", etc.), while on MacOS it is displayed in the top bar common to all applications ("App menu"). Ideally each menu item could invoke an ActionListener, just like tapping a Button.
However, I have not found any information about this and I do not even know if it is possible at the moment. Thank you for your suggestions.
This is a feature we plan to add early on in the 8.0 release cycle as we already make some (minimal) use of it in our own apps. Right now the only option is to use the Swing APIs to add the menus using native interfaces.
You can get all the JFrame instances (of which there can be only one) and add any menu to that. If you use the Mac JMenuBar on a typical Codename One build then it will be on top by default since we set this implicitly through: System.setProperty("apple.laf.useScreenMenuBar", "true");.
You can just add entries to menu bar.

How can I make my C editor support Intellij keymap?

I'm used to the keyboard shortcuts used in JetBrains products such as Intellij IDEA and PyCharm.
I'm now using CodeLite to work on C code, and it's really annoying that the keyboard shortcuts are different.
Is there a way to make CodeLite support pretty-much the same key-mapping as PyCharm? If not, what C editor (apart from the paid-for CLion) supports this?
CodeLite has no builtin maps (I might consider this as a feature request if you will open one here: https://github.com/eranif/codelite/new)
You can change the current default keyboard bindings from the main menu bar:
Settings->Keyboard Shortcuts
Just type in the search box the description you want to change, for example:
If you will type build you will get list of all actions associated with the build keyword, double click one and you will open the edit dialog.
HTH,
Eran

Is it possible to enable hover in the simulator?

I was thinking about creating sort of animated screenshots of my app running in the simulator and created some code which is able to record an animated GIF image where I am also able to draw the current pointer as a gloved hand on top.
Then I realised it woud be much better if the pointer was shown even if it is not down.
Therefore - is it possible to enable hover in the simulator?
We have the ability to show it in the JavaSE port but that isn't exposed in the simulator. If you change the compile path to the project you can use:
JavaSEPort.setInvokePointerHover(true);

Simulator options won't show in Xcode 6.4

I don't get the option to run my app on a certain simulator or anything. I'm trying to upload my app to iTunes Connect but it won't even let me.
Try increasing your screen resolution, or making the Xcode window wider. Then it will reappear.
If you can't do either of those (do to screen resolution constraints), then use the Product menu and then the Destination sub-menu as a workaround.
And by the way, this looks like a duplicate Question #31318316

Resources