Hello World I am creating a financial software working on a touch panel machine. I have created a Keyboard control for User Input. I want to open start menu on my keyboard's Window button click. How's that possible?
you can't send keystrokes from a Silverlight app to the OS, neither can you execute shell operations. Just remember that SL apps run in a browser. If you were able to do things like simulating keystrokes etc., that would be a serious security issue.
Cheers, Alex
Silverlight does not allow you to directly send commands to the OS, as it runs within a secure browser sandbox. Even when the application is running out-of-browser, the majority of the security restrictions remain.
There are workarounds, however. The simplest of these is to implement a service or native application that can execute the commands, and then connect to it from your Silverlight application with a TCP socket or similar.
I haven't used it but Silverlight 4 has COM interop for out of browser applications. Maybe you can use that. If not directly then another application or service can be installed with a COM api and the SL4 application can send commands to it and that in turn sends keystrokes.
Here is some idea on how to do it.
http://elegantcode.com/2010/02/20/silverlight-4-com-interop-and-the-cool-stuff-you-can-do-with-it/
Related
I`ve a legacy winform desktop app that works perfectly with mouse and keyboard. It has some selfmade controls that involve the creation of threads and so on, for example the longer a button is pushed the faster a number is incremented.
The application also uses a win32 dll. Now, the client wants that application to be touch enabled and run it in a tablet, which also means resizing and rotation capabilities.
My question is, which is the better way to get that application touch enabled and responsive design?
I can try to modify the existing winform, but I think it will be lot of work with poor results. I can also migrate to WPF and reuse the c# code, but I might have trouble with the keyboard, as I have not found a good way to show the keyboard and maintain the whole app on the screen. Or I can migrate to windows store app, but with the problem of that win32 dll, that I`m not sure it could be migrated.
The winform application is multilingual so creating a keyboard is not a valid option.
If the target is touch screen, then for sure the best option would be a Windows Store App, although there are several limitations.
If you are not going to publish this application in Windows Store, then you should be able to use all WinAPI functions. (I'm not sure what is win32.dll - if it's your own dll then it can be a problem).
Is there a way to prevent or disable video capturing of my WPF application? Probably some Win32 API calls or some mask over my WPF content. Or if it is imposible is there a way to at least prevent the most popular screen capture programs from recording what is happening in my WPF application?
To prevent an application from capturing window contents, you can call the SetWindowDisplayAffinity Windows API with a WDA_MONITOR affinity. While this prevents applications from capturing a screen, it will not prevent a user from whipping out their smart phone and taking a picture of the screen.
The API is available on systems running Windows 7 and later. It's also required that Desktop Window Manager composition is enabled. Turning off DWM composition will undo the effect, so you need to prevent users from turning DWM composition off. If you are running Windows 8 and later, this is not an issue, since Desktop Window Manager is always on.
I'm building a Silverlight Out Of Browswer Application with Elevated Permissions and need the ability to basically have the application listen for a keyboard shortcut such as doing something like Ctrl + F10 will cause a window to take focus of the screen... Personally I am against stealing focus but feel that this is alright seeing as the user invokes it themselves.
So more background... if any of you are familiar with applications like XFire or Steam.. I'd like to the ability to do a keyboard shortcut and have a window open above all the other applications like Steam can with the in-game windows.
If Silverlight can't do this can someone point me towards a better language where I can create this sort of application?
If Silverlight can do this can someone point me in the direction of how to accomplish this?
Silverlight 4 can't do this and Silverlight is not a "Language". Any application developement platform (JAVA, Delphi, VB6, .NET) that has full access to the windows API could do it.
Silverlight 5 includes support for PInvoke so if your willing to wait for the RC to go to RTM then you may be able to hook the system level WinAPIs needed to watch for a hotkey.
We would like to be able to communicate with a WPF application from the server.
Is it possible to have a WCF listener / service in a WPF application? And then call this service to open a screen in the WPF application?
Is it possible to have a WCF listener / service in a WPF application
It is fairly simple to create a WCF service listener/server anywhere you want.
var servicehost = new ServiceHost(typeof(SomeService))
servicehost.Open();
One problem is that you have to have enough permissions to have your host be visible. You might have to elevate your application, and would definitely have to make sure the firewall (software/hardware) allows traffic to reach it.
This link seems to cover network setup for WCF MSDN samples, and applies both to IIS hosting, as well as your case, non-IIS hosted WCF:
http://msdn.microsoft.com/en-us/library/ms751527(v=vs.90).aspx
Also, you may run into threading complications, though you'll run into these in any case where you are trying to update the UI from a background thread. If you have problems with this, look into the Dispatcher:
http://msdn.microsoft.com/en-us/magazine/cc163328.aspx
After that, it is up to you to create a client/server design that ensures that your service is created and listening at the right times, torn down at the right times (since ServiceHost is IDisposable), and that it handles state correctly (in case operations get called at times you aren't expecting - there are always bugs in any piece of software).
And then call this service to open a screen in the WPF application
WPF creates code that you can invoke more or less the same way you would in WinForms. You can still do a new MainWindow().Show() call, for example. So simply add such code to your service implementation.
Yes you can host WCF service in both WinForms and WFP applications - MSDN contains some sample. Depending on the way how you host the service you have to deal with UI interaction differently - there is difference between hosting in UI and other thread because other threads cannot access UI controls directly.
We have a legacy application running on UNIX. Our windows users login to the application via a terminal application or commaind prompt. The app looks like one of those car-dealer like application which runs in dos mode interface.
I am creating a new winform interface to the application but still want to keep the legacy application online for a while, however I dont want the users to open up command prompt, but instead access the application from within my winform. Is there a usercontrol which I can place on the winform for command prompt access?
Solution ~ Ended up using a control from Rebex.net, fast integration and easy to use.
See this blog post from Jeffrey
With his code you can write to a Console window, even from a Windows Forms application.
NOTE
I would NOT recommend to do that.
A Windows Application should use the Windows as GUI and not the console window...
I've been keeping my eye on this one. It looks promising.
http://www.codeproject.com/KB/IP/Terminal_Control_Project.aspx?fid=469468&df=90&mpp=25&noise=3&sort=Position&view=Quick&select=2857639