I plan to make a project where the user can view the selected items real time and rotate them around. I would like anyone experienced to tell me which sort of renderer apps could be used for this? The way I imagine would be that upon a click another windows appears where one can rotate the object around, fully textured and lit.
What I could think of is Marmoset, that has a decent online real time viewer, Unreal Engine - if it's possible, or Unity where I can even use coded .shader files for my materials. Any ideas on which to use and how to achieve the effect I'm after?
Marmoset has a decent viewer that you can integrate to WPF seamlessly by using Mongoose http server and cefsharp (webgl enabled).
Related
I`ve a legacy winform desktop app that works perfectly with mouse and keyboard. It has some selfmade controls that involve the creation of threads and so on, for example the longer a button is pushed the faster a number is incremented.
The application also uses a win32 dll. Now, the client wants that application to be touch enabled and run it in a tablet, which also means resizing and rotation capabilities.
My question is, which is the better way to get that application touch enabled and responsive design?
I can try to modify the existing winform, but I think it will be lot of work with poor results. I can also migrate to WPF and reuse the c# code, but I might have trouble with the keyboard, as I have not found a good way to show the keyboard and maintain the whole app on the screen. Or I can migrate to windows store app, but with the problem of that win32 dll, that I`m not sure it could be migrated.
The winform application is multilingual so creating a keyboard is not a valid option.
If the target is touch screen, then for sure the best option would be a Windows Store App, although there are several limitations.
If you are not going to publish this application in Windows Store, then you should be able to use all WinAPI functions. (I'm not sure what is win32.dll - if it's your own dll then it can be a problem).
This might be a stupid question, but there are soo many combinations of approach (wpf, silverlight, winforms, html5) with incompatibility at mscorlib level, that I got completely lost.
I would like to be able to have a few windows mainly displaying realtime charts.
Probably with interaction among the windows (click in one, pop and display a new windows)
If it can be viewed on the web, perfect.
But I dont want to have to deal with another layer of nasty stuff for those features (like having to setup some "WCF" on a "IIS", kill me first)
In the end I was thinking of using FSharpChart on Silverlight.
Is that possible and/ or the best option ?
Thanks for your suggestions
update
I see that system.drawing which fsharpchart relies on is not silverlight supported..
Try Dynamic Data Display instead of FSharpChart. It runs on Silverlight: http://research.microsoft.com/en-us/um/cambridge/groups/science/tools/d3/dynamicdatadisplay.htm
It's not as F# Interactive friendly as FSharpChart, but you can easily wrap it in a handful of functions to make it more usable
I want to create an automated UI test that will test my syncfusion grid. My problem is that the recorder can't recognize this control (or any syncfusion control). I've searched a lot in the internet but I couldn't find any extension so the recorder will recognize my controls (I'm using WinForms, not WPF!), or at least a way to extend the recorder abilities so syncfusion's controls will be recognized somehow.
Is there any easy way to extend the recorder? Or is there any extension available?
Or maybe can I get the grid object from the WinClient that the recorder generates?
Thanks!
Start your program. Run the Spy++ utility. Type Ctrl+F to start the finder tool and drag the bulls-eye onto your form. Ok, Synchronize and have a look-see at the windows that are visible in the tree. If you see regular Windows Forms controls, like a Button or a Label, but not any of the SyncFusion controls then you've probably found the source of the problem.
Component vendors that try to improve .NET controls typically do so by creating 'window-less' controls. They are not really controls, they don't derive from the Control class and don't have a Handle property. They use the surface of the parent to draw themselves, making them look just like controls. The .NET ToolStripItem classes do this. And this is also the approach WPF uses.
The big advantage is that they render quickly and support all kinds of effects that regular controls can't support, like transparency, rotation and anti-aliased window edges. The big disadvantage is that the kind of tool that you are using suddenly gets noddy and can't find the control back. Because they work by finding the Windows window back on your form, there is no window for them.
This is a hard problem to solve, the 'control' exists only in memory and there's no good way for a tool to find it back. Using Accessibility is about the only other way for such a tool to find a control that I can think of. Which would have to be implemented by the control vendor first, a somewhat obscure feature that gets easily overlooked. You really do need the help of the vendor to find a workaround for this. Shouldn't be a problem, that's why you paid them the big money.
This is Rajadurai from Syncfusion. Thank you for your interest in Syncfusion Products. To make UI Test Automation recognize Syncfusion grids(WinForms), some internal support need to be provided in grid whose implementation is in progress and about to be completed. Please submit an incident through Direct-Trac for any further related inquiries in the following link.
http://www.syncfusion.com/Account/Logon?ReturnUrl=%2fsupport%2fdirecttrac
You can also contact us through support#syncfusion.com. We are happy to assist you.
Regards,
Rajadurai
I come from mainly a web development background (ASP.NET, ASP.NET MVC, XHTML, CSS etc) but have been tasked with creating/designing a Silverlight application. The application is utilising Bing Maps control for Silverlight, this will be contained in a user control and will be the 'main' screen in the system.
There will be numerous other user controls on the form that will be used to choose/filter/sort/order the data on the map. I think of it like Visual Studio: the Bing Maps will be like the code editor window and the other controls will be like Solutions Explorer, Find Results etc. (although a lot less of them!)
I have read up and I'm comfortable with the data side (RIA-Services) of the application. I've (kinda) got my head around databinding and using a view model to present data and keep the code behind file lite.
What I do need some help on is UI design/navigation framework, specifically 2 aspects:
How do I best implement a fluid design so that the various user controls which filter the map data can be resized/pinned/unpinned (for example, like the Solution Explorer in VS)? I made a test using a Grid with a GridSplitter control, is this the best way? Would it be best to create a Grid/Gridsplitter with Navigation Frames inside the grid to load the content?
Since I have multiple user controls that basically use the same set of data, should I set the dataContext at the highest possible level (e.g. if using a grid with multiple frames, at the Grid level?).
Any help, tips, links etc. will be very much appreciated!
Microsoft has created a great community site for helping people get started with both design and Silverlight here: http://www.microsoft.com/design/toolbox/
It may be far more than what you need for your current project, but it definitely will give you the training you need to master Design with Silverlight.
While there are a lot of 3D libraries out there, I'm in struggle to find one suitable for WPF.
Basically, I want a Character Animation engine, which loads bone hierarchy and allows me to manipulate the skinned mesh.
I know, this is a classic topic for all the 3D engines. And they are made for building games.
How do I display a Skinned Character in a WPF application?
Depends on how broadly you want to distribute your app, provide installation support for it - and how much work you want to do
1) You can always do it yourself - but you've probably already decided you don't want to spend 2 years of your life building a render pipeline, learning the vagaries of IK, etc
2) You could target XNA - this is sort of WPF, will run on windows, and the xbox to boot - one package you could consider for XNA is Visual 3D - you can find a list of engines here
3) IFF you either can access the target machines directly, or can release your app as a standalone WPF application, you've got a lot of options - all you need is a C# wrapper that allows you to call a native implementation - the one thing you'll loose is WPF ability to superimpose controls, because your render surface will most likely be a winform control embedded in a WPF UI - you will need to get the wrapper DLL into the GAC if you want to distribute the app broadly -
3a) check out the Blender community - the entire tool is open source, and there's a lot of smart people playing in that space
4) I'd tout my own engine but it's undergoing a thorough revision and won't be out again for quite a few months - we'd provide WPF/Silverlight support via option 3 - .Net wrapper over C++ core directly installed into GAC - which makes it available for WPF/Silverlight - I believe we'll still have to pretend to be a winform control to allow the D3D render surfaces to punch through onto the screen
Hope this helps
PS - one side question - You capitalized Virtual Human -- you aren't referencing the NIH Visible Human Project, are you ? If so, last I knew you had to assemble the geometry/bones yourself , all it supplies is the tomagraphic slices
Well you could try this:
OGRE Game engine in WPF
If that OGRE engine supports bones, you should be good to go.
More about the OGRE Engine
Good Luck!
There is no support for skinned meshes in WPF. But you can animate other things like robots well in WPF (see the ape walk example at www.okino.com ).
You can make meshes deform in WPF by changing the position values. I have done this by creating a control which exposes properties to the WPF animation system. It just takes a lot of effort.
Here is my unfinished WPF app. Notice the ball joints in the robot. The only place I really need mesh deformations is in the face, so she can talk.
Since MS dropped support for WPF 3D in Visual Studio its really hard to make realistic animations. Although with UIElement3D you could write an app that allows you to let people drag the robots limbs around and then take a snapshot. An animation system on top of the WPF animation system.
I have written a plugin for blender which allows you to export 3D models from Blender.
So if you want to write a skinned mesh for WPF, you will have to write your own. Bones are really just transforms which work on part of the mesh.