Is there a recommended approach for updating actively used SilverLight MEF applications. The update might involve a few related MEF components or something more signfiicant. I want to avoid having the runtime attempt to load incompatible components (Component A updated loaded before update, but component B loaded after update).
Is there a mechanism built into .Net, SilverLight or MEF to support versions or should have parallel deployments with a launch page that redirects to the latest version of the application ?
Are all your components in one XAP, or are you using the DeploymentCatalog to download multiple XAPs, or what? If they are all in one XAP it will all be downloaded as a unit and you shouldn't have to worry about different versions of components being used at once.
If you are using multiple XAPs, then you probably either have a list of additional XAPs to download in your primary XAP, or you have a service that the application calls to figure out what components are available. In either case you can have the list of XAPs be different between the versions of your app and point to the correct versions of your components.
Related
We are in the middle of developing a new CRM, and it uses both WPF for local users, and a Windows Universal App (Store App) for the users in the field. The basic flow is this
Customer calls in and gets scheduled on a field user in WPF app.
Field user goes on service call and updates on Surface through Universal App.
Customer gets billed from the WPF App.
All the modules are in place and working, but I do not seem to be able to integration test the entire project flow due to the different Project Types.
What I need to be able to do is add a reference to my Universal App in the unit test for my WPF app (or the other way around), so I can test the flow through both components.
I have been searching for a solution over the last few weeks, and have not been able to come up with any way to do this. If this is not directly possible (which I am guessing it is not), is there any way I can setup a test playlist so that the integration tests run in a particular order.
I am looking to avoid hard coding sample data into the db in order to test the Universal app, as this would not show the true flow of data from one app to another and back again. Any help is appreciated.
UPDATE 8-21-2015:
We realize because of the Framework difference the best solution is going to be to run some type of ordered test. We are using MSTest. The problem with this is that Ordered Tests in Visual Studios are not solution wide, they are only allowed in a unit test project, which means it will be using the References from that project, and not allow both tests.
We don't care if we have to switch to a different testing framework, as having the proper tests is more important than using a specific framework.
You should use a Portable Class Library (PCL). You can create a PCL in Visual Studio from the add new project screen.
When creating your PCL it will ask you what type of projects you want to target. Since you stated WPF and Windows Store app you would .Net 4.5(or whatever version you are on) and WIndows 8.1 (or 8.0)
The PCL will allow you to write code in the Run Time with the lowest level of language features for the targeted projects. In your case this would be the Window's Store app.
Now, instead of storing your logic inside your Window's store project, you will need to move your logic into the PCL.
Once you have compiled your PCL, you will need to add references to the Window's Store app so that you can use the logic within you Windows Store App (you will have to add the namespace for the PCL to make this work). You will also add a reference to the PCL in your unit test project. Because you targeted both Run times in your PCL, you will be able to call the code from your integration tests.
Since all your Logic is in the PCL, you should be able to test everything. The only thing in your Window's Store app now should be the view specific data.
If you ever need to change the targets for your PCL (lets say want it to work with Windows Phone), you can edit the properties of the PCL project, and change the targets. Be warned that if you add a target with a more constricted run time than one you currently selected, you will have to refactor your code to use only functions available in the lowest functional run time.
I am part of a team of .NET developers and we're trying to use the DNN platform as a way to have a website template so that we will not have to spend weeks or months building core functionality, such as authentication, permissions, navigation, etc. However, I'm very confused as to how the platform works as well as how it's installed. I'v spent many hours researching online at http://www.dnnsoftware.com/ as well as other sites, which only added to my confusion. Here are some specific questions which are still unanswered:
Do we install the source code or not? http://www.dnnsoftware.com/wiki/how-to-install-the-source-package-of-dotnetnuke says that it's not recommended to install source code. On the other hand, http://www.dnnsoftware.com/wiki/packages says that we should use the source code if we are developers (which we are).
If we don't use source code, how do we write code which will be used to add functionality, style, or business logic to our site? Where exactly do we put this code?
I keep on seeing the term "module" being thrown around. What in the world is a module?? Is it a separate .csproj file? Is it a .cs file saved as part of the website? If so, how would we incorporate it without the source code?
Like any other application, we need to be able to maintain full control of builds and deployments. With this, we can see history of what we did, roll back changes if necessary, etc. Currently, for our other projects, we build with TeamCity and deploy with OctopusDeploy. Where does that fit into working with DNN without source? I also know that DNN is set up as a web site project not a web application project (see here http://www.dnnsoftware.com/forums/threadid/338902/scope/posts/threadpage/1) and web site projects is a technology not really being maintained by the newer versions of Visual Studio, and may be harder to deploy as well. Assuming I DON'T want to convert (http://blogs.msdn.com/b/webdev/archive/2009/10/29/converting-a-web-site-project-to-a-web-application-project.aspx), how would I build/deploy the web site project?
http://blogs.msdn.com/b/webdev/archive/2009/10/29/converting-a-web-site-project-to-a-web-application-project.aspx seems to state that it's not recommended to remove dependencies from DNN and replace them with other ones. If that's really true, it makes the whole platform seem very fragile and makes me wonder if I'm using the wrong tool altogether. Was DNN really meant for developers or not? (And if not, what was then intended use?)
Start here->
http://www.christoc.com/Tutorials/All-Tutorials/aid/1
1) Don't touch the DNN source, trust me, it isn't worth the headache
2) You add functionality, override style, etc, through the use of Extension (modules and skins)
3) A separate CSProj (check out my templates http://www.christoc.com/Tutorials/All-Tutorials/aid/2
4) You will deploy by taking the ZIP file from each extension and either uploading through the host/extension page, or taking the ZIp file and putting it into /install/module/ in the root of your deployment target, then have a process call /install/install.aspx?mode=installresources
5) DNN is for sure meant for developers, but it is a framework, build on the framework, don't go in and start jacking the framework itself.
I would start by getting a DNN site running on your local. This is fairly straight forward for any .NET developer.
A module is an extension for the DNN framework that you can essentially (once correctly installed) drop on a DNN page (referred to as tab). All your business logic will go in your own modules and the code for these modules will be the only things that you will have to source control. Do not make core changes to DNN as they will be blown away if you ever upgrade.
You do not need to use Christoc's module template if your module will only need to be deployed once. I find that it brings in a lot of unnecessary components and references that you probably will not need. Create your module's using webUserControls that inherit from DotNetNuke.Entities.Modules.PortalModuleBase. Drop the .ascx file in its own folder under DNN's DesktopModules folder and all required .dll's in DNN's bin folder. In DNN, go to Host>Extensions and create a new extention. To the extension add a module control and add your ascx files as controls (leave your default view's key blank). Other views should have unique keys and you can navigate to them in DNN using EditUrl("KeyName").
Drop your module on a DNN page and go from there.
This is of course an over simplification but it should get you going. There are many tutorials online that I advise you watch to learn the basics like Globals.NavigateUrl() to navigate between tabs and how DNN is put together. This forum topic might assist you http://www.dnnsoftware.com/answers/dnn-7-module-development-step-by-step-tutorial
I understand my "this" query may sound very generic but any pointers to guide is highly appreciated.
We are building an WPF application based on MVVM design pattern (with DevXpress controls) and want to achieve the following.
There will be one .EXE file which will run the applications.
All my XAML files need to be as DLL [Why? - Reasons specified below].
Call the DLL dynamically and load with in the WPF application.
[Why?? - Reasons]
Our application is a client/server intranet application. We are planning to use ClickOnce deployment to deploy the application. In case there are any changes to the application, my though is we don't have to change the entire application and recompile it and then deploy so the client machines will be updated.
Instead we can change only those screen or XAML file which are needed and update on the server and ClickOnce will automatically handles the updates on the clients. This also, helps us in maintaining our application and less troublesome for the developers and UI designers.
I am welcome to another better approach also.
Is it possible to unload a Dll that I previously dynamically loaded into my App?
Background/comments:
1.- We have a requirement that third party developers will implement a wizard-like activity that will be dynamically loaded and executed into our Silverlight application.
2.- We will probably use MEF to put the XAP and Dll catalog into the Silverlight App Domain.
3.- With MEF, its possible to unload the catalog objects, but the Dll will remain loaded into the App Domain.
What we are looking for is to get rid of those Dlls in memory, as the appliance that runs the SL application can remain powered on for long time, and we don't want to pollute its memory with unnecessary Dlls.
Any ideas?
You cant unload a single dll from an appdomain on any version of th clr, the only option for unloading dlls is unloading the entire appdomain.
Unfortunatly you cant create your own appdomains in silverlight as far as i know, but you could always have multiple silverlight apps on the same page.
i wonder though if its a long running app, if its not better to look at desktop .net.. you could use the .xbap deployment format if you still want to run your app in the browser
I have a Silverlight project where functionality is segregated across multiple Silverlight libraries due to the size and complexity of the application. I am having problems figuring what is the best way to decouple the RIA Domain Service that gets generated from the Website project. I need to be able to access data from the other libraries as they will be loaded dynamically into the main Silverlight application as needed.
I ended up taking the code that gets generated by Visual Studio in the Generated_Code directory of the main Silverlight application and creating multiple Silverlight libraries to separate the Ria DomainContext, the authentication service, entities, and other Domain services that we had written. I then extracted interfaces for the DomainContext, etc and put them in their own library. Using Microsoft's Unity Framework for Silverlight I was then able to decouple all my modules from the main project. All my modules now use the interfaces. There is one IoC container in the main application where I register all of the classes that implement the interfaces and they get injected into the pages as they are instantiated. Not that compliated after all. The only thing to remember is to leave the EnableClientAccess attribute on the Domain Services classes in the server but remove the ASP.Net server project link from the main Silverlight application. I read that they are planning to make this easier in the final release of Ria services/Silverlight 3 since other people have complained about the tight coupling created by the current setup.