Normally, I put what I consider the constitutional root resources, such as brand colours/brushes, fonts and sizes down in the 'distrib' assembly.
The Distrib assembly|ies are designed to go out to 3rd party dev shops, so they have access our contracts, interfaces, and branding styles.
More complex resources are then built up and declared 'nearer' where they're used.
I've come along to an application that has grown organically, err, haphazardly. What's odd is that the modules use resources from the main app executable project, even though the modules don't reference the app.
I assume that because they're 'importing' all the resources into App.xaml, they're available to the psuedo-runtime designer context.
My question is
If this is how MS designed it to work, have I been doing it wrong all along by managing resources like I do a type system?
Thanks
Luke
** UPDATE **
So, it was pointed out to me that well organised resources are not the way to go in WPF due to a severe performance problem (much as I had found in a big SL4 app I worked on, but assumed it was an SL thing).
Assuming that managing resources in this highly organised way can still be done with a trick or two, and that modular systems often need to merge dictionaries, I began to look into using Christian Moser's SharedResourceDictionary solution, but I get a problem at design time only:
System.IO.Packaging.PackUriHelper
The URI prefix is not recognized.
at System.Net.WebRequest.Create(Uri requestUri, Boolean useUriBase)
at System.Net.WebRequest.Create(Uri requestUri)
at MS.Internal.WpfWebRequestHelper.CreateRequest(Uri uri)
at System.Windows.ResourceDictionary.set_Source(Uri value)
at CompanyName.Presentation.SharedResourceDictionary.set_Source(Uri value)
It looks like it doesn't understand the pack Uris, which is odd since the SharedResourceDictionary just calls down to the original MS implementation in ResourceDictionary, and registering the pack URI scheme statically doesn't help either!! Grrr.
So I need to crack on and the second option is to smash everything into App.xaml and avoid merged dictionaries.
This means fewer controls/views, and setting up a design-time dictionary in my distributable library which I guess does the job of the app.xaml which they won't have access to.
I think that makes sense.
Interesting? Tell Microsoft
It may be for Silverlight, but I'm hoping the WPF folks might be listening, or at least it might fix one platform -- I've added an 'idea' to the UserVoice site that you can vote up.
http://dotnet.uservoice.com/forums/4325-silverlight-feature-suggestions/suggestions/2307678-fix-the-mergeddictionary-perf-problem
Yes, the App.xaml thing seems to be kind of how it's 'supposed' to work, although obviously other ways are possible as you've found. The performance problem is irritating though, and the App.xaml way is also irritating because they don't resolve at design time (at least, they don't for us, if they do for you I want to know why).
However, putting them in App.xaml is the only technique I've found anything approaching an 'official' statement about.
Related
I've been trying to both convert old code and write new code using GetScopedService().
However, I keep discovering ones I didn't know about.
Is there an easy way to find the complete list of services available for 2sxc? For DNN? And maybe even RazorBlade?
If they are not documented somewhere, is there a page of code in the public repositories that I could bookmark where it would be easy to see (and compile) a list of them?
Your best place to start is https://r.2sxc.org/services (which goes to https://docs.2sxc.org/api/dot-net/ToSic.Sxc.Services.html)
This is where we keep all the current services published. Other services are to be seen as exotic / rare use.
Razor-Blade is still mostly non-service, but we plan to fix that.
We're just about to release ServiceKits as a feature, which would make things even more intuitive. For example, ServiceKit14 has all commonly used services on it, and also IScrub from Razor Blade.
Good day!
I want to distribute the c# application and want protect it.
I need:
obfuscation - protection of the source code + text resource files.
error reporting - a report on Unhandled Error.
clear view obfuscated stack trace
ensure there are no changes to source code.
What problems can get out due to the obfuscation (eg serialization / deserialization / reflection / globalization)? Appreciate the complexity of solutions this problems?
What methods / tools / approaches you recommend?
Thanks for help!
Disclaimer: I work for Red Gate.
SmartAssembly does what you're after. For your points in turn:
1) It does control flow obfuscation, method / field renaming, compression / encryption of resources and embedded strings, and separation of methods from their containing classes.
2) Automated error reporting automatically detects and reports unhandled execptions (it also grabs and sends the stack trace, values of all local variables, and some general system info).
3) The obfuscated stack trace gets decoded again on your machine so you can see it in clear view.
4) Not 100% that I know what you mean by this, but tamper protection prevents the app from running at all if any modifications are made to it. If you mean you don't want to make changes to your own source code, it is run as a post-build process so doesn't need any changes to be made to the source.
Re problems you might get with obfuscation, by far the most common are because of reflection (as a result WPF often causes problems), and data binding causes lots of issues too. Most obfuscators should let you exclude individual types and methods which have problems with reflection, though obviously that leaves those types and methods unprotected.
There are other obfuscators too - I know a couple of people who use one from PreEmptive called dotfuscator.
Crypto Obfuscator supports all the features you are looking for including obfuscation, code-protection as well as Exception Reporting (with automatic de-obfuscation as well as full values of all method parameters and local variables).
Another unique feature of Crypto Obfuscator is the Warnings tab shown after obfuscation. This lists all lines of code in your assemblies which can potentially cause the obfuscated assembly to fail. SO you don't have to shoot in the dark trying to figure out why obfuscated assemblies are not working.
DISCLAIMER: I work for LogicNP Software, the developer of Crypto Obfuscator.
Is there any way in Salesforce to group apex classes under a package or namespace? Can we use managed package for internal organization purpose?
This is a limitation in the force.com stack that makes medium-large size projects painful, if not impractical. Using managed packages in order to get a package prefix doesn't really solve any problems, so it's not really worth the trouble.
I usually try to organize a project into one flat level of namespaces. In lieu of actual namespaces, I'll give each would-be-namespace a 3-5 character name, to be used as a prefix. Any class that belongs in the "namespace" gets prefixed. E.g., if I need a payroll namespace, I'd use a PYRL prefix. A class called PaycheckCalculator becomes PYRL_PaycheckCalculator.
The practical advantage of this type of convention is it helps prevent name clashes and classes are grouped by their "namespace" when viewed in a sorted list, such as in an IDE, or Setup > Develop > Apex Classes
Unfortunately, several basic OO principles are still fundamentally broken. Probably the most important one is every class forms an implicit dependency on every other class it has visibility to, which is all of them.
I'd love to hear how others have worked around this limitation.
Well, you can use managed packages, but as Jeremy mentioned it doesn't really buy you much. Of course managed packages are essential for developing publicly listed apps to sell on the AppExchange. But internally it's really an org-wide problem since once you create a managed package with a prefix, everything that touches any other part of it gets stamped with the same namespace prefix, including all custom objects. And worse, you can't access code in a managed package from outside the managed package (which is actually the whole point of them in the first place).
Although it's not the prettiest solution, what I personally do is maintain numerous named orgs with different purposes, applications and utility classes. When I need a utility class in one org, say I'm building a new app destined for the AppExchange, I'll do an Eclipse Export/Import from the utility org in question. It definitely seems strange but having a library of orgs is the best way I've managed to keep track of everything and to manage "internal" organization. But the end result is really just a glorified copy-paste operation between arbitrary code stores.
I faced similar challenges while working on big projects, wrote this blog post sometime back to share the approach I am following now : http://www.tgerm.com/2011/11/apex-class-naming-convention-suggestion.html
I have quite a deal of experience programing with VB6, VB.NET, C# so on and have used ADO, then SubSonic and now I am learning nHibernate since most of the prospective jobs I can go for use nHibernate.
The thing is, I have been programming based on what I have been taught, read or come to understand as best practice. Recently, someone through a spanner in the works and had me thinking. Up until now, I have been accessing the database(s) from both the core applcation and attached DLLs that I write.
What this persons said ends as follows and hence my question:
I can tell you
that you wouldn't normally want to do this - an external class library shouldn't have access to the database
What I was trying to do was to have a shared/static class for nHibernate sessions that could be consumed in both the global scope of the app and from any dll. This class was to be in a "core" DLL which all dlls and the application reference. Like I said I'm learning nHibernate so it may not be the way.
To say i'm questioning my database access methods is putting it lightly.
Can anyone put me straight on this?
Edit:
I suppose looking at what has been commented already, it depends on how the database is being accessed. I would tend never to put username/password credentials etc hardcoded in any DLLs for any means.
More specifically, my query is in relation to NHibernate's sessions. I have a static class, an helper class, which is called at application start and the new session is then created and attached to the current context, in the case of web applications, and then whenever I need the session I call "GetCurrentSession". This static class is in the "core" dll and can be accessed with any DLL etc that references. This behaviour is intended. My only question is is this ok? Should I be doing it another way?
A couple of reasons would be
Access to the database, how do you cover off username/password
sharing the DLL, a "bad" application may get hold of your DLL and link with it to get access to your database.
Saying this, if you have proper security on files, etc. then I would have thought using a DLL would probably be a reasonable way to go.
Assuming that the username and password are not stored directly in the DLL (but maybe passed as parameters, or passed as a complete connection object) this isn't so bad.
The possible bad practice here might be accessing the same database for the same purpose from different places - core app and DLL. This could get confusing quickly to a new developer, unless the separation is clear and logical.
Myself, I might try to move ALL (or almost all) data access to a DLL just for that purpose, then have the serious application logic (or as much as possible) in the core app or yet another DLL.
During localhost testing of modular Prism-based Silverlight applications, the XAP modules download too fast to get a feel for the final result. This makes it difficult to see where progress, splash-screens, or other visual states, needs to be shown.
What is the best (or most standard) method for intentionally slowing down the loading of XAP modules and other content in a local development set-up?
I've been adding the occasional timer delay (via a code-based storyboard), but I would prefer something I can place under the hood (in say the Unity loader?) to add a substantial delay to all module loads and in debug builds only.
Suggestions welcomed*
*Note: I have investigated the "large file" option and it is unworkable for large projects (and fails to create XAP with really large files with out of memory error). The solution needs to be code based and preferably integrate behind the scenes to slow down module loading in a local-host environment.
****Note: To clarify, we are specifically seeking an answer compatible with the Microsoft PRISM pattern & PRISM/CAL Libraries.**
Do not add any files to your module projects. This adds unnecessary regression testing to your module since you are changing the layout of the module by extending the non-executable portion. Chances are you won't do this regression testing, and, who knows if it will cause a problem. Best to be paranoid.
Instead, come up with a Delay(int milliseconds) procedure that you pass into a callback that materializes the callback you use to retrieve the remote assembly.
In other words, decouple assembly resource acquisition from assembly resource usage. Between these two phases insert arbitrarily random amounts of wait time. I would also recommend logging the actual time it took remote users to get the assembly, and use that for future test points so that your UI Designers & QA Team have valuable information on how long users are waiting. This will allow you to cheaply mock-up the end-user's experience in your QA environment. Just make sure your log includes relevant details like the size of the assembly requested.
I posed a question on StackOverflow a few weeks ago about something related to this, and had to deal with the question you posed, so I am confident this is the right answer, born from experience, not cleverness.
You could simply add huge files (such as videos) to your module projects. It'll take longer to build such projects, but they'll also be bigger and therefore take longer to download locally. When you move to production, simply remove the huge files.