Azure AppService plans and Resource Groups - sql-server

Im trying to get my head around proper design of my resources in Azure "universum".
I have done following as pre-reqs for future deployments
+ created Resource Group for SQL Server
+ Created SQL server
lets call it RG-dev-SQL
At the moment I have build myself deployment templates that kick of building of the following :
+ creates resource group RG-webapp-dev-someappName
+ creates AppService plans (1 basic / 1 shared) AppSp-someappname-B1 | AppSp-someappname-B1
+ creates webapp called webapp-dev-someappname
+ Uses one of the before created AppServicePlans for the new web App
+ performs deployment
This works - however my question is if this is the way to go - so using Resource groups lets say per application that I deploy ? So repeating the process above for example for App1...App33 ?
I'm interested how other people see this of use ?
Thanks!

That's totally up to you.
Advantage of having similar entities in same resource group gives you guarantee that all of those entities will run in the same region. So if you web site is using sql server you've created and you want to make sure response time will be minimum it's good to use same resource group.
I'd say that if you plan to use multiple services and combine their functionalities, it's better to keep them in same resource group.
Even after when you decide to split your resources among different resources group you can still do it.
Here's more detailed article about it:
https://azure.microsoft.com/en-us/documentation/articles/resource-group-move-resources/

Related

Automate the execution of a C# code that uses Entity Framework to treat data?

I have code that uses Entity Framework to treat data (retrieves data from multiple tables then performs operations on it before saving in a SQL database). The code was supposed to run when a button is clicked in an MVC web application that I created. But now the client wants the data treatment to run automatically every day at a set time (like an SSIS package). How do I go about this?
But now the client wants the data treatment to run automatically every day at a set time (like an SSIS package). How do I go about this?
In addition to adding a job scheduler to your MVC application as #Pac0 suggests, here are a couple of other options:
Leave the code in the MVC project and create an API endpoint that you can invoke on some sort of schedule. Give the client a PowerShell script that calls the API and let them take it from there.
Or
Refactor the code into a .DLL or copy/paste it into a console application that can be run on a schedule using the Windows Scheduler, SQL Agent or some other external scheduler.
You could use some tool/lib that does this for you. I could recommend Hangfire, it works fine (there are some others, I have not tried them).
The example on their homepage is pretty explicit :
RecurringJob.AddOrUpdate(
() => Console.WriteLine("Recurring!"),
Cron.Daily);
The above code needs to be executed once when your application has started up, and you're good to go. Just replace the lambda by a call to your method.
Adapt the time parameter on what you wish, or even better: make it configurable, because we know customers like to change their mind.
Hangfire needs to create its own database, usually it will stay pretty small for this kind of things. You can also monitor if the jobs ran well or not, and check no the hangfire server some useful stats.

commercetools - multiple catalogs

Doing some discovery with commercetools. I notice that in the Rest JSON message when fetching a product, there is an element : catalog. It looks to be an array. But I have not found anything in the documentation that indicates if there is an ability to have say different catalogs for a given application - For example a Master / Child catalog structure.
So the question is - is there such a thing within this tool? And if so, how would one go about setting it up.
Yes, there is a structural element in the product master data that is a catalog-like concept. It is not implemented as a feature with behavior thoug and the current development plans are not intending to activate this structure for multi-catalog / multi-market / multi-XYZ cases.
But there are ongoing development activities to improve support for such requirements in in a different way. You best watch the release notes to stay up to date.

Using a Reference Table Service in a WPF Application

For past projects(the last few have been web using asp.net mvc) we created a service that caches our reference tables(as required) to be used primarily for dropdown lists.
Now I'm working on a desktop application.An upgrade from vb6/sybase to vb.net/sql server
I'm trying out WPF.
I started down the same path building up my DAL. one entity for each reference table.
I'm at the stage now where I want to setup the business layer (some reference tables can be edited)
And I'm not sure if I should follow the same process which is to use ReferenceTableService to "manage" the reference tables.(interacts with the DAL, Controller)
This will be an application that sits on a share that multiple users run.
What's the best way to deal with the reference tables? Caching them doesn't seem to be an option. Should I simply load them as each user opens up a new form in the application? Perhaps using a "ReferenceTableService"?
In this case, the Reference Table Service is thin layer in the application. Not a process running elsewhere.
I haven't done much WPF (be interesting to see what the WPF Gurus think) but I think your existing approach is sound and I don;t see why you should deviate from it.
Loading up on app start sounds reasonable; you just have to think about the expected lifetime of a user session vs the expected frequency of changes to the reference data.
Caching: if the data comes from a central service you could always introduce caching there.

Making offline database application on WP7 - find the right way

I need to build an offline database application on WP7.
App is simple - it's about making orders from our clients, then translate it to main server (MS SQL).
Spend a days read about existing techologies - but I'am still confused. Which is right for that project?
Sync Framework.
Looking good, but as I understand - it provides single tables - no reference beetwen them. All the references I have to build on client side. Sad.
Entity FrameWork on server side.
And I have no clue - what can I use on client side. Is there a way to serialize entity object to Isolate Store, then restore it, and continue work with it? May be I can use Sync FrameWork, but scheme will become strange then - kinda one way.)))
Working with WCF & XML - most simple for me. A lot of code and conversion, but in this case I understand the data flow. In other view - I already have app with pure SQL-queries. I wanna be advanced. ))))
Using ext. databases (siaqodb for example).
Which one? siaqodb suppots "Sync provider", but it doesn't support references beetwen objects - so I have to build them by myself? Any gain? I don't know.
Is there another way to build such apps? Point it please.
If this has to be done offline, then I would generally use something like:
storing the minimal amount of the required data within isolated using a WP7 specific database like Sterling
using either a new REST or a new RIA/WCF service with objects/functions you define in order to provide the required data synchronisation
I think this is your option 3?
I've never really liked automatic data synchronisation. I just find it easier to code the sync and deal with the error cases myself - this is especially the case if your wp7 client app uses quite a small footprint of data in relation to the larger main server db.

FindByIdentity in System.DirectoryServices.AccountManagment Memory Issues

I'm working on an active directory managament application. In addition to the typical Create A New User, Enable/Disable an account, reset my password etc. it also managages application permissions for all of the clients web applications. Application management is handled by thousands of AD groups such as which are built from 3 letter codes for the application, section and site, there are also hundreds of AD groups which determine which applications and locations a coordinator can grant rights to. All of these groups in turn belong to other groups so I typically filter the groups list with the MemberOf property to find the groups that a user directly belongs to (or everyone has rights to do everything). I've made extensive use of the System.DirectoryServices.AccountManagment namespace using the FindByIdentity method in 31 places throughout the application. This method calls a private method FindPrincipalByIdentRefHelper on the internal ADStoreCtx class. A SearchResultCollection is created but not disposed so eventually typically once or twice a day the web server runs out of memory and all of the applications on the web server stop responsing until iis is reset because the resources used by the com objects aren't ever relased.
There are places where I fall back to the underlying directory objects, but there are lot of places where I'm using the properties on the Principal - it's a vast improvement over using the esoteric ad property names in the .Net 2.0 Directory services code.
I've contacted microsoft about the problem and it's been fixed in .Net 4.0 but they don't currently have plans to fix it in 3.5 unless there is intrest in the community about it.
I only found information about it in a couple of places
the MDSN documentation in the community content state's there is a memory leak at the bottom (guess I should have read that before using the the method)
http://msdn.microsoft.com/en-us/library/bb345628.aspx
And the class in question is internal and doesn't expose SearchResultsCollection outside the offending method so I can't get at the results to dispose them or inherit from the class and override the method.
So my questions are
Has anyone else encountered this problem? If so were you able to work around it?
Do I have any option besides rewriting the application not using any of the .Net 3.5 active directory code?
Thanks
I have encountered the same error, and no, I don't have a workaround other than using the DirectoryEntry approach.
Wrap your calls to directorysearcher inside a using block and also wrap the resultcollection inside a using block and call .Dispose() on the results explicitly. See answer here:
Memory Leak when using PrincipalSearcher.FindAll()

Resources