Setting Scheme programmatically of website - episerver

I have a multisite application with around 40+ Https sites. The scheme column for all the sites is empty.
I tried creating a schedule job that will update Scheme to HTTPS looping in all the sites using SiteDefinitionRepository. The problem is Scheme property is read only & so I cannot set the same.
Is there a way I can set Scheme to HTTPS rather than doing it manually for the 40+sites?

The SiteDefinition and HostDefinition implementations are IReadOnly.
Create a writable clone and set UseSecureConnection to true in the host definition
// ISiteDefinitionRepository siteDefinitionRepository
var sites = _siteDefinitionRepository.List();
foreach (var site in sites)
{
var writableSite = site.CreateWritableClone();
if (site.SiteUrl.Scheme == "http")
{
writableSite.SiteUrl = new Uri(site.SiteUrl.ToString().Replace("http", "https"));
}
var hosts = writableSite.Hosts.Where(x => !x.Name.Equals(HostDefinition.WildcardHostName));
foreach (var writableHost in hosts)
{
writableHost.UseSecureConnection = true;
}
_siteDefinitionRepository.Save(writableSite);
}
The code will turn this
in to this

Related

For what design reason is Asp.Net Core SessionKey different than the SessionId?

Some Background
In asp.net core when using SqlServer to store sessions, oddly enough the Id column in the SqlServer table gets set to the value of sessionKey which is a Guid generated by the SessionMiddleware. I say oddly enough because there is a SessionId but the Id in the table isn't set to that, it is set to the SessionKey. (I'm not making this up)
This sessionKey used for the Id in the table is also the value that is encrypted and placed in the session cookie. Here is that SessionMiddleware code:
var guidBytes = new byte[16];
CryptoRandom.GetBytes(guidBytes);
sessionKey = new Guid(guidBytes).ToString();
cookieValue = CookieProtection.Protect(_dataProtector, sessionKey);
var establisher = new SessionEstablisher(context, cookieValue, _options);
tryEstablishSession = establisher.TryEstablishSession;
isNewSessionKey = true;
The SessionId however, is a Guid generated by the DistributedSession object in the following line of code:
_sessionId = new Guid(IdBytes).ToString();
Interestingly the ISession interface provides a property for the SessionId but not the SessionKey. So it's often much easier in code to get access to a SessionId then a SessionKey, for example when you have access to an HttpContext object.
This makes it hard to match up the session to the database record if you desire to do that. This was noted by another user on stackoverflow as well How to Determine Session ID when using SQL Sever session storage.
Why?
What I want to know is why is the system designed this way? Why isn't the SessionId and SessionKey the one and the same? Why use two different Guids? I ask because I'm creating my own implementation of ISession and I'm tempted to use the SessionKey as the SessionId in my implementation so that it's easier to match up a record in the database to a session. Would that be a bad idea? Why wan't DistributedSession object designed that way rather than generating a SessionId that is different than the SessionKey? The only reason I can think of is perhaps trying increase security by obfuscating the linkage between the database record and the session it belongs to. But in general security professionals don't find security through obfuscation effective. So I'm left wondering why such a design was implemented?
I also posted the question on GitHub https://github.com/aspnet/Session/issues/151#issuecomment-287894321 to try to get an answer as well.
#Tratcher answered the question there so I'm pasting his answer below so that it's available here on stackoveflow too.
The lifetimes are different. The true lifetime of a session (and SessionId) is controlled by the server. SessionKey is stored in the cookie and lives on the client for an indeterminate amount of time. If the session expires on the server and then the client sends a new request with the old SessionKey, a new session instance with a new SessionId is created, but stored using the old SessionKey so that we don't have to issue a new cookie.
Put another way, don't depend on things outside of your control. The client can keep and replay their SessionKey indefinitely, but it's the server that decides if that is really still the same session.
In case someone need to get the sessionkey in asp.net core 3
Add DI for IDataProtector (IMPORTANT! when create protector it should be nameof(SessionMiddleware))
public IDataProtector _dataProtector;
public TestController( IDataProtectionProvider dataProtectionProvider )
{
_dataProtector = dataProtectionProvider.CreateProtector(nameof(SessionMiddleware));
}
Create method which will get proper value for the session cookie
private string Pad(string text)
{
var padding = 3 - ((text.Length + 3) % 4);
if (padding == 0)
{
return text;
}
return text + new string('=', padding);
}
Use it
public ActionResult TestSession( )
{
var protectedText = HttpContext.Request.Cookies[ ".AspNetCore.Session" ];
var sessionKey = "";
var protectedData = Convert.FromBase64String(Pad(protectedText));
if (protectedData == null)
{
sessionKey = string.Empty;
}
var userData = _dataProtector.Unprotect(protectedData);
if (userData == null)
{
sessionKey = string.Empty;
}
sessionKey = Encoding.UTF8.GetString(userData);
return Content( sessionKey );
}

How to list subscriptions with Microsoft.Azure.ResourceManager?

Context
My core goal is to write an Azure WebApps deployment tool in C#. The process will be roughly
User logs in
User selects subscription
User selects or creates resource group
User selects or creates storage for the web app
User selects or creates web service plan
User selects or creates web app
Tool uploads the web app using Kudu to POST a zip
Since the last step can't be done in the portal, my idea was to do everything in a GUI tool.
I started out using Kudu's ARMClient.Authentication and Microsoft.Azure.ResourceManager 1.0.0-preview. However, when it comes to creating a storage account I get a permissions error (The subscription is not registered to use namespace Microsoft.Storage), so my plan B was to do the authentication myself following Brady Gaster's blog post.
The problem
I've set up an application as documented, and using its clientId and tenantId I'm able to log in and list tenants. But I can't list any subscriptions. (NB I've partly elided the clientId and tenantId in case there are security risks with giving them in full).
string clientId = "f62903b9-ELIDED";
string tenantId = "47b6e6c3-ELIDED";
const string redirectUri = "urn:ietf:wg:oauth:2.0:oob";
const string baseAuthUri = "https://login.microsoftonline.com/";
const string resource = "https://management.core.windows.net/";
var ctx = new AuthenticationContext(baseAuthUri + tenantId);
var authResult = ctx.AcquireToken(resource, clientId, new Uri(redirectUri), PromptBehavior.Auto);
var token = new TokenCredentials(authResult.AccessToken);
var subClient = new SubscriptionClient(token);
var tenants = await subClient.Tenants.ListAsync();
foreach (var tenant in tenants) Console.WriteLine(tenant.TenantId);
var subs = await subClient.Subscriptions.ListAsync();
foreach (var sub in subs) Console.WriteLine(sub.DisplayName);
When I run this it prompts me to login, and lists the tenants corresponding to the subscriptions I own or co-administer. But it doesn't list a single subscription. If I change the IDs to the commonly used (I think officially for Powershell) values
clientId = "1950a258-227b-4e31-a9cf-717495945fc2";
tenantId = "common";
then it's the same.
What is the step I've missed in order to get a list of my subscriptions?
You need to iterate through the tenants, authenticate in tenant and get a subscription list for every tenant.
The following code will output the Subscriptions like Get-AzureRmSubscription powershell cmdlet does.
class Program
{
private static string m_resource = "https://management.core.windows.net/";
private static string m_clientId = "1950a258-227b-4e31-a9cf-717495945fc2"; // well-known client ID for Azure PowerShell
private static string m_redirectURI = "urn:ietf:wg:oauth:2.0:oob"; // redirect URI for Azure PowerShell
static void Main(string[] args)
{
try
{
var ctx = new AuthenticationContext("https://login.microsoftonline.com/common");
// This will show the login window
var mainAuthRes = ctx.AcquireToken(m_resource, m_clientId, new Uri(m_redirectURI), PromptBehavior.Always);
var subscriptionCredentials = new TokenCloudCredentials(mainAuthRes.AccessToken);
var cancelToken = new CancellationToken();
using (var subscriptionClient = new SubscriptionClient(subscriptionCredentials))
{
var tenants = subscriptionClient.Tenants.ListAsync(cancelToken).Result;
foreach (var tenantDescription in tenants.TenantIds)
{
var tenantCtx = new AuthenticationContext("https://login.microsoftonline.com/" + tenantDescription.TenantId);
// This will NOT show the login window
var tenantAuthRes = tenantCtx.AcquireToken(
m_resource,
m_clientId,
new Uri(m_redirectURI),
PromptBehavior.Never,
new UserIdentifier(mainAuthRes.UserInfo.DisplayableId, UserIdentifierType.RequiredDisplayableId));
var tenantTokenCreds = new TokenCloudCredentials(tenantAuthRes.AccessToken);
using (var tenantSubscriptionClient = new SubscriptionClient(tenantTokenCreds))
{
var tenantSubscriptioins = tenantSubscriptionClient.Subscriptions.ListAsync(cancelToken).Result;
foreach (var sub in tenantSubscriptioins.Subscriptions)
{
Console.WriteLine($"SubscriptionName : {sub.DisplayName}");
Console.WriteLine($"SubscriptionId : {sub.SubscriptionId}");
Console.WriteLine($"TenantId : {tenantDescription.TenantId}");
Console.WriteLine($"State : {sub.State}");
Console.WriteLine();
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
finally
{
Console.WriteLine("press something");
Console.ReadLine();
}
}
}
A couple things you can look into...
1) the error you saw during creating of the storage account is likely due to the Resource Provider not being registered for use with the subscription. Any RP needs to be registered before use, some clients (Portal, PowerShell) will register the RP for you so you never notice it. See: https://msdn.microsoft.com/en-us/library/azure/dn790548.aspx - you should be able to do that from your code if the user has sufficient perms.
2) You may not be getting any subscriptions back because your endpoint (management.core.windows.net) is the endpoint for Azure Service Management not Azure Resource Manager (management.azure.com). If the subscription access is granted via AzureRM and RBAC, the old ASM apis will not see (i.e. have access to) those subscriptions.

Switching or Selecting Database Server's Connection String Best Practices

I'm developing a software that will be used in different locations with different Servers. It differs in Server Name, Database name, etc.
Example:
Location 1 : Server Name: ChinaServer; Database Name: ChinaDB
Location 2 : Server Name: USServer; Database Name: USDB
Currently, I am using .ini file, I store the server name, database name and other configurations to it. I read it and use it runtime for my connection string. The problem here is that every time we switch locations, I need to change/edit the .ini file.
I'm asking everyone that has more experience that mine to give me other options or best approach on this matter.
Client's Environment : Windows 7
Developers : Windows 7, Visual Studio 2015, MS SQL, VB.NET
Thanks IA.
There's a couple of ways, each with their own advantages and disadvantages, the configuration file would work, could also store it in the registry of the server and read it that way. Or you could even use a My.Setting variable that can be updated in a settings page (probably not the most suitable for your situation)
You can get the basic idea from the The Twelve-Factor App "manifesto":
The twelve-factor app stores config in environment variables...
So what you need is to establish a machine-wide (setx /M NAME VALUE) environment variable which you'll later use like this:
var connectionString = Environment.GetEnvironmentVariable("MY_APP_CONNECTION_STRING");
var dbContext = new DbContext(connectionString);
You can use the SQL Server enumerator in System.Data.Sql, and run a query to get the database names. From there bind those lists to combo boxes, and use a SqlConnectionStringBuilder to keep track of your connection settings. You can save them to disk, or just ask the user to choose the server and database. Note that the enumerator is not guaranteed to always find all servers, so make sure you have a way to enter it manually if necessary.
private SqlConnectionStringBuilder _connString = new SqlConnectionStringBuilder();
private void RefreshConnectionString()
{
_connString.ApplicationName = AppDomain.CurrentDomain.ApplicationIdentity.FullName;
_connString.ApplicationIntent = ApplicationIntent.ReadWrite;
_connString.DataSource = GetSqlDatasources().FirstOrDefault();
_connString.InitialCatalog = GetSqlDatabases().FirstOrDefault();
_connString.AsynchronousProcessing = true;
_connString.ConnectTimeout = 5;
_connString.IntegratedSecurity = true;
_connString.Pooling = true;
}
private IEnumerable<string> GetSqlDatabases()
{
using (var conn = new SqlConnection(_connString.ConnectionString))
{
using (var cmd = new SqlCommand(#"SELECT [name] FROM sys.databases WHERE [name] NOT IN ('master', 'model', 'msdb', 'tempdb')", conn))
{
var dbnames = new List<string>();
try
{
conn.Open();
var reader = cmd.ExecuteReader();
while (reader.Read()) dbnames.Add(reader.GetString(0));
}
catch {}
return dbnames;
}
}
}
private IEnumerable<string> GetSqlDatasources()
{
var sqlEnum = SqlDataSourceEnumerator.Instance;
return sqlEnum.GetDataSources().Rows.OfType<DataRow>().Select(row => row[0].ToString());
}

Content migration error from Ektron to EpiServer

I am working on a content migration project , from Ektron 9 to EpiServer 8, the first task is to migrate the content of specific pages , to achieve that, I was following Ektron's API guidance Ektron Developer API
1- I am approaching this migration the right way? right now I just added Ektron Dll as a reference in my app. I tried to use their web services , but it doesn't have the data i need (content of specific pages).Ektron Web Services
here's a snippet of my code:
GetAllTemplatesRequest cc = new GetAllTemplatesRequest();
//var UserCRUD = new Ektron.Cms.Framework.User.UserManager();
var UserCRUD = new UserManager();
string Token = UserCRUD.Authenticate("admin", "password");
if (!string.IsNullOrEmpty(Token)) // Success
{
try
{
//Create the Content Object set to observe permissions
Ektron.Cms.Framework.Content.ContentManager ContentAPI
= new Ektron.Cms.Framework.Content.ContentManager(ApiAccessMode.Admin);
//Retrieve the content
Ektron.Cms.ContentData contentData;
contentData = ContentAPI.GetItem(30);
//Output the retrieved item's content
var cs = contentData.Html;
}
catch (Exception _e)
{
throw _e;
}
}
else // Fail
{
}
But I am getting this error:
This is what i ended up doing :
As there are many ways to perform the migration , I chose the approach of focusing mainly on EpiServer APIs to create new content , blocks, and assets ; and get all the content i need from Ektron using SQL statements.
Ektron Save all the content in a table named content .
The pages are organized in “Folder structure” fashion , so every page is in a “Folder”
to get the folder ID for a specific page, you can use this query :
select folder_id from content where content_id = 2147485807
with that folder id you can get all the pages listed under that specific folder; for instance you will need to get all the pages under “Articles”.
Then i used that folderID in this query:
SELECT [content_id]
,[content_title]
,[content_html]
,[date_created]
,folder_id
,[content_teaser]
,[content_text]
,[end_date]
,[content_type]
,[template_id]
, content_status
FROM content
where folder_id=(select folder_id from content where content_id = 2147485807)
order by content_title
FOR XML PATH(‘Article’), ROOT (‘Articles’)
which Creates an XML for me ready to consume in my EpiServer code.
The first thing in EPI server I did is to add a new property in the SitePageBase model to add a “LegacyContentID” that will serve as a mapping entry in case i need to access/modify the content of the new created pages. it serves as a link between the data imported from Ektron and the new data i am creating on EPI server.
[Display(
Name = “Legacy Content ID”,
Description = “Content ID from Ektron imported data , for migration purposes”,
GroupName = GroupNames.PageSettings,
Order = 37)]
[ScaffoldColumn(false)]
[Editable(false)]
public virtual string LegacyContentID { get; set; }
Then i created a method to create the article pages , the only parameter it needs is the parentID from IP server (you can create a new page in EpiServer and then under properties of that page you can the page ID).
public void CreateArticlesPages(int parentID)
{
IContentRepository contentRepository = ServiceLocator.Current.GetInstance<IContentRepository>();
var parentlink = new ContentReference(parentID);
XmlDocument doc = new XmlDocument();
doc.LoadXml(File.ReadAllText(#”Articles.xml”));
string jsonText = JsonConvert.SerializeXmlNode(doc);
dynamic data = JsonConvert.DeserializeObject(jsonText);
for (int i = 0; i < data.Articles.Article.Count; i++)
{
var PageImportedObject = data.Articles.Article[i];
ArticlePage page = contentRepository.GetDefault<ArticlePage>(parentlink);
page = contentRepository.GetDefault<ArticlePage>(parentlink);
page.LegacyContentID = PageImportedObject.content_id;
page.Name = PageImportedObject.content_title;
page.PageTitle = PageImportedObject.content_title;
if (PageImportedObject.content_teaser == null)
page.Summary = “No Summary from the Ektron DB”;
else
page.Summary = PageImportedObject.content_teaser;
page.Description = PageImportedObject.content_html.root.Description;
contentRepository.Save(page, EPiServer.DataAccess.SaveAction.Save, EPiServer.Security.AccessLevel.NoAccess);
contentRepository.Save(page, EPiServer.DataAccess.SaveAction.Publish, EPiServer.Security.AccessLevel.NoAccess);
}
}
The code above creates a new page of type “ArticlePage” and add content from the XML generated earlier holding Ektron’s info.
Just coping one dll from an Ektron site into another site will not work.
The Web services idea was a better one. There are web service calls to get the content by id.
Alternatively you could write your own web service that runs inside the ektron site and uses the Ektron API to expose the data you want. Then call that service from the other site.
You will want to review the content migration starter kit. https://github.com/egandalf/ContentTransferStarterKit

Why is the Entity Framework inserting when it should update?

I use the following RIA Services call to register and return a Project entity.
// On Server; inside RIA Domain Service
[Invoke]
public Project CreateNewProject(String a_strKioskNumber)
{
Decimal dProjectID = ObjectContext.RegisterProjectNumber(a_strKioskNumber)
.FirstOrDefault() ?? -1m;
// Tried this but it returned zero (0)
//int nChanged = ObjectContext.SaveChanges();
var project = (from qProject in ObjectContext.Projects.Include("ProjectItems")
where qProject.ID == dProjectID
select qProject)
.FirstOrDefault();
if (project == null)
return null;
return project;
}
As you can see, it calls a stored procedure that returns a project ID. It uses this ID to look up the Project entity itself and return it. When the Project entity is returned to the client it is detached. I attach it to the DomainContext and modify it.
// At Client
_activeProject = a_invokeOperation.Value; // <-- Detached
_context.Projects.Attach(_activeProject); // <-- Unmodified
if (_activeProject != null)
{
_activeProject.AuthenticationType = "strong"; // <-- Modified
_activeProject.OwnerID = customer.ID;
_projectItems.Do(pi => _activeProject.ProjectItems.Add(pi));
_activeProject.Status = "calculationrequired";
}
At this point it has an entity state of Modified. When I submit changes it gives me an exception regarding a UNIQUE KEY violation as if it is trying to insert it rather than update it.
// At Client
_context.SubmitChanges(OnProjectSaved, a_callback);
I'm using the same DomainContext instance for all operations. Why should this not work?
What's going wrong? This is rather frustrating.
Edits:
I tried this (as suggested by Jeff):
[Invoke]
public void SaveProject(Project a_project)
{
var project = (from qProject in ObjectContext.Projects
where qProject.ID == a_project.ID
select qProject)
.FirstOrDefault();
project.SubmitDate = a_project.SubmitDate;
project.PurchaseDate = a_project.PurchaseDate;
project.MachineDate = a_project.MachineDate;
project.Status = a_project.Status;
project.AuthenticationType = a_project.AuthenticationType;
project.OwnerID = a_project.OwnerID;
project.ProjectName = a_project.ProjectName;
project.OwnerEmail = a_project.OwnerEmail;
project.PricePerPart = a_project.PricePerPart;
project.SheetQuantity = a_project.SheetQuantity;
project.EdgeLength = a_project.EdgeLength;
project.Price = a_project.Price;
project.ShipToStoreID = a_project.ShipToStoreID;
project.MachiningTime = a_project.MachiningTime;
int nChangedItems = ObjectContext.SaveChanges();
}
It did absolutely nothing. It didn't save the project.
What happens if you add a SaveProject method on the server side and send the object back to the server for saving?
I've not done EF with RIA Services, but I've always sent my objects back to the server for saving. I'm assuming that SubmitChanges call you are making wires up everything properly for you for sending it back to the server, but perhaps it is doing something wrong and handling it manually will fix it.
I dont have the source at the moment but I have seen it recommended that you use a new context for each operation in Silverlight. I ran into a similar problem today and it was because I was using a Service level context that was remembering previous values that I didnt want, I changed to creating a new context for each service call and the behavior became what I expected.
public void SaveResponses(ICollection<Responses> items, Action<SubmitOperation> callback)
{
try
{
SurveysDomainContext _context = new SurveysDomainContext();
foreach (Responses item in items)
{
_context.Responses.Add(item);
}
_context.SubmitChanges(callback, null);
}
catch (Exception)
{
throw;
}
}
As for the notion that one can't use a singleton global DomainContext, this is actually debatable. In my project I use a singleton DomainContext with no issues. In other projects, we have created a new DomainContext for different modules in the app where the entities are reused. There are definitely pros and cons. See:
Strategies for Handling Your DomainContext (external blog)
It seems that the problem is that when you attach your project to the DomainContext it checks the _context.Projects entityset and isn't finding an entity with that primary key, and then assumes that the newly attached entity doesn't exist serverside yet and that submitting changes should insert it. A possible workaround might be to explicitly load the newly created Project into the DomainContext. It would ensure that it sets the correct state on the entity--that is, that the project already exists on the server and that that it's an update instance, rather than an insert instance.
So maybe something like:
//after your Project has already been created serverside with the invoke
_context.Load(_context.SomeQueryThatLoadsYourNewlyCreatedProject(), LoadBehavior.RefreshCurrent, (LoadOperation lo) => {
Project project = lo.Entities.FirstOrDefault(); //is attached and has correct state
if (project != null)
{
project.AuthenticationType = "strong";
project.OwnerID = customer.ID;
project.Do(pi => _activeProject.ProjectItems.Add(pi));
project.Status = "calculationrequired";
_context.SubmitChanges(); //hopefully will trigger an update, rather than an insert
}
});

Resources