System.DirectoryServices vs system.directoryservices.accountmanagement - active-directory

I have an array (propertyList) that contains the names of certain Active Directory properties whose data I want to retrieve. Using Ironpython and .NET library System.DirectoryServices I solve the retrieval of properties to be loaded in this way:
for propertyActDir in propertyList:
obj.PropertiesToLoad.Add(propertyActDir)
res = obj.FindAll()
myDict = {}
for sr in res:
for prop in propertyList:
myDict[prop] = getField(prop,sr.Properties[prop][0])
The function getField is mine. How can I solve the same situation using the library system.directoryservices.accountmanagement? I think it is not possible.
Thanks.

Yes, you're right - System.DirectoryServices.AccountManagement builds on System.DirectoryServices and was introduced with .NET 3.5. It makes common Active Directory tasks easier. If you need any special properties you need to fall back to System.DirectoryServices.
See this C# code sample for usage:
// Connect to the current domain using the credentials of the executing user:
PrincipalContext currentDomain = new PrincipalContext(ContextType.Domain);
// Search the entire domain for users with non-expiring passwords:
UserPrincipal userQuery = new UserPrincipal(currentDomain);
userQuery.PasswordNeverExpires = true;
PrincipalSearcher searchForUser = new PrincipalSearcher(userQuery);
foreach (UserPrincipal foundUser in searchForUser.FindAll())
{
Console.WriteLine("DistinguishedName: " + foundUser.DistinguishedName);
// To get the countryCode-attribute you need to get the underlying DirectoryEntry-object:
DirectoryEntry foundUserDE = (DirectoryEntry)foundUser.GetUnderlyingObject();
Console.WriteLine("Country Code: " + foundUserDE.Properties["countryCode"].Value);
}

System.DirectoryServices.AccountManagement (excellent MSDN article on it here) is designed to help you more easily manage user and groups, e.g.
find users and groups
create users and groups
set specific properties on users and groups
It is not designed to handle "generic" property management like you describe - in that case, simply keep on using System.DirectoryServices, there's nothing stopping you from doing this!
Marc

Related

Need to supply DB password to run evolutions at run time - Play + Slick

I need to avoid storing plain text passwords in config files, and so I'm storing the Postgres password externally (in AWS Secrets Manager).
Similarly to the solution provided here:
Encrypted database password in Play + Slick + HikariCP application, I've been able to override dbConfig and supply the password to my DAO classes like this:
trait MyDaoSlick extends MyTableDefinitions with HasDatabaseConfig[MyPostgresDriver] {
protected val dbConfigProvider: DatabaseConfigProvider
override protected val dbConfig: DatabaseConfig[MyPostgresDriver] = secretDbConfig(dbConfigProvider)
def secretDbConfig(dbConfigProvider: DatabaseConfigProvider): DatabaseConfig[MyPostgresDriver] = {
DatabaseConfig.forConfig[MyPostgresDriver]("", dbConfigProvider.get[MyPostgresDriver].config
.withValue("db.user", ConfigValueFactory.fromAnyRef(getUN))
.withValue("db.password", ConfigValueFactory.fromAnyRef(getPWD)))
}
}
This works great for regular DB queries, however evolutions bypass this and still expect the username and the password to be in application.conf, which kind of defeats the purpose of the password being a secret.
Any advice on how evolutions could get the DB credentials from a function?
I ran into the same issue, and I managed to resolve it like this:
Create a custom application loader, as shown here: https://www.playframework.com/documentation/2.7.x/ScalaDependencyInjection#Advanced:-Extending-the-GuiceApplicationLoader
Inside the custom loader's builder, append the DB configuration parameters for Slick:
val extra = Seq(
"slick.dbs.default.db.url" -> secrets.url,
"slick.dbs.default.db.user" -> secrets.user,
"slick.dbs.default.db.password" -> secrets.pass
)
Nothing else needs to be changed, as you've basically added the configuration needed for anything Slick, evolutions included.
On older versions of Play, we used to do this inside GlobalSettings.onLoadConfig, but, at some point, that has been deprecated in favour of DI. More details here: https://www.playframework.com/documentation/2.7.x/GlobalSettings

Is there any way to trace\log the sql using Dapper?

Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.

Batch create extranet users in Composite C1

I am in the process of building a website that in near future shall replace an existing web-site. The existing web-site contains aprox. 300 users which I need to "import" to the extranet-module (which I have bought) in composite.
Is there a way to batch create users to the extranet module?
Yes, you can import your existing user database. You can either do this by writing a script and have that execute on your web site or by directly manipulating the underlying SQL table / XML file (depending on what you use to store Composite C1 data). You can also build a provider that links your existing user database with Composite C1 Extranet.
Importing users programmatically: For a script approach please see methods like AddNewUser described on http://docs.composite.net/Packages/Community/Extranet/CompositeCommunityExtranetDeveloperGuide/Using-Extranet-Facade-Methods
You would write this script as web service, aspx page or similar which executes on the Composite C1 website.
If you are running the Extranet in a default setup expect the providerName to be "Default".
Manipulating the physical data store directly: This depends on what data store you are running on. I suggest you add the groups you want and a test user to help you recognize data when you look at the underlying XML files / SQL tables.
If you are running on XML (default) you should focus on the files named Composite.Community.Extranet.DefaultProvider.DataTypes.DefaultProvider*.xml located in the folder ~/App_Data/Composite/DataStores. There are 3 sush files, one for groups, one for users and one for the relation between users and groups.
If you are running on SQL Server you should focus on the 3 tables named Composite.Community.Extranet.DefaultProvider.DataTypes.DefaultProvider*
In both cases you would need to add new entries to the User table/xml file and matching group relations to the GroupUser table/xml file. When you add a user you provide a unique ID and this ID you reuse to register the user in GroupUser.
When you have made your changes you can force Composite C1 to reload by using the Tools | Restart Server command in the C1 Console. If you make a backup of files/tables before you make changes you can easily revert by restoring the backup (in case you need to start over).
Writing a user/group provider: If your user data is in an external store and you would like to keep it there you could also make a bridge between this existing user store and the Composite C1 Extranet by creating a custom provider. If this is relevant see http://docs.composite.net/Packages/Community/Extranet/CompositeCommunityExtranetDeveloperGuide/Writing-Custom-Extranet-Providers
Thank you. It now works. I imported the users programmatically. I opened the composite solution in visual studio and added a aspx page. Here is the code behind.
using System;
using System.Collections.Generic;
using System.IO;
using Composite.Community.Extranet;
using Composite.Community.Extranet.Data;
public partial class ImportUsers : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
string path = Server.MapPath("UsersToImport.csv");
StreamReader _streamReader = new StreamReader(path);
IList<Guid> userIds = new List<Guid>();
string line;
while ((line = _streamReader.ReadLine()) != null)
{
string[] fields = line.Split(',');
IExtranetUser extranetUser = new ExtranetUser();
extranetUser.Name = fields[0];
extranetUser.UserName = fields[1];
extranetUser.Email = fields[2];
extranetUser.IsApproved = true;
IExtranetUser addedUser = ExtranetFacade.AddNewUser("Default", extranetUser);
userIds.Add(addedUser.Id);
ExtranetFacade.SetUsersForGroup("Default", new Guid("bc728100-e28e-4135-a14c-bead6e0b9b00"), userIds);
Response.Write(string.Format("User: {0} added at {1}", addedUser.UserName, addedUser.CreationDate));
}
}
}

Having problems with accessing active directory using C#

My IS manager provided me with parameters in this format and I am trying use C# to validate a user against Active directory.
Here is a code sample (of course not the real credentials). How do I use these parameters to against a DirectoryEntry object so I can search for users etc.
provider-url=ldap://email.acmetech.com:1111/
base-dn= DC=acmetecg,DC=com
security-authentication= simple
security-principal= CN=ldap,cn=users,DC=acmetech,DC=com
security-credentials= Ldap000
I know this should be simple but its been years since I've programmed active directory.
Edit: How do I pass my params to a directory entry object so I can query objects in AD?
Using .NET 3.5 it's pretty easy.
using(PrincipalContext pc = new PrincipalContext(ContextType.Domain, "acmetecg"))
{
// check the creds (assuming ldap is the user name, and ldap000 is the password)
bool isValid = pc.ValidateCredentials("ldap", "ldap000")
}

Com Interop problem Silverlight 4 and MS Access 2010

I am trying to launch an existing MS Access database (Access 2010) from a Silverlight 4 OOB with elevated authorisation set. I keep getting an error. I can create a new Access application using the CreateObject keyword, but when I try to launch an existing one I get an error: "No object was found registered for specified ProgID."
Any help is appreciated. Here is the code I use:
string sMSAccess = "C:\\Users\\storltx\\Documents\\SL4Demo.accdb";
dynamic MSAccess = ComAutomationFactory.GetObject(sMSAccess);
MSAccess.Visible = true;
I think you should pass "Access.Application" string to GetObject call. like this:
dynamic MSAccess = ComAutomationFactory.GetObject("Access.Application");
Try your code like this:-
string sMSAccess = "C:\\Users\\storltx\\Documents\\SL4Demo.accdb";
dynamic app = ComAutomationFactory.CreateObject("Access.Application");
app .Visible = true;
app.OpenCurrentDatabase(sMSAccess);

Resources