Active Directory 2008R2 Serving Invalid TLS Certificate Over LDAP - active-directory

I am creating a simple client to connect to the LDAP servers running on one of my windows 2008R2 Active Directory Domain Controllers.
I have successfully connected to the LDAP server over a non TLS connection. However, whenever I attempt to make a TLS connection, the handshake fails. After some digging, and downloading the certificate using the following command:
openssl s_client -connect <domain controller>:636
I found that the certificate being served from the LDAP server is invalid. I can see that the certificate is signed by our CA and my local system, that runs the application already has this trust established with the CA. However, It is missing all of the subject information in the certificate. The client application does not allow for this.
After speaking with the administrator, he indicated that the certificates being generated for the domain controller systems to serve TLS certificates over LDAP is automatic and is created by our internal Microsoft Certificate Server. He was not sure how to address this.
After numerous Google searches, I have come up pretty empty on how to resolve this. Is it something that is addressed on the certificate server? Is it something on the domain controller which is stripping the subject information? Is it some setting or configuration? Since, I do not have direct access to these systems I am at a loss on where to begin.
Any assistance would be appreciated.
Blindly trusting a certificate that is invalid is not an acceptable solution.

Ask your admin to export the root certificate for your environment (like, to a .cer file). Then you can use that file to add it as a trusted root certificate on the computer that needs to access it.
That's how we do it in our environment when we've needed to access an external domain over LDAPS.
Of course, that only works if the application accessing LDAPS uses the Windows certificate store. Some applications, like Java-based apps, don't, and you need to do it another way.

I was able to assist my Admin with updating the template the certificate server was using to include the subject and subject alternate name.
I found the following articles that helped determine the problem
https://blogs.msdn.microsoft.com/microsoftrservertigerteam/2017/04/10/step-by-step-guide-to-setup-ldaps-on-windows-server/
https://social.technet.microsoft.com/wiki/contents/articles/2980.ldap-over-ssl-ldaps-certificate.aspx
https://support.microsoft.com/en-us/help/931351/how-to-add-a-subject-alternative-name-to-a-secure-ldap-certificate
Ultimately going over each setting until we found the right solution that solved the problem of why the certificate server was sending and invalid certificate.

Related

Is there a way to specify an SPN when using LinqPad to connect to MSSQL

We are working with SQL Server using Windows authentication and have found that we need to specify the SPN for the connection to work. Our application is partially using VB6 connecting via UDL files and partially a series of ad hoc scripts that we run through LinqPad.
Using the Server SPN feature of the UDL file we can get the connection to work but can not find the equivalent for LinqPad.
Does anyone know how to get this working? We have tried ServerSPN= as an additional connection string parameter on the Advanced settings but it is unrecognised.
Further details.
The client applications are on Windows 7 in domain A. All of the client connections are made from accounts in domain A.
The sql server is in domain B.
There is a selective one way trust between the domains and the server is added to the group that enables that trust.
The error from the linqpad connection is: "The target principal name is incorrect.Cannot generate SSPI context".
The UDL file allows the connection to work once it has the Server SPN = value set but otherwise gets the same error.
Update:
During testing we have found that specifying the SPN results in NTLM authentication to the server. Therefore if there is a way to force this from the client that would be a possible route for us.
Update + Workaround: We have stumbled across a workaround, adding the server into the hostfile with a different name seems to trigger the same fall back to NTLM authentication to work for the LinqPad connection. Would still appreciate if anyone understands how to fix this correctly but for now we are using the hostfile workaround.
Any help appreciated
I can think of several possible issues here:
Your one-way trust is set up the wrong way around;
It's too selective;
Incompatibility in AD versions between domains.
I would recommend trying Kerberos Configuration Manager for SQL Server, it might give you some insight into the root cause.
Also, I believe you might have better chances asking this on ServerFault rather than here. By the looks of it, it's a misconfiguration issue, and has nothing to do with programming.

Do I need to register my SSL certificate in IIS and SQL Server?

I have purchased an SSL certificate and installed it using IIS on my remote system. So I can therefore access my remote system using https://myremotesite.co.uk. All is fine, it seems to work; users can register and login to my remote site and download my GUI to run my application which stores and retrieves data from my SQL Server database.
When a user runs my GUI to access my application it prompts them for their login-id and password and, if they are authenticated, my application pops up on their screen. All is well, it all seems to work fine.
However, I have read that access to the SQL Server database itself can be restricted with an SSL certificate and to do this I would need "Encrypt=yes" in the connection string which my GUI uses to check authentication.
Is it necessary for me to do this? Or is safe to just rely on the IIS HTTPS service? So my question is ... do I need to register my SSL certificate with BOTH IIS AND SQL Server or just ONE of them, and if so, which ONE?
Thanks for the answers thus far .. to explain further, the GUI connects to an IIS controlled website which has specific handlers written to perform a restricted set of database queries. So my database DOES reside on my server, but it only allows my server's (local) IIS to 'login' and insert, update and extract data.
Once the IIS website service has extracted data, it then returns the same to the GUI. So the GUI has no DIRECT access to the database. What I am concerned about is if - by some malicious means - the database was copied in its entirety ... could/should I use my SSL certificate to encrypt sensitive data in this event?

Setting up Azure AD Connect

I'm trying to install the preview of Azure AD Connect:
https://connect.microsoft.com/site1164/program8612
During the setup, you can configure the sign-in method for users, synchronization or a federation with ADFS. I want to use ADFS, and I want the setup to configure a new ADFS farm. The setup wants a SSL certificate, so I've made a self-signed certificate and exported it as a .PFX file. However, the setup won't accept the certificate, it states "The certificate is invalid or corrupted. Please try another certificate"
I selected another certificate which I've used for a website, and I get the same result. The certificate chain is OK, I've tried to install the certificate, but no matter what, the setup keeps rejecting the certificate. I can't find any further info in the eventlog or setup log file, and since the Azure AD connect software is quite new and still in preview, there's not much info on the web regarding the installation.
Any ideas on how to make this work?
It is mandatory for AD FS to use a third-party signed certificate. If you don't want to pay for a certificate, you can use one from wosign which is free and publicly trusted (as an intermediate certification authority from VeriSign or similar I think).

LDAPS Connection from Local Active Directory Server to External Client

I am looking for a solution to my Active Directory problem.
Environment:
Attempting to authenticate users on an external Centos 6.4 website (outside our firewall) by connecting to Microsoft Active Directory which is located behind the firewall.
Currently, we use active directory within our firewall via the domain activedirectory.website.local and works fine. We are in the process of moving some of our sites to an externally hosted server so we need SSL. We have generated a self-signed ssl cert on the active directory server and have exported the ca.pem to the Centos server.
When I try to authenticate Active Directory through the terminal in the client Centos machine (located outside our firewall), I get an error:
TLS: hostname (firewall.website.com) does not match common name in
certificate (activedirectory.website.local)
This error occurs because:
I am trying to access active directory which is behind our firewall from a client computer from outside
the certificate says "Hey I'm generated from
activedirectory.website.local but you are asking for
firewall.website.com".
We talked to an SSL company about getting a commercial SSL for the .local server and they said they could sell us one for a year. Beyond that year they would not be able to extend the SSL due to some sort of regulation.
Due to the complexity of the network, I cannot change the domain name of activedirectory.website.local or firewall.website.com.
I'm sure someone has ran into this problem but I currently can't find any solutions on the web.
All I need from active directory is usernames and passwords for login authentication.
Thank you in advance!
First thing, (shitty ... caca boudin in french) can't you declare activedirectory.website.local with the right IP adress in /etc/hosts.
Another thing I see is to buy a certificate (or to create your own using your own CA) and install it on the Active-Directory service. Have a look to How to enable LDAP over SSL with a third-party certification authority.

Use Azure VM Sql server database as source for Azure Analysis Services model

is it possible to use a database created in a Azure VM as a data source for model which is created in Azure Analysis Services?
So far, when I specify connection properties for the model in the web designer and test connection, I get an error stating "a connection was successfully established with the server, but then an error occurred during the login process. (provider SSL Provider, error: 0 - The certificate chain was issued by an authority that is not trusted.)
I can connect to the server via SSMS and via RDP.
I created a self-signed certificate in the azure key vault and was able to make the SQL server use it. However I can't seem to find out how to make use of it when connecting the model.
Does anyone know if it's possible and if so, what should I do to make it work?
In the end I managed to make it work. For anyone with similar problem, I will write my solution below.
For the error "The certificate chain was issued by an authority that is not trusted" - just as discussed in the thread linked by TJB in comment, this was because I did not have a CA signed certificate, but a self-signed one.
A CA signed certificate from Azure would probably solve the issue, but I tried the Let's Encrypt site (also linked in the other thread). The issue I had with Let's Encrypt was that I had a windows server, while they natively support linux-based systems.
However I found an article by Daniel Hutmacher called Encrypting SQL Server connections with Let’s Encrypt certificates which was solving the very issue I had.
(as for the client tool, the current version is different from the one described in the article, but you can still download the old version on github. I used the lastest november 2017 release). With this I was able to generate and add a CA signed certificate to SQL server.
At this point, I created a model in Azure Analytics Services, used Azure Database as type of source/connection and filled in the connection to my VM SQL server. I saw my database tables, but when I tried to query data, I got a new error, stating that the AAS need an On-premise data gateway set up.
The Microsoft docs Install and configure an on-premises data gateway describes how to install on-premise data gateway on the VM, but if you are like me and use personal account for azure, you will have issues binding your account to the gateway. The solution as hinted here is to create a new account in Azure Active Directory (I created a new user and registered it under my azure custom domain, so the login looked like XXX#zzz.onmicrosoft.com). I gave the user admin role, so as to temporarily avoid any azure permission setbacks. Next I added the user to my subscription via Subscriptions -> "My_subscription" -> Access Control (IAM) and assigned an owner role to the AD user.
Now back on my VM I could bind the new user's account to the gateway (don't forget to change the gateway's region to your preferred region before finishing the setup).
Next, on Azure I created an "on-premise data gateway" service (do note you need to select same region as the one which your VM gateway is located under). I am not sure now, if only the new AD user I created could see the gateway, so in case you do not see it, try the AD user as well.
Last but not least, in the Azure Analytics services I went to the "on-premises data gateway" settings and set it to use the one I just created.
With this I was able to create the model and query the data from database.
Note:
In the model web designed for analytics services I happened to be logged in under the AD user, not under my personal account. Attempting to change the account to my personal one ended up in login failure, however after a few such attempts and opening multiple web designers in separated tabs, I correctly logged in under my personal account. After a while I could no longer replicate the issue.
I guess the issue may have been that I was logged in to Azure under both my personal account and the AD user at the same time in same browser when setting eveything up.

Resources