I'm trying to integrate IBM Connections and IBM Websphere Portal(WP) following this documentation . Can't get the SSO working between them.
Here is the point about SSO at the documentation.
Implementing all the steps having the message "You are not authorized" in WP connections portlets.
Knowing the SSO with LTPA mechanism I have a couple of questions:
in this particular case should the WP server and IBM connections
server be at the same domain in order for the LTPA to work?
should WP websphere server's security be configured to use the same federated reposirory as a connections server? (connections server uses MAD LDAP)
and can anybody explain what id to use to authenticate in WP (I mean should it be it LDAP and not be as a local system user?)
1 - they can actually be the same top level domain, you just need to change your General Settings > Web SSO settings
for instance, I could set the sso domain to .ibm.com intead of a more specific domain, where my servers are in test.org.conx.ibm.com and portalserver.portal.ibm.com
2 - It's much easier if they use the same repository, but it is not required, as long as the ltpa token is used to login to the secondary server such as connections.
3 - well, what ever group you have in your corporate ldap that is set to manage portal, and the ids which you have to access the portal. generally these should be either mail;cn;uid
Related
I have WS 2016 running as AD/DC on which NTLM/NTLMv2 is disabled (Kerberos is a way to go). I have successfully joined Ubuntu machine to it, using this tutorial "Integrate Ubuntu with AD". Everything if working correctly (except Samba), can view users and groups on AD and can login to Ubuntu machine using AD user.
Now when I try to login with AD user to samba share I get NT_STATUS_NTLM_BLOCKED, which is expected, sense NTLM is blocked by AD.
Now my question is how to setup (force) Samba to use kerberos instead NTLM ?
It sounds like you're thinking that the SMB server just receives your password and then uses either NTLM or Kerberos to validate it. That's not how it works.
In SMB, it's the client which speaks NTLM or Kerberos when connecting to the server. You cannot force the server to use Kerberos because that is not the server's decision; it can either offer Kerberos or not, but it cannot make the client support Kerberos if the client doesn't support it.
Most mobile SMB client libraries do not have any Kerberos support (due to complexity); they will only use NTLM.
My "solution" to this issue was simply to exclude specific server from NTLM restriction policy.
There are two policies, on active directory server, in "Local Group Policy/Computer Configuration/Windows Settings/Security Settings/Local Policies/Security options":
Network security: Restrict NTLM: Add server exceptions in this domain
Network security: Restrict NTLM: Add remote server exceptions for NTLM authentication
So servers that are defined under those two policies are able to use NTLM.
Not a solution, but for now it's a workaround.
We use LDAP and our local SQL Server databases to authenticate our users, using Apache Shiro as the app is developed with Apache ISIS. Users in the SQL Server database are REST consumers, while LDAP contains only business users. Lastly, I was instructed to move my LDAP users to MS AAD.
Is there an architecture that allows me to keep both users? Business users will access the app through the MS OpenConnect portal. At the same time, other applications can continue using DB authentication to consume REST APIs.
Yes, it's possible. Actually, the essential of your questions is "how to enable multiple authentication manner s in web app". Since AAD authentication is claimed based, very different from LDAP, so you will need to change your code for sure to upgrade from LDAP to AAD.
Regarding multiple authentication, I don't know the platform you're using. Here is an sample for ASP.NET CORE for your reference:ASP.NET Core: Supporting multiple Authorization
As the title says, if the IIS web server running on active directory domain 'domain_A' can authenticate an user (windows authentication) from a domain 'domain_B' does it mean that 'domain_A' can use LDAP to query 'domain_B' ?
As an additional note, I pinged the 'domain_B' ldap on port 389 and it doesn't aswer.
Thanks
It is possible, but not the only approach that would yield the same results. The server in domain_A could have something like
<add name="ADConnectionString" connectionString="LDAP://domaincontroller.domain_B.gTLD/DC=domain_B,DC=gTLD" />
to perform authentication against domain_B using LDAP. The connectivity would be from the server in domain_A, so unless you're sourcing the connection attempt from that server ... port 389 (clear text ldap) or 636 (ssl ldap) being closed is not indicative of anything.
But it's also possible the two domains are part of the same forest or a trust has been established between the domains. The web servers could be set up to use basic authentication. You'd need to look at the IIS config on the server in domain_A to be certain.
I have windows server running ADFS server. I want to Connect to ldap server on it. My questions are
Does running ADFS Server already have LDAP Server running or need to do anything for that? I believe it is running already because I could see open port 389 and 636.
Assuming LDAP server is running, I was trying to connect to it using Google App Directory Sync to get list of users However I was not able to authorize. Is there any default credentials to connect? Or steps to get credentials for LDAP server?
Thanks
An ADFS server is not an Active Directory server - ADFS only extends Active Directory's infrastructure. Ports 389 and 636 are available because ADFS supports the LDAP and LDAPS protocols for communication, and as such, ADFS can retrieve user attributes from Active Directory, and it can also authenticate users against Active Directory. If you already have a directory server running, you need to add it to ADFS as an account store.
There are no default credentials - just use an administrative account that exists in your Active Directory store, as mentioned in point one.
To clarify on terminology for ADFS:
Account Store in ADFS: This is the account store that ADFS authenticates the user against with some form of credential (e.g. Username/password). By default ADFS connects to the Active Directory Domain Services and adds it as a special account store that cannot be deleted. So, any users in this active directory forest or in it's trusted subsystem can authenticate to ADFS. So far, ADFS only supported Active Directory as an account store and nothing else. With Windows Server 2016, it now supports connecting any LDAP v3 compliant directory as an account store. ADFS does not open LDAP ports as it is not an LDAP server. If ADFS were collocated with a domain controller, you would see LDAP ports open.
Attribute Store in ADFS: This a store where you can augment additional information about the user AFTER the user authenticates. By default ADFS has a default attribute store for ADDS that is setup by virtue of the install. Beyond this, it has in-built adapters that can be instantiated to connect to SQL or ADLDS (lightweight directory service). It also has an extensible API to connect to any other attribute store of your choice via .NET. People connect to Oracle/SAP data base, FIM metaverse etc.
#Srikanth: You will use the ADFS claims language or the UI to query for additional data using the attribute store model. In the UI, you would see it when you configure the issuance authorization rules or the issuance claims rules.
Hope that helps
Sam (#MrADFS)
is it possible to use a database created in a Azure VM as a data source for model which is created in Azure Analysis Services?
So far, when I specify connection properties for the model in the web designer and test connection, I get an error stating "a connection was successfully established with the server, but then an error occurred during the login process. (provider SSL Provider, error: 0 - The certificate chain was issued by an authority that is not trusted.)
I can connect to the server via SSMS and via RDP.
I created a self-signed certificate in the azure key vault and was able to make the SQL server use it. However I can't seem to find out how to make use of it when connecting the model.
Does anyone know if it's possible and if so, what should I do to make it work?
In the end I managed to make it work. For anyone with similar problem, I will write my solution below.
For the error "The certificate chain was issued by an authority that is not trusted" - just as discussed in the thread linked by TJB in comment, this was because I did not have a CA signed certificate, but a self-signed one.
A CA signed certificate from Azure would probably solve the issue, but I tried the Let's Encrypt site (also linked in the other thread). The issue I had with Let's Encrypt was that I had a windows server, while they natively support linux-based systems.
However I found an article by Daniel Hutmacher called Encrypting SQL Server connections with Let’s Encrypt certificates which was solving the very issue I had.
(as for the client tool, the current version is different from the one described in the article, but you can still download the old version on github. I used the lastest november 2017 release). With this I was able to generate and add a CA signed certificate to SQL server.
At this point, I created a model in Azure Analytics Services, used Azure Database as type of source/connection and filled in the connection to my VM SQL server. I saw my database tables, but when I tried to query data, I got a new error, stating that the AAS need an On-premise data gateway set up.
The Microsoft docs Install and configure an on-premises data gateway describes how to install on-premise data gateway on the VM, but if you are like me and use personal account for azure, you will have issues binding your account to the gateway. The solution as hinted here is to create a new account in Azure Active Directory (I created a new user and registered it under my azure custom domain, so the login looked like XXX#zzz.onmicrosoft.com). I gave the user admin role, so as to temporarily avoid any azure permission setbacks. Next I added the user to my subscription via Subscriptions -> "My_subscription" -> Access Control (IAM) and assigned an owner role to the AD user.
Now back on my VM I could bind the new user's account to the gateway (don't forget to change the gateway's region to your preferred region before finishing the setup).
Next, on Azure I created an "on-premise data gateway" service (do note you need to select same region as the one which your VM gateway is located under). I am not sure now, if only the new AD user I created could see the gateway, so in case you do not see it, try the AD user as well.
Last but not least, in the Azure Analytics services I went to the "on-premises data gateway" settings and set it to use the one I just created.
With this I was able to create the model and query the data from database.
Note:
In the model web designed for analytics services I happened to be logged in under the AD user, not under my personal account. Attempting to change the account to my personal one ended up in login failure, however after a few such attempts and opening multiple web designers in separated tabs, I correctly logged in under my personal account. After a while I could no longer replicate the issue.
I guess the issue may have been that I was logged in to Azure under both my personal account and the AD user at the same time in same browser when setting eveything up.