I am trying to get Gerrit to authenticate against anActiveDirectory LDAP server, but I'm having trouble finding the right combination of LDAP settings to make it work. I'm seeing errors like this in the error_log:
WARN com.google.gerrit.server.auth.ldap.LdapRealm : Cannot discover type of LDAP server at ldap://ldapserver.company.com:3268, assuming the server is RFC 2307 compliant.
javax.naming.NamingException: [LDAP: error code 1 - 000004DC: LdapErr: DSID-0C090748, comment: In order to perform this operation a successful bind must be completed on the connection., data 0, v2580^#]; remaining name ''
Is there a "typical" ActiveDirectory config for Gerrit, and should I be using LDAP_BINDauthentication?
LDAP (as opposed to LDAP_BIND) is the correct authentication type.
This was almost completely answered here:
[ldap]
server = ldap://dc.ad.company.com:3268
username = ldapuser#ad.company.com
accountBase = DC=ad,DC=company,DC=com
groupBase = DC=ad,DC=company,DC=com
accountPattern = (&(objectClass=person)(sAMAccountName=${username}))
accountFullName = displayName
accountEmailAddress = mail
accountSshUserName = sAMAccountName
groupMemberPattern = (sAMAccountName=${username})
groupName = cn
localUsernameToLowerCase = true
However, in my case, there is no ad component to the LDAP server hostname, so it looks more like:
[ldap]
server = ldap://something.company.com:3268
username = ldapuser#company.com
accountBase = DC=company,DC=com
groupBase = DC=company,DC=com
accountPattern = (&(objectClass=person)(sAMAccountName=${username}))
accountFullName = displayName
accountEmailAddress = mail
accountSshUserName = sAMAccountName
groupMemberPattern = (sAMAccountName=${username})
groupName = cn
localUsernameToLowerCase = true
Also, you need to add the LDAP password to etc/secure.config (or you can use SecureStore), which should only be readable by the Gerrit user:
[ldap]
password = yourpassword
You will see a error like this if this is not done:
ERROR com.google.gerrit.server.auth.ldap.LdapRealm : Cannot query LDAP to autenticate user
javax.naming.NamingException: [LDAP: error code 1 - 000004DC: LdapErr: DSID-0C090748, comment: In order to perform this operation a successful bind must be completed on the connection., data 0, v2580^#]; remaining name 'DC=company,DC=com'
After this, you can log in with the AD username (without any #company.com parts, just the username) and your usual password.
Related
I'm able to successfully connect to the Snowflake database through my .NET app, but I'm unable to run a SQL command due to the following error from the Snowflake:
Message: Http status: UnprocessableEntity
ResponseContent:
"code" : "391920",
"message" : "Unable to run the command. You must specify the warehouse to use by either setting the warehouse field in the body of the request or by setting the DEFAULT_NAMESPACE property for the current user.",
"sqlState" : "57P03",
"statementHandle" : "01a8
Here is my code I'm using.
public async Task<QueryResult> QuerySnowflake(string statement, string database, string schema)
{
var content = new
{
statement,
database,
schema
};
return await _httpClient.SnowflakePost<QueryResult>($"https://{_accountId}.snowflakecomputing.com/api/v2/statements", content, await GetHeaders(), _cancellationToken);
}
statement = SELECT * FROM SNOWFLAKE_SAMPLE_DATA.TPCH_SF1.CUSTOMER
database = SNOWFLAKE_SAMPLE_DATA
schema = TPCH_SF1
I have already tried the following:
ALTER USER my_username SET DEFAULT_NAMESPACE = SNOWFLAKE_SAMPLE_DATA.TPCH_SF1
GRANT SELECT ON ALL TABLES IN SCHEMA "TPCH_SF1" TO ROLE sysadmin
ALTER USER my_username SET DEFAULT_ROLE = sysadmin
All of these did not change the error response.
I don't think it needs a code change as it works with other Snowflake accounts (I'm using a new trial account). I believe I have my something wrong with my account (e.g. missing role, missing warehouse, missing permission, etc).
Any help would be very much appreciated.
The user does not have a default warehouse and none is specified in the connection request or a use command in the session. You can try sending this command before running your select:
use warehouse MY_WAREHOUSE;
You can also specify it in the connection, or specify a default for the user:
ALTER USER MY_USER SET DEFAULT_WAREHOUSE = MY_WAREHOUSE;
I am accessing the other database using elastic queries. The data source was created like this:
CREATE EXTERNAL DATA SOURCE TheCompanyQueryDataSrc WITH (
TYPE = RDBMS,
--CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly',
CREDENTIAL = ElasticDBQueryCred,
LOCATION = 'thecompanysql.database.windows.net',
DATABASE_NAME = 'TheCompanyProd'
);
To reduce the database load, the read-only replica was created and should be used. As far as I understand it, I should add the CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly' (commented out in the above code). However, I get only the Incorrect syntax near 'CONNECTION_OPTIONS'
Both databases (the one that sets the connection + external tables, and the other to-be-read-only are at the same server (thecompanysql.database.windows.net). Both are set the compatibility lever SQL Server 2019 (150).
What else should I set to make it work?
The CREATE EXTERNAL DATA SOURCE Syntax doesn't support the option CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly'. We can't use that in the statements.
If you want achieve that readonly request, the way is that please use the user account which only has the readonly(db_reader) permission to login the external database.
For example:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>' ;
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH
IDENTITY = '<username>' -- readonly user account,
SECRET = '<password>' ;
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc
WITH
( TYPE = RDBMS ,
LOCATION = '<server_name>.database.windows.net' ,
DATABASE_NAME = 'Customers' ,
CREDENTIAL = SQL_Credential
) ;
Since the option is not supported, then we can't use it with elastic query. The only way to connect to the Azure SQL data with SSMS is like this:
HTH.
This question already has an answer here:
Karate - Database testing - getting timestamp displayed as nano
(1 answer)
Closed 1 year ago.
When I use DBUtils.java in Eclipse and run the tests there it works fine, but when I run it through Jenkins the first time that DBUtils are used is failing. And the second works...
The first time that it uses db.readRows it fails.
Scenario: Account Create
Given path 'accounts'
And header Authorization = setup.authorization
And request {identifier: KarateCreation, subscribers:[{identifier:KarateCreation, firstName:KarateCreation, lastName:KarateCreation}]}
When method POST
And match response contains { id: '#number', identifier: KarateCreation }
Then status 201
* def id = response.id
* def accountNumber = response.identifier
# use jdbc to validate
* def config = { url: #(dbConnectionString), driverClassName: 'oracle.jdbc.OracleDriver' }
* def DbUtils = Java.type('restapi.util.DbUtils')
* def db = new DbUtils(config)
* def rs = db.readRows("SELECT ACCOUNTID, ACCOUNTNUMBER FROM ACCOUNT WHERE ACCOUNTNUMBER = 'KarateCreation'")
* match rs contains { ACCOUNTID: '#(id)', ACCOUNTNUMBER: KarateCreation }
Error:
* def rs = db.readRows("SELECT ACCOUNTID, ACCOUNTNUMBER FROM ACCOUNT WHERE ACCOUNTNUMBER = 'KarateCreation'")(Scenario: Account Create) Time elapsed: 0.039 sec <<< ERROR!
java.lang.RuntimeException: javascript evaluation failed: db.readRows("SELECT ACCOUNTID, ACCOUNTNUMBER FROM ACCOUNT WHERE ACCOUNTNUMBER = 'KarateCreation'")
at com.intuit.karate.ScriptBindings.eval(ScriptBindings.java:115)
at com.intuit.karate.ScriptBindings.updateBindingsAndEval(ScriptBindings.java:103)
at com.intuit.karate.ScriptBindings.evalInNashorn(ScriptBindings.java:88)
at com.intuit.karate.Script.evalJsExpression(Script.java:362)
at com.intuit.karate.Script.evalKarateExpression(Script.java:284)
at com.intuit.karate.Script.evalKarateExpression(Script.java:170)
at com.intuit.karate.Script.assign(Script.java:598)
at com.intuit.karate.Script.assign(Script.java:524)
at com.intuit.karate.StepDefs.def(StepDefs.java:305)
at ✽.* def rs = db.readRows("SELECT ACCOUNTID, ACCOUNTNUMBER FROM ACCOUNT WHERE ACCOUNTNUMBER = 'KarateCreation'")(restapi/accounts/accounts.feature:31)
Caused by: org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC Connection; nested exception is java.sql.SQLRecoverableException: IO Error: Connection reset
First may I gently remind you that DBUtils.java was created as a demo example and is not part of the core of Karate. I am beginning to regret having put this there because of questions like this. See another example.
EDIT - Since this question comes up a lot: You are expected to write your own code to connect to your database, execute SQL and unpack the results the way you want. Please don't tag questions around this as "karate".
Anyway, please work with somebody in your team or org to fix this problem:
Caused by: org.springframework.jdbc.CannotGetJdbcConnectionException:
Could not get JDBC Connection; nested exception is
java.sql.SQLRecoverableException: IO Error: Connection reset
It is quite possible that your Jenkins box is not able to establish a connection to the database and the ports are fire-walled off etc.
I am trying to configure a SAML non-gallery Enterprise app and am having a problem with configuring the claims. To summarize the current claims, objectGUID is being sent as the name identifier. They send extensionAttribute6 as OrgID. GivenName, sn and e-mail address are sent without any changes.
AD Connect has been configured to sync objectGUID and extensionAttribute6 to AAD, and those attributes are available in the SSO configuration blade for the Enterprise App.
My questions are:
1) Does a Namespace need to be defined for the objectGUID, or can it just be selected from the source attribute in the claim and name identifier?
2) How to transform the extensionAttribute6 to be OrgID?
The current claim rules in ADFS are:
1)
c:[Type ==
"http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname",
Issuer == "AD AUTHORITY"]
=> issue(store = "Active Directory", types = ("GUID"), query = ";objectGuid;{0}", param = c.Value);
2)
c:[Type == "GUID"]
=> issue(Type = "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier",
Issuer = c.Issuer, OriginalIssuer = c.OriginalIssuer, Value = c.Value,
ValueType = c.ValueType,
Properties["http://schemas.xmlsoap.org/ws/2005/05/identity/claimproperties/format"]
= "urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified");
3)
c:[Type ==
"http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname",
Issuer == "AD AUTHORITY"]
=> issue(store = "Active Directory", types = ("givenName", "sn", "OrgID", "mail"), query =
";givenName,sn,extensionAttribute6,mail;{0}", param = c.Value);
You do not have to specify the namespace when you are mapping the User.ObjectID as the NameID claim. Also note that do not select any NameID Format and keep that as Default. Azure AD does support the pairwise Name Identifier. That means if the Service Provider is sending the NameID Format then the app will get that from Azure AD based on the Format specified in the SAML Request.
If you are trying to map the User.ObjectID claim as another claim then you can add the Namespace value as needed but it based on how the app need that back.
About transforming OrgID I Am not sure what you want to send. You can define that as a claim name and then select the ExtensionAttribute 6 as value and if the value exist for the user then you should see that in the SAML Response.
I hope this helps.
Thanks and Regards,
Jeevan Desarda
I have Airflow successfully setup to work with my AD/LDAP when everyone is a superuser and data profiler. But now I want to define an admin group and a regular user group. I have the following settings,
Working Config Where Everyone Is An Admin:
# set a connection without encryption: uri = ldap://<your.ldap.server>:<port>
uri = ldap://123.456.789:123
user_filter = objectClass=*
# in case of Active Directory you would use: user_name_attr = sAMAccountName
user_name_attr = sAMAccountName
# group_member_attr should be set accordingly with *_filter
# eg :
# group_member_attr = groupMembership
# superuser_filter = groupMembership=CN=airflow-super-users...
group_member_attr = member
group_name_attr = CN
group_filter = objectclass=group
bind_user = CN=blah,OU=foo,DC=us,DC=bar,DC=com
bind_password = yahoo
basedn = DC=us,DC=bar,DC=com
# Set search_scope to one of them: BASE, LEVEL , SUBTREE
# Set search_scope to SUBTREE if using Active Directory, and not specifying an Organizational Unit
search_scope = SUBTREE
New Config With Specific Admin Group Set:
# set a connection without encryption: uri = ldap://<your.ldap.server>:<port>
uri = ldap://123.456.789:123
user_filter = objectclass=*
# in case of Active Directory you would use: user_name_attr = sAMAccountName
user_name_attr = sAMAccountName
# group_member_attr should be set accordingly with *_filter
# eg :
# group_member_attr = groupMembership
# superuser_filter = groupMembership=CN=airflow-super-users...
superuser_filter = memberOf=CN=MyAdminGroupName,OU=foo,DC=us,DC=bar,DC=com
data_profiler_filter = memberOf=CN=MyAdminGroupName,OU=foo,DC=us,DC=bar,DC=com
group_member_attr = member
group_name_attr = CN
group_filter = objectclass=group
bind_user = CN=blah,OU=foo,DC=us,DC=bar,DC=com
bind_password = yahoo
basedn = DC=us,DC=bar,DC=com
# Set search_scope to one of them: BASE, LEVEL , SUBTREE
# Set search_scope to SUBTREE if using Active Directory, and not specifying an Organizational Unit
search_scope = SUBTREE
Resource: https://airflow.apache.org/security.html
With this new configuration I am able to log into the Airflow UI but I'm no longer able to view the Admin tab. I am 100% sure I am a part of the admin group MyAdminGroupName. I'm also not sure where to put my regular user group name MyRegularGroupName.
Can someone please guide me on how to configure my Admin group (MyAdminGroupName) and my regular user group (MyRegularGroupName)?
I also struggled with setting up LDAP in Airflow.
First of: What is group_filter = objectclass=group in your config? I cannot find it specified in the docs or in the ldap_auth.py.
Then, your group_member_attr is set to member, but in the filter queries you're using memberOf, so I guess that memberOf should be your group_member_attr (it usually is, if your using Active Directory).
Your superuser_filter and data_profiler_filter look good to me.
To whoever reads this: the filters are inserted into a string like this in the code: (&(<FILTER_HERE>)), so if you want to build a more sophisticated filter, take this into account.
E.g. I wanted to only give three users superuser rights (using environment variables for config):
AIRFLOW__LDAP__SUPERUSER_FILTER: "&(objectCategory=Person)(|(sAMAccountName=user1)(sAMAccountName=user2)(sAMAccountName=user3))(memberOf=CN=MyDepartment,OU=Departments,OU=UserGroup,DC=MyCompany,DC=local)"
Regarding your question about MyRegularUserGroup: I guess, you can specify the user filter to filter for persons in your regular user group and then specify the admin group for superuser and data profiler. But that would only work if the admin group is a subset of the regular user group.
Hope that helps.