Kafka Connector for Snowflake keeps failing - snowflake-cloud-data-platform

When I start the Kafka Connector I keep getting this error:
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 3 error(s):
snowflake.url.name: Cannot connect to Snowflake
snowflake.user.name: Cannot connect to Snowflake
snowflake.private.key: Cannot connect to Snowflake
I have tried setting url in following two ways:
snowflake.url.name=<ID assigned to me>.snowflakecomputing.com:443
snowflake.url.name=<My user id>.us-west-2.snowflakecomputing.com:443
I've set snowflake.user.name as:
snowflake.user.name=<My login id>
Not sure exactly how to set 'snowflake.private.key'. I copied contents of:
~/.ssh/id_rsa
After removing all new line characters so the value looks something like this:
snowflake.private.key=MIIEowIBAAKCAQEApM9bYyleCC+......... <long string>
I also tried to run the following command in Snowflake worksheet under SECURITYADMIN role but it keeps failing:
alter user <my user id> set rsa_public_key='MIIEowIBAAKCAQEApM9bYyleCC...';
Error message:
SQL access control error: Insufficient privileges to operate on user
What am I doing wrong?

In my case, it worked when I used the 'ACCOUNTADMIN' role.

snowflake.url.name should match the account name that you use to get to Snowflake via the UI, not your login name. This account name might have a region in the url, as well, which should be included. For example, xyzcompany.us-east-1.snowflakecomputing.com:443.
snowflake.url.name=<account_name>.snowflakecomputing.com:443
I would make sure that you are setting your role in the correct place in the UI worksheet. Easiest way to check is to run a SELECT CURRENT_ROLE(); command. You can also just run USE ROLE SECURITYADMIN; in the worksheet to make sure you are set correctly. That role should have permissions to alter user parameters.

Related

use database with mixed case is not working via ODBC

I have a database with mixed case, i.e testDATABASE.
I run(using ODBC) the query use database ""testDATABASE";", then I run the query use schema "PUBLIC",
the query fail with the error:
ERROR: SQL compilation error:
Object does not exist, or operation cannot be performed.
Error Code: 2043
Query = use schema "PUBLIC"
when I run it not via odbc but in the notebook it works fine.
same queries with database that does not contain mixed case works fine.
if i run use schema "testDATABASE"."PUBLIC" it runs OK via ODBC and notebook.
is there a known issue about it? how can i run it with 2 queries in ODBCand make it work?
Thanks.
In your question it looks like your use database command had double double quotes,
but your schema didn't, perhaps that might be the issue.
Overall Suggestions :
When you make object names MiXeD-CaSe it simply makes use of the objects more difficult, so I'd recommend trying to not do mixed case if you can avoid it. You may not be able to avoid this, that's OK, it's just a suggestion.
if you can't avoid it, the only time I'd use the double quotes is when the object name
(in this case, the database name) has mixed case.
In your case, you should be able to run (you may have to double-double quote it in ODBC):
use database "testDATABASE";
and then this - note no double quotes needed because it's not mixed case
use schema PUBLIC;
this document illustrates how you don't need to prefix the schema with the database:
https://docs.snowflake.com/en/sql-reference/sql/use-schema.html
something else I recommend to folks getting started, for each user I like to set all the default context items (role, warehouse, namespace)
ALTER USER rich SET DEFAULT_ROLE = 'RICH_ROLE';
ALTER USER rich SET DEFAULT_WAREHOUSE = 'RICH_WH' ;
ALTER USER rich SET DEFAULT_NAMESPACE = 'RICH_DB.TEST_SCHEMA';

MSSQL schema & Hibernate 5.1.0.Final (table not found)

I'm trying to understand how to configure my Hibernate to work properly with my MSSQL DB and its schemas.
The problem is that during validation of tables, it logs (for every table):
org.hibernate.tool.schema.extract.internal.InformationExtractorJdbcDatabaseMetaDataImpl
- HHH000262: Table not found SHARED_CONFIGURATION
I debugged Hibernate to find out what causes this and found that it calls something like:
EXECUTE [mydb]..sp_columns N'SHARED_CONFIGURATION',N'',N'mydb'
Notice that 2nd parameter is schema name and there is passed empty string. When I tried to run this query against DB it returned empty result set. But when I passed 'dbo' as 2nd parameter the result set was not empty (meaning that Hibernate should call this instead).
OK so I was like it seems that I need to define schema. But both setting hibernate.default_schema or setting schema in #Table annotation on my entites threw exception:
Schema-validation: missing table [SHARED_CONFIGURATION]
So now I'm wondering what is the real problem. I also wanted to set default schema in my DB but was not allowed (Cannot alter the user 'sa', because it does not exist or you do not have permission.) even when executed with user 'sa' itself:
ALTER USER sa WITH DEFAULT_SCHEMA = dbo;
Note that this happens with any driver (JTDS, official MS driver..)
Can someone explain what is happening here and how "correctly" get rid of that warning message in log that says table does not exist even when it exists (and application is able to run properly with the database)?
I had the same problem and solved by setting the property hibernate.hbm2ddl.jdbc_metadata_extraction_strategy to individually

Oracle + dbunit throws AmbiguousTableNameException

I'm using DBUnit to populate the database so that its content is a known content during testing.
The db schema I'm working on is in an Oracle 11g instance in which they reside other db schemas. In some of these schemas has been defined a table to which has been associated with a public synonym and on which have been given the rights to select.
When I run the xml that defines how the database must be populated, also if the xml file doesn't contain the table defined in several schemas, DBUnit throws the AmbiguousTableNameException exception on that table.
I found that there are 3 solutions to solve this behavior:
Use a database connection credential that has access to only one
database schema.
Specify a schema name to the DatabaseConnection or
DatabaseDataSourceConnection constructor.
Enable the qualified table name support (see How-to documentation).
In my case, I can only apply the solution 1, but even if I adopt it, I got the same exception.
The table that gives me problems is defined in 3 schemas and I don't have the opportunity to act on it in any way.
Please, someone could help me?
I found the solution: I specified the schema in the name of the tables and I have set to true the property http://www.dbunit.org/features/qualifiedTableNames (corresponding to org.dbunit.database.FEATURE_QUALIFIED_TABLE_NAMES).
By this way, my xml code to populate tables look like:
<?xml version='1.0' encoding='UTF-8'?>
<dataset>
<SCHEMA.TABLE ID_FIELD="1" />
</dataset>
where SCHEMA is the schema name, TABLE is the table name.
To se the property I've used the following code:
DatabaseConfig dBConfig = dBConn.getConfig(); // dBConn is a IDatabaseConnection
dBConfig.setProperty(DatabaseConfig.FEATURE_QUALIFIED_TABLE_NAMES, true);
In my case,
I granted dba role to user, thus dbunit throw AmbiguousTableNameException.
After I revoke dba role to user, I solve that problem.
SQL> revoke dba from username;
I had the same AmbiguousTableNameException while executing Dbunits aginst Oracle DB. It was working fine and started throwing error one day.
Rootcause: while calling a stored procedure, it got modified by mistake to lower case. When changed to upper case it stared working.
I could solve this also by setting the shema name to IDatabaseTester like iDatabaseTester.setSchema("SCHEMANAMEINCAPS")
Thanks
Smitha
I was using SpringJDBC along with MySQL Connector (v8.0.17). Following the 2 steps explained in this answer alone did not help.
First I had to set the schema on the spring datasource.
Then I also had to set a property "databaseTerm" to "schema"
by default it is set to "catalogue" as explained here.
We must set this property because (in Spring's implementation of javax.sql.DataSource) if it's not set (i.e. defaulted to "catalogue") then the connection returned by dataSource.getConnection() will not have the schema set on it even if we had set it on the dataSource.
#Bean
public DriverManagerDataSource cloudmcDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("<driver>");
dataSource.setUrl("<url>");
dataSource.setUsername("<uname>");
dataSource.setPassword("<password>");
dataSource.setSchema("<schema_name>");
Properties props = new Properties();
// the following key-value pair are constants; must be set as is
props.setProperty("databaseTerm", "schema");
dataSource.setConnectionProperties(props);
return dataSource;
}
Don't forget to make the changes explained in answer here.

How to determine if a user is a member of a non-fixed sql database role

Executing the following sql:
USE [WSS_Content]
EXEC sp_helplogins
returns two result sets, and within the second resultset I get several rows but one in particular which looks like this:
LoginName DBName UserName UserOrAlias
DEMO\SPUser SharePoint_Config SharePoint_Shell_Access MemberOf
This is as expected, and what I want to see. However executing the following sql:
EXEC sp_helprole 'SharePoint_Shell_Access'
gives me an error which says: 'SharePoint_Shell_Access' is not a role. But it is! I can see that it is from SQL Server Management Studio and it even shows me the role's members from right there.
Ideally what I'm trying to accomplish is to be able to use this:
SELECT IS_MEMBER('SharePoint_Shell_Access')
but of course, this returns null because it also thinks that is not a valid role when it most definitely is. What gives, and how do I best query to see if a given user (or the current user) is a member of this custom database role?
I am running SQL Server 2008 R2.
As it turns out, sp_helplogins returns results across all databases. Which is why that role was showing up when it shouldn't have if it were limited to the current db like I previously thought.
For whatever reason I just didn't notice that and thought that entry was applicable to the WSS_Content. Instead it was from the SharePoint_Config database. The error was correct after all and there was no role by that name on the database where I was looking. I was just getting it confused.

Some tables in SQL Server require [user].[table] and others don't, why is this and can I force it?

As the title suggests I am confused as to why some tables in my database fall over if you do something like;
SELECT * FROM [user].[table]
And yet on another tables it works fine.
I am testing some code that will eventually be on a server that cries if you don't use [user].[table] so I would really like to force this on my machine.
Could someone explain why this happens and possible show me how to fix it.
More Info
Here is the message I get when I try and run a query using [user].[table] instead of just [table];
[Microsoft][ODBC SQL Server
Driver][SQL Server]Invalid object name
'usr.tbl'
The "user" bit is the schema a table belongs to
So you can have dbo.table and user.table in the same database.
By default, SELECT * FROM table will usually look for the dbo.table. However, if the login/user has a different default schema then it will look for thatschema.table
To fix it:
You can use ALTER SCHEMA .. TRANSFER.. to fix the current setup
Ongoing, ensure every table reference has the correct schema on CREATE, ALTER, SELECT, whatever
Also see "User-Schema Separation" on MSDN
What you refer to as [user] is actually something called a schema. Every user has a default schema, which means that when you are logged in as that user you can refer to the tables in the default schema without the schema prefix. One way to solve this would to be to make sure that no user has the default schema where the tables are located. Basically you can just make an emptry schema and use that as the default schema for all your users.
Go to YourDatabase->Security->Users and see the properties (by right clicking) to change the default schema for your users.

Resources