How can I grant privilege's for file formats? - snowflake-cloud-data-platform

I'm trying to grant privileges for file formats in an automated way, I just need to basic structure in how to do so. Based on the research I've done so far I think it should look like this:
GRANT ALL PRIVILEGES ON myformat IN mydb.myschema TO myrole;
But I'm getting an error. Any help would be appreciated.
Error : "SQL compilation error: syntax error line 1 at position 34 unexpected 'in'."
thanks

You need to use the following syntax
GRANT ALL PRIVILEGES ON FILE FORMAT mydb.myschema.myformat TO myrole;

You need use the three-part qualifier instead of IN. You also need to specify that it's a file format, otherwise it will complain that it can't find the table with that name.
GRANT ALL PRIVILEGES ON FILE FORMAT mydb.myschema.myformat TO myrole;

Related

Snowflake: Can't use the stage (S3) - SQL compilation error: Stage does not exist or not authorized

I create an external stage in Snowflake via (I've tried with a public bucket too)
CREATE OR REPLACE stage "DATABASE"."SCHEMA"."STAGE_NAME"
url='s3://bucket'
CREDENTIALS=(AWS_KEY_ID='xxxxxxxxxxxx' AWS_SECRET_KEY='xxxxxxxxxxxx');
I could view the parameters of this stage via
SHOW STAGES
DESC STAGE "DATABASE"."SCHEMA"."STAGE_NAME"
However, I'm getting the error whenever I'm trying to interact with this stage (e.g., LIST #STAGE_NAME or load a file).
SQL compilation error: Stage 'DATABASE.SCHEMA.STAGE_NAME' does not exist or not authorized.
I've tried different snowflake roles but can't make it work. Could anyone point me where to look? Perhaps I have to assign any permissions to the stage?
You have STAGE-privileges: https://docs.snowflake.com/en/user-guide/security-access-control-privileges.html#stage-privileges
For COPY, LIST and others you need the privileges mentioned there. (USAGE, READ and maybe WRITE)
It's pretty weird, but I can list a stage if the name consists of capital letters. No additional permissions are needed.
Works fine
CREATE OR REPLACE stage "DATABASE"."SCHEMA"."STAGE_NAME"
url='s3://bucket'
CREDENTIALS=(AWS_KEY_ID='xxxxxxxxxxxx' AWS_SECRET_KEY='xxxxxxxxxxxx');
LIST #STAGE_NAME
Returns Stage does not exist or not authorized.
CREATE OR REPLACE stage "DATABASE"."SCHEMA"."Stage_Name"
url='s3://bucket'
CREDENTIALS=(AWS_KEY_ID='xxxxxxxxxxxx' AWS_SECRET_KEY='xxxxxxxxxxxx');
LIST #Stage_Name
At the same time, I see all Stages while running the "SHOW STAGES" command.
Are there any constraints on the naming? I haven't found any so far.
If the stage DDL has the name enclosed in double-quotes(CREATE OR REPLACE stage "DATABASE"."SCHEMA"."STAGE_NAME"), the name becomes case-sensitive which is why you cannot see it. Do not enclose the stage name in quotes and you should be able to see it regardless of the case.
https://docs.snowflake.com/en/sql-reference/sql/create-stage.html#required-parameters

Kafka Connector for Snowflake keeps failing

When I start the Kafka Connector I keep getting this error:
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 3 error(s):
snowflake.url.name: Cannot connect to Snowflake
snowflake.user.name: Cannot connect to Snowflake
snowflake.private.key: Cannot connect to Snowflake
I have tried setting url in following two ways:
snowflake.url.name=<ID assigned to me>.snowflakecomputing.com:443
snowflake.url.name=<My user id>.us-west-2.snowflakecomputing.com:443
I've set snowflake.user.name as:
snowflake.user.name=<My login id>
Not sure exactly how to set 'snowflake.private.key'. I copied contents of:
~/.ssh/id_rsa
After removing all new line characters so the value looks something like this:
snowflake.private.key=MIIEowIBAAKCAQEApM9bYyleCC+......... <long string>
I also tried to run the following command in Snowflake worksheet under SECURITYADMIN role but it keeps failing:
alter user <my user id> set rsa_public_key='MIIEowIBAAKCAQEApM9bYyleCC...';
Error message:
SQL access control error: Insufficient privileges to operate on user
What am I doing wrong?
In my case, it worked when I used the 'ACCOUNTADMIN' role.
snowflake.url.name should match the account name that you use to get to Snowflake via the UI, not your login name. This account name might have a region in the url, as well, which should be included. For example, xyzcompany.us-east-1.snowflakecomputing.com:443.
snowflake.url.name=<account_name>.snowflakecomputing.com:443
I would make sure that you are setting your role in the correct place in the UI worksheet. Easiest way to check is to run a SELECT CURRENT_ROLE(); command. You can also just run USE ROLE SECURITYADMIN; in the worksheet to make sure you are set correctly. That role should have permissions to alter user parameters.

use database with mixed case is not working via ODBC

I have a database with mixed case, i.e testDATABASE.
I run(using ODBC) the query use database ""testDATABASE";", then I run the query use schema "PUBLIC",
the query fail with the error:
ERROR: SQL compilation error:
Object does not exist, or operation cannot be performed.
Error Code: 2043
Query = use schema "PUBLIC"
when I run it not via odbc but in the notebook it works fine.
same queries with database that does not contain mixed case works fine.
if i run use schema "testDATABASE"."PUBLIC" it runs OK via ODBC and notebook.
is there a known issue about it? how can i run it with 2 queries in ODBCand make it work?
Thanks.
In your question it looks like your use database command had double double quotes,
but your schema didn't, perhaps that might be the issue.
Overall Suggestions :
When you make object names MiXeD-CaSe it simply makes use of the objects more difficult, so I'd recommend trying to not do mixed case if you can avoid it. You may not be able to avoid this, that's OK, it's just a suggestion.
if you can't avoid it, the only time I'd use the double quotes is when the object name
(in this case, the database name) has mixed case.
In your case, you should be able to run (you may have to double-double quote it in ODBC):
use database "testDATABASE";
and then this - note no double quotes needed because it's not mixed case
use schema PUBLIC;
this document illustrates how you don't need to prefix the schema with the database:
https://docs.snowflake.com/en/sql-reference/sql/use-schema.html
something else I recommend to folks getting started, for each user I like to set all the default context items (role, warehouse, namespace)
ALTER USER rich SET DEFAULT_ROLE = 'RICH_ROLE';
ALTER USER rich SET DEFAULT_WAREHOUSE = 'RICH_WH' ;
ALTER USER rich SET DEFAULT_NAMESPACE = 'RICH_DB.TEST_SCHEMA';

dbms_metadata.get_ddl() failes on a schema different from current user in Oracle

I need to read the structure of some database objects from an oracle database. So I want to do the following:
select dbms_metadata.get_ddl('TABLE', 'MY_TABLE', 'OTHERUSER') from dual
That works well if I'm logged in with the OTHERUSER user. But for PROD I only have a user, which has synonyms and uses this objects with that synonyms. I can select the TABLE information with the ALL_TABLES where I see that the owner is OTHERUSER, but when I run this code above with a different user, I get the following error:
ORA-31603: object "MY_TABLE" of type TABLE not found in schema "OTHERUSER"
ORA-06512: at "SYS.DBMS_METADATA", line 6069
ORA-06512: at "SYS.DBMS_METADATA", line 8666
ORA-06512: at line 1
It is exactly on the schema OTHERUSER. What can I do different to get this run? I can select from ALL_TABLES, ALL_INDEXES and so on, so I can read the information anyway, so it might not be a right problem, or am I wrong?
I know as a workaround I can use a procedure which runs in the context of OTHERUSER, but this is more than ugly.
One solution of doing it is described here:
http://dbmsdirect.blogspot.com/2007/10/using-dbmsmetadata-in-procedure-to.html

Re-importing specific tables from Oracle dump file

I imported a full oracle dump file into my database schema using the following command in linux ssh.
impdp system/password directory=bckup schemas=sch101 dumpfile=sc101.dmp remap_schema=sch101:MY_SCHEMA TABLE_EXISTS_ACTION=APPEND;
This command imported must of the tables into me target schema but some of the tables were skipped due to some constraint error.
I wanted to try to import these tables into my database after fixing the problem one by one. I used following command for doing so,
impdp system/password DIRECTORY=bckup TABLES=TBL_NAME DUMPFILE=sch101.dmp remap_schema=sch101:MY_SCHEMA TABLE_EXISTS_ACTION=APPEND;
But this command returns me error:
ORA-39002: invalid operation
ORA-39166: Object SYSTEM.TBL_NAME was not found.
I checked the name of the tables I tried to import in the export log file of the dump file that I used and they exist in the dump file.
What is that silly mistake that I am doing here?
Because you're importing as system from what is presumably a full (not schema) export, you need to specify the schema name in the tables parameter, despite the presence of the schema parameter:
... TABLES=sch101.TBL_NAME ...
The error message you're getting refers to SYSTEM.TBL_NAME, which clearly (or hopefully, anyway) isn't what you want.

Resources