I am trying to share my Snowflake Database(default metadata)--> Account_Usage schema --> Query_History table to another managed account (i.e. reader account) but the data is not visible in another account.
is there any way to share the snowflake database without duplicating the data?
I am getting error as Already Imported Database (i.e) SNowflake cannot be shared for Datashare option.
for a Managed account for usage, I Snowflake database and schemas are available but are not able to see the data which is available.
According to the documentation you can't re-share any database that is shared with you:
Shared databases and all the objects in the database cannot be forwarded (i.e. re-shared with other accounts).
Since the Snowflake database is one that is shared to you from Snowflake, this is probably why you're having issues.
If you need to do this your best bet is to create a table and populate it with the data you need from the Snowflake database and share that table instead. Although it is strange that you'd want to share this info with another account.
Your other option would be to create database/schema in your account with views over the account usage data that you want to share, create a role that can access only that, and then provide a user login with that role only to the group needing to do analytics on your data.
Related
Wanted to confirm if it is not possible to create a schema from a share.
I see from the documentation that once a share (from a schema not the entire database) is created/enabled by the provider account,I/the consumer account can only leverage the share by creating a new database.
I/Consumer already have a snowflake database & would like to just point the schema from the provider that is enabled as share to a new schema on my side in the existing database. This avoids the need to maintain multiple databases.
Br,
Noor.
It is not possible.
In consumer side, the ff. command is used to consume a share:
CREATE DATABASE <name> FROM SHARE <provider_account>.<share_name>;
Hence, consumption can only be done in database level. This effectively creates a read-only database (unlike regular/local databases), all of its objects are controlled by the provider.
Additionally, IMPORTED PRIVILEGES is used to grant roles in consumer side access on shared database. Cannot grant this privilege on a Schema.
is it possible to share a complete database on snowflake and share it on another snowflake account?
in order that all objects within the shared database are immediately visible in the snowflake subaccount
I unable to create objects (views, file format, stage etc.. ) in a shared sample database (SNOWFLAKE_SAMPLE_DATA).
Kindly let me know, what is the possible way to get access the data?
Regards,
DB
The SNOWFLAKE_SAMPLE_DAT database contains a schema for each data set, with the sample data stored in the tables in each schema. You can execute queries on the tables in these databases just as you would any other databases in your account.
The database and schemas do not utilize any data storage so they do not incur storage charges for your account.
however, just as with other databases, executing queries requires a running, current warehouse for your session, which consumes credits.
You can refer to snowflake documentation: DOCS » USING SNOWFLAKE » SAMPLE DATASETS.
Hope this helps answer your question.
Shared databases are read-only. Users in a consumer account can view/query data, but cannot insert or update data, or create any objects in the database. This is why you can not create any objects on the shared database (SNOWFLAKE_SAMPLE_DATA).
https://docs.snowflake.com/en/user-guide/data-share-consumers.html#general-limitations-for-shared-databases
You can query the data in shared database like any other database.
https://docs.snowflake.com/en/user-guide/data-share-consumers.html#querying-a-shared-database
Tried searching for snowflake tags on metastack and superuser, and couldn't find them. Hence asking the questions here.
I have 2 snowflake accounts and I need to copy data from production account to testing account.
How can I do that? I read the snowflake documentation for loading and unloading the data using s3, but is there a quick way to get the data across?
As Howard said you could use Data Sharing. Create a share with the data you want to copy, grant the testing account access to the SHARE, create a new database in the testing account from the share. Now you can query that data as if it was in the test account. If you need an actual copy of the data, so you can change it, etc, then you need to do a CTAS from the share to an empty table in another database. This will be much faster than unloading and loading all the data.
Here is the doc to get started: https://docs.snowflake.net/manuals/user-guide/data-sharing-intro.html
I have MS Access as a front end and PostgreSQL as back end for my database. So I set up the database in PostgreSQL and linked the tables to MS Access using the ODBC drivers. Everything works great, I can update the tables in MS Access and the record will appear in Postgres database.
Since I can still see the linked tables in MS Access, I feel like it is possible for some users to go in and manually modify the tables without filling out proper forms. Is it possible to HIDE the tables or lock the tables so that Access users cannot modify the raw data at all? If not, what can I do to secure the integrity of the database.
Thanks!
I would recommend looking at Postgres privileges as a way to lock the tables down.
In short, you could have your backend run as one user that has full access permissions on the tables in question, and when the users login to the app, they would be connected to Postgres using a user whose privileges are considerably more locked down (say, read only if you just want to be able to do SELECTs to surface data).
For example, you could run the following SQL against your Postgres server:
REVOKE ALL ON accounts FROM joe;
GRANT SELECT ON accounts TO joe;
Which would first remove all privileges from the user joe for the table accounts, and then allow only SELECT priveleges for that table.
You could do something similar for all the tables you wish to lock down. You'll also need to do the same for the sequences used by those tables.
You may wish to create a special readonly user which has only read access across the board, and use those credentials to surface the Postgres data for the users to access.
When you need to alter data, your backend could specifically use a power user of sorts which has much greater access.
Here's a link which details creating a readonly Postgres user (for purposes of backups in this case, but the general concept and the SQL commands should apply (just ignore the stuff about pg_dump).
If you aren't concerned about users' ability to modify the data in those tables via the up other than in the ways that are authorized, but are only concerned about them using, say, psql to go in and update them, then you probably don't need a readonly user, but can simply lock the tables down and have the backend use that user with sufficient access.