Is there 'USE DATASET' in GOOGLE CLOUD PLATFORM Bigquery like we have in hive?
USE DATABASE NAME
Related
So first I wanted to import a table from Powerapps dataverse to use it on Azure ML, for this I created an Azure Synapse link for the table into the Azure Portal. Now from the Storage account which has my table I tried creating a datastore and then select the path. But I am facing an issue because whenever I click on select path it appears blank and shows error that-
Error when accessing datastore: Unable to access data because it does not exist, is behind a virtual network, or you do not have Storage Blob Data Reader role on this storage account.
Even I tried using the Import Data function to directly link the Synapse link which couldn't happen.
So can anyone help me out in this?
I have a web app that allows users to edit configuration tables in my database.
For audit purposes, I would like to capture the user_name of the authenticated user rather than the account that the web app uses to execute the queries.
Previously my database was on MSSQL. We could pass the user_name into context_info and the trigger will then reflect the user's user_name.
Is there a way to achieve similar results on Snowflake? I have explored Streams + Query_history but that only tells me that a query was fired through the web app account.
Snowflake captures all queries executed and users can view those queries via either
the SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY view or via the information_schema
(e.g. information_schema.query_history_by_user).
If your web app is using a single Snowflake user (sometimes called service accounts, or batch ids, or similar),
and you need to find out what queries "application end users" are causing your single service account to execute,
I'd suggest leveraging Snowflake's "query tags" for this.
To leverage query tags, simply execute a command like the following, before each query or set of queries performed by your application, by your service account, on behalf of the end user.
ALTER SESSION
SET QUERY_TAG = '{"application":"Awesome Web App", "endUser":"Rich Murnane"}';
Note: You don't need to use JSON, but I think it's cool :-)
Once you start logging your queries performed by end users,
you can find out what end users are doing by querying by the QUERY_TAG field in the SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY view, or the information_schema query_history/query_history_by* table functions.
Links:
https://docs.snowflake.com/en/sql-reference/account-usage/query_history.html
https://docs.snowflake.com/en/sql-reference/functions/query_history.html
https://docs.snowflake.com/en/sql-reference/sql/alter-session.html
How can I give additional roles to the default Google App Engine (GAE) service account:
Specifically, I want to give "Cloud SQL client" role to the default app engine service account. When I try to modify the role I get this message:
As it was answered in the comment section by #John Hanley, to add roles to a service account, go to IAM & Admin -> IAM, find your service account in the table and edit it to grant necessary roles.
In addition, please have a look st the documentation Understanding service accounts section Granting access to service accounts:
Granting access to a service account to access a resource is similar
to granting access to any other identity. For example, if you have an
application running on Compute Engine and you want the application to
only have access to create objects in Cloud Storage. You can create a service account for the application and grant it the Storage Object
Creator role.
Learn about Granting roles to all types of members, including
service accounts.
I know that Azure Search works against Azure Bolb storage for document search, but does it work similarly against Azure File Storage service? If yes, how it is different from working against Azure Blob? Is there any limitations?
UPDATE May 21, 2018
Azure Files datasource is now supported in private preview. Please follow onboarding instructions here.
Azure Search now has a preview for Azure Files. In order to sign up for the Azure Files indexer private preview, you need to fill out this form.
Once the team has added you to the preview, you will receive an email with instruction on how to consume the capability.
I have a few small Google App Engine apps that are accessing the same Cloud SQL Instance (to save resources) and I wan't to restrict the access of these individual databases per app; eg.
1. CoolApp5 ---> global_db_instance -> coolapp5_db
2. EatFood ----> global_db_instance -> eatfood_db
3. WebsiteCo --> global_db_instance -> websiteco_db
Configuring Google Could SQL Instance Access
The current system seems to allow for apps to have access to the global instance databases but creates a security vulnerability. If a intruder get's access to the database true a single app (coolapp5/eatfood or websiteco) he will have access to the all the databases.
Eg. Wordpress sites for CoolApp5
<?php // wp-config.php
define('DB_HOST', ':/cloudsql/global_db_instance:db'); //
define('DB_USER', 'root');
define('DB_PASSWORD', '');
define('DB_NAME', 'coolapp5_db');
?>
As you can see, the app coolapp5 has access to coolapp5_db and could gain access other database (eatfood_db, websiteco_db) as the app has root access.
Google seems to allow this through an external IP address
An obvious solution might be to have many individual instance for each app, but this is inefficient for small apps as a Cloud SQL database can run multiple websites at once.
Is there any other solution or is my strategy simply will not work for Google Cloud SQL ?
Setup MySQL users per database/application (docs) within your single Cloud SQL instance and grant access to these new users to only the database they need.