No existing service instances found in IBM Watson Studio - ibm-watson

When I am adding cloud storage in IBM Watson Studio it shows
No existing service instances found
So, I try to add lite cloud storage which is free but it gets added with default location global. How do I make it to some other country?

IBM Cloud Object Storage is a Global service so the default location of global is to be expected. More information can be found regarding provisioning COS at https://cloud.ibm.com/docs/services/cloud-object-storage/basics?topic=cloud-object-storage-provision#provision-instance
Buckets however can be created in a single region if necessary. More details on Buckets can be found at: https://cloud.ibm.com/docs/services/cloud-object-storage?topic=cloud-object-storage-getting-started#gs-create-buckets
When you create your Watson Studio project, you will see that the bucket will be created in the region where your Watson Studio is located.

Related

Cannot see the STORAGE_GCP_SERVICE_ACCOUNT in my google cloud account when setting up storage integration

I am following the steps here: https://docs.snowflake.com/en/user-guide/data-load-snowpipe-auto-gcs.html and having trouble on step 6
I ran:
create storage integration my_integration
type = external_stage
storage_provider = gcs
enabled = TRUE
STORAGE_ALLOWED_LOCATIONS = ('gcs://<my-bucket>')
;
which completed successfully. Then DESC STORAGE INTEGRATION MY_INTEGRATION; successfully describes the storage integration and lists a STORAGE_GCP_SERVICE_ACCOUNT
However I cannot find this service account in the google cloud platform console that owns that bucket.
My snowflake account is on AWS though according to that tutorial page, I am allowed to use AWS or CGP for this integration.
Is there somewhere I should indicate in the create integration command which google cloud project I am referring to so that it knows where to create the service account?
And advice on performing this step?
This issue was because my bucket was not publicly accessible. Once I turned it to publicly accessible I was able to find that service account when adding roles.
It appears that the service account is not a service account "On" the google cloud platform account that hosts the bucket but rather one setup by snowflake on their own managed services.
So its like granting an external storage account permissions rather than an internal one.

Permission denied on external access to Google Cloud Datastore

I want to access datastore (and storage) data of an AppEngine project via google-cloud-datastore and google-cloud-storage with an Python program on my own server.
This works with my AppEngine staging server, by creating a service account and giving it owner access (to the project).
Doing the same thing with the production AppEngine instance fails with
google.api_core.exceptions.PermissionDenied: 403 Missing or insufficient permissions.
Part of the problem might be, that I might be using the wrong project to create the service account with. There are more than one project with the same name in my cloud console. How do I identify the correct one?
How do I get more details about the problem?
First, note that the Datastore and the Cloud Storage are 2 different products with 2 different accessing methods.
The Datastore is closely tied to the GAE project - each project has its own datastore. The external access procedure in general is captured in How do I use Google datastore for my web app which is NOT hosted in google app engine?.
When switching the project (staging to production in your case) there are 2 things to keep in mind:
as you observed, you need to change the project you're accessing.
you also need to change the credentials you load and use for access to match the project you select, as each project has it own service account key configured in the above-mentioned procedure
For the google-cloud-datastore library both of these are simultaneously configured via the datastore.Client() call parameters (emphasis mine):
class google.cloud.datastore.client.Client(project=None,
namespace=None, credentials=None, _http=None, _use_grpc=None)
project (str) – (Optional) The project to pass to proxied API methods.
credentials (Credentials) – (Optional) The OAuth2 Credentials to use for this client. If not passed (and if no _http object is passed),
falls back to the default inferred from the environment.
The Cloud Storage is completely independent from GAE, the GAE project/credentials you use (if any) have no bearing on bucket/object access restrictions whatsoever. There's nothing you need to do from the google-cloud-storage library perspective when switching from one GAE project to another
To eliminate the confusion created by multiple projects having the same name just go to the IAM & admin Settings page, select the respective projects from the drop-down list on the top blue bar and rename them using meaningful names (click in the Project name box to edit the name, then click SAVE). Then re-check if you're using the right keys for the desired project.

Creating a local environment from an existing GAE installation

I have a website that is currently running under GAE... unfortunately, I, nor anyone on the team, does not have access the local environment that it was created from.... Is it possible to create a local environment or at least get a copy of the application files and database from an existing GAE installation?
What you need is the application source code, not the "local environment".
Ideally this source code would be on a version control system (ie GIT,SVN), Google cloud platform provides free GIT repositories for your projects so you might try looking there first. There's also a tool for both Java and python that allow you to download the source of a deployed version, provided you are authenticated as either the dev who uploaded it or a project owner. EDIT: as stated by Dan Cornilescu this feature can be disabled.
As for the database info there's plenty of tools available to "export" your GAE datastore info, just consider for your project that it might be easier to do the queries manually than actually implementing this tools.
Thanks for help... But unfortunately, this code is not in GIT. Furthermore,
being new to Google hosting, I wasn't clear on my setup... My web instance is actually running within Compute Engine not Application Engine. Be that as it may, with some additional search, I was first able to find out how to browse my filesystem by accessing the VM Instances menu option under the Compute Engine section of the Google Cloud Platform interface. On the VM Instances page, it will show your instance and an option to the left side of the instance to connect with a drop down box that will allow you to open a browser window that shows the instance's file system. In addition to this, I found this link https://www.youtube.com/watch?v=9ssfE6ODpak that shows how to configure Filezila FTP client to access your server instance - very helpful. From there, I was able to download all of my site files from the var/www directory. Now, onto extracting my data... Thanks again!

Saving images in Azure storage

I am building a web application , where users can upload images & videos and store them in their account. I want to store these files somewhere and save only the URL in the DB.
What is the right way to do it using Azure services? Is there a dedicated server for this, or some VM?
Yes, there is a dedicated service for this purpose. It is the Azure Blob Storage. And you are highly advised to save all and any user uploaded content to that service instead to the local file system.
The provided link has samples for almost any language that has client SDK provided by microsoft.
If, at the end you use some platform or language that is not directly supported by an SDK, you can always refer to the Blob Storage REST API documentation.
You will need to go through the blob service concepts to get deeper understanding of the service and how to use it.

Scope of ScriptDB instance for Google Apps Spreadsheet Add-on

I was wondering if anyone knew the scope of a Google Apps ScriptDB database when used within an Add-On? The page mentions "Each script project gets a database", so would that mean that every instance of an Add-On shares the same ScriptDB instance? Or would each Add-On instance have it's own private ScriptDB instance? I have some concerns around my Add-On accessing a database that contains data for users other than the current user. Thanks!
ScriptDB we found was specific to a particular instance i.e. Doc or Sheet. Try copying a doc or sheet that has an add-on and script db, the code for the add-on is persisted but the scriptDB is not.

Resources