Hello all actually i am using google cloud platform and there i am storing my coupons images in gcs buckets. Now does google provide any api to delete an existing image from gcs buckets. I searched a lot on its docs google docs also have seen many blogs but what everyone do is deleting data from database but no one tell about how to delete image from buckets. If anyone have done this please help me it would be really appreciable.
Thanks
Sure.
Via console you can use gsutil command in this way. You just should install gsutil command.
Via api rest you can use this service. You can try this api here.
Also there are libraries for python, java or other languajes.
From #MikeSchwartz suggestion. Using cloud console, you can manage your objects manually. Link.
Update 2: example on NodeJS
We can choose between three options. Using request module, Google cloud NodeJS client or Google API NodeJS client. But first of all, you should authorise your server to make request to Google Cloud Storage (GCS). To do that:
Open the Console Credentials page
If it's not already selected, select the project that you're creating credentials for.
Click Create credentials​ and choose Service Account Key.
In the dropdown select Compute Engine default service account. Then click create. A Json file will be downloaded.
In the left panel click Overview and type cloud storage in the finder.
Click Google Cloud Storage and make sure that this api is enabled.
Rename the downloaded json to keyfile.json and put it in accesible path for your NodeJS code.
Google cloud NodeJS client. Here the official repository with a lot of samples.
var fs = require('fs');
var gcloud = require('gcloud');
var gcs = gcloud.storage({
projectId: 'your-project',
keyFilename: '/path/to/keyfile.json'
});
var bucket = gcs.bucket('your-bucket');
var file = bucket.file('your-file');
file.delete(function(err, apiResponse) {}):
Using request module.
npm install request
Then in your code:
var request = require('request');
request({
url: 'https://www.googleapis.com/storage/v1/b/your-bucket/o/your-file',
qs: {key: 'your-private-key'}, // you can find your private-key in your keyfile.json
method: 'DELETE'
}, function(error, response, body){});
Using Google API NodeJS: I don't know how to use this, but there a lot of examples here.
Assuming you have your image file public url, you can do it like this
import {Storage} from "#google-cloud/storage";
const storage = new Storage({
projectId: GCLOUD_PROJECT,
keyFilename: 'keyfile.json'
});
const bucket = storage.bucket(GCLOUD_BUCKET);
//var image_file="https://storage.googleapis.com/{bucketname}/parentfolder/childfolder/filename
var image_file="https://storage.googleapis.com/1533406597315/5be45c0b8c4ccd001b3567e9/1542186701528/depositphotos_173658708-stock-photo-hotel-room.jpg";
new Promise((resolve, reject) => {
var imageurl = image_file.split("/");
imageurl = imageurl.slice(4, imageurl.length + 1).join("/");
//imageurl=parentfolder/childfolder/filename
storage
.bucket(GCLOUD_BUCKET)
.file(imageurl)
.delete()
.then((image) => {
resolve(image)
})
.catch((e) => {
reject(e)
});
});
check on googles official documentation under code samples in this link https://cloud.google.com/storage/docs/deleting-objects or github https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js
Related
Hi I am working on react project ,I want to download huge files ( more than 2.5gb) from azure blob storage to react application , ( scenario is when user click on export button I have text files in azure blob storage I want them to be downloaded to local system ) , I have been looking at few approaches, since I am new to azure I am bit confused
using azure AD we can get access to azure blob storage but since my application is hosted on app service how we can connect these two together or we can have direct access to files through azure app services ?
approach I am currently looking at : here
If all the resources are from azure, then we should use manage identity or service principle (which also use manage identity under the hood) link in your case.
In your case, you have two azure resources
Azure blob storage
App Service (which hosted as reactjs application)
So Here is there is step by step explanation to how you connect and read blob
In AppService(which hosted as reactjs application)
Go to your Appservice
Then Click on Identity in Left panel
Then On System assigned managed identity
After clicking save button then it generate Object Id.
In Azure Blob Storage
Go to Your blob storage account
Clicked Access Control(IAM)
Click Role Assignment (RBAC)
Click Add > Add Role assignment
Select Role as per your need like Storage Blob Data Reader
Click Next > Select Managed Identity > Select Member
Then Select your Subscription then App Service
Then List of Managed identity are shown > Select your App Service one which need to connect with storage
Then click on Select and then Next
Then You get the below screen. Match object id which generated in step 4 to below grid
Then Click Next > Next > Review + assign
Now In React Js Application
We can add these two Dependencies in package.json and do an npm i to install.
Now connect blob storage with DefaultAzureCredential from #azure/identity package :- when we give permission /access of one azure to another azure resource directly using service principle or managed identity then we use default azure credential then azure automatically validate them.
Code
For Import package
import { DefaultAzureCredential } from "#azure/identity";
// we're using these objects from the storage sdk - there are others for different needs
import { BlobServiceClient, BlobItem } from "#azure/storage-blob";
Create service client and container
const blobStorageClient = new BlobServiceClient(
// this is the blob endpoint of your storage acccount. Available from the portal
// they follow this format: <accountname>.blob.core.windows.net for Azure global
// the endpoints may be slightly different from national clouds like US Gov or Azure China
"https://<your storage account name>.blob.core.windows.net/",
new DefaultAzureCredential()
)
// this uses our container we created earlier
var containerClient = blobStorageClient.getContainerClient("your container name");
Get list of blob
let blobs = containerClient.listBlobsFlat();
for await (const blob of blobs) {
console.log(`Blob ${i++}: ${blob.name}`);
}
Download blob
const blobClient = containerClient.getBlobClient(blobName);
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
const downloadBlockBlobResponse = await blobClient.download();
const downloaded = (
await streamToBuffer(downloadBlockBlobResponse.readableStreamBody)
).toString();
console.log("Downloaded blob content:", downloaded);
// [Node.js only] A helper method used to read a Node.js readable stream into a Buffer
async function streamToBuffer(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
}
For More Details, Go through the below links
Azure Storage Blob client library for JavaScript - version 12.12.0
Quickstart: Manage blobs with JavaScript SDK in Node.js
I have an amplify application built using React JS, I have a scenario for which I am manually storing API keys in my SSM parameter store in my AWS account. However, I want to retrieve/get those values(JSON object) based on a key from my React JS app (client side). So, I have installed the aws-sdk, the AWS JavaScript sdk, and using the below code snipped I am trying to access the ssms parameter store
const AWS = require('aws-sdk');
AWS.config.update({region:'us-east-1'});
const ssm = new AWS.SSM();
const getSecret = async (secretName) => {
console.log(`Getting secret for ${secretName}`);
const params = {
Name: secretName,
WithDecryption: true
};
const result = await ssm.getParameter(params).promise();
return result.Parameter.Value;
};
module.exports = {getSecret};
I am receiving this error on running my application and while accessing the store using the getSecret function.
Unhandled Rejection (CredentialsError): Missing credentials in config,
if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1
I believe that amplify configures the environment implicitly but since, the SSM Secrets manager is not supported yet by Amplify hence, I have to use the JS AWS SDK for this purpose. Can anyone help me spot the issue while configuring the service using AWS SDK? Or is there another or a better way to access parameter store from the client side?
Also, after surfing I have found a package named dotenv
Is it okay to store aws credentials in such a way?
Your code to fetch parameter store keys/values shouldn't be at client side considering security implications. It should be done at server-side and functionality can be exposed over endpoint for client-side.
You can read the credentials programmatically something like below:
var AWS = require("aws-sdk");
var credentials = new AWS.SharedIniFileCredentials({profile: 'profile name'});
AWS.config.credentials = credentials;
Refrence:
loading-node-credentials-shared
global-config-object
We have an app hosted in GCP AppEngine. This app saves images and other data in datastore. The images are separated entities (kinds) say Kind1 and Kind2. We only need to export these two entity kinds and store the export in a storage bucket called datastore-exports. We have exported these two entity kinds manually via console. We would like to create a cloud function that could export the two fore mentioned datastore entity kinds on a daily basis every 24 hours. I need assistance on the files and code logic for this to take place.
Below are two examples that I came across that are somewhat what we want to accomplish.
I see they are doing that with firestore HERE.
I also had a look at this doc HERE but we need to use python 3.
Any assistance on either node.js or python3 methods will be highly appreciated.
Thanks,
I tried to reproduce your use case:
Run the command:
gcloud app decribe
# .......
#locationId: europe-west2
Make sure that your export bucket and your cloud function are deployed in the same location.
Your cloud function will use App Engine default service account.
PROJECT_ID#appspot.gserviceaccount.com
Assign to this service account the role Datastore Import Export Admin
(I would recommend to create a new service account for your cloud function, not using App Engine default service account.)
Create the cloud function:
a.main.py
def export_datastore(request):
import google.auth
import google.auth.transport.requests
import json
import requests
#Get the Access Token
creds, project_id = google.auth.default()
auth_req = google.auth.transport.requests.Request()
creds.refresh(auth_req)
token = creds.token
output_url_prefix = 'gs://your-bucket'
url = 'https://datastore.googleapis.com/v1/projects/{}:export'.format(project_id)
#We export all kinds and all namespaces
entity_filter = {
'kinds': [],
'namespace_ids': []
}
request = {
'project_id': project_id,
'output_url_prefix': output_url_prefix,
'entity_filter': entity_filter
}
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + token,
'Accept': 'application/json'
}
#Call the API to make the Datastore export
r = requests.post(url, data=json.dumps(request), headers=headers)
print(r.json())
return('Export command sent')
b. requirements.txt
# Function dependencies, for example:
google-auth
Use Google Cloud Scheduler to call the cloud function every 24 hours.
We have created two microservices, one which uploads images to Google storage (bucket) from node.js. And service which returns the serving_url from a python service. Both services work. Almost...
Our problem is that if we don't have the correct ACL on a file in the bucket, we are not allowed to get the serving_url() from the image service.
We are uploading images from the frontend with a resumable upload link, generated in the node.js file service:
createResumeableUpload: async (filename, metadata) => {
const file = await bucket.file(filename);
const upload = await file.createResumableUpload({
metadata,
public: true,
});
return upload;
},
After inspecting the file in the bucket, we do get allUsers with reader permissions, see the image below. But the transformation on the image-service will throw an exception, due to invalid permissions.
If we, however, upload an image directly in the bucket interface we get some more permissions, and after some testing, we found the following permission is what we are looking for is the editors, the second in the list:
We have tried a lot both setting the ACL when we create the resumable upload or on the file after creating it, but nothing works. We would really appreciate if someone could help us how to set the correct ACL on the file, so we are able to get the serving_url() from the image service.
[EDIT]
To answer some of the questions. We are 100 % certain this is a permission issue, the image service is working fine if we add the editor's permission, but we need to be able to add this permission when we add images to the bucket. And this is my question:
How do we add the owner permission to editors, when we upload images to the buckets from our node.js service?
Images API uses App Engine default service account (YOUR_PROJECT_ID#appspot.gserviceaccount.com). This service account does not need to be project owner for get_serving_url() to work as long as you grant it Storage Object Admin role on the bucket. Alternatively you can update object ACLs granting object owner to the service account.
In order to add the necessary permissions, you can send a patch request to the JSON API with a JSON that defines all ACLs you wish the object to have.
UPDATE:
A feature request has been created on your behalf to make this functionality available for the NodeJS client library. Please star it so that you could receive updates about this feature request and do not hesitate to add additional comments to provide details of the desired implementation. You can track the feature request by following this link.
After many hours of trying figure Google ACL system, I would like to share how I was able to setup ACL from nodejs.
After an image file is uploaded to the bucket, it needs to have the project owner ACL added. If this the image file does not have the project owner ACL, it will throw an exception when generating the serving_url().
To update ACL I used this function:
updateAcl: async (name: string) => {
const file = await bucket.file(name);
console.log('file', file);
const acl = await file.acl.add({
role: 'OWNER',
entity: 'project-editors-231929353801'
});
return acl;
}
From the storage panel, it's possible to get a list of all the default ACLs.
i am building a web app using angularjs and am using firebase as my database and storage.
my issue is that i am trying to get a .txt file from my storage and display it in the browser, but whenever i make a request using the download url that i got using the firebase sdk i get a cors error.
i am authenticated in firebaseAuth and i have already been able to download images via the 'src' attribute.
this is my request:
$http({
url: url,
method: 'GET'
}).then(function(data){
console.log(data);
}).catch(function(e){
console.log(e)
});
I don't want to override the cors options if i don't have to. does anyone know how i can do this?
I am using firebase hosting to host the site, which shares the same url as my firebase storage
OK.
Thanks Mike Macdonald for your comment.
I'll just do what it says in the other thread you sent me