Script to upload files to Azure Storage File Share - file

Situation:
I just created an Azure Storage file share and I figured out how to upload files via script (C#) to the Azure BLOB storage.
Problem:
I want to upload files via script to my Azure storage file share (not BLOB). I already installed the Azure CLI but the problem is I have to login first (az login) before I can take any actions.
Is there any way to upload files from a folder based on my PC to the Azure storage file share (testuser.file.core.windows.net\test) without mounting it?
Thanks a lot.

The are multiple was to accomplish this. If you only problem is the login part then you can use Service principal and certificate to automate login.
example
az login --service-principal -u http://azure-cli-2016-08-05-14-31-15 -p ~/mycertfile.pem --tenant contoso.onmicrosoft.com
You can see more options by using
az login -h
Hope this helps.

Is there any way to upload files from a folder based on my PC to the Azure storage file share (testuser.file.core.windows.net\test) without mounting it?
Yes, we also could use the C# code to upload file to azure file storage. Please have a try to use the following code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("storage connection string");
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("test");
share.CreateIfNotExists();
CloudFile cloudFile = share.GetRootDirectoryReference().GetFileReference("fileName");
Stream fileStream = File.OpenRead(#"localpath");
cloudFile.UploadFromStream(fileStream);

Related

How to allow downloading huge file in a website only after authentication?

I have a website where users need to see a link to download a file (approximately 100 MB size) only after authenticating (userid/password) themselves in the website. Users should not be able to copy the link and use it later without authentication.
Can a REST API with (Transfer-Encoding: chunked) return such a huge file size without being timed-out?
Note: We currently have java springboot based APIs for some basic functions returning JSON (text) response
How can I prevent the URL from being accessed later without authentication ?
Any approach to generate dynamic URLs which will be valid only for few mins ? Should this logic be in the app server or CMS like Drupal have this feature ?
I am open to store this file in DB or Drupal or a file server as per the recommended approach for securely download the file. This file is not text/image/pdf, it will be a binary file.
Note: My system does not use any Public Cloud like AWS/GCP/Azure

Azure Logic Apps-Download File from URL

I have a requirement in Logic Apps where I need to do HTTP GET from a website URL which gives a file which I need to download to Azure File Storage.
I am able to call the downloadable URL but not sure how to go about downloading the file to Azure File storage directory.
Please let me know your inputs.Do I need to write an Azure function or can I get the HTTP action to do the trick to download the file?
Thanks,
SP
I suppose Logic apps has moved on a little since you first asked this question.
The short answer is yes you can do this solely within Logic Apps.
I'm assuming you're making a HTTP Request at some point and the downloadable file is being returned as a content type of application/octet
Use a 'Blob Storage'->Create Blob action, the only thing I needed to do was to use the binary function as the content in this action
e.g. binary(body('HTTP'))
This caused my zip file to be created in the Azure storage account as a blob.
Ping me if you need more info.
1) You need to create one web api function or azure funtion which return file content like i tried for zip file
2) You need to call that method using HTTP connector
3) You can use "azure File storage" connector "create file" action
in that you need to pass file and file content which return from your GET API Url
if you need more help feel free to ask

How to embed a SQLite DB (*.db file) to the AIR application?

I want to emded SQLite database (*.db file) to the AIR app, like images, mp3 etc.
This is to ensure that it was impossible to copy my database for other users.
No, you can't. And even if you could package it, the data wouldn't really be protected: since an .air package is in essence nothing more than an enhanced .zip archive, anyone could just extract the database from the .air file.
What you have to do is encrypt your database and protect it with a password so that only authorized users can access the database.
There's ample documentation on how to do this. Here's a tutorial from Adobe to get you started: http://www.adobe.com/devnet/air/flex/quickstart/articles/encrypted_database.html
I think this is not possible. If you embed it to the air file andinstall the AIR app the db will be copied to the installation folder.
You could create the database in the user storage location
var file:File = File.applicationStorageDirectory.resolvePath("name.db");
Users could still navigate in explorer to the file. It will be stored under /users/username/appdata/appname/Local Store
AppData is a hidden folder.

Cakephp loads old database config file

I developed a small cakephp website on local machine. When I upload to my host server, I realize that I must use their prefix for the database name and database user. So, I go and create new database name and new user. When I visit the site, it has this:
Warning (2): mysql_connect() [function.mysql-connect]: Access denied for user 'old_name'#'localhost' (using password: YES) [CORE/cake/libs/model/datasources/dbo/dbo_mysql.php, line 552]
I am 100% sure, I already change the name to "new_name", it runs on my local machine, but I dont know why it still load the old name. I go to ftp, and download the database config file, it clearly stated that, the database user is "new_name", i dont understand why when I go to the site, it loads the old database config. I am new to cakephp, please help.
try to clear the files from your app/tmp directory and tripple check the configuration.
Sometimes it can take a while for files to become live. One host I use has a folder for upload and then they transfer to the live folder from there automatically. Could it be something like this?
You possibly need to specify an IP for the database instead of localhost. Often the database is hosted on a different server to the code.

Change owner of file uploaded in server

I have trying to overwrite or perform some file operation to the files uploaded in a webserver.
Previously I have uploaded the files from joomla extension. It defined its owner as 99. Without changing its owner it my login name i am unable to perform file operation using ftp and cpanel.
what can be done?
You could enable the FTP layer of Joomla.
It does depend a bit on how your hosting sets permissions (whether they use ACL's, etc), but the FTP layer of Joomla is designed to get around exactly this issue.
Documentation for this feature is here:
http://help.joomla.org/content/view/1941/302/1/2/

Resources