Where to put SQLite database file in Azure App Service? - database

Q1: Where do you think is the right place to put a SQLite database file (database.sqlite) in Azure Web App file system? For example:
D:\home\data\database.sqlite
D:\home\site\database.sqlite
D:\home\site\wwwroot\database.sqlite
other?
Q2: What else should be taken into consideration in order to make sure that the database file won't be accessible to public users as well as not being accidentally overwritten during deployments or when the app is scaled up/down? (The Web App is configured for deployments from a Local Git Repository)
Q3: Where to learn more about the file system used in Azure App Service, the official source URL? E.g. how it's shared between multiple VMs within a single Web App, how does it work when the App is scaled up/down, what's the difference between D:\home (persistent) vs D:\local (non-persistent)...
Note that SQLite does not work in Azure Blob Storage, so that one is not an option. Please, don't suggest alternative storage solutions, this question is specifically about SQLite.
References
Appropriate Uses For SQLite

In a Web App, your app is deployed to d:\home\site\wwwroot. This is the area where you may write files. As an example, the ghost deployment writes its SQLite database to d:\home\site\wwwroot\content\data\ghost.db. (easy to see this, if you open up the kudu console via yourapp.scm.azurewebsites.net):
This file area is shared amongst your web app instances. Similar to an SMB file share, but specific to web apps (and different than Azure's File Service).
The content under wwwroot is durable, unless you delete your app service. Scaling up/down impacts the amount of space available. (I have no idea what happens if you scale down and the smaller size has less disk space than what you're consuming already).

I would say the best location would be app_data folder in the site/wwwroot folder. Create the folder if it doesn't exist.

Web Apps can connect to storage accounts so you can in fact use blob storage and connect that to your web app. So in terms of learning more about it then you need to be looking at the appropriate page of documentation.
In your Web App settings you can then select which storage account to use. You can find this under Settings > Data Connections where you can select Storage from the drop down box.

Related

What is the easiest way to push a file on the GCP AppEngine?

To verify the ownership of a domain to a mail service, I need to put a file with a specific name for verification. Is there a better way than pushing it into my app source repository?
For security reasons you would have to put the file in your source and do a deployment to App Engine. If you’ve worked with a traditional web server in the past where you basically dump files into a folder and serve them this will be a bit of a change. The App Engine files are going to execute only. If you want to get in to adding other files on the fly you would need a Cloud Storage Bucket, but I don’t think that will do it for your domain verification.

How do I preserve certain folders when doing a deployment slot swap on an Azure App Service?

I have an Azure app service with two additional slots - one for QA and testing and one for staging. The deployment process sees me deploying to QA for testing, then to Staging when I'm ready to go live. Staging is then swapped with the Production application to avoid downtime for users.
The problem I have is that I would like to keep some site content - effectively image files uploaded by users. They are located in a specific folder - let's say \wwwroot\images. I can't push these to my TFS system as they are effectively client data files.
Is there a way I can do this deployment without having to back that folder up (using FTP) and restore it after the swap (a 30 minute process)?
In hindsight I could probably have stored the images in the DB - they're not particularly large, but is there another approach that makes more sense? What about using Azure storage? What would you do?
Thanks in advance for any advice!
Typically for any Storage purpose, it's not recommended way to store images in your code by creating any folder(s). Since Azure web app is a PaaS model, you don't own the server. Your code is somewhat deployed in any machine(s) among the Azure DataCenter.
As you mention, you can use Azure Storage.
Azure Blob Storage is a the which you need to look.
Here some good demo for uploading images in Azure Blob Storage

Creating a local environment from an existing GAE installation

I have a website that is currently running under GAE... unfortunately, I, nor anyone on the team, does not have access the local environment that it was created from.... Is it possible to create a local environment or at least get a copy of the application files and database from an existing GAE installation?
What you need is the application source code, not the "local environment".
Ideally this source code would be on a version control system (ie GIT,SVN), Google cloud platform provides free GIT repositories for your projects so you might try looking there first. There's also a tool for both Java and python that allow you to download the source of a deployed version, provided you are authenticated as either the dev who uploaded it or a project owner. EDIT: as stated by Dan Cornilescu this feature can be disabled.
As for the database info there's plenty of tools available to "export" your GAE datastore info, just consider for your project that it might be easier to do the queries manually than actually implementing this tools.
Thanks for help... But unfortunately, this code is not in GIT. Furthermore,
being new to Google hosting, I wasn't clear on my setup... My web instance is actually running within Compute Engine not Application Engine. Be that as it may, with some additional search, I was first able to find out how to browse my filesystem by accessing the VM Instances menu option under the Compute Engine section of the Google Cloud Platform interface. On the VM Instances page, it will show your instance and an option to the left side of the instance to connect with a drop down box that will allow you to open a browser window that shows the instance's file system. In addition to this, I found this link https://www.youtube.com/watch?v=9ssfE6ODpak that shows how to configure Filezila FTP client to access your server instance - very helpful. From there, I was able to download all of my site files from the var/www directory. Now, onto extracting my data... Thanks again!

GWT how to store information on google App Engine?

In my GWT application, a 'root' user upload a specific text file with data and that data should be available to anyone who have access to the app (using GAE).
What's the classic way to store a data that will be available to all users? I don't want to use any database (objectify!?) since this is a relatively small amount of information and it changes from time to time by root.
I was wondering if there was such static MAP on the 'engine level' (not user's session) that this info can be stored (and if the server is down - no bigi, root will upload again)
Thanks
You have three primary options:
Add this file to your /war/ directory and deploy with the app. This is what we typically do with all static files that rarely change (like .css file, images, etc.) This file will be available to all users, whether they are authenticated or not.
Add this file to your /war/WEB-INF/ directory and deploy with the app. This file will be available to your server-side code, so you can read it on the server-side and show to a user. This way you can decide which users can see this file and which users should not have access to it.
Upload this file to Google Cloud Storage. You can do it through an app, or you can simply upload it manually to a bucket using a GCS console or gsutil command-line tool. Then you simply provide a link to your users. The advantage of this option is that you do not have to redeploy your app when a file changes.
The only reason to go with the first two options is to have this file under version control. If you don't need that, I would recommend going with the GCS option.

microsoft azure and silverlight

I am interested in developing a site similar to youtube. I want to have a site that users upload videos.
I imagine technically the website would upload the video to the azure cloud. Where the file will automatically be encoded to silverlight and hosted.
Can azure host my site, take care of encoding and host the videos all programmatically?
And can azure host the rest of the website pages that are not part of the app like a (homepage or about us page) and have a domain name or do i need a web host?
thanks
Azure can do the lot.
You'll probably want to use Azure Blob Storage for the initial upload, then use queues and the worker role functionality to do the encoding and other processing. Then you can store the resulting file back in Blob storage, and have an index either in Azure Tables or SQL Azure, depending on the architecture of the rest of the application.
And yes, an Azure Web role can quite happily host static content, standard dynamic ASPX pages, and a whole lot more (and can do it all on your own domain).
I suggest you grab the Windows Azure SDK (from http://www.microsoft.com/windowsazure/) and take a look through the documentation. Your example scenario is pretty simple actually, and working through the samples should give you all the information you need.
Good luck!
Azure can host your site indeed. However don't forget that the costs will probably be a minimum ~ $80-90 per month even without any load. If your website gets traffic this amount will increase
However you will have to implement video encoding yourself (or better yet find libraries to do it), Azure is purely a host.

Resources