Permission of uploads folder and server files folder on dedicated server - file

I will be running a website on a dedicated server, where users will be allowed to upload files (I am not checking those files for viruses).
Lets say the folders are like so -
parent_folder -> website (folder which contains server files)
parent_folder -> uploads (folder which contains user uploads)
Question. Will file permission on website folder as 750, and on uploads folder as 770 suffice?
(Note1: Owner - Root; Group - Apache; website is a social network)
(Note2: As Apache (www-data) will handle uploads, I believe 770 may be required in place of 740)
(Note3: As Apache (www-data) will only need to read files in website folder, therefore 750)
(Note4: Server will be maintained by a single user)

I think whether it's good enough or not depends on what risk you are willing to take. For a basic protection, it may be good.
What you have to consider is what would happen if files are stolen either by an unauthenticated attacker, or I suppose there would be some kind of an access control and users would have access to their own files, in which case an obvious risk is a user being able to access others' files. How would it affect your website / business if this happened? Are you willing to take this risk? Who are the attackers you are trying to protect against? Opportunist script kiddies? Knowledgable attackers specifically targeting your site? Nationstate-level attacks? You have to answer these first.
The drawback of this solution is that any application or operating system level breach will most likely give access to all uploaded files of all users. Should your application be vulnerable to things like local file inclusion, directory traversal, even SQL injection with certain databases or a lot more vulnerabilities, your files can be downloaded. Whether you want to protect them more depends on your use case.
More secure options could be storing them on a separate server (some kind of a file repository), encrypting them with user-specific keys (and then key management and cryptography in general is a huge topic itself), you could implement mime-type identification with whitelists to only allow certain file types (this is a slightly different aspect), or you could even add end to end encryption to protect against yourself (operations staff), etc. You can go lengths in terms of protecting these files, if you want to, but it gets very complex very quickly.

Related

VB.NET Copying Database template files to selected folder location during installation

net project as well as a setup project. I also have it so that during installation it asks the users to enter a file location to store their database. the plan is to have an empty .mdf file, with all the tables setup, copied into that folder and I store the folder path in a config file.
this is mainly because I am planning on having multiple separate applications that all need the ability to access the same database. I have it storing the folder path in my config file the only thing I'm having trouble with is
storing the template files I don't know if i should do this in the setup project or main project
how to copy said template files into a new folder
so far I have been unable to find a solution so any help is appreciated
Well here is what I do this in a few of my projects - something that has proven reliable enough for me over the years (which you may or may want to do as well):
I have the program itself create the database files in an initialization routine. First however, it creates the sub folders in which the database files will be stored, if they don't already exist.
To do this, the program just checks if the folder exists and if the database file exists and if they do not, it creates them on the spot:
If Directory.Exists(gSQLDatabasePathName) Then
Else
Directory.CreateDirectory(gSQLDatabasePathName)
End If
If File.Exists(gSQLiteFullDatabaseName) Then
Else
...
I also have the program do some other stuff in the initialization routine, like creating an encryption key to be used when storing / retrieving the data - but that may be more than you need (also, for full disclosure, this has some rare issues that I haven't been able to pin down).
Here too are some addition considerations:
I appreciate you have said that you want to give the user the choice of where to store their database files. However, I would suggest storing them in the standard locations
Where is the correct place to store my application specific data?
and only allowing the users to move them if the really need to (for example if the database needs to be shared over the network) as it will make the support of your app harder if every user has their data stored in different places.
I have found letting the user see in their options/settings windows where their database is stored is a good idea.
Also to encourage them to back those files /directories up.
Also to create automatic backups of several generations for the user.
Hope this helps.

Is there a standard for protecting application files from user interference outside of application

Sorry if I didn't express myself precisely in the title, I'll try to explain what i meant to say here.
My application uses a lot of small files like DB files, xml files, fonts, etc. There is folder and file presence check when application starts, but I would like to make sure that user can not accidentally change or delete some important file from disk.
Only thing that comes to mind is archiving files in few archives by usage frequency, changing archive extension to something unfamiliar and hiding those archives.
But compressing and uncompressing those files all the time through application doesn't seem like efficient solution.
Is there some standard procedure for keeping those important files from tampering?
Only thing that comes to mind is archiving files in few archives by usage frequency, changing archive extension to something unfamiliar and hiding those archives
That is security through obscurity, which is not a recommended practice.
Instead, use the file security mechanisms built-in to your operating system. Allow appropriate file access only to a specific group/role or user, and ensure your application runs in that group/role or as that user.

Batch Processing Design Patterns

A partner who cannot support a real-time web service interface must SFTP CSV files to my linux environment.
The file is zipped and encrypted. The sftp server is a different virtual server than the one that will process the CSV data into my application's database.
I don't need help with the technical steps (bash script, etc) but I'm looking for file management conventions that assist with the following requirements:
Good auditabilty
Non-destructive
Recoverable
Basically I'm trying to figure out when it makes sense to make copies of the file, when to rename it to indicate some process step has been completed to a file, etc. (e.g. Do I keep the zip files or do I delete them once unzipped?)
There is going to be personal preference in the response, but I'm looking for that; to learn from someone who has more experience working with this type of interface. This seems better than me inventing something myself.
If the files are encrytped upon the network and within the files settings, then it cannot be successfully transmitted across unless the file is parsed within another file. You could try to make the sftp server foward the file onto a seperate machine,but this would only cause more issues because of the encryption type based on the files.

Should data files be stored on the same computer (server) the database is stored in?

Currently in our research group, we have many "data files" stored on three servers and a couple of personal computers running different operating systems.
We want to build a database, which would store some information in addition to the URLs of those various "data files". My question is, do we have to copy all the data files and put them in a directory in the same server the database is in? Or can they be left as they are on the different computers? If the second case is ok, what would be the format of the url of the "data files"?
It really depends on what your intended goal is and what your current setup is like
If the files are currently sitting somewhere on the network, and you need a path that the application can use to access them, you just need to store the network path (\\server\share\file for Windows environments) in the database, then read it and access that path to access the files. You'll need to make sure everyone has read access to them.
If the files are currently accessible through a website URL, internal or external, then again, you just need to store that URL (or some portion thereof) (http://mywebsite.com/myfile or http://servername/myfile) and access that.
If either of the above are not currently true, but you want them to be, then you'll need to set up a new share/webserver and put the files there. There's no requirement that this be the same server as the database, but it'd make for better backups if it was.
If you want the files themselves to be in the database, you should check out Bob Fanger's link.
Not sure what you're asking here but...
If you want your database engine to read files filled with data, it probably doesn't matter where they are stored - though this may depend on the database you are using. Are you using MySQL? MS-SQL Server? Oracle?
Many database vendors provide relatively easy-to-use admin tools that would let you choose a file to be loaded, and usually the file chooser dialoge lets you browse networks so you could load a file over the network. Details on how to do this vary so consult the manual for your database engine for loading data from a pre-existing file.
Be aware that if the database is on Computer A and the data is being loaded from Computer B over the network, it will probably be slower than if the data was on the same computer as the database.
It doesn't really matter if the files are stored outside the database anyway.
See Storing Images in DB - Yea or Nay? for more thoughts on that one.
If the files accessible by an url, you can store that with the meta data, like
http://server1/folder/file.ext, file://\server1\folder\file.ext or "file://P:\folder\file.ext"
Things to consider:
Backups
Performance
Synchronisation between the meta-data and the data

What is the best deployment approach for WPF applications with local database?

I want to make a WPF application that exists in one directory including all files that it needs: .exe, .mdf database, .xml config files, etc.
the application should work no matter what directory it is in so that it supports this scenario:
person 1 executes the application in c:\temp\wpftool.exe
the application reads and writes to the c:\temp\wpftool.mdf database
person 1 zips up that directory and sends it to person 2 via e-mail
person 2 unzips it to c:\Users\jim\documents\checkout\wpftool.exe, the application reads and writes to the same database in that directory (c:\Users\jim\documents\checkout\wpftool.mdf)
person 2 zips the directory up again and sends it back to person 1 to continue making changes on it
What is the best way to create a WPF application that supports the above scenario?, considering:
there should be no hard-coded database connection strings
what is the best deployment method, click once? or just copy the .exe file out of the /release directory?
reasonable security so that users have to log in based on passwords in the database, and if a third person happens to intercept the e-mail, he could not easily look at the data in the database
Some points on the database side:
Assuming the "New user" already has SQL installed, they'd need to attach the (newly copied) database. Besides having sufficient access rights to attach a database, your application would need to configure the call to include the drive\folder containing the database files. If your .exe can identify it's "new home folder" on the fly, you should be able to work that out.
Define "reasonable security". Any database file I get, I can open, review, and ultimately figure out (depends on how obscure the contents are). Can you obfuscate your data, such as using table "A" instead of "Customer"? Would you really want to? The best possible security involves data encryption, and managing that--and in particular, the encryption keys--can be a pretty advanced subject, depending on just how "secure" you want your data to be.
For the database, I would look into using the "user instance" feature in SQL Express. Combined with the |DataDirectory| substitution string support it makes it very easy for your application to get hooked up.
In all honesty I have not deployed a ClickOnce app leveraging this approach myself yet, but I just thought I would bring it to your attention because it's what I would look into myself if I was building something like you described.

Resources