PHP Encrypting Files During Downloads and Uploads - file

This seems like something that should be easy to find, but I've tried every combination of search terms I could think of and all I could find were answers that were "close but no cigar". After spending over a half an hour looking, I finally decided to ask.
What I am trying to do, explicitly worded, is to ensure that the files my users upload to or download from my web pages are encrypted during the transfer. I am not satisfied with just throwing https:// onto the beginnings of the file's links because these files need to be password protected. In order to password protect them, of course, I have set the directory permissions such that the files inside cannot be accessed via URLs at all. I am using a PHP script to manage the uploads and downloads.
I have tried checking the php.net pages on topics like headers() and mcrypt_encrypt() and have come up empty-handed. The page on headers() appears to apply to HTTP only and doesn't tell me how to use an encrypted protocol for a file download (if that's the way one does it) and I can't use mcrypt_encrypt() relying on the assumption that mcrypt_decrypt() can just be run later to make the files usable because obviously mcrypt_decrypt() can't be run client side after a download (nor can mcrypt_encrypt() be run client-side before an upload), so I am left wondering what method I would use to ensure that the user's browsers will be able to encrypt and decrypt these files in a way that requires no action from the user - the same way everything else is encrypted and decrypted.
I'd like to assume that the fact that I am enforcing https on these web page URLs will automatically take care of it the way it takes care of the web page output. However, I do observe that files with separate file paths like images and CSS are not automatically encrypted, and that the code I'm using to trigger those file download boxes contains header information, implying that it's a separate transaction, and perhaps not encrypted.
I have really, really thought about this from a whole bunch of angles and I'm just not seeing the solution. Anyone want to help me?

Use HTTPS for secure (encrypted) delivery of data. Store the files in each user's folder as you're doing, and only allow access after authentication (over HTTPS).
The reason you're having a hard time finding another solution is because HTTPS is the solution.
If you want to store the files encrypted on the disk, you can encrypt them with a symmetric block (stream) cipher as they're uploaded and do the reverse as they're downloaded. You could use a secret key that's unique per user as the symmetric key.

Related

Dealing with large zip uploads and extracting using google cloud

I am trying to create a site for e-learning courses (zips html/css/js/media) to be uploaded to.
I am using go on google app engine with google cloud storage to store the zips and extracted courses.
I will explain the development dead ends I have encountered.
My first thought was to use the resumable upload functionality of cloud storage to send the zip file, then read it using go on app engine, unzip the files and write them back to cloud storage.
This took a while to read and understand the documentation and worked perfectly for my 2MB test zip. It failed when I tried it with a modest 67MB zip. I had encountered a hidden limitation when accessing cloud storage from app engine. No matter the client I used there was a 10MB/32MB limit.
I tried both the old and new libraries as well as blobstore.
I also looked into creating a custom oauth2 supporting client library using sockets but hit too many dead ends.
Giving up on that approach I thought even though it would mean more uploading, perhaps just extracting on the client (browser) side then uploading each file with it's own resumable upload would make the most sense. After exploring a few libraries I had extracting in browser working ready to upload.
I wrote my handler that created the datastore entry for the upload, selected a location for the upload and created all the upload urls.
When testing this I was finding that it would take a while to go through generating the long lists of files (anything over 100). I decided that it would make sense since I was using to to make the requests concurrently. I spend a day or two getting that working. After dealing with some CORS issues that weirdly did not show up earlier I had everything working.
Then I started getting errors when stress testing my approach with a large (500mb) zip/course. The uploads would fail and I discovered that when trying to send 300+ files to generate upload urls I was getting the following error
Post http://localhost:62394: dial tcp [::1]:62394: connectex: No connection could be made because the target machine actively refused it.
now I have no idea how to diagnose this. I don't know if I am hitting a rate limit and if I am I don't know how to avoid it.
This seems like creating this should be simple, but it is anything but.
I have a few options I can pursue
Try to create the resumable uploads with a batch operation(https://cloud.google.com/storage/docs/json_api/v1/how-tos/batch)
batch operations to /upload are not supported.
Maybe make requesting each url a one by one api call.
Make requesting the url happen over a channel (https://cloud.google.com/appengine/docs/go/channel/reference)
spend the next week or more adding layers of retries and fallback error handling.
Try another solution.
This should be simple. How should this be done?

How to secure file download?

I have an application written in angularjs and a dropwizard backend. All API calls are ajax, with the exception of file downloads, which is done by performing a redirect to a standard GET request.
All API calls are secured through a token which is passed as a Token header. We use SSL for all APIs.
The download GET request works but I'm having a hard time figuring out how to secure it. I have no way of setting a custom header, which is required to pass the token. So theoretically, I'm left with two options, clearly none of them acceptable: 1. Pass the token as one the GET parameters 2. Leave the download unsecured.
Any ideas how to secure file download?
Putting a secret token in a URL query parameter isn't great because URL tend to be leakable, for example through history/logging/referrers. There are ways to mitigate this: for example you could have the server side issue a download token that is only good for one use or for a limited amount of time. Or the client could pass a time-limited token created using a signature over the secret token that the server side could verify.
Alternatively you could, just for this one interface (eg path-limited, quitckly-expiring) put the token in a cookie.
Another approach is to download the whole file through AJAX, thus allowing you to set the header as normal. Then you have to present the content as a downloadable local resource, which requires a cocktail of browser-specific hacks (eg using data: or filesystem: URLs, and potentially links with the download attribute). Given the complication this isn't usually worth bothering with, especially if the file is very large which may present further storage constraints.

nginX custom HTTP authorization , using scripts to decide

i want to protect some files in a folder by requiring passwords for download
however list of users that are allowed to download are on a mysql table with their passwords in md5 format (which means i cannot generate a htpasswd file)
to make it harder i also need to allow some users to download some files and others to download other files without being able to move files (separating files in multiple folders)
so i what i need is some kind of auth api , when there is a request nginx askes a Script (lets say a php script) with parameters of username/password/ip/filename and depending on script's response allow or disallow the download
is this possible?
what i've done so far:
1.looking in the 3rd party modules list http://wiki.nginx.org/3rdPartyModules
where i found a module with PAM but my server is windows
2.googling lots of terms without any results
3.looking at the module development tutorials http://www.evanmiller.org/nginx-modules-guide.html
i'm not really good at C so a pre made module for windows that bounces the request to a script (without proxy-ing the download through it) is the best if not some pointers to how should i make a module that meets my requirements is appreciated .
You can use the http_auth_request module.
p.s. Do you actually know that nginx for windows is not production ready?

only logged in users can play audio from our server

We have made a silverlight application where users can preview audio files from their browser from the telerik radmediaplayer control.
The files are on a webserver and anyone who sniffs the trafic can download the file.
We would like to prevent non-logged-in users from accessing/downloading these files.
Besides providing the application with some sort of temporary valid url and implementing a custom httphandler... what are our options?
It's not too big of a problem if our customers can download the files, we just don't want the rest of the world to also have access.
Any ideas would be more than welcome!
[Update]
The only thing I can come up with is:
host the files in a non-public folder
if a user requests to prelisten a file, copy it to a public folder under a new name ([guid].mp3) and return it's url
every x minutes clean the public folder.
Don't let the web server serve up the files straight out of a directory. Put part of your application in front, and left one of your server-side scripts serve up these files. Keep the raw audio files out of the web root.
For instance, your client-side application would access files like so:
http://someserver/yourscript?audio_asset_id=12345
The code at yourscript would verify the session data, ensuring that a user is logged in, would then go figure out the real path to asset ID 12345, and echo its contents to the client. Don't forget to include the proper Content-Type header as well.
Once the accessing of these assets is under your control, you can implement whatever security measures you like. If your sessions area already pretty well safe-guarded, this should be fine. I would also recommend implementing sane quotas. If you get 100 requests on an asset using the same session ID from multiple IP addresses... something isn't right.

Avoid google oauth2 client secret in code

I'm writing a small c program which connects to the google api via Oauth2.
Therefore I need to send a client secret to google.
I store this secret in my code, which I want to push to github, but how can I avoid to show my client secret to everybody who looks at my code?
use a configuration file where you'll store the API key... you have many options, the simplest being writing the key directly into the file, more sophisticated being using some kind of serializers (like json, xml, inifile etc...), the right option is up to you (usually, you'll want to serialize if you want to store several options in the file).
You can also set the key as a program argument, if you don't mind the key to be visible in the process list of your host.
And be sure not to push your already existing git history to git hub, but create a new repository, or all your previous patches (with the key) will be public ;)
Storing secret (and ideally any string literals) in code is wrong - store it in a resource (text) file and don't push it to Git.
If you are searching for where to find out your Client Secret for your Google Drive apss. then follow this step.
Go to your project
Click Credential.
After that you will get all the details about your project like client
id, redirect uri etc. But there you will click on button "Download
Jason" and after downloading a file you will get your CLIENT SECRET.
Please look at the picture.

Resources