i want to protect some files in a folder by requiring passwords for download
however list of users that are allowed to download are on a mysql table with their passwords in md5 format (which means i cannot generate a htpasswd file)
to make it harder i also need to allow some users to download some files and others to download other files without being able to move files (separating files in multiple folders)
so i what i need is some kind of auth api , when there is a request nginx askes a Script (lets say a php script) with parameters of username/password/ip/filename and depending on script's response allow or disallow the download
is this possible?
what i've done so far:
1.looking in the 3rd party modules list http://wiki.nginx.org/3rdPartyModules
where i found a module with PAM but my server is windows
2.googling lots of terms without any results
3.looking at the module development tutorials http://www.evanmiller.org/nginx-modules-guide.html
i'm not really good at C so a pre made module for windows that bounces the request to a script (without proxy-ing the download through it) is the best if not some pointers to how should i make a module that meets my requirements is appreciated .
You can use the http_auth_request module.
p.s. Do you actually know that nginx for windows is not production ready?
Related
I installed Google Cloud SDK and it dumped a .boto file directory in to the My Documents folder (e.g. C:\Users\John) which is a wildly inappropriate location. I do see many instances of the boto file in the Python files, a couple of dozens of instances / examples:
return os.path.join(self.LegacyCredentialsDir(account), '.boto')
os.path.expanduser(os.path.join('~', '.boto')),
Where do I go to change the path to something appropriate? An appropriate path would be something such as C:\Users\John\AppData\Roaming\gcloud\.boto in example.
At the top of the file:
This file contains credentials and other configuration information needed
by the boto library, used by gsutil. You can edit this file (e.g., to add
credentials) but be careful not to mis-edit any of the variable names (like
"gs_access_key_id") or remove important markers (like the "[Credentials]" and
"[Boto]" section delimiters).
[Credentials]
Google OAuth2 credentials are managed by the Cloud SDK and
do not need to be present in this file.
To add HMAC google credentials for "gs://" URIs, edit and uncomment the
following two lines:
The latest versions of Boto don't seem to be a great fit for App Engine. I ran into this issue about a year ago, and I don't remember all of the details, but I avoided Boto3 and stuck with Boto 2.47 and that worked well for me.
For my use case, I only needed help with SES. If you need many other AWS services then YMMV.
Migrating wordpress sites between hosts can take a lot of time, especially when the hosting platforms are different.
I have been trying to migrate my sites from Cpanel to Mediatemple, but it seems like im just not getting it right.
There is various options
Use the guide they provide
https://kb.mediatemple.net/questions/1556/Migrating+your+websites+to+the+Grid#gs
When moving the files in this way the permissions of the files are not set properly and I would have to got back through them and figure out which ones need to change.
The database export from PHPMyAdmin also does not look the same it looks in the screenshot
Using InfiniteWP
To use InfiniteWP you must provide the url of the site and since I dont want to change the DNS until the site is moved this option does not seems to be ideal.
This option might work if its ok for the sites to be unavailable for a day or so while the DNS resolves...
But I don't want the sites to be unavailable
Using Mediatemples "one click apps" to install wordpress and then moving only the files that are unique to the site from the old host to the new host.
I would like to use this option
I think that the content of the WP-Content folder needs to be moved that the database needs to be moved.
My question is
- what folders and files in a standard wordpress install typically hardly ever changes from one site to the other.
- can I use the wordpress database export and import function to move the database from one site to the other.
Any help will be appreciated
Thank you
With InfiniteWP, you can use the clone an existing site command (which can be found in "Tool"->"Install / Clone WP") to migrate a site to a new server.
You have to use a temporary (sub)domain pointing to the new server.
To answer your questions :
/wp-content/ stores all your files and sometimes plugin files, wp-config.php is where your configuration is stored (e.g. credentials to access the database). Depending on your servers, the .htaccess files may be different.
I would recommend to create a dump file of your entire database using phpMyAdmin.
This seems like something that should be easy to find, but I've tried every combination of search terms I could think of and all I could find were answers that were "close but no cigar". After spending over a half an hour looking, I finally decided to ask.
What I am trying to do, explicitly worded, is to ensure that the files my users upload to or download from my web pages are encrypted during the transfer. I am not satisfied with just throwing https:// onto the beginnings of the file's links because these files need to be password protected. In order to password protect them, of course, I have set the directory permissions such that the files inside cannot be accessed via URLs at all. I am using a PHP script to manage the uploads and downloads.
I have tried checking the php.net pages on topics like headers() and mcrypt_encrypt() and have come up empty-handed. The page on headers() appears to apply to HTTP only and doesn't tell me how to use an encrypted protocol for a file download (if that's the way one does it) and I can't use mcrypt_encrypt() relying on the assumption that mcrypt_decrypt() can just be run later to make the files usable because obviously mcrypt_decrypt() can't be run client side after a download (nor can mcrypt_encrypt() be run client-side before an upload), so I am left wondering what method I would use to ensure that the user's browsers will be able to encrypt and decrypt these files in a way that requires no action from the user - the same way everything else is encrypted and decrypted.
I'd like to assume that the fact that I am enforcing https on these web page URLs will automatically take care of it the way it takes care of the web page output. However, I do observe that files with separate file paths like images and CSS are not automatically encrypted, and that the code I'm using to trigger those file download boxes contains header information, implying that it's a separate transaction, and perhaps not encrypted.
I have really, really thought about this from a whole bunch of angles and I'm just not seeing the solution. Anyone want to help me?
Use HTTPS for secure (encrypted) delivery of data. Store the files in each user's folder as you're doing, and only allow access after authentication (over HTTPS).
The reason you're having a hard time finding another solution is because HTTPS is the solution.
If you want to store the files encrypted on the disk, you can encrypt them with a symmetric block (stream) cipher as they're uploaded and do the reverse as they're downloaded. You could use a secret key that's unique per user as the symmetric key.
(edit: I'm leaving all the mistaken assumptions in just in case someone else makes the same mistakes)
I have an ancient Typo3 3.8.1 site on a remote server. I don't have access to that server, and the team in charge of maintaining the site doesn't know who to contact to get access to the server. I do have the admin rights on that site, though. (edit: no I don't. oops.)
This is what I see in the (not) admin menu:
I'm not sure if this version supports extensions, I can't find an extension manager anywhere. (because I'm not an admin)
I want to export the site so I can host it on a server on my own domain instead. The problem is the export file is too large, I can't download it. Will I destroy the directory structure if I export a bunch of pages at a time?
If you have admin access to the backend you can try to install Quixplorer - file manager. Using it you can try to zip folders in the main directory ie. (typo3, typo3conf, fileadmin etc) one by one and download them via browser.
It's important to download and remove typo3conf.zip from the server as soon as possible, cause it contains sensitive data.
Additionally you can also install PhpMyAdmin extension (search in repository) i you haven't other MySQL client.
Edit:
If you can't use Quixplorer the only way is... to write own extension and upload it via Extension Manager, there you'll need to try perform primitive file system operations like:
(PHP)
system('zip -R t3c.zip typo3conf/');
Sometimes the server allows more memory and execution_time that the T3D Export. So, if you can change PHP files on that server, try to change typo3/sysext/impexp/class.tx_impexp.php - search for ini_set and change that settings. If the server allows, you can then create bigger t3d-files.
And you could try some shell-extensions to get hands on that server:
http://typo3.org/extensions/repository/view/phpshell
http://typo3.org/extensions/repository/view/mw_shell
http://typo3.org/extensions/repository/view/shell
But to answer your initial question: you can crate a couple of T3D-files and import them again. Just force uid if you import them - and install all needed extensions first!
We have made a silverlight application where users can preview audio files from their browser from the telerik radmediaplayer control.
The files are on a webserver and anyone who sniffs the trafic can download the file.
We would like to prevent non-logged-in users from accessing/downloading these files.
Besides providing the application with some sort of temporary valid url and implementing a custom httphandler... what are our options?
It's not too big of a problem if our customers can download the files, we just don't want the rest of the world to also have access.
Any ideas would be more than welcome!
[Update]
The only thing I can come up with is:
host the files in a non-public folder
if a user requests to prelisten a file, copy it to a public folder under a new name ([guid].mp3) and return it's url
every x minutes clean the public folder.
Don't let the web server serve up the files straight out of a directory. Put part of your application in front, and left one of your server-side scripts serve up these files. Keep the raw audio files out of the web root.
For instance, your client-side application would access files like so:
http://someserver/yourscript?audio_asset_id=12345
The code at yourscript would verify the session data, ensuring that a user is logged in, would then go figure out the real path to asset ID 12345, and echo its contents to the client. Don't forget to include the proper Content-Type header as well.
Once the accessing of these assets is under your control, you can implement whatever security measures you like. If your sessions area already pretty well safe-guarded, this should be fine. I would also recommend implementing sane quotas. If you get 100 requests on an asset using the same session ID from multiple IP addresses... something isn't right.