I want something that creates a full list of all files/paths at a domain (mine),
including size and modification date. I want the list to begin all the way at
the root - not just past /public_html. I'd want to run this from my Win7 64
bit PC and have the list saved on my PC.
I do NOT want to DL all the files !
Is there a Win7-64 tool I can use to accomplish this ??
When you say files/paths "at a domain", in general you have a misunderstanding. A domain is basically a name that points to a resource see here.
If this sounds kind of vague, it's because it is. Multiple computers can host a domain (ie. serve up resources for the same domain), and the resources they serve up don't have to be files at all. You can point your browser at http:// somesite/somefile.html, and that "somefile.html" may not exist at all (yet the site could still return a webpage).
You can't (in general) list all the files/paths at a "domain", but if you have access, you can certainly do that for one or more computers. Certain websites may provide a way to get a directory listing, but even then it would just be from the "DocumentRoot" (in Apache terms) of the website (not from root).
EDIT: IF your domain is hosted on a single computer, and you have full access through ftp, you could use something like the python script in the answer here to get a remote directory listing (of this computer). You probably need to change the line that says this:
ftp.login()
to this:
ftp.login(user='your username', passwd='your password')
While it may seem like the same thing, what you're really asking for is a remote directory listing of a computer, not a domain (even if a dns lookup resolves your domain to a computer).
Related
I have configured Nginx with autoindex module to enable Directory Listing. But I want to extend this feature to enable file editing as well and saving it.
The thing is I have some Private IPs which needs to be monitored and I have added those IPs in a file and made a script to take IPs from the file and monitor them by Pinging. Since sometimes these IPs change due to DHCP, Apart from System Admins, No one is much proficient in using Terminal. Hence I wanted to provide a webUI, so that concerned persons change this IP whenever through webpage. I know this can be possible using code, but since am not a developer, I was finding a way through here. Is it possible?
No, it's not possible using nginx alone.
I will be running a website on a dedicated server, where users will be allowed to upload files (I am not checking those files for viruses).
Lets say the folders are like so -
parent_folder -> website (folder which contains server files)
parent_folder -> uploads (folder which contains user uploads)
Question. Will file permission on website folder as 750, and on uploads folder as 770 suffice?
(Note1: Owner - Root; Group - Apache; website is a social network)
(Note2: As Apache (www-data) will handle uploads, I believe 770 may be required in place of 740)
(Note3: As Apache (www-data) will only need to read files in website folder, therefore 750)
(Note4: Server will be maintained by a single user)
I think whether it's good enough or not depends on what risk you are willing to take. For a basic protection, it may be good.
What you have to consider is what would happen if files are stolen either by an unauthenticated attacker, or I suppose there would be some kind of an access control and users would have access to their own files, in which case an obvious risk is a user being able to access others' files. How would it affect your website / business if this happened? Are you willing to take this risk? Who are the attackers you are trying to protect against? Opportunist script kiddies? Knowledgable attackers specifically targeting your site? Nationstate-level attacks? You have to answer these first.
The drawback of this solution is that any application or operating system level breach will most likely give access to all uploaded files of all users. Should your application be vulnerable to things like local file inclusion, directory traversal, even SQL injection with certain databases or a lot more vulnerabilities, your files can be downloaded. Whether you want to protect them more depends on your use case.
More secure options could be storing them on a separate server (some kind of a file repository), encrypting them with user-specific keys (and then key management and cryptography in general is a huge topic itself), you could implement mime-type identification with whitelists to only allow certain file types (this is a slightly different aspect), or you could even add end to end encryption to protect against yourself (operations staff), etc. You can go lengths in terms of protecting these files, if you want to, but it gets very complex very quickly.
In a domain with AD Sites and Services configured is it possible to get the Site of a computer from LDAP? Is it stored as an attribute?
Unless this has changed over the last couple of years outside of my knowledge, there is not. Historically this was never done as AD site knowledge was ephemeral...the assumption was that computers move around so storing where they are is silly. Plus there was no global need for the knowledge.
You could of course add this. By this i mean, you could do something like, extend the schema with a new attribute for this and set a start-up script on your domain-joined machines to write this (if it has changed since they last wrote) to the directory. Obviously you'll want to test this well to ensure it doesn't create more problems than it solves...
On the Win32 point of view you've got the DsAddressToSiteNamesEx API. I don't know how to find it using pure LDAP.
So the problem is I'm not used to FTP clients and managing files with apps and directories. Webfaction has you connect to the domain and server (username.webfactional.com; listen to port 22) and once your in, you see the files already there (like bin, lib, webapp->django, etc). What I don't get is how do I get my source code files uploaded so that they appear on my domain (cooldomainname.com)?
I think the part you are missing here is how WebFaction lays out the structure for your account, applications, and web sites. You can see some of the ways they set things up here:
http://docs.webfaction.com/user-guide/examples.html
The answer to your question is:
You need to put the files, depending on the file type (if it is static or php and you have an Static/CGI/PHP application set up), in the /home/username/webapps/appname directory. You can view your apps by looking through the information in the WebFaction docs and following them through their control panel to the information you need.
I'm creating a little app that configures a connected device and then saves the config information in a file. The filename cannot be chosen by the user, but its location can be chosen.
Where is the best place for the app's default save-to folder?
I have seen examples out there where it is the "MyDocuments" location (eg Visual Studio does this).
I have seen a folder created right at the top of the C:\ drive. I find that to be a little obnoxious, personally.
It could be in the Program Files[Manufacturer] or Program Files[Product Name], or wherever the app was installed. I have used this location in the past; I dislike it because Windows Explorer does not allow a user to browse to there very easily ('browsability').
Going with this last notion that 'browsability' is a factor, I suppose MyDocuments is the best choice. Is this the most common, most widely accepted practice?
I think historically we have chosen the install folder because that co-locates the data with the device management utilities. But I would really like to get away from that. I don't want the user to have to go pawing through system files to find his/her data, esp if that person is not too Windows-savvy.
Also, I am using the .NET WinForms FolderBrowserDialog, and the "Environment.SpecialFolders" enum isn't helpful in setting up the dialog to point into the Program Files folder.
Thanks for your input!
Suz.
User data belongs in the user's folder. The (utopian) idea there being that they need only back up their personal folder, and should their computer die a sudden fiery death they would have everything they need to get their computer back up in working order. If all their personal data is scattered across the computer it only serves to confuse the user and destabilize your product.
Opinion: all this documents-and-settings stuff with lots of spaces inside is really misguided, including "my" documents. You always end up having to type it manually at the command line. I would choose a NIH structure under user's harddrive, he will only say thanks.