Snow Leopard Server, AFP, ACL on share points doesn't seem to work - osx-snow-leopard

I have a server set up, with a series of share points on it. One of those folders is "Marketing", and the Marketing team has a boss, who is the only guy who should be able to create folders within that share.
The people in the Marketing group, should have access to write and delete files, but not create folders.
Here's the setup:
Users/Groups:
John (boss)
Roger (one of the Marketing employees)
John and Roger both belong to the Marketing group
Share point ACL:
Marketing / Allow / Custom ( full read, write attributes, write extended attributes, create files, delete ) but not create folder, delete subfolders and files
John / Allow / Full Access
As last detail, all users are on Apple computers.
Problem
John is able to do everything he should, create files, folders, etc.
Roger on the other hand, can only read.
Question
Does the ability to write files, require the ability to write folders? If I let Marketing group write folders, Roger's access works. If so, is there a workaround?
Thanks!

Just got off the phone with Apple support, confirmed bug in ACL system. Hopefully they patch it soon. Create Folder's absence, adversely affects Create File.

Related

What is the most appropriate way to store files?

I am dealing with one problem, before starting developing, I decided to do some research.
So the problem is what would be the efficient solution when storing files?
I read about it, and some people was against storing files within database, as it will have negative impact on backups / restorations, it will add more processing time when reading database for large files and etc...
Good option would be to use S3 or any other cloud solution to store the files, but for this current customer cloud won't be good.
Another option would be to store files under file system. The concept is clear, but I try to understand what I need to understand before implementing that solution.
For example we need to consider how we structure directories, if we would store 100 000 files in one directory it can be come slow to open and etc. As well there is like maximum amount of files that can be stored in one directory.
Is there any 3rd party tools that helps to manage files in file system? that automatically partitions files and places them in directories?
I work with a software that have more than 10 million files in file system, how you will structure the folders depends, but what I did was:
Create a folder to each entity (Document, Image...)
Into each folder create a folder to each ID object with ID beign the name of the folder, and put their files inside, but this could vary.
Example to Person that have the ID 15:
ls /storage/Person/15/Image/
Will give me this 4 images that in the database I linked to person with the ID 15:
Output:
1.jpg
2.png
3.bmp
4.bmp
If you have a HUGE amount of elements, you cold separate each digit of an ID into a subfolder, that is: Person wih ID 16549 will have this path: /storage/Person/1/6/5/4/9/Image/
About limits of files in folder I suggest you to read this thread: https://stackoverflow.com/a/466596/12914069
About a 3rd party tool I don't know, but for sure you could build this logic into your code.
English isn't my first language, tell me if you didn't understand something.

Uploading and saving files in CakePHP 2.1

I want to save my uploads in a specific folder for each app_id. For example, when an applicant with app_id of 45 uploads the files it must save in a folder named 45 whilst all his/her uploads such as birth certificate, college certificate, profile picture and high school certificate will be in that folder, same to the other app_id eg 89 must have his/her own folder but all the uploaded files will be located in one parent folder for example webroot/files/uploads/....
I managed to implement Uploader v3.6 from Miles Johnson, so any help on that plugin or any other solutions
Any tutorial or link to tackle this challenge
Thank you
A huge security fail in your idea is to store high school certificates in the public (!) accessible /webroot/files/uploads folder. You can be sure to get in pretty unpleasant situations in many countries that have privacy laws if everyone can download these proivate documents from your server.
Storing the files in folders 1, 2, 3... using the incremental ids is also a bad idea performance wise: Read the section "File System Performance" of this article.
Storing all files attached to a user in the same folder might be a good idea if you want to delete all of them in one action when you delete the user but you should not forget about sanitizing the file names properly. But again: If you store thousands of files in the same (sub) folder you might get into issues again.
For a file storage solution for CakePHP 2.x I can suggest you to read the readme.md of my FileStorage plugin that covers nearly all common problems and gives you a solid base for a file storage.

Automatically apprend or prepend names of files uploaded with date or random number in SharePoint

I have somehow turned into the person to oversee my organization's SharePoint and I have been tasked with finding a way that when a file is uploaded to any of our document libraries the file name of said file is either prepended, appended with a date or random string in an effort to prevent naming collisions.
I understand that SharePoint will block uploading files with the same name but I would like to just apprend or prepend a date or string to the file name to just bypass the whole issue. Our users aren't the most tech savvy so to automatically rename their files for them would help us and them both.
Is there a way to do this currently in SharePoint's settings? I've looked into versioning and Document ID but neither prevent the naming issues so far. Is there a plug-in of some sort I could use? or do I need to have code written by someone and have it added to SharePoint?
Thanks!
The Content Organizer feature of SharePoint 2010 allows you to set a duplicate submissions tag to documents.
"Duplicate Submissions
This option specifies whether to use SharePoint versioning or append unique characters to the end of duplicate file names if a document is uploaded that has the same name as a document that is already in the destination library."
Play around with this to see if it is going to help before worrying about a more complex coded solution. Your users may also appreciate the drop box approach to document uploading.

Informix-SQL (SE) on the cloud with WinTerm thin clients

I have several customers, each running a customized version of my ISQL (SE) desktop app. I would like to replace their desktop app with thin client WinTerm's, connecting to the cloud (My ISQL app on an SuSE Open or RedHat server). If this can be done, I would like for each customer to use one standardized version of my app to simplify updates and support. However, I'm not sure what's the best way to design the database. Should each customer have their own database.dbs (DBPATH=) or is there a better design?
You have two options, one of them that you've already considered:
Each customer has their own database with the standard name.
Each customer has their own database with a separate name for each.
The advantage of option 1 is that your existing code would work substantially unchanged; you'd simply have to ensure that the setting of DBPATH is correct for each customer. The downside is that you need to create a separate directory for each customer too. However, this has its merits; you can probably more easily keep Customer A from seeing any of the files generated by Customer B (and vice versa) if each customer has their own separate home directory, and their database is located in that directory.
The advantage of option 2 is that you can put all the customers' databases in a single directory. As mentioned, this is apt to make it easier for Customer A to see stuff that he should not see that belongs to Customer B. You would also have to ensure that you override the default database name every time you run a command - whether that's sperform or sacego or anything else.
Between the two, I would go with option 1 (separate databases with common name in different directories), with rigid walls between customers. Each customer would have their own user name and group, and the permissions on the directories would avoid public access of any sort. They can all still use a single INFORMIXDIR and your associated forms and reports.

Ubuntu Webserver Permissions

Now I've bought an Ubuntu Cloud server. I installed Webmin this morning, and now I have a question. Is it possible to create users that can only use one directory, SVN, FTP, PHP, Python, MySQL, Apache? So for example, user Kevin only may use /var/www/kevin/? Oh, after directory creation, a SVN checkout should fill the directory. What is the best way to manage all databases? Many people will work with the database, so how can I keep it safe? Is it possible to backup the database everytime a query was executed?
Many many thanks,
Regards,
Kevin
This might be better suited on https://askubuntu.com/
That said, backing up a database after every query sounds like a recipe for horrible performance and probably no real benefit. Configuring or modifying your application to send audit logs to another machine would probably be more approachable.
You can configure your standard Unix permissions to allow kevin to write only in /var/www/kevin. Restricting which programs kevin can run would probably require a tool more like AppArmor, SElinux, TOMOYO, or SMACK. Any of these mandatory access control tools can prevent a user from executing untrusted programs or provide an extra layer of security on top of the standard Unix permissions.
I've been working on AppArmor for over a decade now, and it'd be the tool I'd pick first for this job, but the other tools are excellent and might be a better fit for your environment. (AppArmor may already be pre-installed. Check aa-status(8) output to see. :)
But first make sure your Unix permissions are right -- old-school they may be, but they are superb.
Update
But how can I make that when 'kevin' signs into SSH, he automatically goes to directory /var/www/kevin/ (and can't go to /var/www/ or directories below)?
You could add a cd /var/www/kevin command to kevin's ~/.bash_profile or ~/.profile file. This might be more annoying than useful. (I don't recommend setting kevin's home directory (in /etc/passwd) to /var/www/kevin because that would store ~/.bash_history and ~/.ssh/* information in /var/www/kevin/.bash_history and /var/www/kevin/.ssh/, potentially exposing too much of kevin's private information.)
To allow kevin to enter into /var/www/kevin/, kevin will need to be able to enter /var/www -- but he doesn't necessarily need to see the contents of /var/www:
root:root 755 /var
root:root 751 /var/www
kevin:kevin 755 /var/www/kevin
other:www 750 /var/www/other
priv:www 750 /var/www/private
If your webserver runs with a group or supplementary group www, it will be able to traverse and read all these directories. Kevin cannot. (Assuming kevin is not in the group or supplementary group www.) Kevin can cd /var/www, and if kevin guesses /var/www/other or /var/www/private, he can determine that they exist, but he cannot actually enter the directories or list their contents.

Resources