How to create files on grails with specific permissions - file

I'm creating an app using grails that generates multiple files as a process output, and put them on some folders that later will be access by ftp by other people.
Everything works great except that in production the new created files are created with access only for the user that runs tomcat, meaning that when somebody connects to the folder using ftp the files can't be open because they don't have permission.
is there any way to set permissions from grails or configure tomcat in a way that every output can be access by other users?.

This might help. You can also look into executing shell commands but that may not be the best option.

I found out that there is actually a method for the class File that changes the permissions for an instance of file, I was trying to use it but I notice only changed the permissions for the owner, but with a slight change on the parameters you can tell it to apply to other users too.
File.setReadable(boolean readable)
File.setReadable(boolean readable,boolean ownerOnly)
So in my case file.setReadable true,false made the trick.
Check out the class methods for more info java.io.File

Related

Can one ever access a file stored in Heroku's ephemeral file system via the browser?

I've been writing an app to deploy on Heroku. So far everything has been working great. However, there's one last step that I haven't been able to solve: I need to generate a CSV file on-the-fly from the DB and let the user download the file.
On my local machine I can simply write the file to a folder under the web app root, e.g. /output, and then redirect my browser to http://localhost:4000/output/1.csv to get the file.
However, the same code fails again and again on Heroku no matter how I try to modify it. It either complains about not being able to write the file or not being able to redirect the browser to the correct path.
Even if I manually use heroku run bash and create an /output folder in the project root and create a file there, when I try to direct my browser there (e.g. https://myapp.herokuapp.com/output/1.csv, it simply says "Page not found".
Is it simply impossible to perform such an action on Heroku after all? I thought since we are free to create files on the ephemeral file system we should be able to access it from the browser as well, but things seem to be more complicated than that.
I'm using Phoenix framework and I already added
plug Plug.Static,
at: "/output", from: Path.expand("output/"), gzip: false
to my endpoint.ex. Apparently it works on localhost but not on Heroku?
I'll turn to Amazon S3 if indeed what I'm trying to do is impossible. However, I want to avoid using S3 as much as possible since this should be a really simple task and I don't want to add another set of credentials/extra complexity to manage. Or is there any other way to achieve what I'm trying to do without having to write the file to the file system first and then redirecting the user to it?
I know it doesn't strictly answer your question, but if you don't mind generating the CSV every time it is requested, you can use Controller.send_download/3 and serve arbitrary payload as a download (in your case the contents of the CSV).
Naturally you could store the generated CSVs somewhere (like the database or even ets) and generate them "lazily".

Installation of joomla 3.0 is not being finished

I was trying to work on joomla 3.0. I have done everything that is necessary. But the page given in the image is being shown for a long time. Database tables are being created. But it is not going to the next step. Can anyone help me out? TIA
With the almost none information you provide I could tell you this.
Be sure you are selecting mysql and not mysqli and that the username has all the permissions granted on the database you provided.
In your server's home directory, (which is the lowest directory for shared hosting users,) make a new file called "phprc" inside of the .php folder. If the folder doesn't exist yet, create it. The period in front of the folder name means that it's invisible, so make sure you can see invisible files in your FTP client, or use the command "ls -a" to see all files in the command line.
Add the following lines to the 'phprc' file:
Code:
max_execution_time = 3000 ;
memory_limit=128M ;
Then save it.
normally if it is a shared host could take some minutes to reflect the change...but try again after 5 or 10 minutes and you might see it works.

IIS - Releasing handle to folder by w3wp.exe so it can be renamed

I have a TFS build process that drops outputs on sandbox which is another server in the same network. In other words, the build agent and sandbox are separate machines. After the outputs are created, a batch script defined within the build template does the following:
Rename existing deployment folder to some prefix + timestamp (IIS can now no longer find the app when users attempt to access it)
Move newly-created outputs to deployment location
The reason why I wanted to rename and move files instead of copy/delete/overwrite is the latter takes a lot of time because we have so many files (over 5500). I'm trying to find a way to complete builds in the shortest amount of time possible to increase developer productivity. I hope to create a windows service to delete dump folders and drop folder artifacts periodically so sandbox doesn't fill up.
The problem I'm facing is IIS maintains a handle to the original deployment folder so the batch script cannot rename it. I used Process Explorer to see what process is using the folder. It's w3wp.exe which is a worker process for the application pool my app sits in. I tried killing all w3wp.exe instances before renaming the folder, but this did not work. I then decided to stop the application pool, rename the folder, and start it again. This did not work either.
In either case, Process Explorer showed that there were still uncollected handles to my outputs except this time the owner name wasn't w3wp.exe, but it was something along the lines of unidentified process. At one point, I saw that the owner was System, but killing System's process tree shuts down the server.
Is there any way to properly remove all handles to my deployment folder so the batch script can safely rename it?
https://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
Use windows systernal tool called Handle v4.0
Tools like Process Explorer, that can find and forcibly close file handles, however the state and behaviour of the application (both yours and, in this case, IIS) after doing this is undefined. Some won't care, some will error and others will crash hard.
The correct solution is to allow IIS to cleanly release locks and clean up after itself to preserve server stability. If this is not possible, you can either create another site on the same box, or set up a new box with the new content, and move the domain name/IP across to "promote" the new content to production

NServicebus creates new logfile at each restart

I think there must be an easy solution to my problem but I can't figure it out. We're using NServicebus in a windows service and we have configured it to use log4net for logging, in code we have this:
SetLoggingLibrary.Log4Net(log4net.Config.XmlConfigurator.Configure);
Configure.With().Log4Net().....
So far so good. The problem is that NServicebus still creates it's own logfile, named "logfile" placed the same folder the application is run in, i.e. amoung the binaries. In our development and test environments where we reinstall and restart the service frequently this soon polutes the binaries folder with a lot of logfiles as a new is created each day (the old one from a previous date is renamed to for instance: logfile2012-02-28).
In the config-file of the service we have these lines:
...
<section name="Logging" type="NServiceBus.Config.Logging, NServiceBus.Core" />
<Logging Threshold="OFF" />
so all logfiles are empty but how do we stop them from being created or at least have them created in a separate log folder?
Thanks
Christian
Your calls to SetLoggingLibrary and .Log4Net() are conflicting with each other, and probably also with the profiles (if you're using NServiceBus.Host.exe).
Have you looked through the docs?
http://docs.particular.net/nservicebus/logging/

Make a folder like Dropbox that connects with a remote location

How can I make a folder that does things. Surely dropbox knows when a file is put in the folder, and that file is synced. How can I make a folder that does the same, and that the files I put in it go to my ftp?
I'm trying to do this on a Mac (surely, Dropbox works fine on a Mac).
I believe what you are looking for is a way to monitor when files are changed. Then, you can simply upload the changed file via FTP like you mentioned. If this is the case, the answer is to tie into the Windows Folder and File events. Here is a good article on how to do so:
http://www.codeproject.com/KB/files/MonitorFolderActivity.aspx
The code needed to FTP a file can be found here:
http://msdn.microsoft.com/en-us/library/ms229715.aspx
All of this is assuming you are going to be using C#. If you are going to use a different language, you will need to perform the same basic actions in the same basic manner but the syntax will be different.
To get started, this is all you need. You watch the folder for changes to any of the files. When you see a change, you upload the changed file via FTP (if that is your desired method of web transport) to the remote location. Of course, you would need to do the opposite for other clients. They would need to subscribe to events on your server that told them to download the latest versions of the changed files. Finally, you would need to apply your own business logic for things like how often you want the uploads to happen, if you want logging enabled for the changes, if you are going to do file versioning, etc.
One solution(windows only + .NET) would be to run a client with and monitor a folder with FileSystemWatcher and when the change event fires, do appropriate action required to sync with FTP.

Resources