I've been writing an app to deploy on Heroku. So far everything has been working great. However, there's one last step that I haven't been able to solve: I need to generate a CSV file on-the-fly from the DB and let the user download the file.
On my local machine I can simply write the file to a folder under the web app root, e.g. /output, and then redirect my browser to http://localhost:4000/output/1.csv to get the file.
However, the same code fails again and again on Heroku no matter how I try to modify it. It either complains about not being able to write the file or not being able to redirect the browser to the correct path.
Even if I manually use heroku run bash and create an /output folder in the project root and create a file there, when I try to direct my browser there (e.g. https://myapp.herokuapp.com/output/1.csv, it simply says "Page not found".
Is it simply impossible to perform such an action on Heroku after all? I thought since we are free to create files on the ephemeral file system we should be able to access it from the browser as well, but things seem to be more complicated than that.
I'm using Phoenix framework and I already added
plug Plug.Static,
at: "/output", from: Path.expand("output/"), gzip: false
to my endpoint.ex. Apparently it works on localhost but not on Heroku?
I'll turn to Amazon S3 if indeed what I'm trying to do is impossible. However, I want to avoid using S3 as much as possible since this should be a really simple task and I don't want to add another set of credentials/extra complexity to manage. Or is there any other way to achieve what I'm trying to do without having to write the file to the file system first and then redirecting the user to it?
I know it doesn't strictly answer your question, but if you don't mind generating the CSV every time it is requested, you can use Controller.send_download/3 and serve arbitrary payload as a download (in your case the contents of the CSV).
Naturally you could store the generated CSVs somewhere (like the database or even ets) and generate them "lazily".
I'm creating an app using grails that generates multiple files as a process output, and put them on some folders that later will be access by ftp by other people.
Everything works great except that in production the new created files are created with access only for the user that runs tomcat, meaning that when somebody connects to the folder using ftp the files can't be open because they don't have permission.
is there any way to set permissions from grails or configure tomcat in a way that every output can be access by other users?.
This might help. You can also look into executing shell commands but that may not be the best option.
I found out that there is actually a method for the class File that changes the permissions for an instance of file, I was trying to use it but I notice only changed the permissions for the owner, but with a slight change on the parameters you can tell it to apply to other users too.
File.setReadable(boolean readable)
File.setReadable(boolean readable,boolean ownerOnly)
So in my case file.setReadable true,false made the trick.
Check out the class methods for more info java.io.File
I have a question about Java and Databases. I am using an Microsoft Access Database and in order to be connected at the database i have to give full path at the driver.The Path Looks like the following. Also this path doesn't help me with the program portability.
String DBPath = "jdbc:ucanaccess://C://Users//theuser//Desktop//CostumerAppData.accdb";
Can i have database inside my project and give a simple path in order to be connected?
Thank you in advance about your responses.
Ucanaccess supports both relative and absolute path. If you're using a relative path, it must be relative to the current working directory. Path class (java.nio.file) may help . Maybe I'll be able to simplify this case solution in the next versions.
I have a web-app(browser based) which needs to access a folder full of icons that resides outside the web folder.
This folder MUST be outside the web folder, and would ideally exist outside the project folder all together
however, when specifying the path to the folder neither "../" or making use of a symlink will work
when the page attempts to load the image I always get
"[web] GET /Project|web/icons/img.png => Could not find asset Project|web/icons/img.png."
however I set the image source to "../icons/img.png"
how can i get dart to access this file properly
PS: I attempted a symlink to another part of the filesystem (where the images would be kept ideally) however this did not work either.
The web server integrated into DartEditor or pub serve only serves directories that are added as folders to the files view. When you add the folder to DartEditor you should be able to access the files. This is just for development.
You have also to find a solution for when you deploy your server app. It would be a hazardous security issue when you could access files outside the project directory. Where should the server draw the line? If this would be possible your entire server would be accessible to the world.
Like #Robert asked, I also have a hard time imaging why the files must not be in the project folder.
If you want to reuse the icons/images between different projects you could create a resource package that contains only those images and add them as a dependency to your project.
If you want a better answer you need to provide more information about your requirements.
If you wrote your own server (by using the HttpServer class) it may be possible to use the VirtualDirectory to server your external files.
Looking at look the dartiverse_search example may give you some ideas.
You could put them in the lib directory and refer to them via /packages/Project/...
Or in another package, in which case they would be in a different place in the file system. But as other people have said, your requirement seems odd.
I recently had a hard drive crashed and lost all of my source code. Is it possible to pull/checkout the code that I have already uploaded to Google App Engine (like the most recent version)?
Since I just went to all the trouble of figuring out how to do this, I figure I may as well include it as an answer, even if it doesn't apply to you:
Before continuing, swear on your mother's grave that next time you will back your code up, or better, use source control. I mean it: Repeat after me "next time I will use source control". Okay, with that done, let's see if it's possible to recover your code for you...
If your app was written in Java, I'm afraid you're out of luck - the source code isn't even uploaded to App Engine, for Java apps.
If your app was written in Python, and had both the remote_api and deferred handlers defined, it's possible to recover your source code through the interaction of these two APIs. The basic trick goes like this:
Start the remote_api_shell
Create a new deferred task that reads in all your files and writes them to the datastore
Wait for that task to execute
Extract your data from the datastore, using remote_api
Looking at them in order:
Starting the remote_api_shell
Simply type the following from a command line:
remote_api_shell.py your_app_id
If the shell isn't in your path, prefix the command with the path to the App Engine SDK directory.
Writing your source to the datastore
Here we're going to take advantage of the fact that you have the deferred handler installed, that you can use remote_api to enqueue tasks for deferred, and that you can defer an invocation of the Python built-in function 'eval'.
This is made slightly trickier by the fact that 'eval' executes only a single statement, not an arbitrary block of code, so we need to formulate our entire code as a single statement. Here it is:
expr = """
[type(
'CodeFile',
(__import__('google.appengine.ext.db').appengine.ext.db.Expando,),
{})(
name=dp+'/'+fn,
data=__import__('google.appengine.ext.db').appengine.ext.db.Text(
open(dp + '/' + fn).read()
)
).put()
for dp, dns, fns in __import__('os').walk('.')
for fn in fns]
"""
from google.appengine.ext.deferred import defer
defer(eval, expr)
Quite the hack. Let's look at it a bit at a time:
First, we use the 'type' builtin function to dynamically create a new subclass of db.Expando. The three arguments to type() are the name of the new class, the list of parent classes, and the dict of class variables. The entire first 4 lines of the expression are equivalent to this:
from google.appengine.ext import db
class CodeFile(db.Expando): pass
The use of 'import' here is another workaround for the fact that we can't use statements: The expression __import__('google.appengine.ext.db') imports the referenced module, and returns the top-level module (google).
Since type() returns the new class, we now have an Expando subclass we can use to store data to the datastore. Next, we call its constructor, passing it two arguments, 'name' and 'data'. The name we construct from the concatenation of the directory and file we're currently dealing with, while the data is the result of opening that filename and reading its content, wrapped in a db.Text object so it can be arbitrarily long. Finally, we call .put() on the returned instance to store it to the datastore.
In order to read and store all the source, instead of just one file, this whole expression takes place inside a list comprehension, which iterates first over the result of os.walk, which conveniently returns all the directories and files under a base directory, then over each file in each of those directories. The return value of this expression - a list of keys that were written to the datastore - is simply discarded by the deferred module. That doesn't matter, though, since it's only the side-effects we care about.
Finally, we call the defer function, deferring an invocation of eval, with the expression we just described as its argument.
Reading out the data
After executing the above, and waiting for it to complete, we can extract the data from the datastore, again using remote_api. First, we need a local version of the codefile model:
import os
from google.appengine.ext import db
class CodeFile(db.Model):
name = db.StringProperty(required=True)
data = db.TextProperty(required=True)
Now, we can fetch all its entities, storing them to disk:
for cf in CodeFile.all():
os.makedirs(os.dirname(cf.name))
fh = open(cf.name, "w")
fh.write(cf.data)
fh.close()
That's it! Your local filesystem should now contain your source code.
One caveat: The downloaded code will only contain your code and datafiles. Static files aren't included, though you should be able to simply download them over HTTP, if you remember what they all are. Configuration files, such as app.yaml, are similarly not included, and can't be recovered - you'll need to rewrite them. Still, a lot better than rewriting your whole app, right?
Update: Google appengine now allows you to download the code (for Python, Java, PHP and Go apps)
Tool documentation here.
Unfortunately the answer is no. This is a common question on SO and the app engine boards.
See here and here for example.
I'm sure you'll be OK though, because you do keep all your code in source control, right? ;)
If you want this to be an option in the future, you can upload a zip of your src, with a link to it somewhere in your web app, as part of your build/deploy process.
There are also projects out there like this one that automate that process for you.
Found that you can run the following in your console (command line / terminal). Just make sure that appcfg.py is accessible via your $PATH.
locate appcfg.py
By default the code below prints out each file and the download progress.
appcfg.py download_app -A APP_ID -V VERSION_ID ~/Downloads
You CAN get your code, even in Java. It just requires a bit of reverse engineering. You can download the war file using the appengine SDK by following these instructions: https://developers.google.com/appengine/docs/java/tools/uploadinganapp
Then you at least have the class files that you can run through JAD to get back to the source files (close to it, at least).
if you're using python... you might be able to write a script that opens all the files in it's current directory and child directories and adds them to a zipfile for you to download
I don't know much about app engine or the permissions, but it seems like that could be possible
You have to revert to the earlier sdk, appcfg.py is not in the latest sdk. Kind of a pain, but it works. It should be far more prominent in the literature. Cost me an entire day.
Update as of October 2020.
The current version of the Google App Engine SDK still includes the appcfg.py script however when trying to download the files from your site the script will attempt to download them into the root folder of your system.
Example:
/images/some_site_image.png
This is probably related to changes in appengine where your files might have
been in a relative directory before but they are no longer with the new versions
of the system.
To fix the problem you will have to edit the appcfg.py file in:
<path_to_cloud_install_dir>/google-cloud-sdk/platform/google_appengine/google/appengine/tools/appcfg.py
Around line 1634 you will find something that looks like:
full_path = os.path.join(out_dir, path)
The problem is with the path argument that for most files is a root directory.
This causes the join method to ignore the out_dir argument.
To fix this on a *NIX and MacOS type of system you will need to add a line before the above mentioned statement that looks like:
path = re.sub(r'^/', '', path)
This removes the '/' prefix from the path and allows the join method to properly
connect the strings.
Now you should be able to run:
google-cloud-sdk/platform/google_appengine/appcfg.py download_app -A <app> -V <version> 20200813t184800 <your_directory>