Heroku where can i save files? - database

I have a telegram bot, and it saves the user's audio messages and photos in the repository and DB(path only), I deployed it in on pythonanywhere and everything works.
But before that, I tried to deploy it on heroku and ran into the problem that you can't store files there and everything can only be done through databases.
Do I understand correctly that you need to create a field in the database that stores the file itself, or are there other ways?

You may use, for example, cloudinary. They provide 25GB of bandwidth for free. The service is intended to be used for pictures but works well with other files as well. AND it has a good API to go with it for many programming languages (not sponsored)).

Related

How to transfer live WordPress site to Wamp?

I've got a wordpress site that I have been using for a year now and it is hosted with HostGator. I have got a few tests i would like to run on the site, but I would like to test it offline using wamp first before making it LIVE.
The problem is previously I was always making changes to the LIVE site, usually at hours when I get little to no traffic. However, that has changed now and I do get traffic most hours through out a 24hr day.
So my problem is:
How do i download my existing website to laptop (wamp) and make those changes with new theme? (total newbie, sorry!)
I use Windows 7, so not sure what I need to be doing to get the site working like a live site offline.
Once I have implemented the new changes, what is the best way to upload the updated site back to the HostGator server without having any downtime or errors for site visitors?
Is there anything else I need to install or do inorder for this to work? I hope you can give me as much information as possible or any links to any guides or articles that explain how to do this.
Thanks so much for any help you can offer!!!
If you're using Hostgator, the process is simple:
Install XAMPP or WAMPP on your computer;
Go to your cPanel, backup and download your website;
Extract the backup to your computer, specially the homedir and the sql;
Go to your local environment, access http://localhost/phpmyadmin
Create a new database, doesn't matter the name but for the example let's call it "database";
Inside that database, import the one taken from the backup;
create a new folder inside your htdocs with the name of your website, "example.com";
Extract the content of the homedir there;
edit wp-config with the following data:
Host: 'localhost'
Username: 'root'
Password: blank
access http://localhost/example.com
You can check a good tutorial about the subject here.
About putting the site live, I recommend you to use a GIT repository, however it's understandable that might be a little complicated and perhaps too much work for what you're trying to achieve.
Try to move your files directly from your local to live environment using Filezilla or WinSCP, the drag and drop should replace the files live and the downtime should be minimal.
Instead of WAMP, you can always use VirtualBox to install CentOS or Ubuntu/Debian.
You can go one further and install either CentminMod to automate creating a LAMP, or a full panel like ISPConfig or Virtualmin.
That take care of create the environment.
Create a new account on the LAMP, using the same domain name.
You can FTP with Windows to get the files, but networking Windows and Linux is a pain. The better option is to use the command line (CLI) in the Linux VM to ftp the files from Hostgator to the VM. This guide will help with that process: http://www.tldp.org/HOWTO/FTP-3.html
Then your only concern is the MySQL database. And for this, you have several options.
For me, the easiest is to buy (or try!) SQLyog on Windows, and then copy the database from the Hostgator source to the localhost destination. Some mild networking is needed for Windows to see the Linux VM, but nothing as complex as file sharing (the FTP issue). SQLyog is far quicker than backing up the database, then restoring it -- especially since you can run into memory issues doing it this way. It fully depends on the size of the database.
The cheap/free backup>restore method is to use phpMyAdmin.
WordPress also has plugins, of varying cost, but you still have the possible backup>restore memory issue there as well.
When done, just copy it the other way, again using SQLyog and CLI ftp. You'll still have some downtime, but it will hopefully be minimal.
As a newbie, this probably seems like rocket science, but at least it gives you a good place to start. Welcome to the world of locally dev'ing sites!

How to download an ephemeral file from Heroku Cedar

I have a rails project hosted on Heroku Cedar that does the following:
crawls daily newsfeed and store them into the database
manually judge the feeds and classify them into categories
use the judgments to build a classifier that automatically classifies new incoming feed
iteratively improve the classification with additional judgments
The problem is that the classifier requires writing to a file. However, when I run the scripts on Heroku Cedar, it creates an ephemeral file that isn't permanent.
My questions are:
Is there a way to download the ephemeral file I created by running a script on Heroku?
What's a better way to handle situation like this?
In short No. You want to be storing any generated data in some sort of persistent file/data store. You should look at pushing these files to S3 or similar.

local GAE datastore does not keep data after computer shuts down

On my local machine (i.e. http://localhost:8080/), I have entered data into my GAE datastore for some entity called Article. After turning off my computer and then restarting next day, I find the datastore empty: no entity. Is there a way to prevent this in the future?
How do I make a copy of the data in my local datastore? Also, will I be able to upload said data later into both localhost and production?
My model is ndb.
I am using Max OS X and Python 2.7, if theses matter.
I have experienced the same problem. Declaring the datastore path when running dev_appserver.py should fix it. These are the arguments I use when starting the dev_appserver
python dev_appserver.py --high_replication --use_sqlite --datastore_path=myapp.datastore --blobstore_path=myapp_blobs
This will use sqlite and save the data in the file myapp.datastore. If you want to save it in a different directory, use --datastore_path=/path/to/myapp/myapp.datastore
I also use --blobstore_path to save my blobs in a specific directory. I have found that it is more reliable to declare which directory to save my blobs. Again, that is --blobstore_path=/path/to/myapp/blobs or whatever you would like.
Since declaring blob and datastore paths, I haven't lost any data locally. More info can be found in the App Engine documentation here:
https://developers.google.com/appengine/docs/python/tools/devserver#Using_the_Datastore
Data in the local datastore is preserved unless you start it with the -c flag to clear it, at least on the PC. You therefore probably have a different issue with temp files or permissions or something.
The local data is stored using a different method to the actual production servers, so not sure if you can make a direct backup as such. If you want to upload data to both the local and deployed servers you can use the Upload tool suite: uploading data
The bulk loader tool can upload and download data to and from your application's datastore. With just a little bit of setup, you can upload new datastore entities from CSV and XML files, and download entity data into CSV, XML, and text files. Most spreadsheet applications can export CSV files, making it easy for non-developers and other applications to produce data that can be imported into your app. You can customize the upload and download logic to use different kinds of files, or do other data processing.
So you can 'backup' by downloading the data to a file.
To load/pull data into the local development server just give it the local URL.
The datastore typically saves to disk when you shut down. If you turned off your computer without shutting down the server, I could see this happening.

My GAE python development datastore is never persisted to a file

I have just started using GAE (Python 2.7 SDK 1.6.4) , I have set up a
simple test project using Pydev (latest version) in eclipse (indigo)
on Windows XP (SP3).
It all works fine, my app can record data in the datastore and the blobstore
and then retrieve it, but when I stop the development server and start
it again the data in the datastore is lost. This is not the case for
the blobstore which is retaining blobs fine and I can see the
blobstore folder that gets created in C:\Temp
I did the sensible thing and look back through old posts and found
that most people who have this problem solve it by changing the
location of the datastore file, so I used the following parameters;
--datastore_path="${workspace_loc}/myproject/datastore"
--blobstore_path="${workspace_loc}/myproject/blobstore"
"${workspace_loc}/myproject/src"
I moved the blobstore at the same time as you can see.
The blobstore still works, and now the blobstore folder is created in
myproject folder as expected. The datastore file is still not created
however, and when I stop and restart the development server the data
is still lost.
The dev server startup logs include the following entry
WARNING 2012-04-20 10:49:04,513 datastore_file_stub.py:513] Could not
read datastore data from C:\myworkspace\myproject\datastore
So I know it is trying to create the datastore in the correct place.
Finally I lifted the whole eclipse workspace folder and copied it to
another computer with exactly the same setup except it is running
Windows 7 instead of Windows XP.
Everything works fine there - both the datastore file and blobstore
folder are now created where I expect them to be.
I have set up eclipse, python, gae, my project and my eclipse launch
file in exactly the same way on two computers, it works on one and
not the other. Maybe XP is something to do with it but to be honest I
think that's unlikely.
The only other clue I have come up with is that a recent change to the
GAE development server stopped writing to the datastore file after
every change and only flushes on exit, this problem may be closely related to mine;
App Engine local datastore content does not persist
However adding the following to my code did not help at all.
from google.appengine.tools import dev_appserver
import atexit
atexit.register(dev_appserver.TearDownStubs)
So it's not down to incorrect termination sequence either as far as I
can tell although it may be that I was just added it in the wrong place (I'm am new to python).
Anyway I am stumped and I would be really grateful for suggestions you
guys can come up with.
It's probably http://code.google.com/p/googleappengine/issues/detail?id=7244 and a bug. Hopefully a fix will be available soon.
did you try:
--storage_path=...
Path at which all local files (such as the Datastore, Blobstore files, Google Cloud Storage Files, logs, etc) will be stored, unless overridden by --datastore_path, --blobstore_path, --logs_path, etc.
found at https://developers.google.com/appengine/docs/python/tools/devserver?csw=1

Common file system API for files in the cloud?

Our app is a sort-of self-service website builder for a particular industry. We need to be able to store the HTML and image files for each customer's site so that users can easily access and edit them. I'd really like to be able to store the files on S3, but potentially other places like Box.net, Google Docs, Dropbox, and Rackspace Cloud Files.
It would be easiest if there there some common file system API that I could use over these repositories, but unfortunately everything is proprietary. So I've got to implement something. FTP or SFTP is the obvious choice, but it's a lot of work. WebDAV will also be a pain.
Our server-side code is Java.
Please someone give me a magic solution which is fast, easy, standards-based, and will solve all my problems perfectly without any effort on my part. Please?
Not sure if this is exactly what you're looking for but we built http://mover.io to address this kind of thing. We currently support 13 different end points and we have a GUI interface and an API for interfacing with all these cloud storage providers.

Resources