I m experimenting with GAE, and I m looking for a way to import an .sql file in app engine datastore. I found on GAE site ways to import CSV ans xml files:
https://developers.google.com/appengine/docs/python/tools/uploadingdata
but nothing about .sql.
I ve also came across the so called "jiqlAdmin Data Querying tool" that claims to import .sql files in google `s datastores:
https://groups.google.com/forum/?fromgroups=#!topic/google-appengine-java/Sfy3jmWhYfI
Anybody has tried tool this? Does it work? Do you have any other suggestion?
Thnx
If the goal is just using SQL with App Engine, I would check out Google Cloud SQL, which works great with App Engine.
If you just want your data in the datastore, converting it to CSV and uploading it as per the first link might be a good choice. This will not work if you have things like foreign keys as they will refer to IDs in your old SQL database. You also probably won't be happy with your data if it's arranged in such a way that using a JOIN is the only way to make sense of it.
Since it sounds like you are just getting started, I think Google Cloud SQL is the better option.
Related
I am looking at a CSV file hosted on an open government portal with daily updates, and would like to build a web app around it. Looking for the best approach, please advise. Current options I am looking at:
Reading CSV directly into a web app - seems to be a bad idea because it's a sizeable file over 70 Mb, likely will make it too long to load on the first user touch or even with each query
Schedule a task with something like AWS to read the file once a day and send its contents to a database such as Mongodb by either overwriting the db completely or pre-reading it and updating with newer entries, and then query this db from the web app
Am I missing any better approaches?
I am deploying a Plotly Dash app on Google App Engine but meet some difficulties. The data source to be queried in the dashboard is a Bigquery table, whose content is changing. I hope that the data in the App can always be the latest.
What I tried is at the beginning of the main.py code, I read in the table from Bigquery by Bigquery Python API, but after the App being deployed onto GAE, I found the data was fixed; even I deleted the Bigquery table, the App was not affected. May I know what is the correct way to get data from BigQuery to App Engine? Thanks.
How to connect to BigQuery from GAE using Python might be a bit more of a task than a single question can answer, but here are some hints:
Everything Google Cloud can (in my opinion) be best understood through the repositories on Github. For instance, the python docs samples contain several examples, out of which I think the client example is probably the easiest and most basic. Bigquery Python Samples are here. That will basically answer your question, except for a few gotchas I will mention.
You will of course need to download the client library to do development on a local environment. That is straightforward, but if something seems not to be working make sure you have enabled the API service account for your project--that can be a little confusing.
Something that is critically important to remember is that your GAE app will not be able to easily communicate with BigQuery if it is in a different region, and, in fact, once you set up a GAE app you cannot move or delete it! So, do pay attention to what you are doing as you set up, and if you have a locations mismatch you will need to migrate your BQ instance to the matching location.
I and my friend are working on a GWT-Google App Engine project, using Tortoise SVN and Google Code to synchronize the code.
We also synchronize the local_db.bin file in appengine-generated folder. But we cant get it work. After synchronize the db file, our local datastore is not updated as we expected.
That is a pain. Im worrying about our future, when our database get bigger and more complicated #A#.
Anyone please give me an advice. What should i do to synchronize our local datastore?
I have to suggestions:
1) Use remote api : https://developers.google.com/appengine/articles/remote_api to share a GAE hosted db locally.
2) Maybe you can use Gdrive to sync folders.
This is a really bad idea. Even if you weren't having trouble making both ends read from the same datastore file, the local datastore is in a binary format, and thus you won't both be able to work on the app at the same time, or you'll get merge conflicts you will be unable to resolve.
Instead, both for collaboration purposes and for testing and deployment, you should provide a set of test data you can easily load into the datastore. Store the test data in version control, and load it in using bulkloader or your own code.
I've been reading about the remote_api and the bulkloader.yaml configuration file for doing bulk uploads of Google App Engine, but all I really want to do is replace my live datastore on the cloud with the contents of my local datastore. From what I've read, it would seem that I have to first somehow convert my dev_appserver.datastore file into csv or xml, and then apply all the fancy transforms of bulkloader.yaml, which seems like a lot of unnecessary work.
Anyone know if there's an easier way?
Thanks!
It might work to run download_data against localhost and then upload_data, using that dump, against your live app... does it?
I am just getting started with Google Web Toolkit and Google App Engine and have a quick question. I think I understand how to use the datastore now but I was wondering if there is a way that I can quickly create a "database" with static data from an excel sheet? I just need to add some data for a proof of concept later this week.
I am picturing something similar to a SQL database browser where I can just import the data?
I developing in Eclipse with appropriate plugins.
Thanks,
Rob
The easiest way to do this would be to save your spreadsheet as a CSV file, then use the bulkloader to load it into the datastore.
Your best bet is probably to write something to handle uploading it, or to handle processing it on the server.
However, you should also look at the bulk loader. It might be able to save you a little bit of time.
Here is the API (Google Documents List API) that "allows client applications to programmatically access and manipulate user data stored with Google Documents".