Export & import ODS files in Appengine app - google-app-engine

I am developing an application in java using gwt and Appengine. One feature my app should have is to import data from .ods files and also to export data in .ods format. I know about the jOpenDocument jar but it cannot work with Appengine apps as it is using some Appengine restricted classes . Any suggestions please for alternative approaches to get this done??

You could use Google Docs. Stream your ODS file to Google Docs while converting to Google Docs native format, use the Google Docs API to manipulate the data and export from Google Docs in ODS format.
This might not work if you have too complex items in your ODS like charts, macros etc, because the conversions will cause problems there. But for plain data in columns, this could work.

Related

How to import Google Sheets Data Into MongoDB from any link?

Context
I'm a web development beginner building a MERN stack web-app to help school clubs manage applicants. I want users to be able to input a link to Google Sheet generated from their Google Form and be able to see a cleaner, properly formatted version of their data.
Exploration
From my research, it looks like most solutions:
Involve people wanting to automate their own sheets (and therefore being able to use the Google Sheets script editor)
Make use of Tabletop.js, which will soon be deprecated. I looked into PapaParse but really couldn't figure out how to solve my problem using it.
Use MongoDB Stitch, which also relies on accessing the Google Sheets script editor
Require users to download the file as a CSV file (which is non-ideal for live updates)
It also seems like the Google Sheets API requires the user to have access to the script editor? I might not be interpreting the docs correctly. Would having users authenticate/login to my web app through Google be able to overcome this problem?
My Goal
User pastes the link to their Google Sheets (with view access or published to the web) in my React app
My React app parses the Sheets data and stores it in MongoDB Atlas
[Optional] Update the MongoDB database whenever the original sheet is updated (There is no need for MongoDB to communicate back with the sheet.)
Thank you, any help is much appreciated!
Try out Zapier.
It has a Google Sheets + MongoDB integration.
https://zapier.com/apps/google-sheets/integrations/mongodb
One way you could possibly achieve this is to:
link your Google Sheets account to Zapier, then
collect the links through your React app,
duplicate the sheet from that link
set Zapier to watch for that trigger when a new sheet is duplicated, to pass on the data to MongoDB

Access BigQuery table in Dash App deployed on Google App Engine

I am deploying a Plotly Dash app on Google App Engine but meet some difficulties. The data source to be queried in the dashboard is a Bigquery table, whose content is changing. I hope that the data in the App can always be the latest.
What I tried is at the beginning of the main.py code, I read in the table from Bigquery by Bigquery Python API, but after the App being deployed onto GAE, I found the data was fixed; even I deleted the Bigquery table, the App was not affected. May I know what is the correct way to get data from BigQuery to App Engine? Thanks.
How to connect to BigQuery from GAE using Python might be a bit more of a task than a single question can answer, but here are some hints:
Everything Google Cloud can (in my opinion) be best understood through the repositories on Github. For instance, the python docs samples contain several examples, out of which I think the client example is probably the easiest and most basic. Bigquery Python Samples are here. That will basically answer your question, except for a few gotchas I will mention.
You will of course need to download the client library to do development on a local environment. That is straightforward, but if something seems not to be working make sure you have enabled the API service account for your project--that can be a little confusing.
Something that is critically important to remember is that your GAE app will not be able to easily communicate with BigQuery if it is in a different region, and, in fact, once you set up a GAE app you cannot move or delete it! So, do pay attention to what you are doing as you set up, and if you have a locations mismatch you will need to migrate your BQ instance to the matching location.

How to use appengine Datastore API's with Dataflow?

We have a large dataset from an appengine app in our datastore. Now I want to do some ETL on them to push them to bigquery, and I thought of using a Dataflow batch job.
All examples I find are using this class to query the Datastore:
import com.google.api.services.datastore.DatastoreV1.Query;
And that does work. However, I'm not familiar wit this DatastoreV1 API and would like to use the API provided with the appengine SDK, like this:
import com.google.appengine.api.datastore.Query;
The problem is that the DatastoreIO doesn't accept these queries:
PCollection<Entity> projects = p.apply(Read.from(DatastoreIO.source().withQuery(q).withDataset(DATASET_ID)));
It will only take DatastoreV1.Query objects. Is there any way to use the app engine provided API's? I'm much more familiar with those calls. Better yet, if we could use Objectify, that would be awesome :)
Thanks!
This isn't possible with the current implementation of the API. We can look at adding as a feature, and would gladly accept a pull request to expand the current functionality. The AppEngine team is also actively working on increasing interoperability between their SDK and the Datastore API.

How can import a .sql file in app engine datastore

I m experimenting with GAE, and I m looking for a way to import an .sql file in app engine datastore. I found on GAE site ways to import CSV ans xml files:
https://developers.google.com/appengine/docs/python/tools/uploadingdata
but nothing about .sql.
I ve also came across the so called "jiqlAdmin Data Querying tool" that claims to import .sql files in google `s datastores:
https://groups.google.com/forum/?fromgroups=#!topic/google-appengine-java/Sfy3jmWhYfI
Anybody has tried tool this? Does it work? Do you have any other suggestion?
Thnx
If the goal is just using SQL with App Engine, I would check out Google Cloud SQL, which works great with App Engine.
If you just want your data in the datastore, converting it to CSV and uploading it as per the first link might be a good choice. This will not work if you have things like foreign keys as they will refer to IDs in your old SQL database. You also probably won't be happy with your data if it's arranged in such a way that using a JOIN is the only way to make sense of it.
Since it sounds like you are just getting started, I think Google Cloud SQL is the better option.

Developing for Google App Engine and using the datastore

I am just getting started with Google Web Toolkit and Google App Engine and have a quick question. I think I understand how to use the datastore now but I was wondering if there is a way that I can quickly create a "database" with static data from an excel sheet? I just need to add some data for a proof of concept later this week.
I am picturing something similar to a SQL database browser where I can just import the data?
I developing in Eclipse with appropriate plugins.
Thanks,
Rob
The easiest way to do this would be to save your spreadsheet as a CSV file, then use the bulkloader to load it into the datastore.
Your best bet is probably to write something to handle uploading it, or to handle processing it on the server.
However, you should also look at the bulk loader. It might be able to save you a little bit of time.
Here is the API (Google Documents List API) that "allows client applications to programmatically access and manipulate user data stored with Google Documents".

Resources