I am running a bottle webservice in localhost in my system. I have connected it to the MySQL and it is working fine, but when it comes to googleappengine how can i connect to NoSQL.
This is my code:
import bottle
import webapp2
from bottle import route, run, template
import MySQLdb, MySQLdb.cursors
from bottle import route, template, request ,debug
from google.appengine.ext.webapp.util import run_wsgi_app
db = MySQLdb.connect(user="root", passwd="root", db="delhipoc" , host='127.0.0.1')
cur = db.cursor()
def main():
debug(True)
run_wsgi_app(bottle.default_app())
#route('/hai/<name>')
def show(name):
print "the name is \n",name
cur.execute('SELECT * from poc_people')
print "i am in show \n"
for row in cur.fetchall() :
print row[0]
data = row[0]
print "hai \n",row[0]
return template('<b></br></br></br>Hello{{name}}</br></br></br>{{data}} </b>!',name=name,data=data)
if __name__ == '__main__':
main()
In the above way i am able to connect to my local mysql database. But how can i connect it to GAE database
Create a datastore model corresponding to your table and convert all mysql api request to use datastore api.
Datastore reference: https://developers.google.com/appengine/docs/python/datastore/
You don't need to explicitly "connect" to gae datastore, you can just issue the query, filter, etc.
Related
I have a Entity with ~50k rows in Google Cloud Datastore, the stand alone not GAE. I am starting development with GAE and would like to query this existing datastore without having to import it to GAE. I have been unable to find a way to connect to an existing datastore Kind.
Basic code altered from Hello World and other guides im trying to get working as a POC.
import webapp2
import json
import time
from google.appengine.ext import ndb
class Product(ndb.Model):
type = ndb.StringProperty()
#classmethod
def query_product(cls):
return ndb.gql("SELECT * FROM Product where name >= :a LIMIT 5 ")
class MainPage(webapp2.RequestHandler):
def get(self):
self.response.headers['Content-Type'] = 'text/plain'
query = Product.query_product()
self.response.write(query)
app = webapp2.WSGIApplication([
('/', MainPage),
], debug=True)
Returned Errors are
TypeError: Model Product has no property named 'name'
Seems obvious that its trying to use a GAE datastore with the kind Product instead of my existing Datastore with Product already defined, But I cant find how to make that connection.
There is only one Google Cloud Datastore. App Engine does not have a datastore of its own - it works with the same Google Cloud Datastore.
All entities in the Datastore are stored for a particular project. If you are trying to access data from a different project, you will not be able to see it without going through special authentication.
I'm not too certain what it is you're trying to accomplish when you say that you would like to query this existing datastore without having to import it to GAE. I'm guessing that you have project A with the datastore with 50k rows, and you're starting project B. And you want to access the project A datastore from project B. If this is the case, and if you're trying to access the datastore from a different project, then maybe this previous answer that mentions remote api can help you.
Below is working code. I was pretty close at the time I made this original post but the reason I was getting no data back was because I was running my App locally. As soon as I actually deployed my code to App Engine it pulled from Datastore no problem.
import webapp2
import json
import time
from google.appengine.datastore.datastore_query import Cursor
from google.appengine.ext import ndb
class Product(ndb.Model):
name = ndb.StringProperty()
class MainPage(webapp2.RequestHandler):
def get(self):
self.response.headers['Content-Type'] = 'text/plain'
query = ndb.gql("SELECT * FROM Product where name >= 'a' LIMIT 5 ")
output = query.fetch()
#query = Product.query(Product.name == 'zubo - pre-owned - nintendo ds')
#query = Product.query()
#output = query.fetch(10)
self.response.write(output)
app = webapp2.WSGIApplication([
('/', MainPage),
], debug=True)
I'm trying to access data that ODK has pushed into the datastore. The below code words fine when I query an entity that I created via Python, which was called "ProductSalesData". The entity name ODK has given it's data is "opendatakit.test1". When I update the data model to class opendatakit.test1(db.Model) it obviously bombs due to a sytax error. How do I call that data?
#!/usr/bin/env python
import webapp2
from google.appengine.ext import db
class ProductSalesData(db.Model):
product_id = db.IntegerProperty()
date = db.DateTimeProperty()
store = db.StringProperty()
q = ProductSalesData.all()
class simplequery(webapp2.RequestHandler):
def get(self):
for ProductSalesData in q:
self.response.out.write('Result:%s<br />' % ProductSalesData.store)
app = webapp2.WSGIApplication(
[('/', simplequery)],
debug=True)
I know you tagged GAE, but do you have to access it straight through the datastore?
If not, I've had better success using the API that has already been built into aggregate: https://code.google.com/p/opendatakit/wiki/BriefcaseAggregateAPI
If you need GAE access I'd suggest the ODK developers group over on google groups - they're pretty active.
I would like to backup my app's datastore programmatically, on a regular basis.
It seems possible to create a cron that backs up the datastore, according to https://developers.google.com/appengine/articles/scheduled_backups
However, I require a more fine-grained solution: Create different backup files for dynamically changing namespaces.
Is it possible to simply call the /_ah/datastore_admin/backup.create url with GET/POST?
Yes; I'm doing exactly that in order to implement some logic that couldn't be done with cron.
Use the taskqueue API to add the URL request, like this:
from google.appengine.api import taskqueue
taskqueue.add(url='/_ah/datastore_admin/backup.create',
method='GET',
target='ah-builtin-python-bundle',
params={'kind': ('MyKind1', 'MyKind2')})
If you want to use more parameters that would otherwise go into the cron url, like 'filesystem', put those in the params dict alongside 'kind'.
Programmatically backup datastore based on environment
This comes in addition to Jamie's answer. I needed to backup the datastore to Cloud Storage, based on the environment (staging/production). Unfortunately, this can no longer be achieved via a cronjob so I needed to do it programmatically and create a cron to my script. I can confirm that what's below is working, as I saw there were some people complaining that they get a 404. However, it's only working on a live environment, not on the local development server.
from datetime import datetime
from flask.views import MethodView
from google.appengine.api import taskqueue
from google.appengine.api.app_identity import app_identity
class BackupDatastoreView(MethodView):
BUCKETS = {
'app-id-staging': 'datastore-backup-staging',
'app-id-production': 'datastore-backup-production'
}
def get(self):
environment = app_identity.get_application_id()
task = taskqueue.add(
url='/_ah/datastore_admin/backup.create',
method='GET',
target='ah-builtin-python-bundle',
queue_name='backup',
params={
'filesystem': 'gs',
'gs_bucket_name': self.get_bucket_name(environment),
'kind': (
'Kind1',
'Kind2',
'Kind3'
)
}
)
if task:
return 'Started backing up %s' % environment
def get_bucket_name(self, environment):
return "{bucket}/{date}".format(
bucket=self.BUCKETS.get(environment, 'datastore-backup'),
date=datetime.now().strftime("%d-%m-%Y %H:%M")
)
You can now use the managed export and import feature, which can be accessed through gcloud or the Datastore Admin API:
Exporting and Importing Entities
Scheduling an Export
I've been unsuccessful at connecting to Google Cloud SQL using SQLAlchemy 0.7.9 from my development workstation (in hopes to generate the schema using create_all()). I can't get passed the following error:
sqlalchemy.exc.DBAPIError: (AssertionError) No api proxy found for service "rdbms" None None
I was able to successfully connect to the database instance using the google_sql.py instancename, which initially opened up a browser to authorize the connection (which now appears to have cached the authorization, although I don't have the ~/Library/Preferences/com.google.cloud.plist file as https://developers.google.com/cloud-sql/docs/commandline indicates I should)
Here is the simple application I'm using to test the connection:
from sqlalchemy import create_engine
engine = create_engine('mysql+gaerdbms:///myapp', connect_args={"instance":"test"})
connection = engine.connect()
The full stacktrace is available here - https://gist.github.com/4486641
It turns out the mysql+gaerdbms:/// driver in SQLAlchemy was only setup to use the rdbms_apiproxy DBAPI, which can only be used when accessing Google Cloud SQL from a Google App Engine instance. I submitted a ticket to SQLAlchemy to update the driver to use the OAuth-based rdbms_googleapi when not on Google App Engine, just like the Django driver provided in the App Engine SDK. rdbms_googleapi is also the DBAPI that google_sql.py uses (remote sql console).
The updated dialect is expected to be part of 0.7.10 and 0.8.0 releases, but until they are available, you can do the following:
1 - Copy the updated dialect in the ticket to a file (ex. gaerdbms_dialect.py)
from sqlalchemy.dialects.mysql.mysqldb import MySQLDialect_mysqldb
from sqlalchemy.pool import NullPool
import re
"""Support for Google Cloud SQL on Google App Engine
Connecting
-----------
Connect string format::
mysql+gaerdbms:///<dbname>?instance=<project:instance>
# Example:
create_engine('mysql+gaerdbms:///mydb?instance=myproject:instance1')
"""
class MySQLDialect_gaerdbms(MySQLDialect_mysqldb):
#classmethod
def dbapi(cls):
from google.appengine.api import apiproxy_stub_map
if apiproxy_stub_map.apiproxy.GetStub('rdbms'):
from google.storage.speckle.python.api import rdbms_apiproxy
return rdbms_apiproxy
else:
from google.storage.speckle.python.api import rdbms_googleapi
return rdbms_googleapi
#classmethod
def get_pool_class(cls, url):
# Cloud SQL connections die at any moment
return NullPool
def create_connect_args(self, url):
opts = url.translate_connect_args()
opts['dsn'] = '' # unused but required to pass to rdbms.connect()
opts['instance'] = url.query['instance']
return [], opts
def _extract_error_code(self, exception):
match = re.compile(r"^(\d+):").match(str(exception))
code = match.group(1)
if code:
return int(code)
dialect = MySQLDialect_gaerdbms
2 - Register the custom dialect (you can override the existing schema)
from sqlalchemy.dialects import registry
registry.register("mysql.gaerdbms", "application.database.gaerdbms_dialect", "MySQLDialect_gaerdbms")
Note: 0.8 allows a dialect to be registered within the current process (as shown above). If you are running an older version of SQLAlchemy, I recommend upgrading to 0.8+ or you'll need to create a separate install for the dialect as outlined here.
3 - Update your create_engine('...') url as the project and instance are now provided as part of the query string of the URL
mysql+gaerdbms:///<dbname>?instance=<project:instance>
for example:
create_engine('mysql+gaerdbms:///mydb?instance=myproject:instance1')
I think I have a working sample script as follows. Can you try a similar thing and let me know how it goes?
However(and you might be already aware of it), if your goal is to create the schema on Google Cloud SQL instance, maybe you can create the schema against a local mysql server, dump it with mysqldump, and then import the schema to Google Cloud SQL. Using local mysql server is very convenient for development anyway.
from sqlalchemy import create_engine
from google.storage.speckle.python.api import rdbms as dbi_driver
from google.storage.speckle.python.api import rdbms_googleapi
INSTANCE = 'YOURPROJECT:YOURINSTANCE'
DATABASE = 'YOURDATABASE'
def get_connection():
return rdbms_googleapi.connect('', instance=INSTANCE, database=DATABASE)
def main():
engine = create_engine(
'mysql:///YOURPROJECT:YOURINSTANCE/YOURDATABASE',
module=dbi_driver,
creator=get_connection,
)
print engine.execute('SELECT 1').scalar()
if __name__ == '__main__':
main()
Can the Blobstore in GWT/GAE be used as a database? Or is a new Blobstore created each time I launch the application? I would like to store information without losing it when the application is closed. But I can't seem to find a way to name a Blobstore and then reference it by its ID. Thanks!
If all you want to do is store a string I'd still suggest using the datastore.
Here's the complete python source to an App Engine app that retrieves, modifies, and stores some text in the datastore:
from google.appengine.ext import webapp, db
from google.appengine.ext.webapp import util
class TextDoc(db.Model):
text = db.TextProperty(default="")
class MainHandler(webapp.RequestHandler):
def get(self):
my_text_doc = TextDoc.get_or_insert('my_text_doc')
my_text_doc.text += "Blah, blah, blah. "
my_text_doc.put()
self.response.out.write(my_text_doc.text)
def main():
application = webapp.WSGIApplication([('/', MainHandler)],
debug=True)
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
If you're working in Java it would be more verbose, but similar.