I am trying to test Google App Engine's new full text search functionality in Python with the development appserver.
Is there a stub for the search that allows one to test it with the testbed local unit testing?
The following is example code that throws an exception:
#!/usr/bin/python
from google.appengine.ext import testbed
from google.appengine.api import search
def foo():
d = search.Document(doc_id='X',
fields=[search.TextField(name='abc', value='123')])
s = search.Index(name='one').add(d)
tb = testbed.Testbed()
tb.activate()
# tb.init_search_stub() ## does this exist?
foo()
The exception thrown by foo() is: AssertionError: No api proxy found for service "search". Has an api proxy been written for search?
Thoughts and comments appreciated.
UPDATE this was valid in 2012. Things changed in 2013: the stub is officially supported. See #siebz0r answer.
It's not in the list of supported stubs (yet, I assume), but there's a SearchServiceStub in simple_search_stub.py which looks like what you're after.
I haven't tested it myself but you could try do something like this:
testbed = testbed.Testbed()
testbed.activate()
stub = SearchServiceStub()
testbed._register_stub(SEARCH_SERVICE_NAME, stub)
SEARCH_SERVICE_NAME should be "search", and it should also be present in SUPPORTED_SERVICES list, otherwise testbed will raise an exception.
The way you "inject" this new service stub is either modify SDK's testbed/__init__.py or do it from your code. Can't really say which approach is better since it's gonna be a hack in either way, 'till the init_search_stub() will officially appear on the list.
Also, the fact that it's not in the list yet is probably because it's just not ready :) So, use it on your own risk.
It seems that since SDK 1.8.4 the search stub can be enabled from Testbed:
from google.appengine.api import search
from google.appengine.ext import testbed
try:
tb = testbed.Testbed()
tb.activate()
tb.init_search_stub()
index = search.Index(name='test')
index.put(search.Document())
finally:
tb.deactivate()
Related
The error I getting is:
raise ImportError('No module named %s' % fullname)
ImportError: No module named _sqlite3
from _sqlite3 import *
from dbapi2 import *
import sqlite3
It has something to do with:
import sqlite3
Can anyone help me please? I'm using Google App Engine for Python on a Windows 7 machine just in case that has something to do with it.
The help would be much appreciate.
Thanks
Not really sure about your case, but it helped me a couple of times. You should add _sqlite3 to _WHITE_LIST_C_MODULES to python sandbox module here:
[path_to_google_app_engine]/google/appengine/tools/devappserver2/sandbox.py
somehow like this:
_WHITE_LIST_C_MODULES = [
'array',
'_ast',
...
'_sqlite3'
]
As far as I am aware, Google App Engine does not support sqlite. It has its own database system, that uses a vaguely SQL like language call GQL.
To prevent you accidently using the wrong database, the developement environment has intercepted your import of sqlite, and has raised an error.
You can't directly use sqlite3 from our dev_appserver unless you're willing to modify our source. The reason is that dev_appserver is supposed to give you a development-time experience that simulates what's available (and not avilable) when you upload code to appsot.com. sqlite3 won't be available then.
Configuring the remote_api for AppEngine on Python 2.7, I need to set up the configuration calls to create and configure the dabase call stubs so that I don't have to replicate the configuration call in every REST resource and handler. The code I want to have is something similar to this:
def configure_remote_api():
try:
from google.appengine.ext.remote_api import remote_api_stub
remote_api_stub.ConfigureRemoteApi(None, '/_ah/remote_api', auth_func, 'myapp.appspot.com')
except ImportError:
pass
What I want is to set it up so it is modularly called, and doesn't have to be replicated all over the application code, not even configure_remote_api(). This way, we can keep our codebase clean and have automatic remote_api use whenever developing locally. How can I do this?
Maybe you can put the call in appengine_config.py. That usually gets loaded pretty early on. (But please check.)
I've been accessing the traditional datastore from the command line as follows:
from google.appengine.api import apiproxy_stub_map
from google.appengine.api.datastore_file_stub import DatastoreFileStub
os.environ['APPLICATION_ID']="myapp"
apiproxy_stub_map.apiproxy=apiproxy_stub_map.APIProxyStubMap()
stubname, stub = 'datastore_v3', DatastoreFileStub(os.environ["APPLICATION_ID"], Datastore, "/")
apiproxy_stub_map.apiproxy.RegisterStub(stubname, stub)
I've upgraded to the sqlite datastore and need to update the stub (and maybe stubname), presumably with DatastoreSqliteStub, but can't seem to initialise it; any suggestions ?
Thanks!
Here is a little module I often reuse in my AppEngine projects: ae.py
It lets me just do:
import ae
ae.connect_local_datastore()
at the top of scripts. or with remote_api setup you can also do:
ae.connect_remote_datastore()
A simple console.py script that makes use of this can be found here
Hope they help.
I'm programming an application with google app engine, with django 1.1 (no django pacth or others), well as you know is impossible use django login and session features so I download
Gae utility and use Session Object (http://gaeutilities.appspot.com/) but some time this object create 2 sessions instead 1 session ... here's code
def index(request):
aSWrap = SWrap(SWrap.createSession())
....
def login(request):
aSWrap = SWrap(SWrap.createSession())
....
class SWrap(object):
#classmethod
def createSession():
return Session(cookie_name='my_cookie',session_expire_time=7200)
and for setting session no expiration or really long expiration...enter code here
Thanks
Judging by the code, you're calling createsession twice within the same request. That will cause problems with David's library as well.
Also, gaeutilties session included a config file where you can modify all the default values as you like.
https://github.com/joerussbowman/gaeutilities/blob/master/appengine_utilities/settings_default.py
gaeutilities session also has security features lacking in gae-sessions. I'm afraid David didn't attempt to answer you question, rather just suggested you use his library which under your current implementation would have the exact same problem. You need to be sure you only initiate the session once per http request no matter what session library you're using.
I'm moving gaeutilities session to a decorator in order to address this issue as well and provide better performance. You can watch the master branch on Github for updates. https://github.com/joerussbowman/gaeutilities
I suggest using a different sessions library. Check out this comparison of the available sessions libraries for GAE.
I'd recommend gae-sessions - it presents an API almost identical to the library you are currently using, but it is much faster and shouldn't give you headaches like the bug you've encountered above.
Disclaimer: I wrote gae-sessions, but I'm not the only one who would recommend it. Here is a recent thread discussing sessions on the google group for GAE python.
What are you trying to do with SWrap(SWrap.createSession())? It looks like the result of SWrap.createSession() is passed to the SWrap() constructor. Have you omitted part of the definition of SWrap?
Perhaps this is more what you are wanting:
def index(request):
mysession = SWrap.createSession()
....
def login(request):
mysession = SWrap.createSession()
....
class SWrap(object):
#staticmethod
def createSession():
return Session(cookie_name='my_cookie',session_expire_time=7200)
The CherryPy web server can supposedly be deployed in the Google App Engine.
Who has done it, and what was the experience like?
What special effort was required (configuration, etc.)?
Would you recommend it to others?
The article is a good example but its slightly out of date now as the patch is no longer required, the latest version of Cherrypy should run without it, I've gotten the sample below running in the development environment.
I've included cherrypy inside a zip file as the google app engine has a limit of one thousand files per application, it also makes it easier to deploy.
I'm also using the cherrypy dispatch handler to route the request.
import sys
sys.path.insert(0, 'cherrypy.zip')
import cherrypy
import wsgiref.handlers
class Root:
exposed = True
def GET(self):
return "give a basic description of the service"
d = cherrypy.dispatch.MethodDispatcher()
conf = {'/':
{
'request.dispatch': d
}
}
app = cherrypy.tree.mount(Root(), "/",conf)
wsgiref.handlers.CGIHandler().run(app)
So far I've not come across any particular issues but I have read some people have had issues with sessions.
See boodebr.org article (missing, but here on the Wayback machine) It works for me.
If you are looking for an example, look for the condition that accepts ServerMode.GAE in ServerInterface.auto in this example.
There is a good article on how to do this over here now here. I haven't actually tried this yet, I stuck with django on App Engine, but it seems to be a solid example.