Writing to filesystem in App Engine development server - google-app-engine

I'm just trying out using scala and the scalate templating system on an appengine application. By default, scalate tries to write the compiled template to the filesystem. Now, obviously this won't work on appengine, and there is a way to precompile the templates. But I was wondering if it is possible to switch off this restriction, just during development. It slows down the compile/test cycle quite a bit.

In the Python dev server you can, I use it to log to a file when using the dev server:
if os.environ.get('SERVER_SOFTWARE','').startswith('Dev'):
from google.appengine.tools.dev_appserver import FakeFile
FakeFile.ALLOWED_MODES = frozenset(['a','r', 'w', 'rb', 'U', 'rU'])
If you want to write binary files or unicode you might need to add 'wb' or 'wU' to that list. Maybe there is something equivalent in the Java dev server.

I'm currently using webpy that has the same limitation, its templating system can't access parser module (blocked) and can't write to filesystem on Google App Engine, so you need to precompile the templates upfront.
I have resolved this annoying issue with a Python script that, everytime a file of a given directory is changed, triggers the precompilation of that file.
I'm on OSX and I'm using FSEvents but I believe you can find other solutions/libraries on any other platform (incron in Linux, FileSystemWatcher on Windows):
from fsevents import Observer
from fsevents import Stream
from datetime import datetime
import subprocess
import os
import time
PROJECT_PATH = '/Users/.../Project/GoogleAppEngine/stackprinter/'
TEMPLATE_COMPILE_PATH = os.path.join(PROJECT_PATH,'web','template.py')
VIEWS_PATH = os.path.join(PROJECT_PATH,'app','views')
def callback(event):
if event.name.endswith('.html'):
subprocess.Popen('python2.5 %s %s %s' % ( TEMPLATE_COMPILE_PATH ,'--compile', VIEWS_PATH) , shell=True)
print '%s - %s compiled!' % (datetime.now(), event.name.split('/')[-1])
observer = Observer()
observer.start()
stream = Stream(callback, VIEWS_PATH, file_events=True)
observer.schedule(stream)
while not observer.isAlive():
time.sleep(0.1)

I'd strongly advise against using AppEngine...
If you're just looking for free JVM/webapp hosting, then Stax.net offers a better alternative . Amongst other features, it allows you to write to the filesystem and to spawn threads.
They also use Scala internally, so they're very accommodating towards other Scala developers :)
Stax.net: http://www.stax.net/
(Note: I'm in no way affilliated to Stax)

Related

How to run Apache CXF wadl2java with JDK 12?

The following command used to work flawlessly:
C:\tools\apache-cxf-3.3.1\bin\wsdl2java -client -d generated foo.wsdl
It no longer works with the latest version of JDK - 12. I have downloaded the latest version of Apache CXF, and still get the same error:
-Djava.endorsed.dirs=C:\tools\apache-cxf-3.3.1\bin\..\lib\endorsed is not supported. Endorsed standards and standalone APIs
in modular form will be supported via the concept of upgradeable modules.
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Could anyone offer a tip on how to remedy this?
I got the Apache CXF 3.3.1 wsdl2java utility to work with the latest OpenJDK 11 by doing 4 things:
Pull down this jar and place it into the {CXF_HOME}/lib directory: https://mvnrepository.com/artifact/javax.jws/jsr181-api/1.0-MR1
Pull down this jar and also place it in the {CXF_HOME}/lib directory: https://mvnrepository.com/artifact/javax.xml.ws/jaxws-api/2.3.1
In my case, since I'm running on a Mac, I vi'd the wsdl2java script and made sure these two jars are explicitly being set on the CXF classpath, by doing the following declaration within the script right before the execution of the java command:cxf_classpath=${cxf_classpath}:../lib/jaxws-api-2.3.1.jar:../lib/jsr181-api-1.0-MR1.jar
Lastly, I removed the '-Djava.endorsed.dirs="${cxf_home}/lib/endorsed"' parameter from the java command at the end of the script, since newer JDKs no longer support this argument, so my command now looks like this:$JAVA_HOME/bin/java -Xmx${JAVA_MAX_MEM} -cp "${cxf_classpath}" -Djava.util.logging.config.file=$log_config org.apache.cxf.tools.wsdlto.WSDLToJava "$#"
Now, using OpenJDK11, I'm able to point to an external WSDL file and successfully generate the client code I need to consume this SOAP service with the following command:
./wsdl2java -client -d src https://somewhere.com/service\?wsdl
Whether or not this all works yet is TBD in terms of being able to call and consume the SOAP service I'm coding against, but I've at least now overcome the Java9+ support issue with this tool specific to generating client code from a WSDL.
If your needs are different, I would at least remove the '-Djava.endorsed.dirs="${cxf_home}/lib/endorsed"' JVM parameter and start calling the wsd2java command with the parameters you need set and just start iteratively adding back in the missing libs it starts throwing java.lang.NoClassDefFoundError errors for.
Their FAQ specifically says starting in 3.3.x, Java 9+ will be supported but something clearly dropped the ball between the no-longer-supported hardcoded JVM arguments still being passed in the utility and the missing libraries to support the newer JDKs where these legacy libs have been removed.
Hope this helps someone out there unfortunate enough to ALSO still be programming against SOAP endpoints but trying to at least keep the client-side code you're writing up to date and taking advantage of the newer features of the modern JDK.

Error with module using Cloud Storage with Python and his tutorial

I'm trying to test Google Cloud Storage to store images (I need it in an app that I'm developing) and I'm following the Bookshelf App tutorial that they have in his webpage.
I'm using python and the problem is that when I execute the requirementes.txt all packages have been installed fine, but when I try execute the code, I see this error:
...sandbox.py", line 948, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named cryptography.hazmat.bindings._openssl
I have been trying hundred of posibles solutions, reinstalling only the cryptography package, trying to use different versions of the same module, and installing other packages that contains it but anything resolved the problem.
The requirements contains this:
Flask==0.10.1
gcloud==0.9.0
gunicorn==19.4.5
oauth2client==1.5.2
Flask-SQLAlchemy==2.1
PyMySQL==0.7.1
Flask-PyMongo==0.4.0
PyMongo==3.2.1
six==1.10.0
I'm sure that it is a simple error but I don't find the way to solve it.
Any help will be welcome. Thanks.
EDIT:
When I try do this with a python program this work fine:
import os
from gcloud import storage
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = 'key.json'
client = storage.Client(project='xxxxxxx')
bucket = client.get_bucket('yyyyyyy')
f = open('profile.jpg', 'rb')
blob = bucket.blob(f.name)
blob.upload_from_string(f.read(), 'image/jpeg')
url = blob.public_url
print url
Why I don't can use gcloud library without erros in a GAE app?
It seems you're following the bookshelf tutorial, but according to this line in your stacktrace:
...sandbox.py", line 948, in load_module
It hints that you're using dev_appserver.py to run the code. This isn't necessary for Managed VMs/Flexible unless you're using the compat runtime.
If this is the case, the tutorial provides correct instructions:
$ virtualenv env
$ source env/bin/activate
$ pip install -r requirements.txt
$ python main.py
(If this is not the case, please feel free to comment on this with more details about how you're running your application).

How to export fossil-scm timeline to another format

I'm using FossilSCM as my only solution for control version and tickets. So far, so good. Its self contained and minimalist approach suit my needs. But I would like to start to make some analysis on the projects history and development and a good soruce for that are the projects timelines. I could go with some html parsing trying to convert the Fossil timeline output to something else, but I would like if there is any option to export that info in other structured format (e.g JSON or similar). Web search has not produce any useful finding on that issue. Any pointers to a solution?
Thanks,
Offray
Have you tried fossil json timeline branch trunk?
fossil help json
Usage: fossil json SUBCOMMAND ?OPTIONS?
In CLI mode, the -R REPO common option is supported. Due to limitations
in the argument dispatching code, any -FLAGS must come after the final
sub- (or subsub-) command.
The commands include:
anonymousPassword
artifact
branch
cap
config
diff
dir
g
login
logout
query
rebuild
report
resultCodes
stat
tag
timeline
user
version (alias: HAI)
whoami
wiki
Run 'fossil json' without any subcommand to see the full list (but be
aware that some listed might not yet be fully implemented).
Compile json when you build from source:
./configure --json
The key for having this working is to enable json support in fossil by compiling it from sources. Current version have it disabled, so looking for any clue on it in command line help got me nothing originally. Thanks to user 2612611 for the initial clue about it. Here is the procedure I followed:
Go to https://www.fossil-scm.org/download.html and download the source tarball package.
Uncompress the previous package.
Go to the folder where you uncompressed the package (lets call it /uncompress-folder
Run ./configure --json
Run make.
Optional: Put your newly created fossil binary in your path or where the last one was installed (something like sudo mv /uncompress-folder/fossil /usr/bin/fossil.
Open the fossil repository that you want to export its history and launch the fossil web interface (fossil ui).
Go to http://localhost:8080/json/timeline/checkin?limit=0 ,where http://localhost:8080 is your local machine interface for fossil ui, and json/timeline/checkin?limit=0 is the json API call saying: json export of timeline (/json/timeline) chekins (/checkin) for all history (?limit=0). If instead of the 0 at the end of the url you put another integer you will have the last n checkins.
From command prompt you should be able to get the same result by running fossil json timeline checkin --limit=0 > timeline.json stored on the file timeline.json, instead of the web browser but in local test it didn't work.
API is still a moving target, but you can find documentation on this excellent project at [1] and a demo interface to test the parameters at [2]
[1] https://docs.google.com/document/d/1fXViveNhDbiXgCuE7QDXQOKeFzf2qNUkBEgiUvoqFN4/view?pli=1#
[2] http://fossil.wanderinghorse.net/repos/fossil-sgb/json/

"ImportError: No module named _ssl" with dev_appserver.py from Google App Engine

Background
"In the Python runtime, we've added support for the Python SSL
Library, so you can now open secure connections to remote services
such as Apple's Push Notification service."
This quote is taken from a recent post on the Google App Engine blog.
Implementation
If you want to use native python ssl, you must enable it using the libraries configuration in your application's app.yaml file where you specify the library name "ssl" . . .
These instructions are provided for developers through the Google App Engine documentation.
The following lines have been added to the app.yaml file:
libraries:
- name: ssl
version: latest
This much is in line with the advice provided through the Google App Engine documentation.
Problem
I have tried running my project in three different configurations. Two are working, and one is not.
Working ...
After I upload my application to Google App Engine, and run my project through the live server, everything works fine.
Working ...
When I run my project with manage.py runserver and include the Google App Engine SKD in my PYTHONPATH, everything works fine.
Not Working ...
However, when I run my project with dev_appserver.py, I get the following error:
ImportError at /
No module named _ssl
Request Method: GET
Request URL: http://localhost:8080/
Django Version: 1.4.3
Exception Type: ImportError
Exception Value:
No module named _ssl
Exception Location: /usr/local/lib/google_appengine_1.7.7/google/appengine/tools/devappserver2/python/sandbox.py in load_module, line 856
Python Executable: /home/rbose85/Code/venvs/appserver/bin/python
Python Version: 2.7.3
Python Path:
['/home/rbose85/Code/product/site',
'/usr/local/lib/google_appengine_1.7.7',
'/usr/local/lib/google_appengine_1.7.7/lib/protorpc',
'/usr/local/lib/google_appengine_1.7.7',
'/usr/local/lib/google_appengine_1.7.7',
'/usr/local/lib/google_appengine_1.7.7/lib/protorpc',
'/usr/local/lib/google_appengine_1.7.7',
'/usr/local/lib/google_appengine_1.7.7/lib/protorpc',
'/home/rbose85/Code/venvs/appserver/lib/python2.7',
'/home/rbose85/Code/venvs/appserver/lib/python2.7/lib-dynload',
'/usr/lib/python2.7',
'/usr/local/lib/google_appengine',
u'/usr/local/lib/google_appengine_1.7.7/lib/django-1.4',
u'/usr/local/lib/google_appengine_1.7.7/lib/ssl-2.7',
u'/usr/local/lib/google_appengine_1.7.7/lib/webapp2-2.3',
u'/usr/local/lib/google_appengine_1.7.7/lib/webob-1.1.1',
u'/usr/local/lib/google_appengine_1.7.7/lib/yaml-3.10']
Server time: Wed, 24 Apr 2013 11:23:49 +0000
For the current GAE version (1.8.0 at least until 1.8.3), if you want to be able to debug SSL connections in your development environment, you will need to tweak a little bit the gae sandbox:
add "_ssl" and "_socket" keys to the dictionary _WHITE_LIST_C_MODULES in /path-to-gae-sdk/google/appengine/tools/devappserver2/python/sandbox.py
Replace the socket.py file provided by google in /path-to-gae-sdk/google/appengine/dis27 from the socket.py file from your Python framework.
IMPORTANT: Tweaking the sandbox environment might end up with functionality working on your local machine but not in production (for example, GAE only supports outbound sockets in production). I will recommend you to restore your sandbox when you are done developing that specific part of your app.
The solution by jmg works, but instead of changing the sdk files, you could monkey patch the relevant modules.
Just put something like this on the beginning of your project setup.
# Just taking flask as an example
app = Flask('myapp')
if environment == 'DEV':
import sys
from google.appengine.tools.devappserver2.python import sandbox
sandbox._WHITE_LIST_C_MODULES += ['_ssl', '_socket']
from lib import copy_of_stdlib_socket.py as patched_socket
sys.modules['socket'] = patched_socket
socket = patched_socket
I had to use a slightly different approach to get this working in CircleCI (unsure what peculiarity about their venv config caused this):
appengine_config.py
import os
if os.environ.get('SERVER_SOFTWARE', '').startswith('Development'):
import imp
import os.path
import inspect
from google.appengine.tools.devappserver2.python import sandbox
sandbox._WHITE_LIST_C_MODULES += ['_ssl', '_socket']
# Use the system socket.
real_os_src_path = os.path.realpath(inspect.getsourcefile(os))
psocket = os.path.join(os.path.dirname(real_os_src_path), 'socket.py')
imp.load_source('socket', psocket)
I had this problem because I wasn't vendoring ssl in my app.yaml file. I know the OP did that, but for those landing here for the OP's error, it's worth making sure lines like the following are in your app.yaml file:
libraries:
- name: ssl
version: latest
Stumbled upon this thread trying to work with Apples Push notification service and appengine... I was able to get this working without any monkey patching, by adding the SSL library in my app.yaml, as recommended in the official docs, hope that helps someone else :)
I added the code to appengine_config.py as listed by Spain Train, but had to also add the following code as well to get this to work:
phttplib = os.path.join(os.path.dirname(real_os_src_path), 'httplib.py')
imp.load_source('httplib', phttplib)
You can test if ssl is available at your local system by opening a python shell and typing import ssl. If no error appears then the problem is something else, otherwise you don't have the relevant libraries installed on your system. If you are using a Linux operating system try sudo apt-get install openssl openssl-devel or the relevant instructions for your operating system to install them locally. If you are using windows, these are the instructions.

import pycrypto in dev_appserver.py on Google App Engine gives IOError

I am trying to test a Google App Engine app with dev_appserver.py, but when I run import Crypto I get the following excerpted from the IOError (i.e. No access) traceback:
...
import Crypto
...
File "/System/Library/Frameworks/Python.framework/Versions
/2.7/lib/python2.7/zipfile.py", line 867, in read
return self.open(name, "r", pwd).read()
File "/System/Library/Frameworks/Python.framework/Versions
/2.7/lib/python2.7/zipfile.py", line 882, in open
zef_file = open(self.filename, 'rb')
File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google
/appengine/tools/dev_appserver_import_hook.py", line 592, in __init__
raise IOError(errno.EACCES, 'file not accessible', filename)
IOError: [Errno 13] file not accessible: '/Library/Python/2.7/site-packages
/pycrypto-2.3-py2.7-macosx-10.7-intel.egg'
I am on Mac OS X 10.7, with Google App Engine 1.6.6 using Python 2.7.
Since PyCrypto is supported on Google App Engine, I would expect it to work on the development server.
I am aware that dev_appserver.py prevents loading external files. However, I noted that appengine/tools/dev_appserver_import_hook.py seems to have all the requisite files in the whitelist (e.g._fastmath).
Note, in app.yaml I have
libraries:
- name: pycrypto
version: latest
It seems as though I am missing something obvious but crucial. Any thoughts would be appreciated.
EDIT For more details see: https://code.google.com/p/googleappengine/issues/detail?id=12129
Yes, you have to install the third-party library yourself.
Google explains exactly which versions the provide on their platform,
so this should not be any problem.
The best way to get through this is create a virtual environment and install the pycrypto inside that. The reason your libraries inside app.yaml is not detected is most probably because you have multiple versions of python installed in your machine and the version you used to run the program might not be the same version where you installed the libraries

Resources