Golang lua-script like runs on GAE - google-app-engine

I am looking for a solution for a script language like lua to use with Go application running on GAE. I have found golua and luar projects and planned to use them as the solution.
However, once I ran them on GAE environment, I encountered
"o-app-builder: Failed parsing input: parser: bad import "unsafe" in
github.com/stevedonovan/luar/luar.go"
I was confused but finally found that apparently GAE trimmed unsafe package out for a reason. Since luar and golua need the package, I think I have to find a new solution for this.
Is there any way to use luar and golua on GAE? If it is not possible, are there any alternative script languages that will run on GAE environment?

The just announced Managed VMs will allow you to run App Engine applications on Compute Engine virtual machines. This allows access to the full range of libraries, filesystems and sockets. As of today (April 20th, 2014) Managed VMs are in Limited Preview, so you'll need to fill out the form here to get access.

Related

Change runtime from Python to Go in App Engine standard environment

I have a website on AppEngine that is 99% static. It is running on Python 2.7 runtime. Now the time has come to evolve this webapp, and since I have almost none Python code in it, I'd prefer to write it in Go instead.
Can I change runtime from Python 2.7 to Go, while keeping the project intact? Specifically, I want to keep the same app-ID, the same custom domain attached to it, the same SSL certificate, and so on.
What do I have to do in order to do that? I surely have to change runtime in the app.yaml. Is there anything else?
Bonus question: will such change happen without a downtime?
I'd be grateful for any links to documentation on exactly that (swapping runtime on a live app). I can't find any.
Specify a runtime as well as a new value for version. When deployed you'll have an older version that is Python and a newer version that is Go. There won't be any downtime (same as when deploying a newer version of Python).
Rather than trusting links/docs (that may be out of date or not 100% exactly what you're trying to do), why not create a new GAE-Std project for testing purposes and try it yourself. Having a GAE-Std test project is good for testing new function (especially by other testers who won't have access to the dev environ on your laptop).
The GAE services offer complete code isolation. So it should be possible to simply deploy a new version of the service, which can be written in a different language or even use a different GAE (standard/flex) environment. Personally I didn't go through a language change, but I did go through a split of a single-service app into a multi-service one, I see no reason for which the same principles wouldn't apply.
Maybe develop the new version as a separate app first, to be able to test it properly without risking an accidental impact on the old version and only after that bring the code as a new version in the old app. That'd be using the GAE project isolation. You can, in fact, test the entire version migration as a separate app if you so desire without even touching the existing app. I am using this technique - a separate app ID - to implement a staging environment for my app, completely isolated from my production app, see How to copy / clone entire Google App Engine Project
Make sure to not switch traffic to the new version at deployment time. This keeps the app working with the old version. Test first that the new version works as expected using Targeted routing. Then maybe use Splitting traffic across multiple versions to perform A/B testing with just a small percentage of the traffic going to the new version. Finally, when happy with the results, switch all traffic to the new version.
You need to pay special attention to the app-level configs (dispatch, cron, queue, datastore indexes), shared by all services/versions. They need to be functionally equivalent in the 2 versions. The service isolation doesn't apply to them, only project isolation can ensure no impact to the old version.
There should be no need to make any change to the app ID, custom domain mapping or SSL config. The above mentioned tests should confirm that.
A few potentially interesting posts related to re-working services/modules:
Converting App Engine frontend versions to modules
Google App Engine upgrading part by part
Migrating to app engine modules, test versions first?
Advantages of implementing CI/CD environments at GAE project/app level vs service/module level?

Are Cloud Endpoints with Go Google App Engine Standard possible?

I have implemented a simple API in Go on Google App Engine Standard using just:
func init() {
http.HandleFunc("/api/v1/resource",submitResource)
}
Nothing special. However I want to port this code to using Cloud Endpoints instead in order to get the better monitoring and diagnostics.
Is it even possible with STANDARD instances or must I move to FLEXIBLE?
I can't find any documentation on this. Nor answers to this seemingly simple question. At the moment I half wish I had chosen Python because its support seems more mature. I chose Go because it seems more appropriate for API-like code because my minimal research suggested Go offered better performance.
If it is possible, are there any pointers to how please?
Only Python and Java are supported on GAE Standard via the Endpoints Frameworks. However, Go is supported on GAE Flexible.
Here is the Go GAE Flexible sample:
https://github.com/GoogleCloudPlatform/golang-samples/tree/master/endpoints/getting-started
After much research and trial and error, the simple answer is "No." - as of Dec 2016.
The longer answer is it's possible if you want to put far too much effort into making up to date libraries of your own. There is basically no support, even in alpha, for the current Google Cloud Endpoints using Go with Google App Engine Standard.
It's possible to run Go+endpoints on GAE Standard environment, however libraries might be outdated now.
Libraries and sample app can be found on github:
https://github.com/GoogleCloudPlatform/go-endpoints
I have successfully deployed "Greetings" as AppEngine SE app, and it works.

A plea for a basic Notebook example getting data into and out of Google Cloud Datalab

I have started to try to use the Google Cloud datalab. While I understand it is a Beta product, I find the Doc's very frustrating, to say the least.
The questions here and lack of responses as well as lack of new revisions or docs over the several months the project has been available make me wonder if there is any commitment to the product?
A beginning would be a notebook that shows data ingestion from external sources to both the datastore system and the Big query system. That is a common use case. I'd like to use my own data, it would be great to have a Notebook to ingest it. It seems that should be doable without huge effort? And it would get me (and others) out of this mess trying to link the various terse docs from various products and workspaces up and working together..
in addition to a better explanation of the Git hub connection process (prior question))
For BigQuery, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/BigQuery/Importing%20and%20Exporting%20Data.ipynb
For GCS, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/Storage/Storage%20Commands.ipynb
Those are the only two storage options currently supported in Datalab (which should not be used in any event for large scale data transfers; these are for small scale transfers that can fit in memory in the Datalab VM).
For Git support, see https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/intro/Using%20Datalab%20-%20Managing%20Notebooks%20with%20Git.ipynb. Note that this has nothing to do with Github, however.
As for the low level of activity recently, that is because we have been heads down getting ready for GCP Next (which happens this coming week). Once that is over we should be able to migrate a number of new features over to Datalab and get a new public release out soon.
Datalab isn't running on your local machine. Just the presentation part is in your browser. So if you mean the browser client machine, that wouldn't be a good solution - you'd be moving data from the local machine to a VM which is running the Datalab Python code (and this VM has limited storage space), and then moving it again to the real destination. Instead, you should use the cloud console or (preferably) gcloud command line on your local machine for this.

How to launch app from the web/cloud

I have developed an app in Twilio which I would like to run from the cloud. I tried learning about AWS and Google App Engine but am quite confused at this stage:
I have 2 questions which I hope to get your help on:
1) How can I store my scripts and database in the cloud? Right now, everything is running out of my local machine but I would like to transfer the scripts and db to another server and run my app at a predetermined time of day. What would be the best way to do this?
2) How can I write a batch file to run my app at a predetermined time of day in the cloud?
I understand this does not have code, but I really hope someone can point me to the right direction. I have spent lots of time trying to understand this myself but still am unsure. Tks in adv.
Update: The application is a Twilio app that makes calls to people, the script simply applies an algorithm to make calls in a certain fashion and the database is a mysql db that provides the details of people to be called.
This is quite difficult to provide an exact answer without understanding what is the application, what is the DB or what is the script that you wish to run.
I can give you a couple of ideas that might be helpful in such cases.
OpsWorks (http://aws.amazon.com/opsworks/) is a managed service for managing applications. You can define your stack (multiple layers like web, workers, DB...) and what are the chef recipes that should run in various points in the life of the instances in each layer (startup, shutdown, app deployment or stack modification..). Then you can use the ability to add instances to each layer in specific days and hours, to implement the functionality of running at predetermined times as you requested.
In such a solution you can either have some of your instances (like DB) always on, or even to bootstrap them using the chef recipes every day, with restore from snapshot on start and create snapshot on shutdown.
Another AWS service that you use is Data Pipeline (http://aws.amazon.com/datapipeline/). It is designed to move data periodically between data sources, for example from a MySQL database to Amazon Redshift, the Data warehouse service. But you can use it to trigger scripts and run random shell scripts that you wish (http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-object-shellcommandactivity.html), and schedule it to run in various conditions like every hour/day or specific times (http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-concepts-schedules.html).
A simple path here would be just to create an EC2 instance in AWS, and put the components needed to run your app there. A thorough walk through is here:
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/get-set-up-for-amazon-ec2.html
Essentially you will create an EC2 virtual machine, which you can for most purposes treat just like any other Linux server. You can install MySQL on it, copy your script there, and run it. Of course whatever container or support libraries your code requires will need to be installed as well.
You don't say what OS you are using locally, but if it is Mac or Linux, you should be able to follow almost the same process to get your script running on an EC2 instance that you used on your local machine.
As you get to know AWS, there are sophisticated services you can use for deployment, infrastructure orchestration, database services, and so on. But just to get started running a script from a virtual machine should be pretty straightforward.
I recently developed a Twilio application using Ruby on Rails for the backend and found Heroku extremely simple to setup and launch. While Heroku does cost more than AWS, I found that the time I saved using Heroku more than made up this. As an early stage startup, we wanted to spend our time developing important features, and not "wasting" time optimizing our AWS cloud.
However, while I believe Heroku is ideal for early-stage websites/startups I do believe hosting should be reevaluated once a company reaches a certain size. At some point it becomes economically viable to devote resources into optimizing an AWS cloud solution because it will be cheaper than Heroku in the long run.

Google App Engine for an application in a local network

I have this application that will be run in a local network where a number of devices should interact with a database. I could use xampp and go for CherryPy or any other Python framework (Python is usually my choice) but it is the sum of a lot of different things: Python, Apache, MySQL... With GAE, which I have previously used in a number of applications successfully, I feel everything is neatly packed in a single box. Thay may not be true, but using the Google App Engine Launcher to create a local working copy of an app couldn't be easier.
But is it reliable? Should it be used like that? I know it's intended for development, so I'm unsure about using it as a local server in production. A few versions ago there even was this nasty bug that flushed the local datastore from time to time. But it seems that they fixed it and now data persists.
Would you recommend GAE for an application running in a local network or should I stick to LAMP (P for Python)?
Other alternative is http://code.google.com/p/appscale/.
May be you can check the the project TyphoonAE. I think it is exactly what you need.
The TyphoonAE project aims at providing a full-featured and productive
serving environment to run Google App Engine (Python) applications. It
delivers the parts for building your own scalable App Engine while
staying compatible with Google's API.

Resources