I am connecting via ssh to one of an App Engine Flex instances with .Net Core application running on it and get this:
Where does that ruby process(with 24% cpu usage) come from? Is it some internal google service?
The running Ruby process is /usr/sbin/google-fluentd. This package contains the logger agent which is the basis of Stackdriver Logging and it is written in Ruby gem as explained in this document. All in all, the Ruby process is using the CPU because the application’s logging.
As an aside, I noticed that the screenshot you uploaded contains your account-id and project-id. I strongly suggest you to re-upload the picture without this information for security and privacy reasons.
Related
I have started to try to use the Google Cloud datalab. While I understand it is a Beta product, I find the Doc's very frustrating, to say the least.
The questions here and lack of responses as well as lack of new revisions or docs over the several months the project has been available make me wonder if there is any commitment to the product?
A beginning would be a notebook that shows data ingestion from external sources to both the datastore system and the Big query system. That is a common use case. I'd like to use my own data, it would be great to have a Notebook to ingest it. It seems that should be doable without huge effort? And it would get me (and others) out of this mess trying to link the various terse docs from various products and workspaces up and working together..
in addition to a better explanation of the Git hub connection process (prior question))
For BigQuery, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/BigQuery/Importing%20and%20Exporting%20Data.ipynb
For GCS, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/Storage/Storage%20Commands.ipynb
Those are the only two storage options currently supported in Datalab (which should not be used in any event for large scale data transfers; these are for small scale transfers that can fit in memory in the Datalab VM).
For Git support, see https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/intro/Using%20Datalab%20-%20Managing%20Notebooks%20with%20Git.ipynb. Note that this has nothing to do with Github, however.
As for the low level of activity recently, that is because we have been heads down getting ready for GCP Next (which happens this coming week). Once that is over we should be able to migrate a number of new features over to Datalab and get a new public release out soon.
Datalab isn't running on your local machine. Just the presentation part is in your browser. So if you mean the browser client machine, that wouldn't be a good solution - you'd be moving data from the local machine to a VM which is running the Datalab Python code (and this VM has limited storage space), and then moving it again to the real destination. Instead, you should use the cloud console or (preferably) gcloud command line on your local machine for this.
What is meant by "staging instances" of your application in this Google Cloud Debugger doc? As per this question it seems that only the default version of an application can be used with Google Cloud Debugger. Does Google simply intend that you can use this with the default version of a separate application which you dedicate to your staging environment?
Google Cloud Debugger can be used with any module or any version of your running application. The question you referred to is obsolete.
Many projects setup a module/version or simply another project to be used as their 'staging instance' to test their code before pushing to the production instance. There is no one way to implement a 'staging instance'.
The term staging in programming mostly refers to the term development.
In the app engine, you can have multiple instances of your application. One in Production environment(one instance) and other in a Staging environment(another instance).
So production means that its a version currently running or available to the public. Staging means that it is running only for testing purposes (usually new features) before its being deployed to the public
Google saying:
You can use the Cloud Debugger on both production and staging instances of your application
Simply means that you can you the Cloud Debugger on both the Production and Staging(Development) version of your application.
I have an AngularJS site consuming an API written in Sinatra.
I'm simply trying to deploy these 2 components together on an AWS EC2 instance.
How would one go about doing that? What tools do you recommend? What structure do you think is most suitable?
Cheers
This is based upon my experience of utilizing the HashciCorp line of tools.
Manual: Launch an Ubuntu image, gem install sinatra and deploy your code. Take a snapshot for safe keeping. This one off approach is good for a development box to iron out the configuration process. Write down the commands you run and any options you may need.
Automated: Use the Packer EC2 Builder and Shell Provisioner to automate your commands from the previous manual approach. This will give you a configured AMI that can be launched.
You can apply different methods of getting to an AMI using different toolsets. However, in the end, you want a single immutable image that can be deployed. repeatedly.
I am looking for a solution for a script language like lua to use with Go application running on GAE. I have found golua and luar projects and planned to use them as the solution.
However, once I ran them on GAE environment, I encountered
"o-app-builder: Failed parsing input: parser: bad import "unsafe" in
github.com/stevedonovan/luar/luar.go"
I was confused but finally found that apparently GAE trimmed unsafe package out for a reason. Since luar and golua need the package, I think I have to find a new solution for this.
Is there any way to use luar and golua on GAE? If it is not possible, are there any alternative script languages that will run on GAE environment?
The just announced Managed VMs will allow you to run App Engine applications on Compute Engine virtual machines. This allows access to the full range of libraries, filesystems and sockets. As of today (April 20th, 2014) Managed VMs are in Limited Preview, so you'll need to fill out the form here to get access.
I have this application that will be run in a local network where a number of devices should interact with a database. I could use xampp and go for CherryPy or any other Python framework (Python is usually my choice) but it is the sum of a lot of different things: Python, Apache, MySQL... With GAE, which I have previously used in a number of applications successfully, I feel everything is neatly packed in a single box. Thay may not be true, but using the Google App Engine Launcher to create a local working copy of an app couldn't be easier.
But is it reliable? Should it be used like that? I know it's intended for development, so I'm unsure about using it as a local server in production. A few versions ago there even was this nasty bug that flushed the local datastore from time to time. But it seems that they fixed it and now data persists.
Would you recommend GAE for an application running in a local network or should I stick to LAMP (P for Python)?
Other alternative is http://code.google.com/p/appscale/.
May be you can check the the project TyphoonAE. I think it is exactly what you need.
The TyphoonAE project aims at providing a full-featured and productive
serving environment to run Google App Engine (Python) applications. It
delivers the parts for building your own scalable App Engine while
staying compatible with Google's API.