I am new to appengine and have installed google-cloud-sdk from the AUR(arch user repository) and and the google-appengine-go extention at /opt/google-cloud-sdk
thanks to this I am able to run a dev server using
dev_appserver.py app.yaml
But when using goapp serve I found
goapp: command not found
After adding /opt/google-cloud-sdk/platform/google_appengine:$PATH to my $PATH variable in zshrc and running goapp serve i now get the error.
zsh: permission denied: goapp
if sudo goapp serve
sudo: goapp: command not found
Due to this I am unable to use the updated sdk to run tests using goapp test
Thank you in advance for your help.
I had the same problem and I think I figured out how it usually works.
You download the google cloud sdk (https://cloud.google.com/sdk/downloads)
After downloading and unzipping to the folder where you want to use it you have to executet the ./google-cloud-sdk/install.sh.
Appengine is not part of the download.
It can be chosen with that install.sh script.
it will download items like appengine.
Afterwards you have a folder called
platform/google_appengine
as you mentioned yourself.
You might have to change execution permissions like
chmod 755 platform/google_appengine/go*
Add folder platform/google_appengine to the PATH if not done already.
The command "which" will not show non-executable binaries.
If you did not change permissions it will not show the path, even being within the PATH variable.
Related
I just started working with google cloud on a project (using VM instances). I connected to SSH straight from the browser.
I will have thousands of .txt files in a few directories, and the "Download file" option only allows me to download 1 file at a time.
What's the easiest way to download all those files (or the whole directory) straight to my computer? Or, what method should I use/learn?
The easiest way will be to install the Cloud SDK on your local machine (see installation instructions here) and use the gcloud compute scp command to download your files or directories. For example:
gcloud compute scp --recurse vm-instance:~/remote-directory ~/local-directory
This will copy a remote directory, ~/remote-directory, from vm-instance to the ~/local-directory directory of your local host. You'll find more details about this command usage here.
I would like to transfer file from my local machine to Google cloud instance. Here is my command:
gcloud compute scp "C:\Temp\esim_replication.ipynb" nlp-3:
Here is error message:
pscp: unable to open ./esim_replication.ipynb: permission denied
ERROR: (gcloud.compute.scp) [C:\Program Files (x86)\Google\Cloud SDK\google-clou
d-sdk\bin\sdk\pscp.exe] exited with return code [1].
This is brand new error. Everything worked fine 2 weeks ago. I am on Windows 7 locally and ran cmd as Administrator. I tried the above command with and without quotations.
Any suggestions?
Go to gcloud via ssh:
gcloud beta compute ssh --zone "your_zone" "instance_name" --project "project_name"
Give full access to your file:
sudo chmod 777 esim_replication.ipynb
In case someone finds this like I did: I had a similar error message, and what did the trick for me was using sudo: sudo gcloud compute scp [LOCAL] [REMOTE]. Apparently there was the need for updating the project ssh metadata (even though copying in the other direction worked just fine).
Encountered the same error while transferring from my local Windows desktop to Debian VM in GCP.
Changed the permission of the destination folder to 777.
gcloud compute scp source_folder/File1.txt VM_instance_name:destination_folder
It worked!
What? from a windows machine?
'sudo' is not recognized as an internal or external command,
operable program or batch file.
It turned out that I already had identically named file at destination. This caused the error. But Patrick W comment is very helpful
Trying to run gcloud init to initialize the Google App Engine Engine SDK by typing ./google-cloud-sdk/bin/gcloud init but it showed: no such file or directory or command not found. Is something wrong with my PATH? My path is:
/Users/AnneLutz/Documents/google-cloud-sdk\
If you typing ./google-cloud-sdk/bin/gcloud init and you installed Cloud SDK in /Users/AnneLutz/Documents/google-cloud-sdk, then your current directory should be /Users/AnneLutz/Documents in order for what you type to work.
That said you should add /Users/AnneLutz/Documents/google-cloud-sdk/bin to you path. To do this, assuming you are using bash you can
source /Users/AnneLutz/Documents/google-cloud-sdk/path.bash.inc
To make it so that every-time you start your shell you can add it to shell profile. For example you can add above source command at the end of ~/.bash_profile file.
It looks like you used the option to download the SDK zip file and are then trying to configure your environment with that download option. If you aren't comfortable with setting environment variables, you might want to instead try installing using the "interactive" installer, which will automate the steps for making the commands always available on your system.
The directions are here, but for Mac OS users are basically:
Enter the following at a command prompt:
curl https://sdk.cloud.google.com | bash
Restart your shell:
exec -l $SHELL
Run gcloud init to initialize the gcloud environment:
gcloud init
For many, this procedure is easier than getting everything configured manually.
i have to install a plugin on a red hat server where nagios is already configured.
the plugin to be installed is inode_checker which i got from this link
how to install inode checker in nagios
but when i opened this link i could find a shell script here.
now i am not sure whether i have to place the shell script directly on the server in the location /usr/local/nagios/libexec/ or is there any other way to do it since the other plugins available in this location seems to be different and i am not able to open them.
what am i doing wrong here?please advise.
Yes, this is a bash script so simply download and place it in the folder where you have other scripts sitting. Make sure to make it executable like
chmod +x scriptname
Then you should be able to use it in nagios by creating a Command object. You can find the location of the folder where your scripts are located by looking at the resources.cfg file which should hold something like below:
$USER1$=/usr/lib64/nagios/plugins
Hope this helps.
I depolyed an app with gcloud preview app deploy.
Is there a way to download it to an other local machine?
How can I get the files? I tried it via ssh with no success (can't access the docker dir)
UPDATE:
I found this:
gcloud preview app modules download default --version 1 --output-dir=my_dir
but it's not loading files
Log
Downloading module [default] to [my_dir/default]
Fetching file list from server...
|- Downloading [0] files... -|
I am coming to Google App Engine after two years, I see that they have made lots of improvements and added tons of features. But sadly, their documentation sometimes leaves much to be desired.
I used to download my code of the uploaded version with the appcfg.pyusing the following command.
appcfg.py download_app -A <app_id> -V <version> <output-dir>
But of course now that they have culminated everything in the gcloud shell where appcfg.py is not accessible.
However, the following method helped me to download the deployed code:
Go the console and in to the Google App Engine.
Select the project you want to work with.
Once the project's dashboard opens, Click on the top right to
open the built in console window.
Which should load the cloud shell at the bottom, now if you check appcfg.py is available to you to use in this VM.
Hence, use appcfg.py download_app -A <app_id> -V <version> <output-dir> to download the code.
Now once you have the code in the desired folder, in order to download it on your local machine - You can open the docker code editor
Now here I assumed if I rightclicked and exported the desired
folder it would work,
but instead it gave me the following error message.
{"Error":"'concurrency' must be a number but it is [object Undefined]","Message":"'concurrency' must be a number but it is [object Undefined]"}
So, I thought maybe it would play along nicely if the the folder
was an archive. Go back to the cloud shell and using whatever
utility you fancy make an archive of the folder
zip -r mycode.zip mycode
Go to the docker code editor, export and download.
Now. Of course there might many more ways do it (hopefully) but this is what made sense to me after returning to Google App Engine after 2 years.
Currently, the best way to do this is to pull the files out of Docker.
Put instance into self-managed mode, so that you can ssh into it:
$ gcloud preview app modules set-managed-by default --version 1 --self
Find the name of the instance:
$ gcloud compute instances list | grep gae-default-1
Copy it out of the Docker container, change the permissions, and copy it back to your local machine:
$ gcloud compute ssh --zone=us-central1-f gae-default-1-1234 'sudo docker cp gaeapp:/app /tmp'
$ gcloud compute ssh --zone=us-central1-f gae-default-1-1234 "chown -R $USER /tmp/app"
$ gcloud compute copy-files --zone=us-central1-f gae-default-1-1234:/tmp/app /tmp/
$ ls /tmp/app
Dockerfile
[...]
IMHO, the best option today (Aug 2018) is:
Under the main menu, under Products, go to Tools -> Cloud Build -> Build history.
There, click the ID of the build you want.
Then, in the opened window (Build details), click the source link, the download of your compressed code begins.
As simple as that.
HTH.
As of Feb 2021, you can install appengine-sdk using pip
pip install appengine-sdk
Once installed, appcfg can be used to download the app code.
python -m appcfg download_app -A app_id [ -V version ] out-dir
Nothing works. Finally I found the source code this way. Simply go to google cloud storage. choose buckets starting with us.artifacts...., select containers > images > download the latest one (look by created date). unzip after downloaded file. it will have all the deployed source code of app engine.