I know this topic has been referenced a few times already. Unfortunately I still wasn't able to find a working solution for my use case.
I can't seem to get vendoring working for my Go application on App Engine Standard. I'm using dep for vendoring.
I'm building a GraphQL API and here is my folder structure:
/GOPATH
└──/src
└──/acme
├──/app
| ├── app.yaml
| └── app.go
├──/src
| ├── /mutations/
| ├── /queries/
| └── /types/
└──/vendor/
Running goapp serve app/app.yaml on Cloud Shell fails with
INFO 2018-05-14 15:42:08,386 devappserver2.py:764] Skipping SDK update check.
INFO 2018-05-14 15:42:08,471 api_server.py:268] Starting API server at: http://0.0.0.0:47213
INFO 2018-05-14 15:42:08,600 dispatcher.py:199] Starting module "default" running at: http://0.0.0.0:8080
INFO 2018-05-14 15:42:08,601 admin_server.py:116] Starting admin server at: http://0.0.0.0:8000
ERROR 2018-05-14 15:42:13,983 go_runtime.py:181] Failed to build Go application: (Executed command: /google/go_appengine/goroot/bin/go-app-builder -app_base /home/xxx/gopath/src/acme/app -arch 6 -dynamic -goroot /google/go_appengine/goroot -gopath /home/xxx/gopath:/google/gopath -nobuild_files ^^$ -incremental_rebuild -unsafe -binary_name _go_app -extra_imports appengine_internal/init -work_dir /tmp/tmpbt8DA2appengine-go-bin -gcflags -I,/google/go_appengine/goroot/pkg/linux_amd64_appengine -ldflags -L,/google/go_appengine/goroot/pkg/linux_amd64_appengine app.go)
/home/xxx/gopath/src/acme/vendor/github.com/graphql-go/graphql/definition.go:4: can't find import: "context"
2018/05/14 15:42:09 Can't find package "context" in $GOPATH: cannot find package "context" in any of:
/home/xxx/gopath/src/acme/vendor/context (vendor tree)
/google/go_appengine/goroot/src/context (from $GOROOT)
/home/xxx/gopath/src/context (from $GOPATH)
/google/gopath/src/context
Looks like the problem might be that one vendor is not using a full dependency name for "context".
(EDIT: probably not the case though since I’m using 1.8)
Has anyone ever managed to successfully deploy on App Engine Standard using vendoring? Been pulling my hair all day on this.
Just in case anyone else struggles with this, this is the approach I've taken that seems to work for me.
Directory structure looks like this:
/GOPATH
├──/appengine
| ├──/.git/
| ├──/project1
| | ├── app.yaml
| | └── app.go
| └──/project2
| ├── app.yaml
| └── app.go
└──/src
├──/project1
| ├──/.git/
| ├──/mutations/
| ├──/queries/
| ├──/types/
| ├──/vendor/
| └──/main.go
└──/project2
├──/.git/
├──/foo/
├──/bar/
├──/vendor/
└──/main.go
Each app.go file below the appengine folder contains:
package projectX
import "projectX"
func init() {
projectX.Run()
}
Each main.go file below src/projectX contains:
package projectX
import (
// Import whatever you need
"google.golang.org/appengine"
)
func Run() {
// Do whatever you need
appengine.Main()
}
Seems that having the folder that contains app.yaml outside of $GOPATH/src is indeed necessary.
This is also not ideal for version control if you need to have each project versioned under their own git repo as opposed to having one monolyth repo. I solved this by versioning each project AND versioning the appengine folder as well separately.
I was having issues with, spent ages looking around trying to figure out why it wasn't working. This answer is a little late but hopefully will be useful for anyone else who has this issue.
I updated to use Go 1.11 which I thought wasn't supported (found one of the GCP examples on github using it here)
Set runtime: go111 in app.yaml and it will support vendoring and give you a link to a proper build log.
Now my directory structure is as follows.
/GOPATH
└──/src
├──/project1
| ├──/.git/
| ├──/whateverCode/
| ├──/vendor/
| └──/main.go
| └──/app.yaml
I assume if it supports Go 1.11 we could also use Modules for versioning but I haven't looked into that yet.
The context package will be inside $GOROOT ( not in vendor directory ). Probably your Go Appengine SDK is old and does not support Go 1.8.
Update your SDK to the latest version. The $GOROOT should be like /path/to/somewhere/goroot-1.8
Related
SonarQube server 7.9.1
SonarQube Scanner 3.2.0.1227
Java 1.8.0_121 Oracle Corporation (64-bit)
Linux 4.15.0-112-generic amd64
I'm using the sonar scanner to analyse my source code. I realised that for 2 working copies I get different results on the server and was wondering why. I compared the scanner logs for both runs and detected this info in the 2nd working copy:
INFO: SCM provider for this project is: git
The following directory structure should explain the differences between working copies, whereas the 2nd is a fork of the base repository:
└── Work1
├── .git
├── build
| └── config1
├── sonar-project.properties
└── src
└── Work2
├── .git
├── build
| └── config2
| └── .git
├── sonar-project.properties
└── src
I start the analysis of my 1st working copy from within the build folder (Work1/build/config1 $> make sonar -> cd Work1; sonar-scanner-3.2.0.1227-linux/bin/sonar-scanner...) where the scanner finds the sonar-project.properties. The analysis is executed without any issues and the report shows perfect results.
Starting the analysis from the fork - also from within its build folder - (Work2/build/config2 $> make sonar -> cd Work2; sonar-scanner-3.2.0.1227-linux/bin/sonar-scanner...) the analysis does not give any impression, that something goes wrong. The results are stored on the server, but the report contains suspicious results.
As an example the following image shows the differences in case of one source file:
Left side with a file from the base repository (Work1/src/...), right side same file but from the fork (Work2/src/...)
My impressions is, that since the 2nd analysis run log is lacking the INFO: SCM provider for this project is: git it cannot assign / associate the Work2/build/config2/.git to the sources taken from Work2/src.
Is my assumption correct ?
I was trying to set the options -Dsonar.scm.provider=git -Dsonar.projectBaseDir=Work2/ explicitely acc. the online documentation and here but with no luck.
How can change the base folder for the SCM Provider ?
in sonar-project.properties file add:
sonar.scm.disabled=true
or use
-Dsonar.scm.disabled=true
from cli
I'm trying the following tutorial.
Automatic serverless deployments with Cloud Source Repositories and Container Builder
But I got the error below.
$ gcloud container builds submit --config deploy.yaml .
BUILD
Already have image (with digest): gcr.io/cloud-builders/gcloud
ERROR: (gcloud.beta.functions.deploy) Error creating a ZIP archive with the source code for directory .: ZIP does not support timestamps before 1980
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/gcloud" failed: exit status 1
I'm now trying to solve it. Do you have any idea? My gcloud is the latest version.
$ gcloud -v
Google Cloud SDK 193.0.0
app-engine-go
app-engine-python 1.9.67
beta 2017.09.15
bq 2.0.30
core 2018.03.09
gsutil 4.28
Sample google cloud function code on the tutorial.
#index.js
exports.f = function(req, res) {
res.send("hello, gcf!");
};
#deploy.yaml
steps:
- name: gcr.io/cloud-builders/gcloud
args:
- beta
- functions
- deploy
- --trigger-http
- --source=.
- --entry-point=f
- hello-gcf # Function name
#deploying without Cloud Container Builder is fine.
gcloud beta functions deploy --trigger-http --source=. --entry-point=f hello-gcf
Container Builder tars your source folder. Maybe something in your . directory has corrupted dates? That's why moving it to the source folder fixes it.
While I don't know the reason, I found a workaround.
(1) make src directory and move index.js into it.
├── deploy.yaml
└── src
└── index.js
(2) deploy via Cloud Container Builder.
$ gcloud container builds submit --config deploy.yaml ./src
I ran into the same issue now. I could not solve it but at least I found out where it comes from.
When you locally submit your build there is a tar created and uploaded to a bucket. In this tar the folders are created at 01.01.1970:
16777221 8683238 drwxr-xr-x 8 user staff 0 256 "Jan 1 01:00:00 1970" "Jan 1 01:00:00 1970" "May 15 12:42:04 2019" "Jan 1 01:00:00 1970" 4096 0 0 test
This issue only occurs locally. If you have a github build trigger it works
I recently came across the same issue using Cloud Build (the successor to Container Builder).
What helped was adding a step to list all the files/folders in the Cloud Build environment (default directory is /workspace) to identify the problematic file/folder. You can do this by overriding the gcloud container's entrypoint to execute the ls command.
steps
- name: gcr.io/cloud-builders/gcloud
entrypoint: "ls"
args: ["-la", "/workspace"]
I'm new on Google App Engine. And, I'm getting an issue that I can't solve.
I've a very simple app (developped in Go) like this :
main/
| model/
| | user.go
| main.go
| app.yaml
These are the imports of main.go :
import (
"github.com/julienschmidt/httprouter"
"log"
"net/http"
)
My code works well when I run it locally.
But, when I try to publish it on my Google App Engine instance, I receive this error :
$ gcloud app deploy
You are about to deploy the following services:
- <MY_APP_ENGINE_URL> (from [<MY_LOCAL_YAML_PATH>])
Deploying to URL: [<MY_APP_ENGINE_URL>]
Do you want to continue (Y/n)? Y
Beginning deployment of service [default]...
Some files were skipped. Pass `--verbosity=info` to see which ones.
You may also view the gcloud log file, found at
[<LOCAL_APP_ENGINE_LOG>].
File upload done.
Updating service [default]...failed.
ERROR: (gcloud.app.deploy) Error Response: [9] Deployment contains files that cannot be compiled: Compile failed:
2017/05/27 14:48:24 go-app-builder: build timing: 5×compile (301ms total), 0×link (0s total)
2017/05/27 14:48:24 go-app-builder: failed running compile: exit status 2
main.go:4: can't find import: "github.com/julienschmidt/httprouter"
What did I do wrong ?
EDIT :
This is the content of my app.yaml file :
runtime: go
api_version: go1
handlers:
- url: /.*
script: _go_app
App Engine environment doesn't contain your dependencies, you can add an script to do a go get ... for each one but it's too hacky and Go has a solution for that, we can save our dependencies in a vendor folder on the root of our project.
Quick solution:
# Instal godep:
go get -v -u github.com/tools/godep
cd your/project/path
godep save
Now try to deploy again, you'll see a vendor folder in your project, don't remove it and add it to your git source, that folder contains all your third party dependencies like your httprouter (it's my favorite :) )
Note You can use other tools to save your dependencies
I haven't used the gcloud tool, but back in the day when goapp was the tool you had to create github.com/julienschmidt/httprouter (with the lib's source in it, of course) directly under you'r main and then deploy.
AFAIK the App Engine's go version is currently 1.6 which means that while the vendoring is on by default, it can be switched off - perhaps thats the case and thats why #Yandry Pozo's suggestion doesn't work.
Is there any way to run the compiler on an App Engine application written in Go without continue to serve the application with the development server and instead get an exit code?
Because I want to add a check in my automated tests in Travis that the application actually compiles.
To clarify: I have access to the App Engine SDK / Development Server in Travis, but I dont want to run goapp serve since it never exits.
Without actually implementing test, your solution looks pretty hacky. Why not use goapp build? Here's my .travis.yml:
language: go
go:
- 1.2.1
# Grab newest version and suck down
install:
- export FILE=go_appengine_sdk_linux_amd64-$(curl https://appengine.google.com/api/updatecheck | grep release | grep -o '[0-9\.]*').zip
- curl -O https://storage.googleapis.com/appengine-sdks/featured/$FILE
- unzip -q $FILE
# Run build and tests
script:
- ./go_appengine/goapp test ./tests; # If you are testing
- ./go_appengine/goapp build ./packagedir; # Wherever you keep your stuff
For reference on tests or just to see a project that builds
Edit:
It has been awhile, but I noticed recently that some of my builds randomly break. It is infuriating and I have occasionally hardcoded SDK values to overcome this. No more. Here's a very hacky implementation of grabbing the first featured (and thus hosted as /updatecheck fails to always return a hosted version) of the SDK desired:
export FILE=$(curl https://storage.googleapis.com/appengine-sdks/ | grep -o 'featured/go_appengine_sdk_linux_amd64-[^\<]*' | head -1)
For just the file:
export FILE=$(curl https://storage.googleapis.com/appengine-sdks/ | grep -oP '(?<=featured/)go_appengine_sdk_linux_amd64-[^\<]*' | head -1)
I solved this by adding an empty Unit test at the entry point of the application (main_test.go). This unit test will force the whole application to compile.
Then I execute all unit tests by putting goapp test ./... in the script section.
I have a repository for one of my projects that has a nested repository using the svn:externals property to update files from an external library. The problem is that I need to comment out one of the function declarations from one of the headers in this library, and carry around the modified header with the root repository.
Is there a way that this could be done, so that when the library is updated, it overrides that specific file with my version of it?
What you want sounds like a "vendor branch" scenario to me.
current repository
root
|-- myproject
|
-- mycode
|
-- library -> svn:externals to a remote svn or your own libraryproject
suggested repository
root
|-- myproject
|
-- mycode
|
-- library -> copied/branched from ^/vendor/library/current (modified at this location as much as you like)
|
-- vendor
|
--library
|
--current
|
--imported-version-1.0
|
--imported-version-1.1
How to create the layout
Create ^/vendor/library/current and DOWNLOAD the original unmodified library code into it.
svn commit ^/vendor/library/current
svn cp ^/vendor/library/current ^/vendor/library/imported-version-1.0 (tag the import)
svn cp ^/vendor/library/current ^/myproject/library (branch the code into your project)
modify ^/myproject/library and commit
How to update the library without losing your modifications
Download the latest original release of the library into ^/vendor/library/current OVERWRITING files.
svn commit ^/vendor/library/current (checks in the difference between the two library releases)
svn cp ^/vendor/library/current ^/vendor/library/imported-version-1.1 (tag the change)
cd /your-local-workspace/myproject/library (will be merge target)
svn merge ^/vendor/library/current (get all CHANGES from the upstream branch and apply them to your modified library)
svn commit
profit
Instead of branching "current" directly into your project you could branch to a "my-modified-libs" directory and make use of it via externals. This would be advised if you have multiple projects that need the same modified version of a library.
Keep in mind that vendor branches will have problems dealing with renames and deletes as those can not be tracked by overwriting. Cross-repository merging is a different and rather young topic for SVN.
If you try it out, give us feedback how it went :)
Christoph
There isn't a built in feature to help you with this.
The general practice way of dealing with this would be: Make a branch of the library you're using then make the changes you need there and use the newly created branch as the external for the root project. In my experience I've found this to be a simple and effective solution to the problem you describe.