I'm trying to use the PR Service of Yocto (fido) but each time I launch bitbake on my recipe the package get the ${PR}=r0.
local.conf
INHERIT += "buildhistory"
BUILDHISTORY_COMMIT = "1"
PRSERV_HOST = "localhost:0"
recipe.bb
SRCREV = "${AUTOREV}"
BPV = "1.1.0"
PV = "${BPV}+gitr${SRCPV}" # I know, I should use a tag instead.
SRC_BRANCH = "master"
SRC_URI = "xxx.git;protocol=ssh;branch=${SRC_BRANCH}"
This produce a package with the name xxx_1.1.0+gitrAUTOINC+e7de1c757a-r0.0.
I was expecting to get
Build #1
xxx_1.1.0+gitr0+e7de1c757a-r0.0
Build #2
xxx_1.1.0+gitr1+e7de1c757a-r1.0
And so on...
I want to use the PR as the build number. Getting something like "1.1.0.453
Where "major.minor.revision.build-number"
I see two problems here:
The PR is not incremented, even if I change the recipe or the project source code.
The name of the package is not the one I'm expecting. Why there is a "r0" before the git and why revision is "r0.0" instead of "r0" ?
Best regards,
It's not expected to increment PR, it increments on EXTENDPRAUTO (which is used in PKGR after PR).
And it's also used in SRCPV to get always increasing number in front of the git hash (everytime the hash is changed to something PRSERV haven't seen for this recipe before it will return max+1).
And you shouldn't use tags in SRCREV, because bitbake will always run git ls-remote against the remote git repository to convert tag names to git sha (which breaks when you cannot connect to the git repository e.g. when disconnected from VPN and also significantly slows down the parsing of the recipes).
Related
I can call the switch branch interface normally, but when the switch branch fails, I cannot get the specific file of the current branch failure. Viewing error info only shows "one or more conflict prevents checkout", if I want to get the detailed error file name, How to get detailed error information from the callback function or return value? (also include:Merge、Reset...)
// code
git_checkout_options opts = GIT_CHECKOUT_OPTIONS_INIT;
opts.checkout_strategy = GIT_CHECKOUT_SAFE;
git_branch_lookup(&lookup, repo, branchname, GIT_BRANCH_LOCAL);
git_revparse_single(&treeish, repo, branchbane);
if(git_checkout_tree(repo, treeish, &opts)<0)
{
/*
just return "1 conflict prevents checkout",
But I want to know which files is wrong
*/
const git_error* error = giterr_last();
}
You'll need to set a notify_cb in your git_checkout_options, and set your notify_flags to include GIT_CHECKOUT_NOTIFY_CONFLICT .
Your notify callback that you provide will be invoked with the files that are changed in your working directory and preventing the checkout from occurring.
I am not so familiar with libgit2 but it looks like you'll have to solve the conflict manually. The error that you get :
1 conflict prevents checkout
Tells you that there is a file with a conflict, but you'll have to iterate through your tree to find which one and to solve it.
The status example from libgit2.org would certainly be a good starting point for you.
My goal is to build a multilingual site using hugo. For this I would like to:
not touch the theme file
have a config file which defines the overall structure for all languages (config.toml)
have a "string" file for all languages
So for example, I would have a config.toml file like this:
[params.navigation]
brand = "out-website"
[params.navigation.links]
about = $ABOUT_NAME
services = $SERVICES_NAME
team = $TEAM_NAME
impressum = $IMPRESSUM_NAME
a english language file:
ABOUT_NAME=About
SERVICES_NAME=Services
TEAM_NAME=Team
IMPRESSUM_NAME=Impressum
and a german language file like this:
ABOUT_NAME=Über uns
SERVICES_NAME=Dienste
TEAM_NAME=Mitarbeiter
IMPRESSUM_NAME=Impressum
And then I want to compile the project for english, I do something along the line of:
hugo --theme=... --config=config.toml --config=english.toml
and german:
hugo --theme=... --config=config.toml --config=german.toml
Or in same similar way.
For this I need to use variables in the config.toml that are defined in english.toml or german.toml
My google search so far say, that I cannot use variables in toml.
So is there a different approach with which I could achieve this?
Your approach with variables is not optimal, use the tutorial below.
Multilingual sites are coming as a feature on Hugo 0.16, but I managed to build a multilingual site on current Hugo using this tutorial and some hacks.
The tutorial above requires to "have a separate domain name for each language". I managed to bypass that and to have to sites, one at root (EN), and one in the folder (/LT/).
Here are the steps I used.
Set up reliable build runner, you can use Grunt/Gulp, but I used npm scripts. I hacked npm-build-boilerplate and outsourced rendering from Hugo. Hugo is used only to generate files. This is important, because 1) we will be generating two sites; 2) hacks will require operations on folders, what is easy on npm (I'm not proficient on building custom Grunt/Gulp scripts).
Set up config_en.toml and config_de.toml in root folder as per tutorial.
Here's excerpt from my config_en.toml:
contentdir = "content/en"
publishdir = "public"
Here's excerpt from my config_lt.toml (change it to DE in your case):
contentdir = "content/lt"
publishdir = "public/lt"
Basically, I want my website.com to show EN version, and website.com/lt/ to show LT version. This deviates from tutorial and will require hacks.
Set up translations in /data folder as per tutorial.
Bonus: set up snippet on Dash:
{{ ( index $.Site.Data.translations $.Site.Params.locale ).#cursor }}
Whenever I type "trn", I get the above, what's left is to paste the key from translations file and I get the value in correct language.
Hack part. The blog posts will work fine on /lt/, but not static pages (landing pages). I use this instruction.
Basically, every static page, for example content/lt/duk.md will have slug set:
slug: "lt/duk"
But this slug will result in double URL path, for example lt/lt/duk.
I restore this using my npm scripts task using rsync and manual command to delete redundant folder:
"restorefolders": "rsync -a public/lt/lt/ public/lt/ && rm -rf public/lt/lt/",
I try to create a project in gitlab via their API,
with a request (in angular) like this :
$http.post(
"https://gitlab.com/api/v3/projects",
{private_token: <token>}
)
But then I get, as a returned data, a project document with a default_branch : null ... and then it is impossible to update the project by for example post files with the API "https://gitlab.com/api/v3/projects//repository/files" because gitlab will return me an error that I need to be in a specified branch.
Unfortunately a post of a branch with
$http.post(
"https://gitlab.com/api/v3/projects/<projectId>/repository/branches",
{
private_token: <token>
branch_name: "master"
}
)
returns me also an error... because I need to specify also a ref parameter, but it would make any sense if I don't have yet an origin master branch !
If your project is empty (contains no git objects), you should be able to post to "https://gitlab.com/api/v3/projects/repository/files", and specify a branch name, and that will become your default branch. If that isn't working, open a bug on: https://gitlab.com/gitlab-org/gitlab-ce/issues
We use some Gradle base scripts on an central point. This scripts are included with "apply from:" from a large count of scripts. This base scripts need access to files relative to the script. How can I find the location of the base scripts?
Sample for one build.gradle:
apply from: "../../gradlebase/base1.gradle"
Sample for base1.gradle
println getScriptLocation()
I'm not sure if this is considered an internal interface, but DefaultScriptHandler has a getSourceFile() method, and the current instance is accessible via the buildscript property, so you can just use buildscript.sourceFile. It's a File instance pointing at the current script
I'm still not sure if I understood the question well but You can find path of current gradle script using following piece of code:
println project.buildscript.sourceFile
It gives the full path of the script that is currently running. Is that what You're looking for?
I'm pulling it off the stack.
buildscript {
def root = file('.').toString();
// We have to seek through, since groovy/gradle introduces
// a lot of abstraction that we see in the trace as extra frames.
// Fortunately, the first frame in the build dir is always going
// to be this script.
buildscript.metaClass.__script__ = file(
Thread.currentThread().stackTrace.find { ste ->
ste.fileName?.startsWith root
}.fileName
)
// later, still in buildscript
def libDir = "${buildscript.__script__.parent}/lib"
classpath files("${libDir}/custom-plugin.jar")
}
// This is important to do if you intend to use this path outside of
// buildscript{}, since everything else is pretty asynchronous, and
// they all share the buildscript object.
def __buildscripts__ = buildscript.__script__.parent;
Compact version for those who don't like clutter:
String r = file('.').toString();
buildscript.metaClass.__script__ = file(Thread.currentThread().stackTrace*.fileName?.find { it.startsWith r })
Another solution is set a property for the location of A.gradle in your global gradle settings at: {userhome}/.gradle/gradle.properties
My current workaround is to inject the path from the calling script. This is ugly hack.
The caller script must know where the base script is located. I save this path in a property before calling:
ext.scriptPath = '../../gradlebase'
apply from: "${scriptPath}/base1.gradle"
In base1.gradle I can also access the property ${scriptPath}
You could search for this scripts in the relative path like:
if(new File(rootDir,'../path/A.gradle').exists ()){
apply from: '../path/A.gradle'
}
This solution has not been tested with 'apply from', but has been tested with settings.gradle
Gradle has a Script.file(String path) function. I solved my problem by doing
def outDir = file("out")
def releaseDir = new File(outDir, "release")
And the 'out' directory is always next to the build.gradle in which this line is called.
I use pysvn 1.7.5 to access my svn server.
If I want to copy a single file from a svn Server to my local disk there are no pysvn function implemented. But if I make a connection by https I can copy the single file, without doing a hole checkout on a directory.
def fetch_svn_file(self, file_url, local_path):
local_path = local_path.replace('\\', '/')
# Set up a HTTPS request with username/password authentication
try:
# create a password manager
password_mgr = HTTPPasswordMgrWithDefaultRealm()
# Add the username and password.
password_mgr.add_password(None, 'https://www.xyz.com', self.default_user, self.default_passwd)
opener = build_opener(HTTPBasicAuthHandler(password_mgr))
remote_file = opener.open(file_url)
content = remote_file.read()
try:
local_file = open(local_path,"w")
local_file.write(content)
local_file.close()
except IOError:
return -1
except URLError, e:
print 'URLError: "%s"' % e
return -2
return 0
The same way tortoise do it, if i drag a file from the Retro browser to my local disk, but tortoise can also copy single files in another revision. Anyone now how I can realise this in pysvn or in simple python code?? If these function is implemented by tortoise it has to be possible in pysvn too... because of the same developer team.
I already got the answer. :-)
there is a standart pysvn function called "pysvn.export". The name was so strange for a copy function....