I want to use git diff command instead of native SmartGit's diff because I find it better.
SmartGit has customizable external diff, but as far as I understand this setting expects some executable file, not any arbitrary command like git diff.
Is there a way to achieve what I want?
P.S.: images are broken right now due to Imgur's error, but they are just showing the menu and the error so it's not critical.
Related
I'm using an OpenWRT environment for code development.
Now, OpenWRT build works by first fetching a package from remote repository, extract it and later apply local patches on top of that code.
What I've noticed is that in case the patch fails to apply, the build itself not always fail, and that creates problems from entire system perspective.
I'm looking for a way to define that in case a patch is fail to apply, the entire build will fail.
Thank you all in advance!
According to the documentation, the easiest way to spot build failures would be to run make V=s 2>&1 | tee build.log | grep -i '[^_-"a-z]error[^_-.a-z]'.
If you know you are having issues with a specific package, I would specifically build those packages via make package/<pkgname>/compile V=s and see where it is failing.
Also, I would try testing the image out in qemu before flashing a real device. That way you can verify your build.
I'm using FossilSCM as my only solution for control version and tickets. So far, so good. Its self contained and minimalist approach suit my needs. But I would like to start to make some analysis on the projects history and development and a good soruce for that are the projects timelines. I could go with some html parsing trying to convert the Fossil timeline output to something else, but I would like if there is any option to export that info in other structured format (e.g JSON or similar). Web search has not produce any useful finding on that issue. Any pointers to a solution?
Thanks,
Offray
Have you tried fossil json timeline branch trunk?
fossil help json
Usage: fossil json SUBCOMMAND ?OPTIONS?
In CLI mode, the -R REPO common option is supported. Due to limitations
in the argument dispatching code, any -FLAGS must come after the final
sub- (or subsub-) command.
The commands include:
anonymousPassword
artifact
branch
cap
config
diff
dir
g
login
logout
query
rebuild
report
resultCodes
stat
tag
timeline
user
version (alias: HAI)
whoami
wiki
Run 'fossil json' without any subcommand to see the full list (but be
aware that some listed might not yet be fully implemented).
Compile json when you build from source:
./configure --json
The key for having this working is to enable json support in fossil by compiling it from sources. Current version have it disabled, so looking for any clue on it in command line help got me nothing originally. Thanks to user 2612611 for the initial clue about it. Here is the procedure I followed:
Go to https://www.fossil-scm.org/download.html and download the source tarball package.
Uncompress the previous package.
Go to the folder where you uncompressed the package (lets call it /uncompress-folder
Run ./configure --json
Run make.
Optional: Put your newly created fossil binary in your path or where the last one was installed (something like sudo mv /uncompress-folder/fossil /usr/bin/fossil.
Open the fossil repository that you want to export its history and launch the fossil web interface (fossil ui).
Go to http://localhost:8080/json/timeline/checkin?limit=0 ,where http://localhost:8080 is your local machine interface for fossil ui, and json/timeline/checkin?limit=0 is the json API call saying: json export of timeline (/json/timeline) chekins (/checkin) for all history (?limit=0). If instead of the 0 at the end of the url you put another integer you will have the last n checkins.
From command prompt you should be able to get the same result by running fossil json timeline checkin --limit=0 > timeline.json stored on the file timeline.json, instead of the web browser but in local test it didn't work.
API is still a moving target, but you can find documentation on this excellent project at [1] and a demo interface to test the parameters at [2]
[1] https://docs.google.com/document/d/1fXViveNhDbiXgCuE7QDXQOKeFzf2qNUkBEgiUvoqFN4/view?pli=1#
[2] http://fossil.wanderinghorse.net/repos/fossil-sgb/json/
From the IAR command line, it's easy to build a particular configuration, and obviously, if I want to mimic the "build all" behavior I just run my own batch file with the configs I want.
How do I handle the case where I want to build all configs, but I don't know in advance what configurations are available?
Using Jenkins, for instance, if a developer adds a configuration in the IAR IDE, it won't be included in a build until the Jenkins scripts are manually updated. I just want Jenkins to build all the configurations without caring what they are called. In the IAR GUI for setting up batches, there is an option to rebuild all so there must be something somewhere. Thanks!
You can specify * as configuration name to build everything, like this:
c:\> iarbuild myproject.ewp -build *
One solution I've implemented years ago for this problem was to read the configuration names from the .ewp file and use them for the build.
Regards
Yves
I have: gstreamer-sdk, gstreamer-ffmpeg, gstreamer-plugins-good, bad, and ugly. I googled everywhere for this error and have found nothing relevant. I'm going a little nuts trying to figure out this error:
Error received from element decodebin20: Your GStreamer installation is missing a plug-in.
Debugging information: gstdecodebin2.c(3576): gst_decode_bin_expose (): /GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20:
no suitable plugins found
It throws when I run my gstreamer program. Any ideas on why?
You may not be missing any plugins at all.
This error can be a result of just an unlinked pipeline.
Playbin2(decodebin2) got some changes that made it unable to automatically link up some pipelines that formally worked, for example transcoding a decoder to an encoder. In my case, explicitly adding the ffdec_h264 that it used to add automatically fixed it.
Relying on the Playbin2 can be very frustrating when it does not work. Using the setup below, you can create a .png diagram of the pipeline in various phases of construction. It's very helpful in finding why it isn't linking up.
export GST_DEBUG_DUMP_DOT_DIR=~/gstdump
for f in $GST_DEBUG_DUMP_DOT_DIR/*.dot ; do dot -T png $f >$f.png; done
This tool also lets you learn from it how to link up pipelines, and replace them with explicit ones that are easier to debug and less likely to break.
In Fedora, I resolved this issue removing gstreamer1-vaapi.x86_64:
sudo yum remove gstreamer1-vaapi.x86_64
uridecodebin is part of the "base" plugin set, so make sure you have gstreamer-plugins-base.
Another thing to look into is your LD_LIBRARY_PATH and GST_PLUGIN_PATH. If they point to a different GStreamer installation, it could cause problems like this. Also, if you didn't install GStreamer with a package manager, you may need to set your LD_LIBRARY_PATH to point to it (or better yet, install it with a package manager).
Pleas try to use gst-inspect command to find out if environment is correctly setup.
use gst-launch -v playbin2 uri = "your_uri_here" to find more information to trace this issue.
I have opensuse 11.4 installed. Vim is version 7. Now I normally use it to browse the linux kernel source. So I generated the cscope database inside a directory within my home folder i.e. /home/aijazbaig1/cscope_DB/ and I got 3 files viz. cscope.out, cscope.po.out and cscope.in.out besides the cscope.files file which contains a list of all the relevant files which I want to search.
Additionally I have added the following to my .bashrc:
CSCOPE_DB=/home/aijazbaig1/cscope_DB/cscope.out
export CSCOPE_DB
But when I do a :cscope show from within vim it says there are no connections. Can anyone please let me know what is going wrong.
Keen to hear from you,
This is mentioned in the comments above, but I want to make sure it's preserved in an answer.
The issue that came up for me was that vim didn't know where to look for the cscope database. When I added
cs add $CSCOPE_DB
to my .vimrc. Everything came out fine.
I figure since I've made the visit, I would try responding.
I was getting this error when searching using ctrl-space s (or any search for that matter):
E567: no cscope connections
I finally found the full solution at http://cscope.sourceforge.net/cscope_vim_tutorial.html, Step 11.
The idea is that you create a list of source files to be included in the view of cscope, generate the cscope.out in the same location, and update the export path accordingly:
find /my/project/dir -name '*.c' -o -name '*.h' > /foo/cscope.files
cscope -R -b (this may take a while depending on the size of your source)
export CSCOPE_DB=/foo/cscope.out (put this in your .bashrc/.zshrc/other-starting-script if you don't want to repeat this every time you log into the terminal)
You need to add a "cscope connection", like this in vim:
:cscope add $PATH_TO_CSCOPE.out
See :help cs for more examples.
Here's how I explore linux kernel source using cscope:
I use vim as my editor.
While standing inside the kernel source root directory, run cscope in interactive mode while recursively going through subdirectories during search for source files:
cscope -R
When run for the first time, it will generate the database file with the name: cscope.out inside the current directory. Any subsequent runs will use the already generated database.
Search for anything or any file and open it.
Set cscope tags in vim to make the :tag and CTRL-] commands search through cscope first and then ctags' tags:
:set cscopetag
Set cscope database inside current VIM session:
:cs add cscope.out
Now you can use CTRL-] and CTRL-t as you would do in ctags to navigate around! :)
I have the same issue on my PC. For now, to solve the issue:
On terminal execute: which is cscope
Open .vimrc file to edit: set csprg=/usr/bin/cscope
I ran into a similar problem with no cscope connections on ubuntu 18.04, then I discovered my .vimrc file does not load the CSCOPE_DB variable. Looked a little around and found a solution.
You can just copy this directly in to your .vimrc file.
Part of the code loads your cscope file from your directory. The keybinds are just a nice bonus.
Hope this helps.