I've got a Maven2 project, where I use assembly plugin. Everything would be just fine if created assembly file name wouldn't ended with format extension (ex. ".zip"). I specified in plugin configuration a fileName parameter and setted appendAssemblyId to false. I have already spent few hours on that problem... Any idea?
My answer is a bit of a non-answer - don't remove the format extension. If you are using a specific <format> in your assembly descriptor to produce an artifact (zip, tar.gz, etc.) there is exists no good reason to remove the extension from the file. If I'm a user of your software and download this binary, I don't want to have to guess as to what the packaging type is, I should be able to tell just by looking at the filename.
FWIW, setting appendAssemblyId to false means that the <id> of your assembly descriptor will not be included in your file name.
Now, if you really are dead set on making this a pain for you and everyone else, what you probably want to do (as with most nonsensical things people want to make maven do) is to either use the antrun plugin to rename the generated file during your maven invocation, or just simply run a shell script to rename it after your maven process has finished.
Related
MSI database contains set of tables, and I can successfully enumerate File table, which has all deployable file' meta-deta. What I need to extract is the actual contents of those files. msiexec, lessmsi, 7-zip all can do it, but I couldn't find any source/API to do it.
What I've discovered it that all other (resource) files are in Binary table, and Data field can be used to get content of those files (like icons, custom DLL etc).
Further, I found and know that Media table contains information about the .CAB file (MSI has all content embedded with <MediaTemplate EmbedCab="yes"/>. This simply means the CAB file contains the actual content. I probably need to read contents from "Structured Storage" of the .msi file.
How to extract the contents of CAB/MSI file, using native C Msi* functions?
Phil has given you the easy/simple answer but I thought I might give you a little more information since you've done some research. Checkout:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa372919(v=vs.85).aspx
This is where the structured storage is. You'll see something like Disk1.cab as the Name (PK) and binary data. The data is a CAB file with the file entry in the cab matching the File.File column. From there you can use the File.FileName column to get the short name and long name (you'll want the long name no doubt) and do a joint to the Component table to get the directory table ID.
You'll also need to recurse the directory table to build the tree of directories and know where to put the files.
Fun stuff. There's some libraries in C# that make this WAY simpler. Or just call msiexec /a as Phil says. :)
The most straightforward to extract all the files to some location is to install the product in "advertised" mode. If you do a:
msiexec /a [path to msi] TARGETDIR=[some folder]
you'll see what happens.
In C++ call MsiInstallProduct () with that command line.
You have gotten many good answers already, including the use of dark.exe from the WiX toolkit. By downloading the WiX source code you should be able to get the code you need ready-made from there. I assume you may already have done this.
Chris has already linked to the DTF code you can check, but here is a link directly to dark.exe as well: https://github.com/wixtoolset/wix3/tree/develop/src/tools/dark. I would try both. This is C#, you seem to want native.
UPDATE: Before I get to the Win32 features you can use, check out this little summary of the C# DTF features: How to programmatically read the properties inside an MSI file?
Native Win32 functions: The database functions to deal with an MSI file can be found on MSDN (this is to deal with the MSI file as a database). There are also MSI Installer Functions (used to deal with the MSI file as an actual installer).
You can certainly find good examples of native code for this with a good Google search. Have fun!
BTW: It would help with a description of the actual problem you are trying to solve as well as what you need technically. There could - as always - be less involved ways to achieve what you need. Unless you are writing a security software or malware scanner or something super-involved.
And so it is clear: WiX's dark.exe fully decompiles MSI files into WiX source files and the resource files used to build them - you can then text and binary compare the various types of content (text compare for tables, binary compare for binaries, etc...). The process to do so via command line is described in the following answer: How can I compare the content of two (or more) MSI files? (this is about comparing MSI files, but one option to do so is to decompile them - see section on dark.exe - just for reference for others who find your question).
I like to link things together so we can find content easily at a later point in time. Strictly speaking it doesn't seem necessary here, you have what you need I think but others could perhaps benefit from some further links. Here are some related links:
Extract MSI from EXE.
What is the purpose of administrative installation initiated using msiexec /a?
How do I extract files from an MSI package? (explains why you should not use 7-Zip to extract).
We have subversion to help us manage our c files (and tortoise svn as front end).
When I want to know the changes in a c module, I (of course) only get the changes in the "body" of the program, not the changes in the include files.
So I wrote a small simple programm finding out all include files of a c module, checking the last subversion change date for each include file and writing the result in an output file.
This way I get a full impression of what has changed recently in the whole module.
But the program is very simple and I would like to know, if there is a solution out there that handles this "full view" of a c module in good way.
As I work on multiple independent change requests at one time in one subversion working folder, it does not help just looking at the result of "check for modifications".
Thanks a lot in advance.
Some handwork (onetime) required, but it can work
Using (file-level type for all files in every "project" /reqiure SVN 1.6+/) create virtual (or real) folders, which will include all files for each project. After it svn log inside such folder in WC will show only related to project-files changes
I make a fair amount portable Apps for personal use and they work perfectly for the most part. I do, however, find it quite frustrating that if I run them on another computer none of my preferences are retained, as a program always looks in appData for the configuration files (which obviously don't exist on another system), so I'm wondering whether there is some kind of command line to launch an .exe with a custom .ini location.
I'm asking this firstly because Google has proved fruitless (once again) and secondly because I know it's possible - I've actually done this before, but with only one of my Apps. I accomplished this by launching the App via the command programFile.exe -f configFile.ini /s (I have also seen programFile.exe -d -f configFile.ini /s elsewhere). Naturally, I thought I would try to apply this to some other Apps but it seems it only works for that particular App.
So, is there a command/switch that I am unaware of that will do this for an .exe file?
Thanks
It really depends on each executable file you are using. Some have support for what you are looking for, and some don't. Some programs don't even use .ini files. What you should look for is if each and every program you use have support for user data custom location.
Edit
The only case where generic arguments would be avaialble for a group of EXE files is if they are generated with the same tool, which automatically provides these arguments for you. InstallShield and MSI install programs have that kind of feature (with the silent install and automated installation for instance).
I suggest you look into the tool you are using to generate your portable Apps, and see if it does provide those generic arguments for you, and how they work. If it does not have that feature, then look into the Apps you were able to specify a custom location for your INI file. Somewhere into the code, there must be a piece of code that handles the arguments you specify to the EXE file and handles them. You should share that piece of code with your other Apps, to make sure they provide the same arguments list.
I am writing a terminal-based application, but I want the user to be able to edit certain text data in a separate editor. For example, if the user chooses to edit the list of current usernames, the list should open as a text file in the user's favorite editor (vim, gedit, etc.). This will probably be an environment variable such as $MYAPPEDITOR. This is similar to the way commit messages work in svn.
Is the best way to do this to create a temporary file in /tmp, and read it in when the editor process is terminated? Or is there a better way to approach this problem?
There's already a $EDITOR variable, which is extremely standard and I have seen it working on a wide variety of unixes. Also, vi is always an option on any flavor of unix.
Debian has a sensible-editor command that invokes $EDITOR if it can, or falls back to some standard ones otherwise. Freedesktop.org has an xdg-open command that will detect which desktop environment is running and open the file with the associated application. As far as I know, sensible-editor doesn't exist on other distributions, and of course xdg-open will fail in a text-only environment, but it couldn't hurt to try as many options as possible, if you think it's important that a desktop user can see their happy shiny gedit or kate instead of scary old vi or nano. ;)
The way crontab and sudoedit work is also by making a file in /tmp. git puts it under .git, and svn actually puts it in the current directory (not /tmp).
The way svn and mercurial do it is by making a file in /tmp.
BTW, you don't need a MYAPPEDITOR, on nix there's EDITOR already present.
Since you mention svn in your post, why not just follow the same methodology? svn opens a file with a particular name with whatever $EDITOR (or $SVN_EDITOR) contains - this might actually require some work on your part; determining the parameters to each supported editor. In either case, you have the name of the file that was saved (or the error code of the application if something failed) and you can just use that.
I'm doing some Linux kernel development, and I'm trying to use Netbeans. Despite declared support for Make-based C projects, I cannot create a fully functional Netbeans project. This is despite compiling having Netbeans analyze a kernel binary that was compiled with full debugging information. Problems include:
files are wrongly excluded: Some files are incorrectly greyed out in the project, which means Netbeans does not believe they should be included in the project, when in fact they are compiled into the kernel. The main problem is that Netbeans will miss any definitions that exist in these files, such as data structures and functions, but also miss macro definitions.
cannot find definitions: Pretty self-explanatory - often times, Netbeans cannot find the definition of something. This is partly a result of the above problem.
can't find header files: self-explanatory
I'm wondering if anyone has had success with setting up Netbeans for Linux kernel development, and if so, what settings they used. Ultimately, I'm looking for Netbeans to be able to either parse the Makefile (preferred) or extract the debug information from the binary (less desirable, since this can significantly slow down compilation), and automatically determine which files are actually compiled and which macros are actually defined. Then, based on this, I would like to be able to find the definitions of any data structure, variable, function, etc. and have complete auto-completion.
Let me preface this question with some points:
I'm not interested in solutions involving Vim/Emacs. I know some people like them, but I'm not one of them.
As the title suggest, I would be also happy to know how to set-up Eclipse to do what I need
While I would prefer perfect coverage, something that only misses one in a million definitions is obviously fine
SO's useful "Related Questions" feature has informed me that the following question is related: https://stackoverflow.com/questions/149321/what-ide-would-be-good-for-linux-kernel-driver-development. Upon reading it, the question is more of a comparison between IDE's, whereas I'm looking for how to set-up a particular IDE. Even so, the user Wade Mealing seems to have some expertise in working with Eclipse on this kind of development, so I would certainly appreciate his (and of course all of your) answers.
Cheers
Eclipse seems to be pretty popular for Linux kernel development:
http://cdtdoug.blogspot.com/2008/12/linux-kernel-debugging-with-cdt.html
http://jakob.engbloms.se/archives/338
http://revver.com/video/606464/debugging-the-linux-kernel-using-eclipsecdt-and-qemu/
I previously wrote up an answer. Now I come up with all the details of the solution and would like to share it. Unfortunately stackoverflow does not allow me to edit the previous answer. So I write it up in this new answer.
It involves a few steps.
[1] The first step is to modify linux scripts to leave dep files in. By default after using them in the build, those dep files are removed. Those dep files contains exact dependency information about which other files a C file depends. We need them to create a list of all the files involved in a build. Thus, modify files under linux-x.y.z/scripts to make them not to remove the dep files like this:
linux-3.1.2/scripts
Kbuild.include: echo do_not_rm1 rm -f $(depfile);
Makefile.build: echo do_not_rm2 rm -f $(depfile);
The other steps are detailed in my github code project file https://github.com/minghuascode/Nbk/blob/master/note-nbkparse. Roughly you do:
[2] Configure with your method of configuration, but be sure use "O=" option to build the obj files into a separate directory.
[3] Then use the same "O=" option and "V=1" option to build linux, and save make output into a file.
[4] Run my nbkparse script from the above github project. It does:
[4.1] Read in the make log file, and the dep files. Generate a mirroring command.
[4.2] Run the mirroring command to hard-link the relevant source files into a separate tree, and generate a make-log file for NetBeans to use.
Now create a NetBeans C project using the mirrored source tree and the generated log file. NetBeans should be able to resolve all the kernel symbols. And you will only see the files involved in the build.
The Eclipse wiki has a page about this: HowTo use the CDT to navigate Linux kernel source
I have been doing some embedded linux development. Including kernel module development and have imported the entire linux kernel source code into Eclipse, as a separate project. I have been building the kernel itself outside of Eclipse(so far), but I don't any reason why I shouldn't be able to set up the build environment within Eclipse to build the kernel. For my projects, as long as I setup the PATH properties to point to the appropriate linux source include directories, it seems to be pretty good about name completion for struct fields, etc.
I can't really comment, on if it is picking up the correct defines and not greying out the correspond sections, as I haven't really paid to much attention to the files within the kernel itself.(so far)
I was also wondering about using Netbeans as a linux 'C' IDE, as I do prefer Netbean's for Java GUI development.
I think this would work (done each step for various projects):
[1] Modify kernel build scripts to leave .d files. By default they are removed.
[2] Log the build process to a file.
[3] Write a script to parse the build log.
[3.1] From the build log, you know every .c files.
[3.2] From the .c file, you know which is the corresponding .d file.
[3.3] Look into .d files to find out all the included .h files.
[3.4] Form a complete .c and .h file list.
[4] Now create a new dir, and use "ln -s" or "ln" to pick files of interest.
Now, create a Netbeans project for existing source code in the [4].
Configure code assistance to use make-log file. You should see
exactly the effective source code as when you build it at [2].
Some explanations to the above steps:
At [2], do a real build so the log file contains the exact files and flags of interest.
Later netbeans will be able to use the exact flags to parse.
At [4], pick only the files you want to see. Incorporating the whole kernel tree into netbeans will be unpractical.
There is a trick to parsing .d files: Many of the depended items are not real paths to a .h file, they are a modified entry for part of the linux config sections in the auto config file. You may need to reverse the modification to figure out which is the real header file.
Actually there is a topic on netbeans site. This is the discussion url: http://forums.netbeans.org/ntopic3075.html . And there is a wiki page linked from the discussion: wiki.netbeans.org/CNDLinuxKernel . Basically it asks you to prefix make with CFLAGS="-g3 -gdwarf-2" .
I found this link very helpful in setting up proper indexing in Eclipse. It requires running a script to alter Eclipse environment to match your kernel options, in my case
$ autoconf-to-eclipse.py ./include/generated/autoconf.h .
An illustrated guide to indexing the linux kernel in eclipse