What files need to be distributed with zeit-pkg produced executable? - zeit-pkg

If I use pkg (https://github.com/zeit/pkg) to build an exe-file, what other files do I need to distribute with the exe?
It seems the produced .exe includes a copy of Node.js binaries. Therefore it occurred to me shouldn't I include a Node.js license-file in addition to the .exe? Or, are the copyright and license notices built into (inside) the exe enough?
Are there any other open source components that might be used BY Node.js which need attribution?
Or is it ok to distribute just the pkg-produced .exe alone? Are there any examples of such applications?

Related

C app deployment and managing dependencies in c

I'm new to c development, but I have some experience in other modern languages .so the first thing that I found hard is dependencies and deployment, while we got Gradle, maven, NuGet and pipy and... but in c I find it a bit difficult to manage this process.
for example, I have an app that should use mongo-c-library, log4c,libarchive so basically, in my development environment, I download and unzip all of the tar files of the above libraries and then followed their instruction(usually some make stuff) and installed them in order to include them in code make the code work.
I have studied a bit about CMake but I couldn't get a clear picture of how that could actually solve the problem.
at this moment my best solution is to create an install bash script and zip all dependencies unzipped folder with that install script and then send it to the production server to deploy it.
1.The first question is : is it possible to just copy and past all of .so .h and etc files in /path/of/installed/dependencies/include
and /path/of/installed/dependencies/lib in the destination server libary path.
2.if not what is the faster way?
while I was surfing the CMake source file I found that its developers just use this package source code directly.
cmxxx contains the xxx sources and headers files.
3.how can apt-get and Linux package manager help in the deployment process?
2 first question was more about dependencies. imagine we have a simple c app and we want to install(build and make a useable executable file) quickly. how it can be related to .deb packages.
1.The first question is : is it possible to just copy and past all of .so .h and etc files in /path/of/installed/dependencies/include and /path/of/installed/dependencies/lib in the destination server libary path.
Yes, technically it's possible. That's essentially what package managers do under the hood. However, doing that is a colossal mistake and screams bad practices. If that's what you want then in the very least you should look into package managers to build up your own installer, which handles this sort of stuff already for you.
2.if not what is the faster way?
You're actually asking an entirely different question, which is: how should I distribute my code, and how do I expect users to use/deploy it?
If you want users to access your source code and build it locally, as you've mentioned cmake then you just to set up your project right as cmake already supports that usecase.
If instead you just want to distribute binaries for a platform then you'll need to build and package that code. Again, cmake can also help you on that one, as cmake's cpack supports generating some types of packages like DEB packages used by Debian and Ubuntu, and which are handled by apt.
3.how can apt-get and Linux package manager help in the deployment process?
apt is designed to download and install packages from a repository.
Under the hood, apt uses DEB packages, which can be installed with dpkg.
If you're targeting a system that uses apt/deb, you can build DEB packages whenever you release a version to allow people to install their software.
You can also go a step beyond and release your DEB packages in a Personal Package Archive.
You would typically NOT download and install source packages. Instead you should generally rely on the libraries and development packages of the distribution. When building your own package you would typically just reference the packages or files that your package is dependent on. Then you build your own package and you're done. Upon installation of your package, all dependencies will automatically be resolved in an appropriate order.
What exactly needs to be done is dependent on the package management system, but generally the above statements apply. Be advised, package management apparently is pretty hard, because so many 3rd party developers screw it up.

How to install C source files and headers?

I've been given these source files and headers. In the README.md the authors explain how to launch the test executables without the need of a proper installation. It is just a make command to run. They explain how to generate the .so files. I think these latter are meant to be used if I wanted to install the APIs at a system level (the definitions should be in api.h). My question is: where should I copy the shared objects generated by the Makefile and the api.h header? I aim to write a source file from scratch where I use those APIs (e.g. crypto_sign()) just including the headers, if it is possible. Thanks
where should I copy the shared objects generated by the Makefile and the api.h header? I aim to write a source file from scratch where I use those APIs (e.g. crypto_sign()) just including the headers, if it is possible
Nowhere.
The project comes with CMake support. Use CMake in your project and just add_subdirectory the repository directory.
Anyway, if you really wish to install the library system-wide, then FHS specifies directory structure on linux. For local system administration use /usr/local/lib for local libraries .so files and /usr/local/include for local C header files.

Are valgrind suppression files included in the installation?

The valgrind tool's source code releases (http://valgrind.org/downloads/current.html) show suppression files in the root directory, such as darwin16.supp.
If I want to use these files do I have to obtain them from a source download or are they added to some location on a machine during installation?
e.g. If I have my CI install valgrind will I then be able to reference one of these supp files from some location within the system or do I need to make it available in some other way?
Analysis of Valgrind's build system, particularly its top-level Makefile.am, shows that at build time it creates one default suppression file appropriate for the platform, and that it installs that file at the installation stage. The default suppression file seems to be built from zero or more of the individual suppression files you're looking at, but the individual files are not installed.
What is or is not included in a particular pre-built Valgrind package is an entirely different question. You can probably rely on the generated default suppression file to be included, as Valgrind depends on it. For example, the Valgrind 3.12.0 packages for CentOS 7 indeed do include it. Those particular packages do not include the individual component files from the Valgrind source, which is not surprising because the build system does not install them. Other packages might nevertheless provide them.

Why can't I load GLScene as a package in C++Builder?

I am trying to use GLScene (a third-party FireMonkey component for C++Builder and Delphi). I downloaded the zip, and in it, among other things, are .bpl files for use in C++Builder. When I try to install the package in my project (a C++Builder multi-device application) under Component > Install Packages... > Add, I get this error:
My directory structure:
The _Installation directory and the Readme's are not very helpful. I have tried the other .bpl files and also tried importing all of them at once, but the same error occurs. How do I fix this?
There are several problems.
The XE3 extension is suspicious. I just downloaded and unzipped the GLScene download. The .BPLs in the CBXE3 and CBXE4 seem to be compiled for XE3 and XE4 respectively. But packages are version-dependent.
You should recompile the sources (also included) for 10.2 Tokyo. Just copy the DelphiXE4 directory to a new directory, and load the .groupproj file or the single .dpk files into the IDE. You might want to change the suffixes (currently XE3) of the packages to, say, Tokyo, in the project options, to avoid DLL (or BPL) hell.Also take a look at the pictures in the _Installation directory for the other options that must be set. Note that your paths may differ, theirs are just an example.
The compiler needs to be able to find the .bpi and .lib etc. files in order to be able to link. Set the directories with those files in the project options, as shown in the _Installation pictures.
The newly compiled .bpl files should be compiled to a directory on the Windows path, so they can be found by the system at runtime. In their example setup, (and in the _Installation pictures) that is the shown C:\Library\GLScene path.

How to build axis2c Unofficial source code

I have to create SOAP services in C using axis2C. But since axis2C is kind of not maintained properly as per this question, I have to use axis2C unofficial source code. But I could not see configure file to build the sources. How should I build this. I checked all the documentation both in here and in the github repo but no luck. All points to the axis2C official documentation. Should I copy the sources from unofficial to official code and try with the configure script in official folder ?
This project probably uses the GNU build system. In this system, ./configure is a portable shell script that is automatically generated from hand-written input files, the main file is configure.ac.
So, distribution packages of the source will contain ./configure, therefore enabling anyone on any platform with a sh-compatible shell and a make utility to build the software. But as it is a generated file, you will not find it in source-code repositories, e.g. on github.
To build a package using the GNU build system directly from source controls, you have to use the GNU autotools yourself to generate ./configure. In order to do this, install the following packages:
autoconf -- generates ./configure from ./configure.ac.
automake -- generates portable makefile templates Makefile.in from Makefile.am (those templates are then filled with values by the ./configure script to write the final Makefiles)
libtool -- tools for building dynamic shared objects on different platforms
Then, the command autoreconf -i given in the root of your source package should generate the ./configure script for you.
Note that there are packages providing a script ./autogen.sh (or similarly named). If this is there, you should call it instead of running autoreconf -i yourself, it might contain additional necessary preparation steps. ./autogen.sh will typically directly run the generated ./configure for you.

Resources