IHaskell for new-style cabal project? - cabal

When using IHaskell via JupyterLab, there seems to be only partial support for new-style cabal projects.
When creating a workbook in a cabal project's directory, IHaskell picks up the .ghc.environment file. So the kernel sees exactly the same package versions as used by cabal. Nice!
However, some other things work less smoothly:
Module paths: modules from the same cabal package cannot be imported, presumably because IHaskell doesn't know where cabal keeps compiled module files.
Language extensions: My cabal file has ViewPatterns under language-extensions which is not automatically enabled in the IHaskell session.
Although there is only one ihaskell package installed, this message appears on the JupyterLab console:
Disabling IHaskell widget support due to an encountered error:
The installed IHaskell support libraries do not match the instance of IHaskell you are running.
This *may* cause problems with functioning of widgets or rich media displays.
This is most often caused by multiple copies of IHaskell being installed simultaneously in your environment.
To resolve this issue, clear out your environment and reinstall IHaskell.
If you are installing support libraries, make sure you only do so once:
# Run this without first running `stack install ihaskell`
stack install ihaskell-diagrams
If you continue to have problems, please file an issue on Github.
Are there any known workarounds for these problems?

Related

C app deployment and managing dependencies in c

I'm new to c development, but I have some experience in other modern languages .so the first thing that I found hard is dependencies and deployment, while we got Gradle, maven, NuGet and pipy and... but in c I find it a bit difficult to manage this process.
for example, I have an app that should use mongo-c-library, log4c,libarchive so basically, in my development environment, I download and unzip all of the tar files of the above libraries and then followed their instruction(usually some make stuff) and installed them in order to include them in code make the code work.
I have studied a bit about CMake but I couldn't get a clear picture of how that could actually solve the problem.
at this moment my best solution is to create an install bash script and zip all dependencies unzipped folder with that install script and then send it to the production server to deploy it.
1.The first question is : is it possible to just copy and past all of .so .h and etc files in /path/of/installed/dependencies/include
and /path/of/installed/dependencies/lib in the destination server libary path.
2.if not what is the faster way?
while I was surfing the CMake source file I found that its developers just use this package source code directly.
cmxxx contains the xxx sources and headers files.
3.how can apt-get and Linux package manager help in the deployment process?
2 first question was more about dependencies. imagine we have a simple c app and we want to install(build and make a useable executable file) quickly. how it can be related to .deb packages.
1.The first question is : is it possible to just copy and past all of .so .h and etc files in /path/of/installed/dependencies/include and /path/of/installed/dependencies/lib in the destination server libary path.
Yes, technically it's possible. That's essentially what package managers do under the hood. However, doing that is a colossal mistake and screams bad practices. If that's what you want then in the very least you should look into package managers to build up your own installer, which handles this sort of stuff already for you.
2.if not what is the faster way?
You're actually asking an entirely different question, which is: how should I distribute my code, and how do I expect users to use/deploy it?
If you want users to access your source code and build it locally, as you've mentioned cmake then you just to set up your project right as cmake already supports that usecase.
If instead you just want to distribute binaries for a platform then you'll need to build and package that code. Again, cmake can also help you on that one, as cmake's cpack supports generating some types of packages like DEB packages used by Debian and Ubuntu, and which are handled by apt.
3.how can apt-get and Linux package manager help in the deployment process?
apt is designed to download and install packages from a repository.
Under the hood, apt uses DEB packages, which can be installed with dpkg.
If you're targeting a system that uses apt/deb, you can build DEB packages whenever you release a version to allow people to install their software.
You can also go a step beyond and release your DEB packages in a Personal Package Archive.
You would typically NOT download and install source packages. Instead you should generally rely on the libraries and development packages of the distribution. When building your own package you would typically just reference the packages or files that your package is dependent on. Then you build your own package and you're done. Upon installation of your package, all dependencies will automatically be resolved in an appropriate order.
What exactly needs to be done is dependent on the package management system, but generally the above statements apply. Be advised, package management apparently is pretty hard, because so many 3rd party developers screw it up.

Could not perform backup protocol version exchange, error code -1

I need to backup iphone with libimobiledevice, using ubuntu, the device is detected but going to launch the backup commands the following error is displayed:
Started "com.apple.mobilebackup2" service on port 49343.
Could not perform backup protocol version exchange, error code -1
What could it depend on?
Several Github issues have reported this problem, like this one.
Solution:
you need to use latest version of idevicebackup and libimobiledevice
Indeed, if you use Ubuntu 20.04 (for instance), the libimobiledevice package is outdated, as of now.
If that's your case, you'll have to either wait for the next Ubuntu release (22.04) or compile it from source, what may become necessary at some point after the release of Ubuntu 22.04 anyway.
Disclaimer: downside of compiling yourself is that your binaries are not managed by the package manager. You'll have to update yourself, git pulling or downloading the newest source code releases and re-compiling everything everytime. You might have to redo all of this after a distribution upgrade. Upside is that your binaries do work...
Note: compilation steps are described on the official site only for debian; I could perform them equally well on a Linux Mint 20.3 (based on Ubuntu, based on debian). OP does not mention the OS he or she uses, but debian based seem to be the only ones available for now, so what follows should work on debian based OSes.
Compilation from source, step by step:
uninstall the official package and its dependencies and:
install the build dependencies: sudo apt install build-essential checkinstall git autoconf automake libtool-bin libplist-dev libusbmuxd-dev libssl-dev usbmuxd (see "from source" here)
get libimobiledevice source code from its repo, using for instance git clone https://github.com/libimobiledevice/libimobiledevice.git. You might get to the releases page and use the latest tar.gz instead (1.3 at the moment).
also get source code of other libraries required by libimobiledevice: libplist, libimobiledevice-glue and libusbmuxd. (I also compiled usbmuxd instead of using the official package, but I am not sure it is necessary). For each one of them, you can git clone it or download and untar the latest source code release, if available.
choose a prefix directory, where libraries and binaries will go. Create it if necessary (official libimobiledevice site suggests /opt/local and I will use this too in the next steps; in order for the compilation to work, you'll have to sudo mkdir /opt/local and export PKG_CONFIG_PATH=/opt/local/lib/pkgconfig before starting the first compilation)
to compile and install, cd to the root of each git-cloned (or source-downloaded) directory (in this order: lipblist, libimobiledevice-glue, libusbmuxd and libimobiledevice, because each one depends on the previous one) and execute, in each one of them: ./autogen.sh --prefix=/opt/local, then make and finally sudo make install. (Note, the autogen line for libimobiledevice may be ./autogen.sh --prefix=/opt/local --enable-debug, as suggested here).
Having done all of this, the iphone was not mounted automatically, I had to manually run idevicepair pair and then could mount it using ifuse ./iphone_mount_point/ (do sudo apt install ifuse if necessary) and perform a backup using idevicebackup2 backup --full iphone_backup/. Read the help of idevicebackup2 for more information.

Conan cannot find a certain package for the specified settings, options and dependencies

I am working on a small C executable project using Jetbrains CLion 2019.3, MinGW 8.1, and also the Conan C/C++ Package Manager 1.21.1. I am refreshing my knowledge about C and want to learn about new tools like Conan. My main development environment is Windows, but this project is intended to be cross-platform; I would like to be able to build and run the application on Linux/Unix as well.
Since my application needs to compute signatures using HMACSHA1, I want to use the OpenSSL library, so I added the OpenSSL/1.1.1a#conan/stable package to the requires section of my conanfile.txt file, and I also created a Conan profile for MinGW that has the following options:
toolchain=$MINGW64_PATH
target_host=x86_64-w64-mingw32
cc_compiler=gcc
cxx_compiler=g++
[env]
CONAN_CMAKE_FIND_ROOT_PATH=$toolchain
CHOST=$target_host
AR=$target_host-ar
AS=$target_host-as
RANLIB=$target_host-ranlib
CC=$target_host-$cc_compiler
CXX=$target_host-$cxx_compiler
STRIP=$target_host-strip
RC=$target_host-windres
[settings]
os_build=Windows
arch_build=x86_64
# We are cross-building to Windows
os=Windows
arch=x86_64
compiler=gcc
# Adjust to the gcc version of your MinGW package
compiler.version=8.1
compiler.libcxx=libstdc++11
build_type=Release
The MINGW64_PATH points to my MinGW installation folder.
When running conan install it complains about a missing package (obviously a dependency package of OpenSSL) that does not exist:
zlib/1.2.11#conan/stable: WARN: Can't find a 'zlib/1.2.11#conan/stable' package for the specified settings, options and dependencies:
- Settings: arch=x86_64, build_type=Release, compiler=gcc, compiler.version=8.1, os=Windows
- Options: minizip=False, shared=False
- Dependencies:
- Package ID: eb34f13b437ddfd63abb1f884c4b8886c48b74cd
ERROR: Missing prebuilt package for 'zlib/1.2.11#conan/stable'
Try to build it from sources with "--build zlib"
Or read "http://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package"
Since I am a noob using Conan, I have no clue how I can fix this problem. What needs to be done to fix this issue, and also can I fix this on my own, or do I need help from the package author?
I found a description of the Missing prebuilt package error at https://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package, but it does not help much.
so I added the OpenSSL/1.1.1a#conan/stable package to the requires
That package is obsolete, you can check it on Conan Community repository. You should try openssl/1.1.1a# instead, which is maintained by the new Conan Center Index.
conan install openssl/1.1.1d#
Where is the namespace? It has been removed, take a look on more information about recipes.
Since I am a noob using Conan, I have no clue how I can fix this problem. What needs to be done to fix this issue, and also can I fix this on my own, or do I need help from the package author?
As the FAQ recommends, you should build by yourself, running the command proposed by the error message:
conan install openssl/1.1.1a# --build zlib
But I'm sure it won't be enough, you will need to build OpenSSL too. So, the best approach in your situation is:
conan install openssl/1.1.1a# --build missing
Now, Conan will build from sources anything which is not pre-built on server side.
To summarize, this is not an error, like something is broken.
When you asked for OpenSSL 1.1.1a, Conan found the recipe on Conan Center, which explain how to build OpenSSL, however it didn't find your pre-built package, following your settings and options.
Well, MingW is not used in Conan Center Index, because there is no enough demand, all supported platforms and configurations are listed in the Wiki. But this specific recipe should support MingW, since when it was part of Conan Community, MingW was present in package lists for building.
I would say, you can use 1.1.1d instead, which newer and safer than 1.1.1a.

installing hackage package with stack (not in LTS or nightly)

I'm getting started with stack, and I'm not entirely sure how to pull in a package that appears in hackage but not in the curated builds.
In particular, I'd like to pull in thrift-0.10.0. It seems I can't specify it in my project.cabal file, nor does the extra-deps section work since there is no resolver that contains this package.
When I run stack install thrift-0.10.0, I receive the following error:
While constructing the build plan, the following exceptions were encountered:
In the dependencies for thrift-0.10.0:
vector-0.11.0.0 must match ==0.10.12.2 (latest applicable is 0.10.12.2)
I'm not really sure (a) what stack install does, and (b) how to resolve the build plan since the thrift package specifies an equality (==) on the vector-0.10.12.2 package. If I try and include the relevant vector == 0.10.12.2 in my package.cabal, that also fails. Do I need to specify an earlier resolver?
I realize I have much to learn about this build tool, but in this case, my primary question, for which no documentation seems readily available is:
how do I include a hackage package in my stack build?
nor does the extra-deps section work since there is no resolver that contains this package.
extra-deps can contain any hackage package.
(a) what stack install does
It just does a build of the package + copying of executables to .local/bin
Install shouldn't be used for dependencies, instead it should be used for your local project / applications from hackage (packages with executables). There is no benefit to installing the dependencies of your project, instead they should be specified in stack.yaml
(b) how to resolve the build plan since the thrift package specifies an equality (==) on the vector-0.10.12.2 package.
It seems really ugly for the thrift package to have an (==) constraint like that. To get around it either do "allow-newer: true" in your stack.yaml (causes constraints to get ignored). Or, probably better, add `vector-0.10.12.2" to your extra-deps.
for which no documentation seems readily available is:
how do I include a hackage package in my stack build?
See this section of docs: https://docs.haskellstack.org/en/stable/GUIDE/#external-dependencies

Macports warning when installing automake: Warning: Deactivate forced. Proceeding despite dependencies

I just installed MacPorts and issued the command:
sudo port install automake
Throughout the process I see this message:
Warning: Deactivate forced. Proceeding despite dependencies.
What does it mean? Why did it happen? Is it critical and, if so, should I do anything about it?
Thanks,
gb
Based upon my own experience this is not critical and you should in most cases don't have to do anything about it. It is only a warning.
This warning is shown when an old version of a software package 'A' is uninstalled and replaced by a newer version, while another previously installed software package 'B' depends on it.
This package 'B' was built with the old version of package 'A' in mind, and might break if the newer version isn't compatible. However, for most updates this is not a problem.
(After upgrading packages you will notice that macports scans your files for linking errors, which, I believe, should detect this kind of problems and automatically fix them.)

Resources