I’m using delete-package as a simple way to unintern a lot of old (user specified) variable names, before loading a project back in following certain edits to the package files. (Otherwise, new values can get pushed onto the old values for those variable names defined by the end-user.) But SBCL complains when I try to reload after the deletion.
After one-time loading an init.lisp file which sets up Quicklisp, ASDF, and installs some Quicklisp libraries, I then load the project with (progn (asdf:load-system “my-project”) (in-package :my-package)), where the project definition my-project.asd file contains
(when (find-package :my-package)
(delete-package :my-package))
(defpackage :my-package
(:use :cl))
(asdf:defsystem "my-project"
…)
This all works fine on the first load, but stumbles on the second load of the project, because
*PACKAGE* can't be a deleted package:
It has been reset to #<PACKAGE "COMMON-LISP-USER">.
Where is the error coming from? Can it be fixed, retaining the same functionality?
Create a separate package to hold the user-defined symbols, say (defpackage :us) in addition to the working package. (:use :cl) is not required since the package contains only data. Install the user symbols using (in-package :us) when loading the user files, intern into :us programmatically, or direct reference with the package prefix. Access the symbols with the package prefix. The (delete-package :us) should then work.
Fundamentally SBCL can delete and recreate packages with the same name. The error message indicates that the current package (*PACKAGE*) has been deleted. Try to assure that the current package is a different package, e.g. CL-USER, when deleting your package.
Related
I am using MikTex 2.9 and Texmaker. I need to download packages, such as the sespace package, manually from https://ctan.org/. I have done that and placed the resulting folder in: C:\Users\User1\AppData\Local\Programs\MiKTeX 2.9\tex\latex
Other previously installed packages are in that location and are recognised by MikTex as present.
The setspace folder contains setspace.sty and two test files. However when I compile in Texmaker it doesn't recognise that the package is present, and wants to download it.
Does the .sty file need to go somewhere specific, and how do I find out where that might be, please?
I have a project which has three project files targeting various .net versions (mylib.net20.csproj, mylib.net40.csproj, and mylib.net40-client.csproj). I have a single nuspec file named mylib.nuspec for packaging them all together.
It might help to see the files section of the nuspec file, so here it is:
<files>
<file src="bin\net20\*.dll" target="lib\net20\" />
<file src="bin\net40-client\*.dll" target="lib\net40-client\" />
<file src="bin\net40\*.dll" target="lib\net40\" />
</files>
Right now I use a static version number in the nuspec file and I can successfully package everything by running nuget pack mylib.nuspec. However, I would like to start using $version$ so I don't have to remember to update the version number in two places.
If I simply change the version number to $version$, and build in the same way I get the predictable error:
Attempting to build package from 'mylib.nuspec'.
The replacement token 'version' has no value.
If I use $version$ and package with nuget pack mylib.net40.csproj, it is successful, but I get a package that completely ignores the nuspec file.
Q: What can I do to get the $version$ variable to work?
Technically, I could rename my nuspec file to mylib.net20.nuspec and then package with nuget pack mylib.net20.nuspec. I really don't want to rename my nuspec file in this way though.
creating a package can be done two ways
Using nuspec directly
Using your csproj file , which in fact uses your nuspec file at run time.
First case is pretty straight forward where all metadata defined in nuspec file will be considered for package creation.
nuget pack nuspecfilename.nuspec
On executing above command all hard coded values in your nuspec file will be used for package creation
Above Procedure uses nuspec file directly.
Now let come to your requirement , Using token $version$ in your nuspec files.
In order to work with tokens rather than hard coded values , we need to use csproj file while running nuget pack command.
nuget pack nameofyourprojectfile.csproj -p "configuration=Release;platform=x64"
I'll explain the command i have used ,the concept of using csproj file while generating a package is it will replace all tokens in nuspec file at the run time.
Your corresponding assembly info file to your csproj file will contain required metadata of version which will be replaced at run time.
-p is parameters to be used at run time , here i have assumed that my proj file builds in release and x64 platform so i have passed that at run time so nuget which for artifacts to bundle accordingly.
please refer Here for more details on how to create nuget packages from csproj.
But Ideally when you package from csproj file your nuspec file will be used at run time so ,technically nuspec file should have same name as that of your csproj file.
so requirement of dealing with multiple project files with different frame work may not be achieved with single nuspec file. you need to have individual nuspec accordingly to achieve this.
But if you still want to stick with same nuspec file with out renaming it or making changes to it.
i would suggest passing version at run time
nuget pack nameofnuspec.nuspec -v 1.3.4.5
-v - is version , or you can use -version as well followed by the version of your choice.
after running above command nuget package will be created with the version specified.
to check nuget package created , rename it .zip file and check ...)
I've installed Ocamlodbc using opam install odbc, but I can't work out how to build an app that uses it with ocamlbuild. The examples that come with the source don't build either.
If I put
#require "odbc";;
into my .ocamlinit, I can open Odbc_unixodbc;; in utop, but any reference to functions in that module result in a "Reference to undefined global 'Odbc_unixodbc'" error.
The following snippet also fails with an error about no implementation for "Odbc_unixodbc"
open Odbc_unixodbc
let () = ignore (Odbc_unixodbc.connect "DSN" "UID" "PWD")
Trying
open Odbc
fails with "Unbound module Odbc"
I'm building the code with
ocamlbuild -pkg odbc test.native
The generated documentation for the package seem to suggest I should be opening the "Ocamlodbc" module, but that also results in an "Unbound module" error.
TL;DR
ocamlbuild -use-ocamlfind -pkg odbc test.native
Description
-use-ocamlfind tells ocamlbuild to use ocamlfind system to find libraries on your system. Otherwise, without this flag, you need to provide flags with concrete locations and also take care of the package dependencies. So, it is a good idea to always use ocamlfind.
If this command still doesn't work for you, then make sure, that you chose the right package name. You can use ocamlfind list to look at the set of all packages available on your system.
Further reading
While the above is ok for small programs, I would suggest to use OASIS system to handle all the flags for you.
You can start from this example, adapting dependency list to your neeeds.
I'm working on an package for GNU Octave. One of the package functions uses a large, pre-computed table of data. That data is stored in a mat file which I load and unload when the function is called. The problem is that I'm unsure of a good, installation non-specific way of doing this. As near as I can tell I have to give the load command an absolute path to the mat file within the package install directory. I can see no way of getting that directory at run-time. Am I missing something or am I just going about this the wrong way?
PKG_ADD is executed when the package is loaded so you can get the path where PKG_ADD resides in with __fqp__ = fileparts (mfilename ("fullpath"));From there you can init your data.
If you want the install dir from another package it's possible to load( pkg ("local_list")) and inspect local_packages.
I also think such specific Octave questions should go to the help#octave.org mailinglist.
Do you want to distribute your new package on octave-forge?
I have a set of functions saved in a clojure file.
How do I Provide selected set of functions and then import these functions in my other file?
You have a few options.
If it’s just a file (not in a package) then in your files, you can just use load. If your file was named “fun.clj”, you would just use the name of the file without the extension:
(load "fun")
(provided fun.clj was on your classpath). Or
(load "files/fun")
if it wasn’t on your classpath but in the files directory.
Or you could use load-file and pass it the location of your file:
(load-file "./files/fun.clj")
If you wanted to namespace them (put them in a package), then you’d put the ns macro at the beginning of your file, again put it on your classpath. Then you could load it via use or require.
Here are the docs for the functions I’ve described:
load
load-file
ns
use
require
Besides "load"ing source files, you can also use the leiningen "checkout dependencies" feature. If you have two leiningen projects, say, project A requires B (provider). In the root directory of project A, create a directory called "checkouts". Inside "/checkouts" make a symbolic link to the root directory of project B.
- project A
- project.clj
- checkouts
- symlink to project B
- src
- test
in project A's project.clj, declare project B as a dependency in the :dependencies section as if it were a project in clojars.org. E.g.
(defproject project-a
:dependencies [[org.clojure/clojure "1.5.1"]
[project-b "0.1.0"]])
The thing though is, you have to go into project B and type:
lein install
That will compile project B's files into a jar file, which will appear in your ~/.m2 directory, which is kind of like your local clojars.org cache.
Once you set all this up, in your *.clj src file(s) of project A, you can "require" project B files in the normal way, as if it were from clojars.org:
(ns project-a.core
(:require [project-b.core :as pb])
This way, you can refer to functions in your project-b.core the usual way like this:
pb/myfunction1
In my opinion, this is a pretty good way to share libraries and data between Leiningen projects while trying keep each Leiningen project as independent, self-contained, and minimal as possible.
This one solved my problem and I have looked through countless other issues here so I would like to clarify.
The easiest way in emacs (on linux) is to do something like this:
java -cp "lib/*":. clojure.main -e "(do (require 'swank.swank) (swank.swank/start-repl))"
(note the "lib/*":. given to -cp
Then you can use M-x slime-connect to connect with this instance.
Don't know if it's required, but I have read that it's a good idea to use the same version of clojure, clojure-contrib and swank-clojure on both sides.
You can also setup the path inside emacs by consing the "." to swank-clojure-classpath.