I have (at least) one package where my main program lives. I have another package for running tests. I :use the package of the main program in the defpackage form of the test package but that only imports the exported symbols. So I can't test all of functions, only the ones I have explicitly exported (the public API). How to I solve this issue?
You can always refer to internal (un-exported) symbols with a double-colon qualifier:
(package-name::function-name)
You can also import a symbol into your test package (regardless of whether it's been exported from the main package) with import. For instance:
(import 'package-name::function-name)
(fboundp 'function-name) ;; => t
Here's the CLHS entry on import.
Also, if you haven't read it, I recommend the Programming in the Large: Packages and Symbols chapter from Practical Common Lisp. It doesn't directly address your question, but I mention it because I've found it very helpful regrading packages and symbols in general.
Related
When building my cn1 project I hit the following exception
redacted-path\nbproject\mirah-build-cn1.xml:152:
java.lang.RuntimeException: Could not find stub for interface ItemPrice>
at ca.weblite.asm.JavaExtendedStubCompiler$2.visitClassImpl(JavaExtendedStubCompiler.java:764)
at ca.weblite.asm.JavaExtendedStubCompiler$2.visitClass(JavaExtendedStubCompiler.java:695)
at com.sun.tools.javac.tree.JCTree$JCClassDecl.accept(JCTree.java:720)
at com.sun.source.util.TreePathScanner.scan(TreePathScanner.java:68)
at com.sun.source.util.TreeScanner.scan(TreeScanner.java:91)
at com.sun.source.util.TreeScanner.scanAndReduce(TreeScanner.java:99)
at com.sun.source.util.TreeScanner.visitCompilationUnit(TreeScanner.java:120)
at ca.weblite.asm.JavaExtendedStubCompiler$2.visitCompilationUnit(JavaExtendedStubCompiler.java:278)
at com.sun.tools.javac.tree.JCTree$JCCompilationUnit.accept(JCTree.java:550)
at com.sun.source.util.TreePathScanner.scan(TreePathScanner.java:68)
at com.sun.source.util.TreeScanner.scan(TreeScanner.java:91)
at ca.weblite.asm.JavaExtendedStubCompiler.compile(JavaExtendedStubCompiler.java:887)
at ca.weblite.asm.JavaExtendedStubCompiler.compileFile(JavaExtendedStubCompiler.java:176)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:213)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:216)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:216)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:216)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:216)
at ca.weblite.asm.JavaExtendedStubCompiler.compileDirectory(JavaExtendedStubCompiler.java:195)
at ca.weblite.asm.WLMirahCompiler.compile(WLMirahCompiler.java:208)
at ca.weblite.mirah.ant.MirahcTask.execute(MirahcTask.java:158)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.GeneratedMethodAccessor276.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:99)
at org.apache.tools.ant.Task.perform(Task.java:350)
at org.apache.tools.ant.Target.execute(Target.java:449)
at org.apache.tools.ant.Target.performTasks(Target.java:470)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1388)
at org.apache.tools.ant.Project.executeTarget(Project.java:1361)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.apache.tools.ant.module.bridge.impl.BridgeImpl.run(BridgeImpl.java:261)
at org.apache.tools.ant.module.run.TargetExecutor.run(TargetExecutor.java:574)
at org.netbeans.core.execution.RunClassThread.run(RunClassThread.java:128)
The build complains about not finding stub for interface ItemPrice but this interface is not referenced from within the package or class I am generating a data_mapper for.
I further investigated the build/mirah_tmp folder. Indeed Mirah tried generating stubs for non-referenced interface in a different package.
To demonstrate this problem, I created the following simplified project (accessible on github) out of a basic Hello World cn1 template.
In this simplified project structure, I have the following three packages:
1. com.company.project
MyApplication.java
2. com.company.project.firstmodel
AuthContext.java
DataMappers.mirah
3. com.company.project.secondmodel
Address.java
The DataMappers.mirah only references AuthContext along with its package. And AuthContext does not reference in any way Address.
package com.company.project.firstmodel
data_mapper AuthContext:AuthContextMapper
On build, looking at build/mirah-tmp , it seems that stubs were also unexpectedly generated for Address.java as evident by the uploaded github build folder.
I am under the impression that mirah attempts to generate stubs for literally every file within my project (yet to verify since it's already failing due to not finding stub for interface)
Any help is appreciated.
Edit: Screenshot seen on netbeans startup after moving my app source files to a new project.
The Mirah integration works by first compiling stubs for all of the java classes in the project. It then compiles the mirah files, using the compiled java stubs for dependencies. It then does a final pass, compiling all of the java files completely. The first step, compiling the stubs, needs to compile all files in the project because it doesn't yet know which files the mirah files reference - since that happens in step 2.
If you can put together a test case that reproduces your error, I can try to troubleshoot the error further.
I know that lib/ is where we put all our library files and /bin is where we put our entrypoint for our command-line app. I know both of them are public lib/ and bin but i'm unable to understand the convention of using lib/src which according to the official docs should contain: implementation code
lib/ is the directory that contains shareable code.
It can be shared
to other top-level directories like bin/, web/, example/, test/, tool/, ... in the same package
to other packages that have this package as a dependency.
lib/src by convention contains the private implementation of the public API exposed by lib/ or lib/xxx where xxx is not src.
bin is reserved for command line apps and contains the Dart entry point scripts to execute them (the files that contain main() {...}).
In pubspec.yaml you can define executables https://www.dartlang.org/tools/pub/pubspec#executables that allows you to run scripts from bin/ by just executing foo to have dart somePath/bin/foo.dart executed (using pub global activate my_package_with_foo).
See Pub Package Layout Conventions - Implementation files
The libraries inside lib are publicly visible: other packages are free to import them. But much of a package’s code is internal implementation libraries that should only be imported and used by the package itself. Those go inside a subdirectory of lib called src. You can create subdirectories in there if it helps you organize things.
You are free to import libraries that live in lib/src from within other Dart code in the same package (like other libraries in lib, scripts in bin, and tests) but you should never import from another package’s lib/src directory. Those files are not part of the package’s public API, and they might change in ways that could break your code.
When you use libraries from within your own package, even code in src, you can (and should) still use package: to import them.
I've this code, it's a react typescript project:
import { Trans, translate, InjectedTranslateProps } from 'react-i18next';
and then:
export const Home: React.SFC<InjectedTranslateProps> = props => (
When i click on webstorm on InjectedTranslateProps, it's take me to /node_modules/#types/react-i18next/src/props.d.ts
why is taking me to #types and not to 'react-i18next' package ?
i mean all the links take me to #types and not the right folder for react-i18next in node_modules
so, everything pass by #types, and i's that library to connect it with the real package?
i don't need to see javascript code, just need to see if imports go first to the typescript file, and it's that file that do the work to import.
At runtime, your module loader will resolve the import directly to the real implementation. When the TypeScript language service sees an import, it only cares about the type information and not the implementation, so if the implementation file is a .js file and doesn't have a .ts or .d.ts alongside it, the language service will look for a .d.ts in a #types package. When you click on the import, the language service takes you to the .d.ts. As I understand it, the reasons for this behavior are:
It was easier to implement since the language service is already finding the .d.ts and not the .js.
Assuming you are writing code based on an API (which is more orthodox from a software engineering point of view than looking at the implementation, although I know that in the real world, developers often have questions that won't be answered by the API documentation), then the .d.ts is more likely to describe the API in human-readable format than the .js, which for real-world modules (especially those that have gone through some transpilation process) may be organized using any number of tricky code patterns as long as all the right elements end up defined when it is done loading.
I have a c language program that has the following structure:
src/main/c/main.c
src/main/headers/main.h
src/module_1/c/module_1.h
src/module_1/headers/module_1.h
...
src/modulen/c/module_n.c
src/module/headers/module_n.h
In the gradle script I have defined:
components {
module_1(NativeLibrarySpec)
...
module_n(NativeLibrarySpec)
main(NativeExecutableSpec){
sources{
c.lib library: "module_1", linkage: "static"
...
c.lib library: "module_n", linkage: "static"
}
The reason of using this structure is to facilitate creating unit tests for each module separately.
The problem comes with the inclusion of the .h files from the modules in the main or in other modules (there are some dependencies between them). I haven't found a way to make the headers of a module available to other modules. I would actually like to make them all "global" to the project (that is, automatically added to the source set for any module).
Thanks in advance
I do not know gradle but may give you some general advise.
I haven't found a way to make the headers of a module available to other modules.
You could make a central directory (repository) for all .h files of your project, for example src/include. The header files of each module can be placed there (in the version of the curent baseline).
I would actually like to make them all "global" to the project (that is, automatically added to the source set for any module).
The above repository can support that. However, including a header in a source file is a manual task. It is also wise not to include all headers into a source file; it may only need a few.
Ok, so here's my question:
I have a working DUB project which produces an application. I decided I also wanted a "library" configuration in my dub.json file:
"configurations": [
{
"name": "application",
"targetType": "executable"
},
{
"name": "library",
"targetType": "library",
}
],
So, now when I build the project using dub build --config=library, it produces a libXXXX.a file in the same directory.
So far, so good.
I tried using this library (actually a tiny test-function marked as extern "C" from a test C app).
So, I compile my C app using gcc -c ctest.c and then link them all together like dmd libMYLIBRARY.a ctest.o.
Now, here is the problem:
In this last step, the linker complains that many symbols are missing - all coming from external dependencies (2 object files and several .a libraries) that would normally be linked when building the project as an application.
So, the question is... how do I solve this?
I mean... Should I just link my test C app against ALL of the original dependencies (this would not make the library very portable admittedly), or is there any way around it, so that anybody could use my library, only by linking against my libXXXXX.a file?
Should I just link my test C app against ALL of the original dependencies (this would not make the library very portable admittedly),
This is the "technically correct" answer. The reason for that is because, otherwise, if the C app wanted to use another D library which had among its dependencies some package that's also a dependency in your library, and if it were linked in the same way (including all of its dependencies in its static library file), this dependency would then occur twice in the linker inputs. Even if you told the linker to discard one copy, there can be problems due to the dependency being of separate incompatible versions, etc. (Note that there is an ongoing D SAOC project to handle this.)
If you were to assume that the only D library the C program will use is your Dub package, then you could conceivably create a static library which includes all dependencies, though it would probably need to include the D standard library and runtime as well.