I would like to use ThrowTheSwitch Unity for my unit tests in several projects so I have this generic structure:
.
├── README.md
├── Makefile
├── src
│ ├── ...
│ └── main.c
└── test
├── vendor
│ ├── unity.c
│ └── unity.h
└── test_main.c
To include unity I could use different approach:
Git Submodules
pros: git only, one single source of truth
cons: get all the unity repository while only 3 files are required
Git Submodules with sparse checkout
pros: only get what I need
cons: need some hack to enable the sparse-checkout
Git Subtree
pros: safe if no internet available
cons: keep local copies of unity in the history
Wget from a release
pros: very simple, very effective
cons: dependency on wget and tar and internet (no local cache)
Local copy of unity
pros: most simple solution, effective
cons: not SSOT, all similar projects will have their local copy of unity, lost track of which unity version is synced.
It seems none of the above method is perfect and I don't know which one to choose. The best solution would be to have a simple package manager with a package.json file that list my dependencies and their versions.
Is there any other/better solution?
Related
I'm kind of a beginner in c, and I would like to know if there's a way of making a package like in python, for example:
.
├── main.py
└── pkg
├── file1.py
├── file2.py
└── __init__.py
how would that look in c?
I'd imagine something like:
.
├── main.c
└── pkg
├── a.c
├── a.h
├── b.c
└── b.h
is it the way? if so how would that work? how would I use the stuff inside it?
There is nothing like this exact thing in C, it does not care about packages.
When you want to distribute a "package" you can build it as library, delivering the header files and precompiled libraries (static or dynamic, per OS and architecture, sometimes per compiler)
If you want to organize a big project into packages, just go ahead like your layout - your tools won't really care. You'd include headers of such "packages" by relative path from where you use it. Compilation depends on your toolchain, but generally you have one big project with all the sources.
I'm creating CMake OBJECT libraries which I need to combine later into a single SDK library.
I can install the files and package them with Conan.
add_library(mylib_OBJECTS OBJECT
# ...
install(TARGETS mylib mylib_OBJECTS
# object libraries
OBJECTS DESTINATION ${CMAKE_INSTALL_LIBDIR})
This makes cmake install the object libraries like this:
├── conaninfo.txt
├── conanmanifest.txt
└── lib
├── libmylib.a
└── objects-Release
└── mylib_OBJECTS
├── aio.o
├── atomic.o
├── bitmap.o
What cpp_info should I set to make Conan aware of that object library, and generate the correct FindMyLib.cmake file for it?
I have a Javascript (typescript) project that I am using as a git submodule in a react project (also typescript).
This all works fine... until I make use of any node module in the git submodule. I tried using, for example, the node modules "moment" and "faker", but I am almost certain that it does not have anything to do with these specific node modules.
Importing for example "moment" in the submodule:
import moment from 'moment';
using it:
moment().format('DD/MM/YYYY');
and running the main react project leads to this error:
Failed to compile.
./src/project_core/node_modules/moment/moment.js
Line 5:37: 'define' is not defined no-undef
Line 5:50: 'define' is not defined no-undef
Removing moment().format('DD/MM/YYYY'); solves it.
If I just run some tests directly in the submodule and make use of for example "moment", it always works. I can use the node modules. It looks like something goes wrong once the main project uses the node modules in the git submodule.
This is how my react project looks like. As you can see the submodule project_core is added in the src folder of the main project.
├── build
│ ├── assets
│ └── static
│ ├── css
│ └── js
├── node_modules
├── public
│ └── assets
└── src
├── project_core
│ ├── node_modules
│ ├── src
│ └── tests
├── ui
├── api
└── utils
I know... I read everywhere that using git submodules is not recommended. I, however, just want to experiment with git submodules to learn from it.
It looks like you have a node module inside a node module. This probably confuses your bundler and produces the import errors. Try removing the package.json and node_modules from your project core subdirectory and see if the imports work correctly. If you want to make your core directory a library think about actually having it as a npm dependency instead of a submodule. You could still reference a git commit hash in the dependency.
I'm starting a new C project using CMake, so I created a directory structure very similar to the ones I use in Python (my "main" language). Although it compiles correctly, I'm not certain I'm doing it the right way. This is the current structure:
.
├── CMakeLists.txt
├── dist
│ └── # project will be built here, 'cmake ..'
├── extras
│ ├── CMakeLists.txt
│ ├── extra1
│ │ ├── CMakeLists.txt
│ │ ├── extra1.h
│ │ └── extra1.c
│ └── extra2
│ ├── CMakeLists.txt
│ ├── extra2.h
│ └── extra2.c
├── src
│ ├── CMakeLists.txt
│ ├── main.c
│ ├── module1.h
│ ├── module1.c
│ ├── module2.h
│ └── module2.c
└── test
├── CMakeLists.txt
├── test_module1.c
└── test_module2.c
Since all files are distributed across multiple directories, I had to find a way to locate the libraries present in extras and the ones I need to test in src. So, these are my CMakeLists':
./CMakeLists.txt
cmake_minimum_required(VERSION 2.8)
project(MyProject)
add_definitions(-Wall -std=c99)
# I don't know really why I need this
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_BINARY_DIR}/dist)
add_subdirectory(src)
add_subdirectory(test)
add_subdirectory(extras)
enable_testing()
add_test(NAME DoTestModule1 COMMAND TestModule1)
add_test(NAME DoTestModule2 COMMAND TestModule2)
./src/CMakeLists.txt
macro(make_library target source)
add_library(${target} ${source})
target_include_directories(${target} PUBLIC ${CMAKE_CURRENT_SOURCE_DIR})
endmacro(make_library)
make_library(Module1.o module1.c)
make_library(Module2.o module2.c)
./test/CMakeLists.txt
macro(make_test target source library)
add_executable(${target} ${source})
target_link_libraries(${target} Libtap.o ${library})
target_include_directories(${target} PUBLIC ${CMAKE_CURRENT_SOURCE_DIR})
endmacro(make_test)
make_test(TestModule1 test_module1.c Module1.o)
make_test(TestModule2 test_module2.c Module2.o)
./extras/CMakeLists.txt
# Hopefully you'll never need to change this file
foreach(subdir ${SUBDIRS})
add_subdirectory(${subdir})
endforeach()
(Finally) ./extras/libtap/CMakeLists.txt
add_library(Libtap.o tap.c)
target_include_directories(Libtap.o PUBLIC ${CMAKE_CURRENT_SOURCE_DIR})
So, now the question: the reason I'm worried is that this setup will create a 'public' library for every file I'm using, including extra libs (which are not meant to be distributed). If I have 10 libraries in src, 4 dependencies in extras (including libtap, which I'm using to test) and at least the same amount of test files, I'll end up with 24 compiled artifacts.
Is there any better way to expose libraries to linking?
I'm not compiling "main" yet, what would be the right configuration for that?
is add_definitions the right way to add flags to the compiler?
How can I make this structure more DRY?
Is there any better way to expose libraries to linking?
No, this seems fine.
You might however want to reconsider the granularity at which you create static libraries. For example, if all applications except the tests will only ever use Module1 and Module2 in combination, you might want to merge them into a single library target. Sure, the tests will link against parts of the component that they do not use, but that is a small price to pay for the decrease in build complexity.
I'm not compiling "main" yet, what would be the right configuration
for that?
There's nothing wrong with adding it to the src/CMakeLists.txt as well:
add_executable(my_main main.c)
target_link_libraries(my_main Module1.o Module2.o)
is add_definitions the right way to add flags to the compiler?
It can be used for that purpose, but might not be ideal.
Newer CMake scripts should prefer the target_compile_options command for this purpose. The only disadvantage here is that if you want to reuse the same compile options for all targets in your projects, you also have to do the same target_compile_options call for each of those. See below for tips on how to resolve that.
How can I make this structure more DRY?
First of all, unlike most program code, redundancy is often not that big an issue in build system code. The notable thing to look out for here is stuff that gets in the way of maintainability. Getting back to the common compiler options from before: Should you ever want to change those flags in the future, it is likely that you want to change them for every target. Here it makes sense to centralize the knowledge about the options: Either introduce a function at the top-level that sets the option for a given target, or store the options to a global variable.
In either case you will have to write one line per target to get the option, but it will not generate any maintenance overhead after that. As an added bonus, should you actually need to change the option for only one target in the future, you still have the flexibility to do so.
Still, take care not to overengineer things. A build system should first and foremost get things done.
If the easiest way to set it up means you copy/paste a lot, go for it! If during maintenance later it turns out that you have some real unnecessary redundancies, you can always refactor.
The sooner you accept the fact that your CMake scripts will never be as pretty as your program code, the better ;)
One small nitpick at the end: Avoid giving your target names extensions. That is, instead of
add_library(Libtap.o tap.c)
consider
add_library(Libtap tap.c)
CMake will automatically append the correct file ending depending on the target platform anyway.
I'll begin by saying that I have no prior GAE experience - I'm trying to get GAE working in IntelliJ 12 but having issues, was wondering if anyone could have a look over what I'm doing and tell me if there's anything wonky here.
Steps:
Create Java project in IntelliJ with JDK 1.7.0_51. Click Next.
Select Web Application > Google App Engine on desired techs page
with path to appengine-java-sdk-1.8.9. Click Finish.
Copy files from appengine-java-sdk-1.8.9/demos/new_project_template/ to project
directory
I now have a main directory structure like:
.
├── COPYING
├── build.xml
├── html
│ └── index.html
├── src
│ ├── META-INF
│ │ └── jdoconfig.xml
│ ├── WEB-INF
│ │ ├── appengine-web.xml
│ │ └── web.xml
│ ├── log4j.properties
│ ├── logging.properties
│ └── org
│ └── example
│ └── HelloAppEngineServlet.java
├── test.iml
└── web
├── WEB-INF
│ ├── appengine-web.xml
│ └── web.xml
└── index.jsp
Running this will run the webserver with the index.jsp in the web directory.
A few questions around this - should there be a 'web' and an 'html' directory? Why are there two WEB-INF directories and should they both be the same? Should I manually edit both of them each time I update one?
If I then follow the instructions at https://developers.google.com/appengine/docs/java/gettingstarted/creating it mentions a 'war' folder - I confess that I'm confused about the 'war', 'web' and 'html' folders - I think somewhere I've also seen referenced a 'www' folder. Do these folder names actually matter?
Following the tutorial I create a guestbook folder within the 'src' folder and make the java file. When I enter the info in the web.xml (both of them) I get an error for the line
<servlet-name>guestbook</servlet-name>
"A field of identity constraint 'web-app-servlet-name-uniqueness' matched element 'web-app', but this element does not have a simple type"
To top that off - guestbook.GuestbookServlet doesn't resolve.
There has to be a simpler way of getting this running in Intellij - can anyone help me?
Unfortunately, IntelliJ does not make this process simple. It seems like they expect you to use Maven to handle a lot of this. But this makes things a lot harder on people trying to get started with GAE on IntelliJ.
Your project is a mess right now. You have combined things that IntelliJ added for your web module with some of the files from the demo projects. So to start, remove your files and remove your web module from IntelliJ.
Now go back to the demo folder that you want to use, it should include the COPYING, build.xml, and a src and war dir. Copy all of those to your project. Then go into project structure->modules and import module. This will allow IntelliJ to detect your web module and avoid creating duplicate files and dirs.
You also need to configure your Application Server under Settings->IDE Settings->Application Servers. Add a Google App Engine Dev Server and specify your SDK directory.
Then go back to your Project Structure->Module->Dependencies and add a Library. Select the Application Server Library that you just defined. If your project uses more advanced features of GAE, you will need to go to Project Structure->Artifacts and add the libraries to your artifact.
Also for the settings on the Artifact, you need to create an 'exploded war' definition that points to your war dir.
There is likely more configuration needed... but I can't think of it all right now. Let me know what you get stuck on next and I can try to help.
IntelliJ IDEA 14 Ultimate has integrated GAE support. How comprehensive this is I'm not totally sure yet. I'll update this answer shortly with more details.
https://www.jetbrains.com/idea/features/google_app_engine.html