On Fri, 2017-01-13 at 08:34 +0000, Niall Douglas wrote:
* Autodiscovers any tests you have and sets them up with ctest
This is interesting. But I don't think it scales. Some tests require linking in multiple sources, or require certain flags to be enabled. I really don't see how to autodiscover tests that will work for most boost libraries. Perhaps a `bcm_auto_test` function could be called by the author to do that(I would like to know what kind of conventions you follow when discovering the tests), and for libraries that have more complicated testing infastructure and add the tests manually with `bcm_add_test`.
The way it scales is that it makes use of directory structure. So if you fire .cpp files into /test, each is considered a ctest target, but if you put them into /test/somedir they get new semantics. In particular, you'll see Boost.AFIO v2 uses a /test structure which says to use Boost.KernelTest which is a new meta-test infrastructure.
That will work for the common case. I assume fail test when is based on if the name has fail in it. But how do you handle setting flags for certain tests? Compile-only tests?
* Provides out of the box CI ctest scripting which uploads results to a CDash for your project and updates your github website with the output from doxygen
This is probably useful for libraries that use doxygen. Using sphinx or mkdocs, I don't need to push changes out to github, as ReadTheDocs will update the documentation on push. Plus it will store the documentation for different tagged versions as well. This scales much nicer than using github pages.
I don't anybody who has ever used doxygen for anything serious has ever been happy with it. The good folks over at DoxyPress did an amazing job at refactoring doxygen, but in the end the fundamental design is just broke.
Problem is, and I think most would also agree here, there isn't anything better than doxygen for C++ reference docs. ReadTheDocs + Breathe generates what I find to be unusable reference docs. Formatting which suits Python well suits C++ terribly.
I don't use the doxygen at all because it can't generate good reference documentation for my libraries at all. So I just wrote it in markdown, which seems a lot easier than trying to get doxygen to generate boost- style documentation. I've never tried Breathe plugin for sphinx, but it would be nice if there was a plugin to use standardese.
Many years ago Stefan (along with Dave Abrahams) championed a new C++ docs tool which was much better than doxygen, but in the end the effort required to finish it proved difficult to make happen. I'm sure most would agree what a shame.
If anybody knows of a tool which can understand doxygen markup but generates much better reference docs, I would be *extremely* interested. The really key part is that new C++ docs tooling *needs* to grok doxygen markup. So many new tools don't, and therefore get no traction because so many C++ codebases are locked into doxygen markup.
Like I mentioned, there is standardese: https://github.com/foonathan/standardese It still a WIP, but it looks like it is shaping up nicely.
* Automatically matches git SHA in dependent git subrepos in flat dependency configurations
I am not a fan of git submodules, as it breaks downloading the source tarball files from github.
That's a long standing bug on github. And the fault of github, not of anyone else. I really wish they let you disable the tarball download and let you supply your own tarball URL. The current broken system is very confusing for users.
However, beyond superprojects, I don't think submodulues are a good way to manage dependencies. Its best to take an approach similiar to llvm. There can be a superproject that has all the components together to build, or each component can be built and installed individually.
* Automatically merges any develop commit passing all tests on all platforms according to CDash into master branch * Automatically packages up your library and publishes it to tarball, vcpkg (with ubuntu launchpad and homebrew in progress right now)
Adding support for CPack to create tarballs, debian, and fedora packages would be nice to add. However, mapping dependecies names between different package managers can be handled through convention for boost-only libraries, however, external dependencies(such as zlib) is not so easy.
I bailed out on that question and simply have each boost-lite library maintain the metadata for each package repo. i.e. it's the long way round.
Also, as the library would support standard cmake install flow, it can easily be installed with cget(and dependencies can be installed with a requirements.txt file). I find this flow preferable over trying to update system-level package managers like homebrew or vcpkg. Although, from what I've seen from vcpkg, it works very similiar to cget except it is windows-centric.
I'd call cget an external tool dependency personally. I certainly had never heard of it before you mentioning it, and I would have no idea how to install it on Windows. I am assuming it is this: https://linux.die.net/man/1/cget
No, its here: https://github.com/pfultz2/cget And can easily be installed with `pip install cget`. I believe on the latest windows, python is included by default so user won't have to install python. Furthermore, I plan on adding support for cget generating a cmake or bash script with the commands it would go throught to build and install dependencies. So if you consider python too much of an external dependecy then you could commit a script to your repo that will take care of everything.
I think this stuff comes back to David Sankel's notion of libraries being anti-social. If you're anti-social, you force library users up this hill of preconfig and build just to test out your library.
All libraries go through the steps of configure, build and install: cmake .. cmake --build . cmake --build . --target install Trying to support a non-conventional way to build the library would be anti-social. Furthermore, after I have installed a library I would expect to use it like this: find_package(YourLib) target_link_libraries(myLib ${YourLib_LIBRARIES}) Not supporting that I would consider anti-social as well. And this is the point of the cmake modules that I am writing is to be able to support this easily especially for boost libraries.
Bjarne's been railing against that for years, and it is one of his biggest bugbears with Boost, yet trying out Boost on all platforms except Windows is a simple install from that platform's package repos and therefore is a very low hill to climb. I'm therefore fond of package repositories, end users like them too.
* Libraries based on this are 100% standalone, when you clone the git repo or unpack the tarball you are 100% ready to go. Nothing else needed, not even configure and build. No arcane command line programs to run.
I don't understand this. My focus of these modules is to support the standard configure, build and install flow in cmake. Trying to hack cmake with a different conventional flow seems problematic. If users don't like this flow, or scared of typing, then external tools can be created to automate this. However, creating a different flow in cmake will just cause a dissonance with other cmake libraries.
Sorry you misunderstood me. What I meant above is that the cmake is ready to go. You don't need to run cmake generators, or run some python master cmake control script etc. The libraries themselves are header only currently, but sometime this year I'm going to write a preprocessor stage for cmake which will have cmake at dev time convert a header only library which does preprocessor metaprogramming like Outcome does into a single large preexpanded include file. That should reduce the gap between C++ Module include times and non-C++ Module include times for users, plus it means I can provide an easy playpen on gcc.godbolt etc.
I wouldn't recommend that anyone else use it yet. It is very much a work in progress, but all the above is working, and you can see it in action in proposed Boost.Outcome. It also has nil documentation.
So I tried to install your Boost.Outcome library with no luck. First, I did `cget install ned14/boost.outcome`. And that didn't work because of missing the git submodules. So, I cloned it locally with its submodules, and then did `cget install boost.outcome`. It still didn't work.
I'm not sure about this cget tool, but cmake --build . --target install should work on all platforms after you've done a *recursive* git submodule checkout. By "should" I mean install is not being CI tested yet, and it could be broken after some changes I did earlier this week so caveat emptor.
I get an error like this: -- Found boost-lite depended upon by boost--outcome at embedded include/boost/outcome/boost-lite -- CMAKE_BUILD_TYPE = CMake Error at include/boost/outcome/boost- lite/cmake/BoostLiteUtils.cmake:148 (file): file failed to open for reading (No such file or directory): /home/paul/tmp/cget/cget/build/tmp- 48d80d9e2c734b86800806772ac60260/boost.outcome/include/boost/outcome/bo ost- lite//home/paul/tmp/boost.outcome/.git/modules/include/boost/outcome/bo ost-lite/HEAD Call Stack (most recent call first): include/boost/outcome/boost-lite/cmake/BoostLiteUtils.cmake:188 (git_revision_from_path) include/boost/outcome/boost-lite/CMakeLists.txt:18 (UpdateRevisionHppFromGit) This explains how cget calls cmake here: http://cget.readthedocs.io/en/latest/src/building.html Paul