On 05/22/2014 05:27 PM, Niall Douglas wrote:
On 22 May 2014 at 13:44, Stefan Seefeld wrote:
And here we flog that old and very dead horse of C++ package management yet again. I don't see it being quite dead yet. At least, despite all the flogging, not much has changed, and everyone (in particular package and distribution maintainers) still face the same issues. There is an argument for making everything, absolutely everything header only in the long run. Modules makes it tractable.
Same problem again: wanting to apply a single policy to every project under the sun universally. That's adding to the problems, not solving them. Downscale the scope of the problem you want to solve, and you might have better chances of succeeding.
Dave even went off and wrote code to implement a C++ package manager, it's since been abandoned. I wonder how he thinks of this experience in hindsight, and why he abandoned it. Around the same time he declared enough of the git migration tool. Enough said.
Right, that's the same issue as the above: attempting to solve everyone's problems at the same time, with a single silver bullet. That has never worked yet, and I'm trying to learn from all those failures when proposing a different approach.
A more realistic alternative is to let each project come up with their own ways to deal with this, so all users of the project have to care about is precisely the information Tom lists above: what are the (coarse-grained) dependencies, and what are the supported platforms (including compilers) ? Thing is, BlackBerry initially let each team use whatever tooling it liked for BB10. The result was a disaster, and a huge amount of later work to unify the disparate build systems under a single meta build framework.
Others have succeeded. Consider yocto/openembedded and its "meta build framework". Here is a very good discussion of the underlying architecture: http://aosabook.org/en/yocto.html. The crux of the matter is to fully encapsulate the package-specific build systems. The result is a very flexible tool that allows to generate all sorts of Linux distributions with minimum effort. Consider what it would have taken if all those projects had to first agree on a single build system...
I think letting people choose their own tooling is disasterous, unless that choice plays nice with any arbitrary other build system. cmake integrates well with any arbitrary build system, it's why I suggested it.
(At the risk of diverting into a tool-specific discussion: I very much dislike the cmake approach the same way I dislike the automake tool: both generate Makefiles from templates, but those Makefiles are impossible to manage directly, so in the end it doesn't actually matter that you relay to (GNU) make for the final build, as you take away all advantages that tool itself offers. You could just as well have added the 'build' step to cmake itself. But, I don't really want to argue about cmake or any other specific tool here. That's against the very premise of the points I'm trying to make. :-) )
bjam isn't terrible, but I have had problems getting it to play nice with surrounding build systems in the past e.g. forcing certain custom compiler toolsets from a surrounding cmake.
Your comments are well intentioned, but C++ package management is very hard, much harder than it looks. Yeah, but that is no reason to make it even harder by trying to come up with a universal solution that can be imposed on everyone. Without conformance and imposition there would be no point in joining into the Boost libraries.
Consider this in the context of my "focus" remark further below. It's what Boost's focus ("added value" in marketing slang) is where conformity and imposition matters (i.e., good and modern APIs, rigorous design review and testing, thorough API documentation, etc.), not in the choice of tools and what font the documentation uses.
We can all wish for dream solutions, but in the end what we must adopt is what is reasonable given people work on this for free during family time. In that line of thought: I think it's important to reconsider what the value (and thus focus) of Boost is. It's not the tools that are used to build the libraries, it's not the clever hacks that are used to work around compiler deficiencies, it's the new C++ APIs that are developed and made available to the larger C++ development community. +10
And thus, if someone has great skills in writing (and implementing) such APIs, why should he be forced to learn particular tools to build, test, document, package, etc. his contribution ? Why can't he just pick whatever he is already familiar with, as long as the quality of the outcome meets the expectation ? +1 in some areas not others, as per my OP.
Stefan -- ...ich hab' noch einen Koffer in Berlin...