On 8/25/2014 11:59 PM, pmenso57@comcast.net wrote:
----- Original Message -----
From: "Edward Diener"
On 8/25/2014 6:04 PM, Rodrigo Madera wrote:
On Mon, Aug 25, 2014 at 3:12 PM, Edward Diener
wrote: The problem of merging VMD into Boost PP is twofold.
First Boost PP is still Paul Mensonides library and his philosophy of safety, which I totally respect, is part of Boost PP.
Secondly, as you have already said and realize, the philosophy of VMD is that preprocessing errors can occur if the functionality in VMD is misused by a programmer. In other words VMD depends on the constraints discussed in the library to work correctly. But within those constraints it does work and adds a great deal of flexibility to macro metaprogramming and the design of macro input.
Could you please provide a link to this mentioned safety philosophy of Boost PP?
There is no link I can give, other than to say it is my interpretation of the philosophy of Boost PP from looking at code and working/talking privately with Paul.
I do not believe that passing incorrrect data to Boost PP functionality will ever produce a preprocessing error. It may produce an undefined result but that is then the programmer's problem misusing some functionality of Boost PP.
It can produce preprocessing errors. E.g. nothing stops the user from writing BOOST_PP_CAT(+,-) which technically should produce a preprocessing error.
Thanks for pointing this out.
Another example of the "safety philosophy" of Boost PP is that Paul was well aware that he had written a better variadic macro to test for emptiness, as well as that the Boost PP library has an undocumented non-variadic test for emptiness used internally. He did not want the current non-variadic version to be documented and he made no move to add the variadic version to Boost PP.
My opinion WRT the above is as follows.
Unconstrained arguments to macros are sequences of preprocessing tokens and whitespace separations (hereafter called "token sequences"). That token sequence may be empty. Which is no different than passing an empty vector, empty range, or empty string to a function at runtime. A preprocessor tuple, for example, can hold arbitrary unconstrained data (aside from those tokens which would interfere with actual parameter delineation such as commas and unbalanced parentheses--which I call "pathological input"). Therefore, the library must interpret (,,) as a ternary tuple, (,) as a binary tuple, and () as a unary tuple, and there is no machinery in place for a rogue value representing a nullary tuple. For example, if one is processing a tuple of cv-qualifiers and popping elements off the fr ont as it processes, that tuple might start as (const volatile, volatile, const,) which is a four-element tuple. The front is popped, and it becomes a ternary tuple (volatile, const,). The front is popped again, and it becomes a binary tuple (const,). After a final pop, it becomes a unary tuple () _not_ a nullary tuple. So, with something like MACRO(a, b,, d), the third argument is not elided, it is supplied as an empty tokens sequence.
One *could* supply a rogue value representing a nullary tuple (and, in fact, chaos-pp does this) and then design all of the tuple-related machinery around this, but this is not in place in the Boost pp-lib.
Even if it was logical to consider () to be a nullary tuple, there is no way to detect emptiness on unconstrained (but not pathological) input. It is flat-out impossible. There are a variety of ways you can *almost* do it. If the input is constrained such that those "almost" cases are removed, only then can you do it, but then you must add constraints to (e.g.) tuple elements which you cannot add to such a general-purpose container.
With all of the above said, you can have optional arguments to a macro provided there is at least one non-optional argument. For example, if you have MACRO(a, b, c) where you want c to be optional, you change the definition to MACRO(a, ...), you detect whether the __VA_ARGS__ is unary or binary (which can be detected), and branch in some way based on that result. Chaos does this too in a number of places.
Once you start constraining data, you can detect lots of things. The basic problems are that (1) you cannot reasonably constrain the data held in general-purpose containers and (2) you cannot change the way arguments to a macro are delineated by the preprocessor.
You can "constrain" data if you document that input "x" must be one of certain preprocessor token types ( including possible emptiness ), or Boost PP data types, in a particular macro input situation in order for the macro to work correctly. That is the sort of input "constraint" I deal with in VMD. In other words it is up to the macro designer to document what choices the input data could be and if the data does not follow that "constraint" the macro "will not work". In VMD the macro "will not work" sometimes means that a preprocessing error will occur and I accept that as part of what VMD does and also I document that it could happen. The analogy to this in C++ itself is that if you have an overloaded function name and the types of data passed to that function do not match any of the overloaded set for that name, the compiler will issue an error. So, I am 100% against ever treating () as a nullary tuple. Something else like an empty token sequence representing a nullary tuple? I am okay with that provided all of the tuple-related machinery is updated to deal with this rogue value and provided it works across all supported preprocessors--which is not an easy task in some cases. I agree with you about a nullary tuple. It does not exist. But in a "constrained" situation you could say that a unary tuple which has empty data means "something". What that "something" means is defined by the situation. I view this as valid macro programming. There are also other ways of passing empty preprocessor data in "constrained" situations other than a unary tuple with empty data. In the VMD documentation I discuss this pretty extensively.
I am also not particularly fond of fractured interfaces. By that I mean that I do not like interfaces that only work on some preprocessors in a library that is targeting preprocessors rather than the standards. I especially don't like scenarios like interface A works, interface B works, but interface A + B does not work--i.e. combinatorial nightmare (which is what VC++ produces, BTW)--which is what happens when you start exposing the low-level primitives as interfaces. Right now that means that, in particular, variadic macros and placemarkers are not supported well because preprocessors don't support them very well--especially "important" ones like VC++. If this was not the case and if we had good preprocessors across a broad range of compilers and in particular the important compilers , the Boost pp-lib would simply get a breaking ground-up reimplementation built to support variadic macros and placeholders from the ground up. Which, in essence, is what chaos-pp already i s. So, from my point of view, Boost.Preprocessor is a relic that serves as a lowest common denominator. Otherwise, get a real preprocessor (i.e. not VC++) and use better libraries like Chaos.
I tore my hair out in many different situations trying to get VC++ to work reasonably in VMD. As you certainly know this is not a fun way to program. Even with all the work I did the VMD documentation mentions a number of situations where one needs to do something slightly different when using VC++. Reagarding your Chaos library I still think that having Chaos as part of Boost is worthwhile even if it works only for C++ standard conforming preprocessors, which obviously means that it will not work with VC++. But I understand that might mean alot of work for you, submitting Chaos as a Boost library, with what you may perceive as little reward.