[config] Rethinking feature macros?
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++, would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them? This would seem to require less maintenance, and the feature macro can be used without waiting for Boost.Config to add it. For a concrete example, let's take noexcept in function types. This is __cpp_noexcept_function_type, and is implemented by g++ 7, clang 4, clang 5 (in C++17 mode), and apparently in the latest VS2017 preview. noexcept function pointers break Boost.Bind, and to fix it, I need to add overloads for them, but only if they are implemented, otherwise the overloads would be an error. With the feature macro, I can just #ifdef __cpp_noexcept_function_type and it will immediately work on g++ and clang++ and all compilers that don't support noexcept function types will still work. Only msvc would need to be fixed in some way. With our traditional approach, I would need to request the addition of BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE, wait for it to be added and to be present on every compiler except the latest ones (which requires changes throughout Boost.Config), and only then be able to use #ifndef BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE. (Then wait for it to be merged to master before merging my changes to master.)
AMDG On 11/05/2017 06:15 PM, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++, would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them?
I think it's probably a bad idea for Boost.Config
to try to define the standard feature macros.
- #defining third-party macros in library code is a recipe for
ODR violations.
- It's easy for those using compilers with support for these
macros to forget to #include
This would seem to require less maintenance, and the feature macro can be used without waiting for Boost.Config to add it.
For a concrete example, let's take noexcept in function types. This is __cpp_noexcept_function_type, and is implemented by g++ 7, clang 4, clang 5 (in C++17 mode), and apparently in the latest VS2017 preview.
noexcept function pointers break Boost.Bind, and to fix it, I need to add overloads for them, but only if they are implemented, otherwise the overloads would be an error.
With the feature macro, I can just #ifdef __cpp_noexcept_function_type and it will immediately work on g++ and clang++ and all compilers that don't support noexcept function types will still work. Only msvc would need to be fixed in some way.
With our traditional approach, I would need to request the addition of BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE, wait for it to be added and to be present on every compiler except the latest ones (which requires changes throughout Boost.Config), and only then be able to use #ifndef BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE. (Then wait for it to be merged to master before merging my changes to master.)
In Christ, Steven Watanabe
Steven Watanabe wrote:
I think it's probably a bad idea for Boost.Config to try to define the standard feature macros.
- #defining third-party macros in library code is a recipe for ODR violations.
There is no difference between defining __cpp_foo and BOOST_NO_CXX17_FOO as far as ODR violations are concerned. In both cases, before the inclusion of boost/config.hpp the macro isn't set, and after the inclusion, it may be.
- It's easy for those using compilers with support for these macros to forget to #include
. There's also a good chance that the automated tests will pass for compilers without such support in this case, as it's quite common for the fallback implementation to work even if the feature is actually present. (Or the corresponding tests might just end up being disabled as well.)
This is unfortunately a good point, and this new approach is indeed more fragile than the old one WRT forgetting to include boost/config.hpp. But on the other hand, the old one suffers from the problems I outlined. Ideally, MSFT would be persuaded to implement the macros, but until then...
AMDG On 11/05/2017 07:30 PM, Peter Dimov via Boost wrote:
Steven Watanabe wrote:
I think it's probably a bad idea for Boost.Config to try to define the standard feature macros.
- #defining third-party macros in library code is a recipe for ODR violations.
There is no difference between defining __cpp_foo and BOOST_NO_CXX17_FOO as far as ODR violations are concerned. In both cases, before the inclusion of boost/config.hpp the macro isn't set, and after the inclusion, it may be.
This is only true assuming that a) The only uses of these macros are inside Boost or by Boost users who explicitly #include Boost.Config to get them and, b) No other (unrelated to Boost) library chooses to implement the same idea OR such a library is never used in the same translation unit as Boost.Config OR the definitions are exactly identical to Boost.Config (If the implementation of a feature is incomplete or buggy, different people may make different choices about whether to #define the feature macro). In Christ, Steven Watanabe
On 11/5/2017 8:15 PM, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++, would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them?
This would seem to require less maintenance, and the feature macro can be used without waiting for Boost.Config to add it.
For a concrete example, let's take noexcept in function types. This is __cpp_noexcept_function_type, and is implemented by g++ 7, clang 4, clang 5 (in C++17 mode), and apparently in the latest VS2017 preview.
noexcept function pointers break Boost.Bind, and to fix it, I need to add overloads for them, but only if they are implemented, otherwise the overloads would be an error.
With the feature macro, I can just #ifdef __cpp_noexcept_function_type and it will immediately work on g++ and clang++ and all compilers that don't support noexcept function types will still work. Only msvc would need to be fixed in some way.
With our traditional approach, I would need to request the addition of BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE, wait for it to be added and to be present on every compiler except the latest ones (which requires changes throughout Boost.Config), and only then be able to use #ifndef BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE. (Then wait for it to be merged to master before merging my changes to master.)
I brought up SD-6 for Boost Config some time ago, but the original objection was that a great deal of the standard c++ header files would have to be included by Boost Config to use most of SD-6. While I see in your link that a number of the __cpp_... macros are predefined, like your example __cpp_noexcept, a number of other __cpp_... macros require header files to be included. So maybe Boost config can use macros which are predefined without incurring the header file inclusion overhead of those which require a header file to be included to determine if the feature is present or not.
Edward Diener wrote:
I brought up SD-6 for Boost Config some time ago, but the original objection was that a great deal of the standard c++ header files would have to be included by Boost Config to use most of SD-6.
Config already uses the SD-6 macros when possible, but what I'm suggesting is for it not to use it, but to define it when the implementation does not support them, so that libraries can use them. For correct use, the library would need to include both the standard header defining the __cpp_lib_... macro and boost/config.hpp, as the macro could be defined by either. This is admittedly a bit more fragile than the present arrangement. Compiler macros don't need a header to be included though, so for them, just boost/config.hpp would be needed.
On 06/11/2017 01:15, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++,
VS2017 also supports them. It actually has a really high quality implementation without the spelling mistakes certain other compilers have :)
would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them?
The answer is no. The SD6 feature test macros in the latest proposal paper are a tiny subset of those offered by Boost.Config. With latest SD6, one ends up testing for some feature which you know is usually associated with the one that you actually want due to the paucity of feature macros. I personally think the stripping of all the macros which used to be in the proposal down to the current minimum viable set is a mistake. But the argument is that these macros are for a future C++ standard, not the current standard. You may have heard that the WG21 convenor got annoyed at how much dependence on those macros was already appearing in the C++ ecosystem when the proposed set wasn't close to entering the standard yet. This led to the stripping to discourage usage until they enter a future standard. And thus, very recent compilers actually have removed feature test macros, thus breaking my code which then thinks such and such a feature is not implemented. Which is quite annoying. Boost.Config, in comparison, will not break your code on new compilers, usually. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/
Niall Douglas wrote:
On 06/11/2017 01:15, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++,
VS2017 also supports them.
It does? Which macros does VS2017 define?
VS2017 also supports them.
It does? Which macros does VS2017 define?
The ones in the latest SD-6 proposal. And only those, not the ones which used to be in there.
Do you have specific examples of macros being removed? You will see a raft of macros in:
https://github.com/ned14/quickcpplib/blob/master/include/cpp_feature.h ... which have been commented out with four ////. Those are vanishing from compilers like clang trunk, so I thought it best to purge the purged macros from my code. That file is probably out of date, they probably have purged some more SG-6 feature test macros by now. I can see them removing all the C++ 11 and C++ 14 feature tests before it goes for standardisation, though I daresay standard library implementers will howl about it. But I guess it stops people jumping that SG proposal into a fait accompli standardisation before it's ready. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/
Niall Douglas wrote:
VS2017 also supports them.
It does? Which macros does VS2017 define?
The ones in the latest SD-6 proposal. And only those, not the ones which used to be in there.
That's not what I'm seeing here. VS2017 supports structured bindings but __cpp_structured_bindings is not defined. #include <iostream> struct X { int a, b; }; int main() { X x{ 1, 2 }; auto [a, b] = x; #ifdef __cpp_structured_bindings std::cout << "__cpp_structured_bindings: " << __cpp_structured_bindings << std::endl; #else std::cout << "__cpp_structured_bindings: not defined" << std::endl; #endif }
That's not what I'm seeing here. VS2017 supports structured bindings but __cpp_structured_bindings is not defined.
That macro I think was just added literally in the October mailing. I think we can give a pass to Microsoft on that. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/
Niall Douglas wrote:
That's not what I'm seeing here. VS2017 supports structured bindings but __cpp_structured_bindings is not defined.
That macro I think was just added literally in the October mailing. I think we can give a pass to Microsoft on that.
OK, can you give me a specific macro that the compiler defines?
Niall Douglas wrote:
That's not what I'm seeing here. VS2017 supports structured bindings but __cpp_structured_bindings is not defined.
That macro I think was just added literally in the October mailing. I think we can give a pass to Microsoft on that.
OK, can you give me a specific macro that the compiler defines?
One of us is confused, Niall, because I just added all of the SD-6 macros to config_info, and none of them showed up on msvc-14.1 with any of /std:c++14, c++17, or latest. https://github.com/boostorg/config/pull/191 Either I'm doing something wrong, or you're in a possession of a super-secret build.
OK, can you give me a specific macro that the compiler defines?
One of us is confused, Niall, because I just added all of the SD-6 macros to config_info, and none of them showed up on msvc-14.1 with any of /std:c++14, c++17, or latest.
https://github.com/boostorg/config/pull/191
Either I'm doing something wrong, or you're in a possession of a super-secret build.
Heh! No, I confirm your findings. __has_include() works. Otherwise nothing else seems to, at least as far as VS2017.3. It must have been the case that my cpp_feature.h was being pulled in by some dependency, and thus it *appeared* to be working on MSVC. Which is a very good illustration of why your idea is a bad one Peter. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/
On 11/06/17 04:15, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++, would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them?
I think this is a bad idea. Boost.Config macros do not necessarilly correspond to what the compiler defines. Compilers lie sometimes by defining a macro while the corresponding feature is broken. Besides, I believe this is not Boost's prerogative to define compiler and standard library-specific macros. Also, on the std-proposals list (IIRC) there was a discussion about SD-6 macros, and there were even initiatives to entirely drop the whole idea. The future of SD-6 is not quite clear, IMHO. At least, library-specific macros have very limited use and are prime candidates for changes or plain removal.
Andrey Semashev wrote:
Besides, I believe this is not Boost's prerogative to define compiler and standard library-specific macros.
Steven made the same good point - we should not define these macros because someone else may decide to do it too.
Boost.Config macros do not necessarilly correspond to what the compiler defines. Compilers lie sometimes by defining a macro while the corresponding feature is broken.
That's true in principle but I've found this practice questionable and can't help but note that in my specific example of __cpp_noexcept_function_type, if Config doesn't define the macro my code will break, regardless of whether the feature is broken. I want to know if the feature is present, not whether someone arbitrarily considers it broken. But that's really a side issue.
Also, on the std-proposals list (IIRC) there was a discussion about SD-6 macros, and there were even initiatives to entirely drop the whole idea.
I don't remember a time when this wasn't the case. The objections to the specific idea of defining SD-6 macros in Config are solid, but nobody has said anything about the inefficiencies in our current approach that are caused by us defining negative macros instead of positive ones. We can avoid the problem of not being allowed to define "foreign" macros by having our own names for them and defining them automatically when the standard macro is defined. #ifdef __cpp_noexcept_function_type # define BOOST_CPP_NOEXCEPT_FUNCTION_TYPE #endif Positive macros suffer from the problem of one forgetting to include config.hpp a bit more than negative macros do, but using our names avoids the other problem Steven brings up, that things would silently work in this case on g++/clang++. (Of course the above doesn't work for library macros because we don't want to include the whole STL but we still can use positive macros there, we just need to define them ourselves appropriately.)
On 11/6/2017 7:45 AM, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
Besides, I believe this is not Boost's prerogative to define compiler and standard library-specific macros.
Steven made the same good point - we should not define these macros because someone else may decide to do it too.
Boost.Config macros do not necessarilly correspond to what the compiler defines. Compilers lie sometimes by defining a macro while the corresponding feature is broken.
That's true in principle but I've found this practice questionable and can't help but note that in my specific example of __cpp_noexcept_function_type, if Config doesn't define the macro my code will break, regardless of whether the feature is broken. I want to know if the feature is present, not whether someone arbitrarily considers it broken. But that's really a side issue.
Also, on the std-proposals list (IIRC) there was a discussion about SD-6 macros, and there were even initiatives to entirely drop the whole idea.
I don't remember a time when this wasn't the case.
The objections to the specific idea of defining SD-6 macros in Config are solid, but nobody has said anything about the inefficiencies in our current approach that are caused by us defining negative macros instead of positive ones. We can avoid the problem of not being allowed to define "foreign" macros by having our own names for them and defining them automatically when the standard macro is defined.
#ifdef __cpp_noexcept_function_type # define BOOST_CPP_NOEXCEPT_FUNCTION_TYPE #endif
This is a better idea, if only to align Boost Config macro names with the SD-6 macros.
Positive macros suffer from the problem of one forgetting to include config.hpp a bit more than negative macros do, but using our names avoids the other problem Steven brings up, that things would silently work in this case on g++/clang++.
(Of course the above doesn't work for library macros because we don't want to include the whole STL but we still can use positive macros there, we just need to define them ourselves appropriately.)
Theoretically a good idea, but we then end up with defining Boost equivalent SD-6 macros, where the compiler macros are based on SD-6 and the library macros are based on Boost Config logic.
On 11/06/17 15:45, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
Boost.Config macros do not necessarilly correspond to what the compiler defines. Compilers lie sometimes by defining a macro while the corresponding feature is broken.
That's true in principle but I've found this practice questionable and can't help but note that in my specific example of __cpp_noexcept_function_type, if Config doesn't define the macro my code will break, regardless of whether the feature is broken. I want to know if the feature is present, not whether someone arbitrarily considers it broken. But that's really a side issue.
I think we had multiple occasions when we disabled a feature that was advertised as implemented in a compiler. Having Boost.Config declaring the feature as available in such cases would not be helpful because the actual code would still not work. You need a macro that indicates that noexcept is part of the function type. I'm not sure how any compiler that supports the feature can support it incompletely/incorrectly, but until we have an actual case we can't tell if that level of support is suitable for you. I'd say add BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPES and use it in Boost.Bind the way we always did. When there appears a compiler that doesn't fit, we'll see what is best. It is always possible to add compiler-specific exceptions in Boost.Bind or define a special macro in Boost.Bind.
The objections to the specific idea of defining SD-6 macros in Config are solid, but nobody has said anything about the inefficiencies in our current approach that are caused by us defining negative macros instead of positive ones.
I don't see much of a problem with the negative form. The idea is that the macros indicate compiler defects wrt. the latest standard (plus the positive form macros for non-standard features), which I think makes sense. This way the number of defined macros tend to be always low on good compilers, which is probably better than having them continuously grow over time. Anyway, I'm not particularly tied to either negative or positive form. The process of adding a new macro seems to be the same in either case, so I don't see much difference from the maintenance/submission cost perspective. What I do prefer though is that we have *one* naming approach, consistent across Boost and C++ versions. That significantly reduces the cost of using the macros. We currently use the negative form and I'm guessing we're not going to remove the current macros straight away because of breaking tons of code. So I guess that's a point in favor of the negative form.
We can avoid the problem of not being allowed to define "foreign" macros by having our own names for them and defining them automatically when the standard macro is defined.
#ifdef __cpp_noexcept_function_type # define BOOST_CPP_NOEXCEPT_FUNCTION_TYPE #endif
Positive macros suffer from the problem of one forgetting to include config.hpp a bit more than negative macros do, but using our names avoids the other problem Steven brings up, that things would silently work in this case on g++/clang++.
I'm not sure "forgetting to include config.hpp" is a valid argument. You're obviously using Boost.Config (that follows from the macro name), so you should include its header. This would be less so obvious if we decided to define __cpp_noexcept_function_type ourselves, which is another point against that approach.
Andrey Semashev wrote:
I don't see much of a problem with the negative form. The idea is that the macros indicate compiler defects wrt. the latest standard (plus the positive form macros for non-standard features), which I think makes sense. This way the number of defined macros tend to be always low on good compilers, which is probably better than having them continuously grow over time.
This was the case ten years ago but not now. You can no longer derive any quality metric by what a compiler supports. gcc 6 is not a non-good compiler, it just defaults to C++14. VS 2017 15.3 is not a non-good compiler, it's just not 15.5 yet. Given the new pace of the standard, there will never be any longer a point at which a good compiler will be macro-free, as there will always be things left to implement because they were added after the compiler shipped.
On 11/06/17 17:41, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
I don't see much of a problem with the negative form. The idea is that the macros indicate compiler defects wrt. the latest standard (plus the positive form macros for non-standard features), which I think makes sense. This way the number of defined macros tend to be always low on good compilers, which is probably better than having them continuously grow over time.
This was the case ten years ago but not now. You can no longer derive any quality metric by what a compiler supports. gcc 6 is not a non-good compiler, it just defaults to C++14. VS 2017 15.3 is not a non-good compiler, it's just not 15.5 yet. Given the new pace of the standard, there will never be any longer a point at which a good compiler will be macro-free, as there will always be things left to implement because they were added after the compiler shipped.
I'm not saying any given compiler is good or not, and that there is (or should be) a macro-free compiler. I'm saying that having ~10 macros defined for g++-7 -std=c++17 is probably better than ~100 macros. And having ~10 (other) macros defined for g++-15 -std=c++22 is yet better than ~200. Also, another minor point. In the user's code, the #ifdef checks can be viewed as workarounds for compilers not supporting a particular language feature. Potentially, you could strip some of the conditionally compiled code over time to raise the minimum bar of supported C++, thus reducing the maintenance cost. This could even be done with a preprocessor. Of course, that is not always as simple as that, but at least partially this can be made to work. This can be done with positive form as well, but I think it would be more complicated since you'd have to keep the code instead of removing it.
Andrey Semashev wrote:
I'm saying that having ~10 macros defined for g++-7 -std=c++17 is probably better than ~100 macros. And having ~10 (other) macros defined for g++-15 -std=c++22 is yet better than ~200.
You're refuting your own argument, because if g++-15 would need 200 positive macros compared to g++-7's 100, g++-7 would need 100 negative macros. There is no point in time at which both can do with 10. The good thing about positive macros is that an old compiler never needs maintenance. With negative macros you have to keep adding them to it.
On 11/06/17 18:32, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
I'm saying that having ~10 macros defined for g++-7 -std=c++17 is probably better than ~100 macros. And having ~10 (other) macros defined for g++-15 -std=c++22 is yet better than ~200.
You're refuting your own argument, because if g++-15 would need 200 positive macros compared to g++-7's 100, g++-7 would need 100 negative macros.
g++-7 will be out of wide use by then, so it doesn't matter.
The good thing about positive macros is that an old compiler never needs maintenance. With negative macros you have to keep adding them to it.
That is not more maintenance than adding positive macros for newer compilers.
Andrey Semashev wrote:
You're refuting your own argument, because if g++-15 would need 200 positive macros compared to g++-7's 100, g++-7 would need 100 negative macros.
g++-7 will be out of wide use by then, so it doesn't matter.
That's not true on at least three levels.
The good thing about positive macros is that an old compiler never needs maintenance. With negative macros you have to keep adding them to it.
That is not more maintenance than adding positive macros for newer compilers.
It is. Maintaining the new compilers is constant regardless of the macro type, and maintaining the old compilers is only required for negative macros.
On November 6, 2017 7:13:06 PM Peter Dimov via Boost
Andrey Semashev wrote:
You're refuting your own argument, because if g++-15 would need 200 positive macros compared to g++-7's 100, g++-7 would need 100 negative macros.
g++-7 will be out of wide use by then, so it doesn't matter.
That's not true on at least three levels.
Why?
The good thing about positive macros is that an old compiler never needs maintenance. With negative macros you have to keep adding them to it.
That is not more maintenance than adding positive macros for newer compilers.
It is. Maintaining the new compilers is constant regardless of the macro type, and maintaining the old compilers is only required for negative macros.
I don't see how. You have to add new macros as they come with new C++ versions or someone requests them. You have to test them. All this is the same amount of work regardless of whether the macro is positive or negative.
Andrey Semashev wrote:
g++-7 will be out of wide use by then, so it doesn't matter.
That's not true on at least three levels.
Why?
Because 1. C++17 still matters even if g++-15 -std=c++17 is used instead of g++-7 -std=c++17; 2. g++-7 will not be out of wide use as today we still have to care about g++ 4.4; 3. Even if a compiler is out of wide use, we still maintain Config for it. https://github.com/boostorg/config/blob/develop/include/boost/config/compile...
The good thing about positive macros is that an old compiler never needs maintenance. With negative macros you have to keep adding them to it.
That is not more maintenance than adding positive macros for newer compilers.
It is. Maintaining the new compilers is constant regardless of the macro type, and maintaining the old compilers is only required for negative macros.
I don't see how. You have to add new macros as they come with new C++ versions or someone requests them. You have to test them. All this is the same amount of work regardless of whether the macro is positive or negative.
On an old compiler, you don't need to add new positive macros, so there's less work to do. On a new compiler, you either add the positive macro to -std=c++17 or add the negative macro to -std=c++14 and below, so the work is the same.
AMDG On 11/06/2017 10:02 AM, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
I don't see how. You have to add new macros as they come with new C++ versions or someone requests them. You have to test them. All this is the same amount of work regardless of whether the macro is positive or negative.
On an old compiler, you don't need to add new positive macros, so there's less work to do.
We don't usually define macros independently for every compiler version. The actual implementation generally looks like: #if COMPILER_VERSION < NNN #define BOOST_NO_FEATURE #endif which is always the same amount of work regardless of whether it's a positive or negative macro.
On a new compiler, you either add the positive macro to -std=c++17 or add the negative macro to -std=c++14 and below, so the work is the same.
In Christ, Steven Watanabe
Steven Watanabe wrote:
We don't usually define macros independently for every compiler version. The actual implementation generally looks like:
#if COMPILER_VERSION < NNN #define BOOST_NO_FEATURE #endif
which is always the same amount of work regardless of whether it's a positive or negative macro.
This ignores -std=... though.
On 11/06/17 21:03, Peter Dimov via Boost wrote:
Steven Watanabe wrote:
We don't usually define macros independently for every compiler version. The actual implementation generally looks like:
#if COMPILER_VERSION < NNN #define BOOST_NO_FEATURE #endif
which is always the same amount of work regardless of whether it's a positive or negative macro.
This ignores -std=... though.
We also test for BOOST_GCC_CXX11 or __cplusplus to take that into account.
Andrey Semashev wrote:
On 11/06/17 21:03, Peter Dimov via Boost wrote:
Steven Watanabe wrote:
We don't usually define macros independently for every compiler version. The actual implementation generally looks like:
#if COMPILER_VERSION < NNN #define BOOST_NO_FEATURE #endif
which is always the same amount of work regardless of whether it's a positive or negative macro.
This ignores -std=... though.
We also test for BOOST_GCC_CXX11 or __cplusplus to take that into account.
Yes, it's the same either way, I guess. #if COMPILER_VERSION < NNN || COMPILER_CXX < 17 # define BOOST_NO_CXX17_FEATURE #endif
On 11/06/17 20:02, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
g++-7 will be out of wide use by then, so it doesn't matter.
That's not true on at least three levels.
Why?
Because
1. C++17 still matters even if g++-15 -std=c++17 is used instead of g++-7 -std=c++17;
Not sure how it relates to my point. Surely, g++-15 will have no worse support for C++17 than g++-7, so it will not require more macros. Likely less.
2. g++-7 will not be out of wide use as today we still have to care about g++ 4.4;
First, g++-4.4 is out of wide use currently. Second, we don't have to care about it beyond caring about checking for Boost.Config macros, which were defined and tested years ago. Those macros are also relevant for other compilers, so this is not something we do specifically for gcc 4.4. Third, we test gcc 4.4 only as long as we want to. Or put it another way, as long as there are people that want to test it. I don't see how those people would benefit from changing macros the other way around.
3. Even if a compiler is out of wide use, we still maintain Config for it.
https://github.com/boostorg/config/blob/develop/include/boost/config/compile...
Sure, as long as we choose to. And we would still have to maintain and test those files with positive macros.
The good thing about positive macros is that an old compiler never >> > needs maintenance. With negative macros you have to keep adding them >> > to it.
That is not more maintenance than adding positive macros for newer compilers.
It is. Maintaining the new compilers is constant regardless of the macro > type, and maintaining the old compilers is only required for negative > macros.
I don't see how. You have to add new macros as they come with new C++ versions or someone requests them. You have to test them. All this is the same amount of work regardless of whether the macro is positive or negative.
On an old compiler, you don't need to add new positive macros, so there's less work to do.
No, it's not. Either way you add a macro to gcc.hpp, visualc.hpp, etc., only the condition changes. Either way you need to test it, which is arguably the most time consuming part.
On a new compiler, you either add the positive macro to -std=c++17 or add the negative macro to -std=c++14 and below, so the work is the same.
As I've said earlier, I'm specifically against the naming zoo in macros. If we switch to positive macros - fine, but then switch all of them and deprecate the old ones. And PRs to libraries updating to the new convention would be nice, too.
Andrey Semashev wrote:
3. Even if a compiler is out of wide use, we still maintain Config for it.
https://github.com/boostorg/config/blob/develop/include/boost/config/compile...
Sure, as long as we choose to. And we would still have to maintain and test those files with positive macros.
I feel like we're talking past of each other here. What is there to maintain about digitalmars.hpp? It stays the same for eternity because the compiler is no longer developed so it's impossible for it to acquire a new feature.
On 11/06/17 20:59, Peter Dimov via Boost wrote:
Andrey Semashev wrote:
3. Even if a compiler is out of wide use, we still maintain Config for > it.
https://github.com/boostorg/config/blob/develop/include/boost/config/compile...
Sure, as long as we choose to. And we would still have to maintain and test those files with positive macros.
I feel like we're talking past of each other here. What is there to maintain about digitalmars.hpp? It stays the same for eternity because the compiler is no longer developed so it's impossible for it to acquire a new feature.
I'm not aware of the Digital Mars compiler activity, but if it's long abandoned then we might drop that file entirely. Although I can see some C++14 and 17 additions in it. Anyway, if we keep supporting it, adding a #define for each new Boost.Config macro in that file doesn't seem that much of a problem. Naming confusion is worse for users and mass switching to positive naming is more expensive. Dead compilers are not the problem. The most time consuming part is related to the compilers that are still alive. It is those compilers, especially the ones that are hard to come by, require research about whether they support a feature and since which version. For those compilers you have to do the same amount of work to define the macro either way.
Andrey Semashev wrote:
Dead compilers are not the problem.
Okay, I guess. This was my first argument in favor of positive macros, that old compilers would require no maintenance. My second argument in favor of positive macros was that they degrade more gracefully when one is using an older Boost.Config. This happens in several scenarios; first, in the course of normal development, when your library needs a macro but Boost.Config doesn't have it yet; second, when you merge to master but Config hasn't been merged yet; third, when a user upgrades your library in an older Boost release to take advantage of bugfixes. Since your typical code is something like // old things #if (new feature is available) // new things #endif with a positive macro and an older Config the new things don't get included even when the compiler supports the new feature, which leaves the user not worse than before. With a negative macro, the new portion is compiled and errors out when the compiler does not support the new feature. Steven correctly points out that this is not necessarily a benefit as it allows one to forget to include Config (or perhaps allows new code to pick up an old Config via a system include instead of the correct one?) But from where I sit, this seems an acceptable tradeoff. Either way, judging by the reaction and the arguments offered, a switch to positive macros would be unlikely at this time. :-)
On 7/11/2017 07:22, Andrey Semashev wrote:
On 11/06/17 20:59, Peter Dimov wrote:
I feel like we're talking past of each other here. What is there to maintain about digitalmars.hpp? It stays the same for eternity because the compiler is no longer developed so it's impossible for it to acquire a new feature.
I'm not aware of the Digital Mars compiler activity, but if it's long abandoned then we might drop that file entirely. Although I can see some C++14 and 17 additions in it.
I think the point was that with purely positive feature macros, the digitalmars.hpp header file need never be changed, because it cannot acquire new features. Whereas with negative feature macros, when C++42 adds a new feature someone will still have to go through every old compiler header file (including digitalmars.hpp) and explicitly state "this doesn't support that". And code will break if someone forgets to do so. If nothing else, it seems somewhat like a SOLID violation. (And arguing "let's drop digitalmars" is a strawman; the same thing can apply to any compiler, even one of the current major ones.)
On 11/06/17 05:15, Peter Dimov via Boost wrote:
For a concrete example, let's take noexcept in function types. This is __cpp_noexcept_function_type, and is implemented by g++ 7, clang 4, clang 5 (in C++17 mode), and apparently in the latest VS2017 preview.
noexcept function pointers break Boost.Bind, and to fix it, I need to add overloads for them, but only if they are implemented, otherwise the overloads would be an error.
Not really addressing the main question, but just out of curiosity, what would be the downside of not relying on any macro but detecting the feature automatically? For example, something like this could be done: void n() noexcept {} void e() {} constexpr bool noexcept_function_type = !std::is_same< decltype( n ), decltype( e ) >::value; Now we can use this to SFINAE out the noexcept overloads when they'd be problematic: template < typename R > R invoke0( R (*f)() ) { std::cout << "main overload" << std::endl; return f(); } template < typename R > std::enable_if_t< noexcept_function_type, R > invoke0( R (*f)() noexcept ) { std::cout << "noexcept overload" << std::endl; return f(); } void f1() { std::cout << "f1" << std::endl; } void f2() noexcept { std::cout << "f2" << std::endl; } int main() { invoke0( f1 ); invoke0( f2 ); } Clang gives a warning for the noexcept overload in C++14 mode: "mangled name of 'invoke0' will change in C++17 due to non-throwing exception specification in function signature [-Wc++17-compat-mangling]", but that can be silenced by surrounding that overload with #pragma clang diagnostic push #pragma clang diagnostic ignored "-Wc++17-compat-mangling" ... #pragma clang diagnostic pop Didn't try with other compilers. Thanks, Gevorg
On Mon, 2017-11-06 at 03:15 +0200, Peter Dimov via Boost wrote:
Now that there are standard feature-testing macros (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0096r5.html) that are being implemented by at least g++ and clang++, would it perhaps make sense for us to reevaluate our decision to provide negative macros in Boost.Config and start defining the standard feature macros instead on the compilers that don't support them?
I think it would be better to provide macros like how the Fit library does: https://github.com/pfultz2/Fit/blob/master/include/fit/config.hpp#L93 The macros are always defined, but can be easily overridden. So if a compiler has a bug with the feature or is lying, then the user can easily build by adding `-DBOOST_HAS_FEATURE=...` to compilation to enable or disable the feature in boost. The current setup this is not possible because we always write `#ifdef BOOST_NO_FEATURE` and there is no way to express that the macro should be left undefined even if Boost.Config thinks it should be defined. Instead we should be writing `#if !BOOST_HAS_FEATURE` and Boost.Config defines it 0 or 1 if it hasn't already been defined by the user.
This would seem to require less maintenance, and the feature macro can be used without waiting for Boost.Config to add it.
For a concrete example, let's take noexcept in function types. This is __cpp_noexcept_function_type, and is implemented by g++ 7, clang 4, clang 5 (in C++17 mode), and apparently in the latest VS2017 preview.
noexcept function pointers break Boost.Bind, and to fix it, I need to add overloads for them, but only if they are implemented, otherwise the overloads would be an error.
With the feature macro, I can just #ifdef __cpp_noexcept_function_type and it will immediately work on g++ and clang++ and all compilers that don't support noexcept function types will still work. Only msvc would need to be fixed in some way.
With our traditional approach, I would need to request the addition of BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE, wait for it to be added and to be present on every compiler except the latest ones (which requires changes throughout Boost.Config), and only then be able to use #ifndef BOOST_NO_CXX17_NOEXCEPT_FUNCTION_TYPE. (Then wait for it to be merged to master before merging my changes to master.)
Of course, boost shouldn't define `__cpp_noexcept_function_type`, but it could define `BOOST_FEATURE_NOEXCEPT_FUNCTION_TYPE`(or whatever mapping we want to have from sd-6 macros to boost macros). Using the scheme above, Boost.Bind could define it if it hasn't already been defined by Boost.Config or the user. So simply doing: #ifndef BOOST_FEATURE_NOEXCEPT_FUNCTION_TYPE #ifdef __cpp_noexcept_function_type #define BOOST_FEATURE_NOEXCEPT_FUNCTION_TYPE 1 #else #define BOOST_FEATURE_NOEXCEPT_FUNCTION_TYPE 0 #endif #endif This way Boost.Bind can start using that feature while it waits for it to be integrated in Boost.Config, and then it can remove the macro definition above once its integrated into Boost.Config. This does have a potential problem if two libraries decide to define the macro in different ways, but this should only exists until its integrated into Boost.Config.
participants (8)
-
Andrey Semashev
-
Edward Diener
-
Gavin Lambert
-
Gevorg Voskanyan
-
Niall Douglas
-
paul
-
Peter Dimov
-
Steven Watanabe