On 3/30/2016 5:22 PM, Gavin Lambert wrote:
On 31/03/2016 12:56, Edward Diener wrote:
Asking programmers to support another broken C++ preprocessor at this stage of C++'s development history is a travesty. And all simply because Microsoft has refused for a quarter of a century to fix their C++ preprocessor implementation, which they well know has always been non-standard.
Yes.
Presumably they both have the same motivations: trillions (or likely much higher) of lines of existing code, some unknown fraction of which might depend on the quirks of the existing behaviour.
The current state of C++ is a many-to-many-to-many scenario: compilers x platforms x codebases x { standard } An ideal state would be many-to-one scenario: (compilers U platforms U code) x { standard } With the size of these sets growing, the latter is a scalable model; the former is not. While an ideal state may only be achievable in the limit, we should be moving in that direction--not intentionally away from it. If we do continue to move in that direction (which we must or C++ will die), the aforementioned code must break at some point. With the sets growing, the sooner it breaks the better because it just gets worse as time goes on.
Bear in mind that one of clang's goals is not just to compile new portable programs but also existing previously-MSVC-only codebases, with minimal (or no) changes to the code.
Yes, a misguided goal and deplorable mentality. Valuing popularity more highly that doing what is right results in trading future productivity for short-term expediency. This is an ethical rather than technical failure and a disservice to most of the current generation of C++ developers and all of their successors. Shame on them. Regards, Paul Mensonides