On 1/6/14, Andrey Semashev
All these workarounds clutter the code and preclude further development of the libraries.
The code will never be free from preprocessor-controlled branches to support differences in even the newest compilers. There is also the possibility to keep the old files of the libraries for which the authors themselves wish to drop support in a separate sub-folder and include the old files instead of the new ones from a masterfile. Loki used to do and it worked great. Yes, I know they don't do it anymore, but what is relevant is that it solved the clutter problem at the time -- a time when reading the Loki book required an imaginary compiler for me. But messing with Boost.Config and introducing an #error directive upon detection of _MSC_VER <= 1300 is gratuitous and evil.
And since old compilers (the ones that were proposed for dropping) are not tested, you cannot call them supported anyway.
I have been testing (on my lonely computer, I recognize) ever since Boost switched from CVS to SVN, as much as possible against the HEAD revision, on the few compilers I could get my hands on. If it is any help, I would gladly like to contribute my modest and ugly patches and work-arounds.
Some developers may want to keep the compatibility and cope with inconveniences and may even find ways to test on the old compilers. But in general I don't think it is good for Boost to be pulled back by decade-old compilers. The Standard is out for 15 years and if some compiler failed to implement crucial parts of it, such as PTS, then this compiler is not worth supporting.
Incidentally, the compilers I mentioned *do* support Partial Template Specialization. Again, I do not ask for support. But I kindly ask for avoiding gratuitous and hardcoded #error directives. If a compiler can process boost::shared_ptr (for example) then I strongly believe that it is fair to allow users of that compiler to use the latest and greatest version of boost::shared_ptr, even if it fails to process some other libraries. Forcing the users to stick to an old, unpatched version of *all* the libraries just because they cannot compile *some* of the libraries does not sound fair at all to me.
From my memory, Borland was one of the worst compilers I had worked with, at least the free version available for download, not sure which version it was (I think it was 5.something).
Yes, that is the one ! (-:
It always surprised me why people would pay for it when there are many better alternatives.
One difference I mentioned is compile speed. It does make a difference on a 1600 MHz single core processor. Another is download size. 4 MB for Digital Mars, 9 MB for Borland, a few GigaBytes for new versions of Visual and CodeGear. As for Cygwin and GCC, Another is acquaintance with the build environment, the debugger... Another is the syntax of the inline assembler (and the amount of already written inline assembler code that still needs to be processed). And support for Windows-specific features such as the Active Template Library. I guess I am lucky that I have not used VCL !
There are GCC and MSVS Express for Windows, which are free and implement the Standard sufficiently well. And let me say that Windows also costs money, so you could save it by using Linux, which is free and runs virtually on any hardware.
Thank you -- I have kept my license from the days when I was in school. (-: But really, thank you kindly (and in no way ironically) for the good advice.
Most distributions have recent GCC and clang. Intel compiler is available for free for non-commercial use.
I'm not sure what you refer to as "2-or-3-or-4 times faster than Visual C++"
I mean "2-or-3-or-4 times faster than Visual C++". Any version of Borland since 2000 versus any version of Visual since 2003. Linking time is about the same. But I instead hack the output OBJ files and link with Digital Mars' linker instead which is 10 times as fast. Which means "10 times as fast". All these *do* make a difference (for me) for an executable file with a 5 MB code section...
but if you're not satisfied with compile times then I suggest you play with
compiler options. At least with GCC, reducing optimization level makes it compile considerably faster.
Staying on an older Boost version is another option. And believe me, if you're highly concerned with compile times you may want to do that anyway since in
some cases compile times get longer with the more recent releases.
Thank you, but I will rather have the latest patches and bug fixes and the features that the compiler affords. Some are just happy with a few libraries and even those not having all the features. But for the few libraries and the few features that I use, I would very much like to have the latest. And I truly believe that being granted this desire means in no way more work for any Boost developer or maintainer (as per the Loki approach described above). Otherwise I would not have asked for it.
[...] As I said, the Standard was released 15 years ago. The compilers that are not able to implement it by this time are the failure.
I see your point. But I do not think that *age* by iteself is not that important. For example, Borland 2002 (5.6 aka C++Builder 6) and 2006 (5.8 aka Developer Studio 2006 aka Turbo C++ 2006) have more issues in the front end than the free 2000 version (5.5). As for Microsoft Visual C++, the SSE2 optimizer is better in the 2003 than in the 2005 version in quite a few cases. Maybe isolated cases, but my point is not to judge and despise a compiler just for its age...