On Wed, 25 Nov 2015 01:50:40 +0530, Andrey Semashev
On 2015-11-24 19:29, Domagoj Šarić wrote:
On Tue, 17 Nov 2015 06:24:37 +0530, Andrey Semashev
wrote: Personally, I'm in favor of adding these: BOOST_OVERRIDE, BOOST_FINAL. Although their implementation should be similar to other C++11 macros - they should be based on BOOST_NO_CXX11_FINAL and BOOST_NO_CXX11_OVERRIDE.
I agree, but what if you don't have final but do have sealed (with a less recent MSVC)?
As far as I understand, sealed can be used only with C++/CLR, is that right? If so then I'd rather not add a macro for it.
If on the other hand sealed can be used equivalently to final in all contexts, then you could use it to implement BOOST_FINAL.
It is available in C++ but technically it is not _the_ C++ keyword but an extension with the same purpose so 'purists' might mind that BOOST_NO_CXX11_FINAL is not defined even when BOOST_FINAL is defined to sealed instead of final...
In release builds asserts are expanded to something like (void)0. Technically, that's nothing, but who knows if it affects optimization.
It doesn't ;) (otherwise plain asserts would also affect optimisation)
Dependency on Boost.Assert is technically only there if you use the 'checked' macros...I agree that it is still 'ugly' (and the user would have to separately/explicitly include boost/assert.hpp to avoid a circular dependency) but so is, to me, the idea of having to manually duplicate/prefix all assumes with asserts (since I like all my assumes verified and this would add so much extra verbosity)...
You can add the checked versions to Boost.Assert with a separate PR.
True, didn't think of that (although that'd still be more verbose than I'd like...BOOST_ASSUME_CHECKED, BOOST_ASSUMED_ASSERT or something like that)... + I'd add a configuration macro to Boost.Assert that would make the regular BOOST_ASSERT use BOOST_ASSUME instead of BOOST_LIKELY...
I would have liked BOOST_HAS_CXX_RESTRICT to indicate that the compiler has support for the C99 keyword 'restrict' (or an equivalent) in C++ (the CXX in the macro name emphasizes that the feature is available in C++, not C). The BOOST_RESTRICT macro would be defined to that keyword or empty if there is no support.
Sure I can add the detection macro but for which 'feature set' (already for minimal - only pointers, or only for full - pointers, refs and this)?
That's a good question. I'm leaning towards full support, although that will probably not make MSVC users happy.
Yeah that'd make me pretty unhappy so no can do :P As it is: BOOST_RESTRICT is defined only if there is full support while _PTR, _REF and _THIS are always defined to whatever the compiler offers...so that's also a possibility (make the 'subfeature' macros always offer what they can and define the HAS macro only for 'all features')...
I don't see much use in BOOST_ATTRIBUTES and related macros - you can achieve the same results with feature specific-macros (e.g. by using BOOST_NORETURN instead of BOOST_ATTRIBUTES(BOOST_DOES_NOT_RETURN)).
Yes, I may change those...I was however 'forward thinking' WRT attributes standardization (so that the BOOST_ATTRIBUTES(BOOST_DOES_NOT_RETURN) macros look like 'one day' [[noreturn]])
That still doesn't improve over BOOST_NORETURN. If there's a reason to, we could even define BOOST_NORETURN to [[noreturn]].
Well if shorter prefixes (than BOOST) were used/allowed for attributes (eg. BFA_ as Boost Function Attribute) then the BOOST_ATTRIBUTES syntax may be less verbose when many attributes are used...but OK I'll have to rethink this...
I don't see the benefit of BOOST_NOTHROW_LITE.
It's a nothrow attribute that does not insert runtime checks to call std::terminate...and it is unfortunately not offered by Boost.Config...
Do you have measurments of the possible benefits compared to noexcept? I mean, noexcept was advertised as the more efficient version of throw() already.
What more measurements beyond the disassembly window which clearly shows unnecessary EH codegen (i.e. bloat) are necessary? (I already talked about this when Beman Dawes was adding the *THROW* macros but was ignored...I'd actually like it more if BOOST_NOTHROW_LITE would simply replace BOOST_NOTHROW instead of adding the new more verbose macro...)
(I need this and the *ALIAS* macros for a rewrite/expansion of Boost.Cast, that includes 'bitwise_cast', a sort of generic, safe&optimal reinterpret_cast)...
Again, it looks like this macro would have a rather specialized use.
You're right - these *ALIAS* related macros may start their life in Boost.Cast and then later be moved to Config if libraries (or client code) start depending on Cast only because of them...
I don't think BOOST_OVERRIDABLE_SYMBOL is a good idea, given that the same effect can be achieved in pure C++.
You mean creating a class template with a single dummy template argument and a static data member just so that you can define a global variable in a header w/o linker errors?
Slightly better:
template< typename T, typename Tag = void > struct singleton { static T instance; }; template< typename T, typename Tag > T singleton< T, Tag >::instance;
That's what I meant...and it is really verbose (and slower to compile than a compiler-specific attribute)...
Also, some compilers offer this functionality only as a pragma.
You mean in a way that would require a _BEGIN and _END macro pair?
Maybe for some compilers. I meant this:
https://docs.oracle.com/cd/E19205-01/819-5267/bkbkr/index.html
Oh...so a function macro would be required...
There's just no point in these compiler-specific workarounds when there's a portable solution.
Except maybe when the 'portable solution' is also a 'hack' (i.e. 'abusing' existing language functionality) and verbose one at that...what may point that we need specific/dedicated language functionality...
Calling conventions macros are probably too specialized to functional libraries, I don't think there's much use for these. I would rather not have them in Boost.Config to avoid spreading their use to other Boost libraries.
That's kind of self-contradicting, if there is a 'danger' of them being used in other libraries that would imply there is a 'danger' from them being useful...
What I mean is that having these macros in Boost.Config might encourage people to use them where they would normally not.
The same as above...I don't see a problem? If they are useful - great, if not and people still use them - 'we have bigger problems'... These may be moved to Functional but that would eventually make many more libraries depend on Functional just for one or two macros...
In any case, I agree that most of those would mostly be used only in functional libraries but for HPC and math libraries especially, the *REG*/'fastcall' conventions are useful when they cannot (e.g. on ABI boundaries) or do not want to rely on the compiler (IPO, LTCG etc.) to automatically choose the optimal/custom calling convention...Admittedly this is mostly useful on targets with 'bad default' conventions, like 32bit x86 and MS x64, but these are still widely used ABIs :/
Non-standard calling conventions give enough headache for users to avoid them as much as possible.
There is no 'standard' calling convention, just the 'default' one...and what headache can a non-default c.convention in an API cause (e.g. the whole Win32 and NativeNT APIs use the non-default stdcall convention)?
You might use them in library internals but there I think it's better to avoid the call at all - by forcing the hot code inline.
Enter bloatware... A statically dispatched call to a 'near' function has near zero overhead for any function with half-a-dozen instructions _if_ it (i.e. the ABI/c.convention) does not force the parameters to ping-pong through the stack... Forceinlining is just a primitive bruteforce method in such cases...which eventually makes things even worse (as this 'bloatware ignoring' way of thinking is certainly a major factor why the dual-core 1GB RAM netbook I'm typing on now slows down to a crawl from paging when I open gmail and 3 more tabs...). For dynamically dispatched calls (virtual functions) choosing the appropriate c.convention and decorating the function with as many relevant attributes is even more important (as the dynamic dispatch is a firewall for the optimiser and it has to assume that the function 'accesses&throws the whole universe')...
Function optimization macros are probably too compiler and case-specific. Your choice of what is considered fast, small code, acceptable math optimizations may not fit others.
If the indisputable goal (definition of 'good codegen') is to have fast and small code/binaries then 'I have to disagree'. For example a library dev can certainly know that certain code will never be part of a hot block (assuming correct usage of the library), for example initialisation, cleanup or error/failure related code and should thus be optimised for size (because that is actually optimising for real world speed - reducing unnecessary bloat - IO and CPU cache thrashing).
If that code is unimportant then why do you care?
Already explained above - precisely because it is unimportant it is important that it be compiled for size (and possibly moved to the 'cold' section of the binary) to minimise its impact on the performance of the code that does matter; loading speed of the binary; virtual memory; disk space, fragmentation and IO...
Simply organizing code into functions properly and using BOOST_LIKELY/UNLIKELY where needed will do the thing.
No it will not (at least not w/o PGO) as the compiler cannot deduce these things (except for simple scenarios like assuming all noreturn functions are cold)...and saying that we can/should then help it with BOOST_LIKELY while arguing that we shouldn't help it with BOOST_COLD/MINSIZE/OPTIMIZE_FOR_* is 'beyond self contradicting'...
Also, things like these should have a very limited use, as the user has to have the ultimate control over the build options.
I'm 'more than all for ultimate control' - as explained above this can actually give more control to the user (+ Boost Build was already a major pain when it came to user control over changing compiler optimisation options in preexisting variants)...
What I was saying is that it's the user who has to decide whether to build your code for size or for speed or for debug. That includes the parts of the code that you, the library author, consider performance critical or otherwise.
I'm sorry I fail to take this as anything else than just pointless nagging for the sake of nagging (and we are not talking about debug builds here). That's tantamount to saying that the user has to decide which parts of my library I'll tweak and optimise and which not... As already explained, properly marking hot and cold parts of code gives the user _more_ freedom to use the more coarse/global compiler options w/o fear that it will negatively impact the library code.
You may want to restrict his range of choices, e.g. when a certain optimization breaks your code.
More strawman 'ivory towering'...how exactly am I restrictring anyones choices? A real world example please?
I guess, you could try to spell these restrictions with these macros, but frankly I doubt it's worth the effort.
Based on what do you doubt? Obviously I am using all this and find it worth the effort...You could throw this generic nay saying just as well on likely, noexcept, restrict, rvalue refs...
I mean, there are so many possibilities on different compilers.
With some you have many possibilities (GCC) with others only a few (Clang, MSVC)...in anycase - what difference does that make, isn't that part of the Boost story - abstraction-on-path-to-standardisation?
One legitimate reason to use these macros that comes to mind is changing the target instruction set for a set of functions that require that (e.g. when a function is optimized for AVX in an application that is supposed to run in the absence of this extension). But then this only seems necessary with gcc, which again makes it a rather specific workaround.
GCC and Clang (and possibly others) but that's a whole different story/set of macros (i.e. unrelated to this)... -- "What Huxley teaches is that in the age of advanced technology, spiritual devastation is more likely to come from an enemy with a smiling face than from one whose countenance exudes suspicion and hate." Neil Postman --- Ova e-pošta je provjerena na viruse Avast protuvirusnim programom. https://www.avast.com/antivirus