Much of my work is with memory-constrained systems and unknowingly using twice as much memory as expected would be very undesirable. (I noticed Niall Douglas' comment earlier today "There is nothing wrong, *whatsoever*, with a variant which is sized the largest of its possible states multiplied by two" - I strongly disagree with that.) I'd like to at least get a warning if that were going to happen. A specific concern is that I might accidentally get this fallback due to forgetting to write "noexcept" somewhere (e.g. when using types from older code).
Something which I have found is surprisingly uncommon amongst even expert C++ programmers is the judicious use of static_assert to ensure no surprises down the line when junior programmers go modify something that they don't realise the consequences of. In my own code, you'll find lots of stanzas like this: ``` #ifdef NDEBUG static_assert(sizeof(X) <= cmax(sizeof(A), sizeof(B), sizeof(C), ...)); static_assert(noexcept(std::declval<A>() = std::move(std::declval<A>())); #endif ``` ... and so on. Yes there is a strong argument that these really ought to live in unit tests. However these are *very* cheap tests, so cheap as to be virtually no cost to sprinkle them all over your headers, right next to the type definitions. And yet they will strongly encourage every junior (or senior!) programmer to never commit accidental "inconsequential" changes which blow out the size, or break nothrow, in a hot code path. Coming back to my point, there really isn't anything wrong with a variant which defines its size to be twice that of its largest alternative state. If it is *documented* to be so. If the small-size optimisation is extremely important to you, then static assert it! Write the requirement into your code! Then you are *guaranteed* the requirement is met. Else the code won't compile. Now back to ACCU talk writing! :) Niall