Bo Persson wrote
The problem is that C and C++ doesn't enforce any specific implementation of data in the underlying hardware. For example, a byte (char) isn't required to be 8 bits. Not all bits in a word are required to take part of the value. The word size isn't given by standard either.
*IF* the language standard was to let you specify 42 bit padding and 17 bit alignment on any data type, that would disqualify some hardware up front. It would just be impossible to implement this on some systems that have compilers right now.
How is that an improvement? Less portability because there will be fewer compilers?!
Yeah, I've heard this excuse before, and it doesn't fly. If a vender chooses not to be able to do generic things, such as reading a binary file or stream of types that are not native, then that is the choice of the vender to not make a compatible compiler. It can have a lesser compliance standard attached to it if need be. But to hurt the interoperability of the rest of the community is hurting the progress of the community, and that is not good. A standard needs to be created for binary interoperability, for speed, size, and genericity. Xml is ok for certain usages where resources are plentiful, but is too bloated for other things like micro controllers where resources are scarce. Communication is key for a thriving community. -- View this message in context: http://boost.2283326.n4.nabble.com/boost-The-file-boost-detail-endian-hpp-ne... Sent from the Boost - Dev mailing list archive at Nabble.com.