John Maddock wrote:
I've just committed the code so you can try for yourself, but basically rather than storing a sequence of "limbs" (which may vary in size from one platform/compiler to another), it stores a sequence of bytes instead. The bytes are extracted using high level operations (shifts and bitmasks) so there's no issue with endianness etc.
As I said before - this shouldn't be necessary.
Consider a 128-bit integer type, internally this could be:
* Expressed as 8 16-bit integers (intmax_t=32 bits). * Expressed as 4 32-bit integers (intmax_t=64 bits). * Expressed as a single 128-bit (native) integer.
How would you serialize from one format to a different one without breaking it down into "portably small" chunks?
Of course you have to define the 128 bit integer in terms of some other types. But THOSE types should already be portably serializable. another way to deal with this is to work a at a more primitive level. By marking the 128 integer type as "primitive" and defining an output for output/input stream operarate - text archives should "just work". Of course these operators will have to be implemented with portability in mind so the issue is moved to somewhere else. But then one probably needs to define these operators in any case so for text archives serialization comes for free. for binary and portable binary archives, one might need to specialize the serialization operators. For binary archives there's not much if anything to do. For portable binary archives, one might have to make a minor specialization. The portable binary archive includes code for handling endianness of arbitrarily long integer types. (note that portable binary archive isn't complete in anycase in that it doesn't handle floating point types - a good GSOC project if one wants one.) So you've got a couple of ways of addressing this. Robert Ramey