One of the more annoying things that you have to deal with when writing code that targets a variety of different platforms is endianness. You see, like in Gulliver’s Travels, there’s two camps in the computing world – those who break numbers up into clusters of 8-bits, starting at the least-significant part of the number and laying it out a byte at a time that way in memory (little endian) and those who start at the most significant 8-bits, and lay them out a byte at a time in memory until they get to the smallest part (big endian).
(Actually most chips these days seem to let you strap a pin to signal or ground to flip the default order, or set a bit in the BIOS to change it to whichever order you want, but most people use the default).
This is really annoying if you’re a game developer. PCs are little-endian. Old-school Macs? Xbox 360? Wii? PS3? Yeah, they’re big-endian. Network developers too; port numbers are always big-endian. And you need to remember that, or you’ll have problems.
This means that you end up littering your PC-based editor code (your development environment) with code to swap the byte order during serialization – which is harder to read than it should be. And if you forget, the bugs can be difficult to find. And if you take stock code written for one platform and move it to another, you often have to go in and change all kinds of things.
Let’s fix that.