A curious property of CPU architecture is the argument that datasets should be big-endian or little-endian, that is, should the most significant items be listed first, saving the least for last, or should the least significant items be listed first, saving the best for last? What if the transmission of data is cut-off mid-stream? The computer program may be tempted to accept the big-endian dataset because it contains the most significant data, but “the devil is in the details” as they say, so this could be a fatal mistake. Similarly, the computer program may be tempted to drop the little-endian dataset entirely because it contains no significant data, but the unrevealed data at the end may have led to a different conclusion.
For the purposes of computer science, it is tempting to say that endianness is a solved problem and Intel won. However, just because Intel is the largest manufacturer of computer processors and Intel’s X86 and X64 processing architectures are little-endian does not mean that is the best way to go. It’s very possible we could have much further advanced computers now if not for Intel’s choice, or much crappier ones. It’s also possible we could have big-endian computers that are just as advanced as the current ones, and it’s possible those computers would be just as advanced despite Intel, or with Intel’s help, i.e. because big-endian is very superior but Intel’s engineers persevered anyway on the hard road, or because big-endian is very inferior and Intel’s engineers took the easy road, never reaching the full potential of human discernment.
It’s also possible that neither path is correct, both are equal, or some are more equal than others, to quote Planet of the Apes. It’s also possible that different endiannesses are appropriate for different situations, i.e. big-endianness for lossy editing and little-endianness for lossless editing, i.e. the difference between editing photos, videos, and music, or text, respectively. Endianness is also known as byte order, and in a multibyte string, the preference of putting the most significant bytes first or the least significant bytes first is a product of your upbringing, whether it be your parents, grandparents, uncles, aunts, cousins, friends, strangers, environment, culture, heritage, private school, the streets, or state-run public schools. Your preference is also a product of your static and unchangeable genetics, and I may have just now subtly or overtly influenced your view of the world by writing this paragraph.
The difference between choosing big-endianness or little-endianness might just as well be the difference between being male or female, black or white, gay or straight, tall or short, right-handed or left-handed, right-brained or left-brained, or speaking English or Spanish, or both, or neither. The question isn’t “does it matter?,” because it obviously has a significant impact upon your life and behavior. The question is “why does it matter?,” because that is the only question that makes sense and has efficacy. Similarly, you can’t understand the motivations behind big-endianness without understanding the motivations behind little-endianness, and you can’t understand the motivations behind little-endianness without understanding the motivations behind big-endianness. I’m not a computer science student, so I don’t know much about it, but I could wager a bet it has something to do with the properties of silicon.
In truth, endianness is just another of life’s little arguments to either make you incredible sane or incredibly insane. No problem can be solved at the same level it was created, so you either have to choose the path of ignorance or choose the path of knowledge. The former divides you into camps of religious zealots, and the latter unifies you into the camp of humanity, not by transcending the need for endianness, but by recognizing that the worst possible action is to stagnate and refuse to make a choice, and that each road is equally valid and you should choose whatever works best for you and your mind. The choice is yours.