Hi to the group,
I am tinkering with some C-code where I am working on something that can
process some microcode. The microcode is from a DG MV/10000 machine and
while working on it, I noticed it is in little-endian. That's simple enough
to work around but that had me wondering, why do we have big and little
endianness? What is the benefit of storing the low-order byte first? Or is
that simply just an arbitrary decision made by some hardware manufacturers?
I am mostly just curious.
Thanks,
Peter / KG4OKG