** This post was originally titled Big Endian vs Little Endian - but this tile was not relevent to the post. **
So in the last post I mentioned how Math was causing me a little bit of a problem well it was actually quite a large problem. The Math part itself was fairly straight forward but I had made a very big mistake quite early on.
I declared this Enum
public enum EightBitLocation : int<br />
{<br />
Zero = 0,<br />
One = 1,<br />
Two = 2,<br />
Three = 3,<br />
Four = 4,<br />
Five = 5,<br />
Six = 6,<br />
Seven = 7<br />
}
Which made perfect sense since I was storing the first bit in the Zero Index of my Bit Array however this is not the convention used by the Z80 CPU.
Instead the Z80 CPU goes from right to left since it has the most significant Bit First; the opposite of what I had.
This didn't really cause a problem until I tried to convert my internal representation to use a Byte at which point the semantic differences started to matter.
Previously I had declared a ZByte like so
new ZByte(Bit.One, Bit.One, Bit.One, Bit.Zero, Bit.One, Bit.One, Bit.Zero, Bit.One);
Which is not a problem but the specification says this should have a value of EDh (This is in hexadecimal format).
Whereas I was getting the value of my internal Byte to be B7h.
Essentially I was putting the Bits in backwards.
This is because originally I placed the Zero Byte into the Zero Index of the Bit Array, which meant that it was the most significant bit. But the Zero Bit in a Byte is the least significant Bit.
So instead I adjusted by enum to reflect the semantics of Z80 CPU, with the Seventh Bit being the most significant and it started working correctly.
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.