Consider the following code:
func () { long a = 0x12345678; char *p; p = (char *) &a; printf ("%02X\n", *p); }
On a little-endian machine, this prints the value "0x78"; on a big-endian machine, it prints "0x12". This is one of the big (pardon the pun) reasons why structured programmers generally frown on typecasts.