## Since humans use decimal notation and computers use binary notation, number conversion is of paramount importance to how computers represent numbers.

a) Using the following numbers (show all your working):

i) 1356, show how a decimal number can be converted to binary.

ii) 1011100111, show how a different binary number can be converted to decimal.

b) Using the following numbers (show all your working without going via decimal):

i) 1011111000, show how a binary number can be converted to octal. ii) 1110001111, show how a binary number can be converted to hexadecimal.

c) Explain why it is important that, when using the methods described in part (b), the conversion starts at the least significant bit?

## Leave an answer

Sorry, you do not have permission to answer to this question .