Computers use binary digits (0s and 1s) to represent all data as these are the only two states that electronics can interpret. Characters are encoded using patterns of bits and bytes, with ASCII and Unicode being the most common encoding schemes to represent a wide range of languages. For a character to be displayed, it is converted through various stages from a keyboard press to a binary code to the appropriate character on screen.