a 'byte')? Yep, but the 8th bit was used for code pages - that is, the other 128 characters (128 + 128 = 256 = maximum number you can make from 8 bits) were used for domain-specific purposes. But isn't it the case the computers tend to like groups of 8 bits (i.e. There were 128 characters in the original ASCII specification - and that's because 128 is the largest number that can be represented with 7 bits. ASCII was (and still is) just a simple set of conversion rules to go from numbers to characters. Unicode was the solution to an increasingly important problem in the dawn of computing and the internet: How does my computer communicate with another computer on the other side of the world if that computer 'speaks a different language'? One of the most popular 'languages' in the early 1980s (especially in the USA) was ASCII - the American Standard Code for Information Interchange. It's the organisation that handles the international standards for converting numbers into textual characters. Okay, now on to the long explanation: The long explanation starts with an international organisation called 'Unicode'.