What does 8-bit / 16-bit actually refer to?

When talking about retro games, terms like “8-bit music” or “16-bit graphics” often come up. I myself often use these terms, but I’m not exactly sure what they refer to. What do they mean?

Answer

8-bit and 16-bit, for video games, specifically refers to the processors used in the console. The number references the size of the words of data used by each processor. The 8-bit generation of consoles (starting with Nintendo’s Famicom, also called Nintendo Entertainment System) used 8-bit processors; the 16-bit generation (starting with NEC/Hudson’s PC Engine, also called TurboGrafx-16) used a 16-bit graphics processor. This affects the quality and variety in the graphics and the music by affecting how much data can be used at once; Oak’s answer details the specifics of graphics.

If you don’t know about a computer bit, then here is the Wikipedia article on bits: http://en.wikipedia.org/wiki/Bit, which I’ll quote the first sentence that is all one really needs to know.

A bit or binary digit is the basic unit of information in computing and telecommunications; it is the amount of information that can be stored by a digital device or other physical system that can usually exist in only two distinct states.

Now, note that in modern times, things like “8-bit music” and “16-bit graphics” don’t necessarily have anything to do with processors or data size, as most machinery doesn’t run that small anymore. They may instead refer specifically to the style of music or graphics used in games during those generations, done as a homage to nostalgia. 8-bit music is the standard chiptune fare; the graphics were simplistic in terms of colour. 16-bit music is higher quality but often still has a distinct electronic feel, while the graphics got much more complex but still largely 2-dimensional and 240p resolution.

Attribution
Source : Link , Question Author : Kevin Yap , Answer Author : Community

Leave a Comment