[QUOTE=Steerforth]4 bit - Atari 2600
8 bit - NES
16 bit - SNES
32 bit - PS1
64 bit - N64
128 bit - Dreamcastif I am mistaken on these someone will correct them.[/QUOTE]
I was going to write a post about this this morning, but it's just a lot more complicated than the bitness of the CPU. CPUs are much more complex these days than they were in the past. They can be multi-core, and can also be accompanied by additional special-purpose processing units, such as GPU, FPU, ALU, SIMD, among others. These can be embedded into the CPU itself, or can be part of a machine's architecture added in addition to a given CPU.
Atari was an 8-bit machine, not 4-bit. It used a MOS 6507. There were some calculators that had a brand of 4-bit processor from Intel, and some other sparse use of 4-bit processing, but it was never widely used in any gaming consoles or home computer systems.
Dreamcast also was not a true 128-bit CPU, rather it was a 64-bit CPU with an added 128-bit FPU. The Gamecube had a 64-bit CPU with added 32-bit ALU, 64-bit FPU, and a 2x32-bit SIMD. XBOX was also a 64-bit Pentium III hybrid. Being a Pentium, it had a built-in MMX SIMD processor for integer matrix calculations. It was also accompanied by a very powerful NVidia GPU. Note that the Playstation 2 was the first console to have a true 128-bit CPU, called the "Emotion Engine", along with a 64-bit FPU, and two built-in 128-bit vector units. Yet it was weaker in processing power, and definitely in CPU/GPU combined output, than the two basically 64-bit machines it was competing against.
That should tell you something about why bitness is no longer the benchmark for advance in processor speed and/or capabilities. Bitness increase in hardware and software development is delivering diminishing returns for each new increase, partly because the ability to deal in values, addresses, etc. in values beyond 64-bit does not provide the same streamlining effect that each increase did on the previous counterparts. Instead, the trend today is to magnify a processor by adding cores, utilizing speedy communication between those cores, and providing specialized processors which aren't as general-purpose or functional as a CPU, but which can be utilized to be much faster on specific types of operations. Again, these would include the GPU (graphics processing unit), ALU (arithmetic logic unit), FPU (floating point unit), and SIMD (which stands for "single instruction, multiple data") which allows for processing of specific mathematical operations which result in an array of data instead of single value returns.
A little Bit more?
- VideoGameCritic
- Site Admin
- Posts: 17257
- Joined: April 1st, 2015, 7:23 pm
A little Bit more?
A bit is a 1 or a 0 in computer memory. In terms of video games, the term bit was widely used to describe the number of bits a processor could crunch at one time. All of the early consoles and home computers were 8-bit, including the Atari 2600. That was always the standard in computers. One 8-bit word can contain a number from 0 to 255, which is a pretty major limitation when it comes to rendering colors, generating high resolution graphics, and accessing memory locations. Naturally, clever programmers could come up with all kinds of clever work-arounds.
So when the first 16-bit CPUs were used in the Atari ST, Amiga, and later the Genesis and SNES, the difference was pretty dramatic. The games with bigger, with more colors, larger sprites, and better resolution. Back then, bit did make a difference. Although subsequent consoles CPUs were 32 and 64 bits, the law of diminishing return was at work. Marketers in the mid 90's really bastardized the whole "bit" definition by adding the bits of two co-processors together, or measuring the width of the data bus in bits. Anything to come up with a bigger number.
The latest consoles have 128 CPUs, but that's largely irrelevant. Today's computers and consoles don't rely on a single CPU to do everything, but farm everything out to dedicated graphics chips, sound chips, and various other processors. Clock speed is more of a factor, but the architectures are so complex you can't really compare one to the next.
So when the first 16-bit CPUs were used in the Atari ST, Amiga, and later the Genesis and SNES, the difference was pretty dramatic. The games with bigger, with more colors, larger sprites, and better resolution. Back then, bit did make a difference. Although subsequent consoles CPUs were 32 and 64 bits, the law of diminishing return was at work. Marketers in the mid 90's really bastardized the whole "bit" definition by adding the bits of two co-processors together, or measuring the width of the data bus in bits. Anything to come up with a bigger number.
The latest consoles have 128 CPUs, but that's largely irrelevant. Today's computers and consoles don't rely on a single CPU to do everything, but farm everything out to dedicated graphics chips, sound chips, and various other processors. Clock speed is more of a factor, but the architectures are so complex you can't really compare one to the next.
Return to “Video Games General”