A little Bit more?
Posted: April 22nd, 2007, 5:35 pm
[QUOTE=Steerforth]4 bit - Atari 2600
8 bit - NES
16 bit - SNES
32 bit - PS1
64 bit - N64
128 bit - Dreamcastif I am mistaken on these someone will correct them.[/QUOTE]
I was going to write a post about this this morning, but it's just a lot more complicated than the bitness of the CPU. CPUs are much more complex these days than they were in the past. They can be multi-core, and can also be accompanied by additional special-purpose processing units, such as GPU, FPU, ALU, SIMD, among others. These can be embedded into the CPU itself, or can be part of a machine's architecture added in addition to a given CPU.
Atari was an 8-bit machine, not 4-bit. It used a MOS 6507. There were some calculators that had a brand of 4-bit processor from Intel, and some other sparse use of 4-bit processing, but it was never widely used in any gaming consoles or home computer systems.
Dreamcast also was not a true 128-bit CPU, rather it was a 64-bit CPU with an added 128-bit FPU. The Gamecube had a 64-bit CPU with added 32-bit ALU, 64-bit FPU, and a 2x32-bit SIMD. XBOX was also a 64-bit Pentium III hybrid. Being a Pentium, it had a built-in MMX SIMD processor for integer matrix calculations. It was also accompanied by a very powerful NVidia GPU. Note that the Playstation 2 was the first console to have a true 128-bit CPU, called the "Emotion Engine", along with a 64-bit FPU, and two built-in 128-bit vector units. Yet it was weaker in processing power, and definitely in CPU/GPU combined output, than the two basically 64-bit machines it was competing against.
That should tell you something about why bitness is no longer the benchmark for advance in processor speed and/or capabilities. Bitness increase in hardware and software development is delivering diminishing returns for each new increase, partly because the ability to deal in values, addresses, etc. in values beyond 64-bit does not provide the same streamlining effect that each increase did on the previous counterparts. Instead, the trend today is to magnify a processor by adding cores, utilizing speedy communication between those cores, and providing specialized processors which aren't as general-purpose or functional as a CPU, but which can be utilized to be much faster on specific types of operations. Again, these would include the GPU (graphics processing unit), ALU (arithmetic logic unit), FPU (floating point unit), and SIMD (which stands for "single instruction, multiple data") which allows for processing of specific mathematical operations which result in an array of data instead of single value returns.
8 bit - NES
16 bit - SNES
32 bit - PS1
64 bit - N64
128 bit - Dreamcastif I am mistaken on these someone will correct them.[/QUOTE]
I was going to write a post about this this morning, but it's just a lot more complicated than the bitness of the CPU. CPUs are much more complex these days than they were in the past. They can be multi-core, and can also be accompanied by additional special-purpose processing units, such as GPU, FPU, ALU, SIMD, among others. These can be embedded into the CPU itself, or can be part of a machine's architecture added in addition to a given CPU.
Atari was an 8-bit machine, not 4-bit. It used a MOS 6507. There were some calculators that had a brand of 4-bit processor from Intel, and some other sparse use of 4-bit processing, but it was never widely used in any gaming consoles or home computer systems.
Dreamcast also was not a true 128-bit CPU, rather it was a 64-bit CPU with an added 128-bit FPU. The Gamecube had a 64-bit CPU with added 32-bit ALU, 64-bit FPU, and a 2x32-bit SIMD. XBOX was also a 64-bit Pentium III hybrid. Being a Pentium, it had a built-in MMX SIMD processor for integer matrix calculations. It was also accompanied by a very powerful NVidia GPU. Note that the Playstation 2 was the first console to have a true 128-bit CPU, called the "Emotion Engine", along with a 64-bit FPU, and two built-in 128-bit vector units. Yet it was weaker in processing power, and definitely in CPU/GPU combined output, than the two basically 64-bit machines it was competing against.
That should tell you something about why bitness is no longer the benchmark for advance in processor speed and/or capabilities. Bitness increase in hardware and software development is delivering diminishing returns for each new increase, partly because the ability to deal in values, addresses, etc. in values beyond 64-bit does not provide the same streamlining effect that each increase did on the previous counterparts. Instead, the trend today is to magnify a processor by adding cores, utilizing speedy communication between those cores, and providing specialized processors which aren't as general-purpose or functional as a CPU, but which can be utilized to be much faster on specific types of operations. Again, these would include the GPU (graphics processing unit), ALU (arithmetic logic unit), FPU (floating point unit), and SIMD (which stands for "single instruction, multiple data") which allows for processing of specific mathematical operations which result in an array of data instead of single value returns.