8 bit, 16 bit, 32 bit, 64 bit etc...
I remember games were defined by this "bit". What about now? What bit would PS2 be considered, or Xbox 360?
I'm pretty sure that both are 128 bit capable, but they may not use it.
I've always been confused as to what the 'bit' actually stands for. Surely any system after the 32-bit era (save maybe a few of the 64-bit systems) DO NOT have a 128-bit CPU. Is it perhaps the bit size of the registers in the GPU?
NES: 8-bit 6502 with integrated sound hardwareay 1.79Mhz
SMS: 8-bit Z80 at 3.6mhz
SNES: 16-bit 65c816 hack of the 8-bit 6502 running at 1.79, 2.7. or 3.6Mhz
Genesis: 16-bit 68000 (SegaCD: same, but 12Mhz, rather than 8, 32x: two 32-bit SH2 CPUs at 25Mhz)
Playstation: 32-bit MIPS clone CPU (R3000A) at 33Mhz
N64: 64-bit MIPS CPU (R4300i, designed with SGI) at 94Mhz
Saturn: dual 32-bit SH2 CPUs running at 27Mhz
Dreamcast: 32-bit SH4 CPU running at 200Mhz
GameCube: 32-bit
POWerPC G3 based CPU running at 485Mhz
Xbox: 32-bit Intel Celeron running at 733Mhz
PS2: some crazy MIPS knock off that they call the "Emotion Engine" running at 300Mhz
Xbox 360: Triple core 64-bit
POWerPC G5 based CPU each core running at 3.2Ghz
PS3: some crazy "Cell" crap with the main core being a 3.2Ghz
POWerPC G5 based chip, also there are 7 vector processing units on the chip which are supposed to help with all of the graphic rendering.
That's pretty much all the consoles from history that matter, and I think it answers the question.
All from memory, though I did check my facts.
Also moved to "General Gaming" where it belongs...
I think that the GBA is an 32 bit, if you count the Game Boys...
The "bit" part of the CPU (ie 8-bit) refers to how many bits can be manipulated by a single instruction in the CPU (ie stored, added, subtracted etc).
Although, with more instructions / programming it can go beyond those limits. Some registers in the CPU (registers = variables) can be bigger than whatever the bit of the CPU is (usually stack registers).