Small adendum: it would be possible to emulate operations with more bits by splitting and doing it with more calculations, but you would loose a lot of compute power.
so instead of calculating 1 value @ 16 bit (65 536) you could calculate 2x 8 bit , and would aditionally need to interpretthese 2 values together.
so from 8 (256)->16 (65 536) and from 16->32 (4 294 967 296) were huge. From 32 -> 64 (18 446 744 073 709 551 616) not so much but still relevant, and with 64 we have enough values for most variable you could need in game development, and for the view that would need more precision, these view you can do with multiple operations. 128 will come, but wohnt be as relevant (3,4028236692093846346337460743177e+38), and with 256... i honestly dont even expect it to come to the mainstream, except maybe if the aim is to work with 2 values parrallel on one core.
Ok, quoting wikipedia:
"256-bit processors could be used for addressing directly up to 2^256 bytes. Already 2^128 ( 128-bit) would greatly exceed the total data stored on Earth as of 2010, which has been estimated to be around 1.2 zettabytes (over 2^70 bytes).[1]"
en.wikipedia.org
With 128 there seem to be at least some uses:
en.wikipedia.org