@HarneyBA node sizes are determined such that each node size decrease translates to a doubling of transistor density.
Ie for planar transistors, the drop from 45nm to 32nm doubled the number of transistors per square millimeter.
Of course, the cost of each node drop is greater than the gain in transistor density
@HarneyBA that doesn't hold up anymore, however, in the era of FinFET transistors. Intel has been FinFET since 22nm, and pretty much every node smaller than that in the industry is also FinFET.
Node size in FinFET era is also mostly meaningless below 15nm, because it doesn't account for fin pitch and density in any way
@HarneyBA cost goes up a huge amount from this point forward. Below 10nm it requires new UV technology that is only just now coming to fruition.
450mm wafers are also just around the corner, and when they are ready, the ENTIRE industry will move to them at the same time. That will require ENORMOUS capital expense the likes we haven't seen since 300mm wafers came out
There is actually no more interest in the industry to do it, despite the obvious benefits. Equipment manufacturers don't want to end-of-life their current 300mm tech, fabs don't want to have to completely retool, and we would have to learn the painful lessons of going up a wafer size again.
Going from 200mm to 300mm wafers was a VERY painful experience for the industry, even moreso than going from any other size up to 200mm.
Linux geeks doing what Linux geeks do...