@HarneyBA node sizes are determined such that each node size decrease translates to a doubling of transistor density.
Ie for planar transistors, the drop from 45nm to 32nm doubled the number of transistors per square millimeter.
Of course, the cost of each node drop is greater than the gain in transistor density
@HarneyBA that doesn't hold up anymore, however, in the era of FinFET transistors. Intel has been FinFET since 22nm, and pretty much every node smaller than that in the industry is also FinFET.
Node size in FinFET era is also mostly meaningless below 15nm, because it doesn't account for fin pitch and density in any way
@HarneyBA cost goes up a huge amount from this point forward. Below 10nm it requires new UV technology that is only just now coming to fruition.
450mm wafers are also just around the corner, and when they are ready, the ENTIRE industry will move to them at the same time. That will require ENORMOUS capital expense the likes we haven't seen since 300mm wafers came out
@HarneyBA why move to 450mm you ask? 450mm will result in a 2.25x greater die per wafer, while maintaining CURRENT cost/wafer.
We saw the same benefits with each wafer size jump, from 100mm -> 150mm -> 200mm -> 300mm.
Of course. That ignores cost of new equipment and R&D.
@HarneyBA Of course.
Its fun to see hobbyists try to make usable devices with non-industry tools, but it cannot scale without gigantic capital investment if you are starting from scratch. Even if you want to borrow a foundry fab, you will still need gigantic financial investment *just* for the photomasks.
Linux geeks doing what Linux geeks do...