A new tool measures the distance between phonon collisions
Today’s computer chips pack billions of tiny transistors onto a plate of silicon within the width of a fingernail. Each transistor, just tens of nanometers wide, acts as a switch that, in concert with others, carries out a computer’s computations. As dense forests of transistors signal back and forth, they give off heat — which can fry the electronics, if a chip gets too hot. Manufacturers commonly apply a classical diffusion theory to gauge a transistor’s temperature rise in a computer chip. But now an experiment by MIT engineers suggests that this common theory doesn’t hold up at extremely small length scales. The group’s results indicate that the diffusion theory underestimates the temperature rise of nanoscale heat sources, such as a computer chip’s transistors. Such a miscalculation could affect the reliability and performance of chips and other microelectronic devices. “We verified that when the heat source is very small, you cannot use...