Laboratory setup for using giant “Rydberg” atoms to measure temperature. The glowing red orb shows the cloud of approximately one million rubidium atoms used in the setup.
Credit: N. Schlossberger/NIST
Scientists at the National Institute of Standards and Technology (NIST) have created a new thermometer using atoms boosted to such high energy levels that they are a thousand times larger than normal. By monitoring how these giant “Rydberg” atoms interact with heat in their environment, researchers can measure temperature with remarkable accuracy. The thermometer’s sensitivity could improve temperature measurements in fields ranging from quantum research to industrial manufacturing.
Unlike traditional thermometers, a Rydberg thermometer doesn’t need to be first adjusted or calibrated at the factory because it relies inherently on the basic principles of quantum physics. These fundamental quantum principles yield precise measurements that are also directly traceable to international standards.
“We’re essentially creating a thermometer that can provide accurate temperature readings without the usual calibrations that current thermometers require,” said NIST postdoctoral researcher Noah Schlossberger.
Revolutionizing Temperature Measurement
The research, published in Physical Review Research, is the first successful temperature measurement using Rydberg atoms. To create this thermometer, researchers filled a vacuum chamber with a gas of rubidium atoms and used lasers and magnetic fields to trap and cool them to nearly absolute zero, around 0.5 millikelvin (thousandths of a degree). This means the atoms were essentially not moving. Using lasers, they then boosted the atoms’ outermost electrons to very high orbits, making the atoms approximately 1,000 times larger than ordinary rubidium atoms.
In Rydberg atoms, the outermost electron is far away from the core of the atom, making it more responsive to electric fields and o ..
Support the originator by clicking the read the rest link below.