In 1916, Einstein postulated the existence of gravitational waves in his General Theory of Relativity. Gravitational waves are described as ripples in the curvature of spacetime detected using laser interferometers to measure the quadrupole stretching and squeezing of space caused by these waves. Current detectors are limited by thermal noise which is calculated using the Fluctuation Dissipation Theory (FDT). A number of future detectors are planned to operate at cryogenic temperatures to reduce this thermal noise but it introduces some issues to address. The FDT relies on the system it is describing being in thermal equilibrium. For room temperature detectors, any thermal gradient present is significantly smaller than the absolute temperature and so thermal equilibrium can be assumed. In cryogenic detectors there will be substantial thermal gradients due to heat being removed through the suspension fibres and the system can no longer be seen to be in equilibrium. The experiment described in this thesis investigates the validity of the FDT when a system is in non-equilibrium. By measuring Johnson noise in a thin film platinum resistor while under the effect of a thermal gradient, the thermal noise that is predicted by the FDT can be compared with the measured thermal noise of the resistor, to look for any deviation from the theoretical values. The experiment showed a nul result where no excess noise was seen at an absolute temperature of 110k with a temperature difference of 6k across the resistor. This provides a limiting factor for the thermal gradient that will not produce excess noise above the theoretically calculated value, and giving a limit to the thermal gradient where the FDT is still valid.
【 预 览 】
附件列表
Files
Size
Format
View
Measuring Johnson resistor noise in non-equilibrium conditions