Home / Publications / Journals / Nuclear Technology / Volume 6 / Number 1
Nuclear Technology / Volume 6 / Number 1 / January 1969 / Pages 73-80
Technical Papers and Note / dx.doi.org/10.13182/NT69-A28270
Articles are hosted by Taylor and Francis Online.
Techniques and instrumentation at microwave frequencies show promise for measuring both temperature and gas coolant impurities within high-temperature nuclear reactors. Temperature is measured as a result of the thermal expansion of a metallic sensor, while impurities can be detected by their effect upon the coolant dielectric constant. An experimental Ni-Cr steel microwave cavity, resonant at 15 GHz, yielded a linear output signal for variations of temperature to 1250°C with a sensitivity of 330 kHz/°C. For gas coolant impurity measurements, both a microwave cavity method and a phase-shift method provided desired speed of response and sensitivity. Tests with the interferometer-type impurity measuring instrument indicate a sensitivity of ∼ 4 × 10−4 degrees phase shift/[(ppm)m] for water vapor in helium gas and a time constant of 1 sec for step changes in impurity content.