humy
Jul 29, 2012, 03:26 PM
Suppose we have a single-photon infrared light detector with a flat one square millimetre area light detecting surface that has a light absorbing and detecting layer that is, lets say, exactly one nanometre thick ( one molecule thick ) that can detect any infrared photon with wavelengths between w1 and w2 that enters this thin layer. Lets suppose that it is a 'perfect' detector in the sense that it detects all photons between w1 and w2 that actually ends up in that that light detecting layer with no false photon detection or false signals indicating a photon with that wavelength range that isn't there and every photon between w1 and w2 that ends up in this layer is detected. Now, at a given temperature T of that layer, depending on T, in each second there would typically be some infrared photons between w1 and w2 generated within that layer due to black body radiation at T and all those photons will be detected even though they didn't originate outside that layer.
What I want to know it this: what is the correct mathematical formulae for describing how many photons between w1 and w2 per second will be detected within that layer that didn't originate outside that layer but within the layer itself as a result of the layer being at temperature T?
I also want to specifically know how many single photons per second between 1000nm and 1100nm wavelengths will be detected by the light detecting layer if the temperature of that layer is 37C and no photons of that wavelength range was shining at it from outside the detector so all those photons came spontaneously from within the layer itself as a result of the temperature of the layer itself. How is specifically that worked out?
I have heard of “thermal noise” but don't understand how that might relate to this if at all.
What I want to know it this: what is the correct mathematical formulae for describing how many photons between w1 and w2 per second will be detected within that layer that didn't originate outside that layer but within the layer itself as a result of the layer being at temperature T?
I also want to specifically know how many single photons per second between 1000nm and 1100nm wavelengths will be detected by the light detecting layer if the temperature of that layer is 37C and no photons of that wavelength range was shining at it from outside the detector so all those photons came spontaneously from within the layer itself as a result of the temperature of the layer itself. How is specifically that worked out?
I have heard of “thermal noise” but don't understand how that might relate to this if at all.