humy
Jul 29, 2012, 03:26 PM
Suppose we have a single-photon infrared light detector with a flat one square millimetre area light detecting surface that has a light absorbing and detecting layer that is, lets say, exactly one nanometre thick ( one molecule thick ) that can detect any infrared photon with wavelengths between w1 and w2 that enters this thin layer. Lets suppose that it is a 'perfect' detector in the sense that it detects all photons between w1 and w2 that actually ends up in that that light detecting layer with no false photon detection or false signals indicating a photon with that wavelength range that isn't there and every photon between w1 and w2 that ends up in this layer is detected. Now, at a given temperature T of that layer, depending on T, in each second there would typically be some infrared photons between w1 and w2 generated within that layer due to black body radiation at T and all those photons will be detected even though they didn't originate outside that layer.
What I want to know it this: what is the correct mathematical formulae for describing how many photons between w1 and w2 per second will be detected within that layer that didn't originate outside that layer but within the layer itself as a result of the layer being at temperature T?

I also want to specifically know how many single photons per second between 1000nm and 1100nm wavelengths will be detected by the light detecting layer if the temperature of that layer is 37C and no photons of that wavelength range was shining at it from outside the detector so all those photons came spontaneously from within the layer itself as a result of the temperature of the layer itself. How is specifically that worked out?

I have heard of “thermal noise” but don't understand how that might relate to this if at all.

ebaines
Aug 8, 2012, 06:57 AM
What yuo are describing is precisely what thermal noise is. Photons theat emanantet from within the detector can cause "false" positive readings by the detector mechanism. For very sensitive detectors (such as used in astrophotography) this thermal noise can show up as extraneous spots on the digital picture. Consequently it's common practice to cool the detector to reduce the incidence of this spurious radiation.

The phenomenon of electromagnetic radiation due to an object's temperature is known as "black body radiation." In practice no obkect is truly "black," and consequently the intensity of radiatio is somewhat less than for a theoretically perfectly black body. That theoretical intensity is given by Planck's Law:

I = \frac {2 h \nu^2} {c^2} \frac 1 {(e^{(\frac {h \nu}{kT})}-1)

where:

I = the intensity in units of energy per unit time (power) per unit area emitted in the normal direction per unit solid angle.
h = Planck's constant
\nu = frequency of light
k = Boltzman constant
c = speed of light
T = temperature in kelvins.

I suppose you could approximate the number of photons per unit time per unit solid angle by employing the equation:

e = h \nu = h \frac c {\lambda}