Detector Sensitivity

Definition: Detector sensitivity refers to the ability of the detector to convert incoming light/photons into an electrical signal. It is defined as the ratio of the change in the output quantity of the detector against the change in the input quantity of the detector (also known as responsivity). It can be formulated as the observed change in the output current of the detector for a change in the power of the incident light in fNIRS devices (unit: Ampere/Watt or Ampere/Lumen). Detector sensitivity can be affected by several factors, including the detector material, the size and shape of the detector, the quality of the optics, and the electronics used to amplify the detector signal.

Alternative definition:



Related terms: Detector, Detectivity, Responsivity, Noise equivalent power  

Related Posts