Time Frame

Definition: Time frame typically refers to a discrete time interval (usually in seconds or milliseconds) over which a set of optical measurements are acquired. Each frame is divided into a series of smaller time intervals called “time bins” or “data points” and the optical signal is sampled at a fixed sampling rate during each time bin.

Alternative definition:

Synonym:

References:

https://doi.org/10.1117/1.nph.8.1.012101

Related terms:  

Related Posts