CHAPTER 11 Digital Microscope Cameras 67
Fluorescence and Brightfield on the
Same Microscope
So what if you have both brightfield and fluorescence applications, but only one
microscope? Should you buy a color camera because of the brightfield application?
And also use it for the fluorescence imaging? Well yes, you can do that, but
keep in mind the loss of sensitivity. The best advice: if your microscope is equipped
with two camera ports, then install two cameras.
Sensor Size, Pixel Size and Differences in
Resolution
Remember the Abbe resolution mentioned in Chapter 2? The resolution
of your complete microscope system depends on the numerical
aperture of the objective you use and the wavelength. Following this
formula, lower wavelengths and higher numerical apertures of your
objective give you better resolution.
As the sensor is supposed to convert a full image, it measures the intensity distribution
of the microscopic image over the image area. Therefore the pixels are arranged
as a matrix to cover the area. So you get two new qualities: pixel count and sensor
area. And if you divide the sensor area by the pixel count, you get the pixel size.
Now it becomes clear: the smaller the pixels, the finer the image content can be
resolved. So the pixel size can be defined in such a way that it can pick up the resolved
microscopic image without losing any image content.
The scientists who defined this condition were Harry Nyquist in 1928 and later
on Claude Shannon. They found out that a given dimension can be only measured
correctly if the measurement tool can resolve at least half the distance it needs to
measure. In other words, it tells you that your pixels should – at most – be half
the size of the minimum resolvable distance of your microscope setup (as we said
before, the smaller the pixel, the better your camera resolution). To be on the safe
side, scientists have calculated a factor of 2.3 by which the camera pixels should
be smaller than the minimum resolution of the microscope.
Camera resolution in sample area= (sensor pixel size × 2.3) / (objective power
× camera adapter power)
But now we have a conflict: This works against the sensitivity requirement: large
pixels are more sensitive as they collect more photons, less noisy and ultimately