Team:Lambert GA/Measurement

MEASUREMENT

OVERVIEW

In typical labs, Plate Readers are very helpful in quantifying green fluorescence protein (GFP) and Optical Density (OD) from a 96-well standard plate; however, laboratory grade plate readers can cost up to $15,000 which is cost prohibitive for underfunded labs. To address this issue, we worked with Dr. Bhamla at the Georgia Institute of Technology and his colleague Chinna Devarapu at Tyndall National Institute to create Plate-Q, a frugal microplate reader. At just under $150, Plate-Q is an inexpensive way to quantify both fluorescence and optical density (OD) of samples. Rather than using optical sensors found in laboratory grade plate readers[6], Plate-Q takes advantage of a Raspberry Pi camera to capture images of a well plate and extracts image features using computer vision and machine learning algorithms. A 440 nm wavelength excitation light is used to measure GFP fluorescence using a 510 nm filter for emission, and a 600 nm light source is used for OD without any emission filter. Plate-Q is completely open-source and users can customize the design to scan for other fluorescent proteins by replacing the light source and filter for different wavelengths. Users can retrain the Plate-Q algorithm to take measurements in an affordable and sustainable manner.

FULL PARTS LIST

  • Frosted Acrylic Plate:
  • Raspberry Pi
  • Raspberry Pi HQ Camera with 16mm Lens
  • 440 nm LED Strip Lights:
  • 600 nm LED Strip Lights
  • 510 nm Glass emission filter

DESIGN

Past Design

The design of Plate-Q has been revised many times throughout the 2021 season. The initial design of Plate-Q contained a single focal light source that would not evenly distribute light across the well plate, and used angle and distance characterization to account for differences in light intensity caused by the uneven light distribution (see Figs. 1-3). After thorough testing and speaking with mentors including Dr. Bhamla, Dr. Devarapu, and Rajas Poorna, we quickly learned that characterizing for angle/distance is impractical and yields inaccurate results.

Figure 1. Testing jig of Plate-Q that revolved light source around a center of rotation to simulate multiple angles of excitation.

Figure 2. A telescopic arm that attaches to the light source to simulate different distances.

Figure 3. Old Plate Reader Design with Light Sources housed on top without even light distribution on the sample.

Current Design

Plate-Q is broken down into 3 detachable parts: Top cover (Fig. 4-1), a Light-diffusing chamber (Fig. 4-2), and a bottom cap (Fig. 4-3). The Raspberry Pi camera is housed on the top cover and has 4 positions of movement. The cover is printed with a matte black filament which helps prevent the reflection of light into the camera. This part then slides onto the light-diffusing chamber (Figures 5-6). The 96-well plate is placed into the slot on top of the light-diffusing chamber. We use a frosted acrylic diffuser that is able to diffuse the light from the LED strips evenly across the well plate (Figures 7-9). The bottom cap, which houses the led light strips, will be under the diffuser in order to provide the GFP and OD600 wavelengths needed for quantification.

Camera Movement and Perspective Adjustion:

A common problem with camera-based devices is lens distortion and camera perspective [1]. In order to counteract lens distortion, we utilized a 16mm focal length lens which minimizes the fisheye effect compared to other lenses(Figures 10-12). On the other hand, we minimized the camera perspective problem but moving the camera over multiple positions of the well plate and utilizing a perspective transformation algorithm (Figure 13).

Well Plate Cover:

Another Common problem with extracting luminescence from a clear well plate image is signal distortion [1]. To solve this problem we developed a well plate cover that fits into most 96 holed well plates (Figure 13). With this the signal from one well won't affect the readings of the surrounding wells.

Figure 13. Image of well plate covers.

Camera Settings

In order to get the most precision from the Raspi camera in a low-light setting, we adjusted various camera settings such as ISO, exposure, shutter speed, etc., to manipulate the output image. Below are the camera settings used:

  • Shutter speed: 1000 ms
  • ISO: 800
  • Exposure: Night mode (this is built into the raspistill library on the Raspberry Pi)

These camera settings are ideal for photography of low-light scenes, and worked well for capturing well plates for Plate-Q [7].

SOFTWARE PROCESSING

Overview

Plate-Q relies on a standardized Raspberry Pi HQ camera for the quantification of fluorescence and optical density, which is done through a software pipeline (Figure 14-15) that extracts the image features and maps those values to laboratory-grade plate reader values. The pipeline starts by taking pictures in triplicates at 4 different positions across the well plate. Each image is then converted into a grayscale and applied a perspective transformation to analyze the perceived brightness. Then to isolate the wells in the image Otsu's Thresholding (see Image optimization) is used. Then a gamma correction is applied to the image to ecode it into a linear luminescence scale. Finally the brightness values of the triplicates are extracted and averaged to eliminate any outliers. Then the final value is stored in a data frame (CSV file) that is passed through a mathematical regression model to output laboratory-grade plate reader values.

Figure 14. Visual representation of software pipeline.

Figure 15. Diagram of Plate-Q software pipeline.

Perspective Transformation Algorithm

To solve the problem of camera perspective, we utilized the perspective transformation algorithm. This type of transformation does not preserve parallelism, length, and angle, but instead preserves collinearity and incidence. This means that the straight lines will remain straight even after the transformation. The perspective transformation can be represented by the formula (1) shown below, where,(x, y) are the input points,(x’, y’) are the transformed points, and M is the3*3 transformed Matrix.

(1)

Matrix M is a combination of an image transformation matrix(a1-a4), translation vector(b1-b2), and projection vector(c1-c2). With Matrix M having 8 degrees of freedom we are able to select 4 points in the input image and map these 4 points to the desired locations in the output image. In our case, we are able to identify the corners of interest in the image and calculate the perspective transformed image. With this, we are able to minimize the camera perspective problem.

Otsu's Method and Well Isolation

Otsu’s Method, a computer vision technique, is used to isolate wells in an image (Figure 16). Otsu’s method[2] is a variance-based technique to find the threshold value where the variance between the foreground and background pixels is the least. With this, the algorithm iteratively applies different values of thresholds to find the value that minimizes the within-class variance between the two classes. The Formula to find the within-class variance at any threshold (2) can be represented by the equation below where t represents the threshold value, ωbg and ωfg represents the probability of the number of pixels for each class at a threshold t, and σ2 represents the color values [3].

(2)

For our situation, our image is in grayscale so pixel values are between 0-255. After iteration and finding the threshold that minimizes the within-class variance is 25, all pixel values greater than or equal to 25 become the foreground of the image while all other pixels become the background. With this, we are able to isolate the wells from the background as the well's pixels are brighter than the background of the image.

Figure 16. Example of Otsu’s Method applied to fluorescence data.

Gamma Correction

Gamma correction is a process of translating the brightness measured by camera sensors to the perceived brightness of human eyes. It is used to compensate for the non-linear relationship between luminescence and brightness as measured by the camera (Figures 17-18). Pictures taken by a camera are gamma encoded in an sRGB format, which records red, green, and blue light ranges in a non-linear method. Gamma correction applies the inverse of this encoding function to transform the non-linear sRGB [4].

To apply a gamma correction to an input image, each pixel value from a scale of 0 to 255 is converted to a scale of 0 to 1. The following function below (3) is applied to the pixel values of the input image: where I is the input image, G is the gamma value, and O is the output image.

(3)

For optical density, the ideal gamma value is 0.22, whereas for fluorescence imaging, the ideal gamma value is 0.70 (See Regression section for more info).

Figure 17. Image before gamma correction.

Figure 18. Image after gamma correction.

Calculating Fluorescence and Optical Density

In order to calculate fluorescence using brightness values extracted from an image, the following equation is used (4) where Io is the amount of light that initially hits the sample, T is the amount of light that transmits through the sample, and OD is the calculated optical density. I0 is measured by scanning a clear plate or a blank well through Plate-Q and extracting the 0 to 255 pixel brightness value, and T is measured by scanning a filled well through Plate-Q and extracting the brightness value.

(4)

In order to calculate the optical density using brightness values extracted from an image, the following equation is used (5) where Io is the amount of light that initially hits the sample, T is the amount of light that transmits through the sample, and OD is the calculated optical density. I0 is measured by scanning a clear plate or a blank well through Plate-Q and extracting the 0 to 255 pixel brightness value, and T is measured by scanning a filled well through Plate-Q and extracting the brightness value [5].

(5)

Io is the amount of light that initially hits the sample, T is the amount of light that transmits through the sample, and OD is the calculated optical density. I0 is measured by scanning a clear plate or a blank well through Plate-Q and extracting the 0 to 255 pixel brightness value, and T is measured by scanning a filled well through Plate-Q and extracting the brightness value [5].

Regression

To find the optimal gamma correction value and to map values to real plate reader values, we applied regression comparing calculated values of Plate-Q with a wide range of gamma correction applied to real plate reader values. To find the best gamma correction value, we tried to linearize the data by applying a wide range of gamma values. When iterating through a wide range of gamma correction values and applying linear regression on the data, we looked for the gamma value that would result in the r^2(correlation coefficient) value closest to 1. In addition the resulting linear regression equation was used to map the values to a laboratory grade plate reader. Without analysis we were able to determine the optimal gamma correction value of Fluorescence to be .7 with a regression equation of y=31.5 + 19.8(raw Plate-Q Value) (Figures 19-20). On the other hand optical density’s optical density value was .23 with a regression equation of y=.0635+1.5(raw Plate-Q Value).

Figure 19. Image of linear regression on raw OD values from Plate-Q.

Figure 20. Image of linear regression on raw OD values from Plate-Q.

RESULTS

After obtaining results, we analyzed fluorescence values and determined that Plate-Q works best at greater brightness values between 400 to 2000 with a percent error of 4.7% (see Fig. 21). Because Plate-Q overestimates for lower fluorescence values, this shows that the camera sensor has trouble quantifying lower brightness values (see Fig. 8).

Figure 21. Comparison of fluorescence in plate reader and Plate-Q at different concentrations and ranges.

Similar to fluorescence measurement, optical density (OD) in Plate-Q performs more accurately at higher brightness values than lower brightness values. In comparison with a laboratory microplate reader, Plate-Q tends to overestimate optical density (OD) at values approximately greater than 0.3. This is likely due to the low sensitivity of the Raspberry Pi camera sensor at low light settings, because higher OD values are associated with less light transmission through a sample, or lower brightness values. Due to this, the returned output of Plate-Q will be less reliable at a higher OD. Plate-Q was able to output with an average percent error of 18.74%.

Figure 22. Comparison of optical density output between Plate-Q and a laboratory microplate reader.

Overall Plate-Q was able to have less variability in data, but had trouble with lower brightness values. Moving forward, we plan to utilize either a night-vision camera or the combination of a phone camera and app to better account for Plate-Q’s current inability to quantify lower fluorescence values. Additionally, we plan on testing different shutter speed settings to manipulate the camera sensitivity to brightness.

Github Judging Release

Plate-Q Judging Release Form

Link: Plate-Q Source Code (github.com)

REFERENCES

[1] Berg, B., Cortazar, B., Tseng, D., Ozkan, H., Feng, S., Wei, Q., ... & Ozcan, A. (2015). Cellphone-based hand-held microplate reader for point-of-care testing of enzyme-linked immunosorbent assays. ACS Nano, 9(8), 7857-7866. Retrieved from https://pubs.acs.org/doi/10.1021/acsnano.5b03203

[2] Otsu, N. (1979). A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cbernetics, 9(1), 62-66. Retrieved from https://ieeexplore.ieee.org/document/4310076

[3] Muthukrishnan. (2020). Otsu’s method for image thresholding explained and implemented. AI, Computer Vision and Mathematics. Retrieved from https://www.sciencedirect.com/topics/computer-science/gamma-correction https://muthu.co/otsus-method-for-image-thresholding-explained-and-implemented/

[4] McReynolds, T. & Blythe, D. (2005). Gamma correction. Advanced Graphics Programming Using OpenGL. Science Direct. Chapter 3. Retrieved from https://www.sciencedirect.com/topics/computer-science/gamma-correction

[5] Delyfer, M.N. & Delcourt, C. (2014). Optical density. Science Direct. Retrieved from https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/optical-density

[6] İlktaç, R. & Henden, E. (2021). Optical sensor. Science Direct. Retrieved from https://www.sciencedirect.com/topics/chemistry/optical-sensor

[7] William, Brendan. (2020). The best camera settings for low light photography. Brendan Williams Creative. Retrieved from https://www.bwillcreative.com/the-best-camera-settings-for-low-light-photography/