Team:Lambert GA/Contribution

CONTRIBUTION

OVERVIEW

For our 2020 iGEM project, we developed FLUORO-Q, a mobile phone-based measurement device that uses a phone camera to quantify the fluorescence of samples in cuvettes. Some issues with this device included the accuracy in brightness values, and solely being able to quantify one sample at a time. To improve on this, we developed Plate-Q: a frugal microplate reader at just under $150 that provides an inexpensive way to quantify both fluorescence and optical density (OD) of samples. Rather than using optical sensors found in laboratory grade plate readers[6], Plate-Q takes advantage of a Raspberry Pi camera to capture images of a well plate and extract image features using computer vision and mathematical regression algorithms. A 440 nm wavelength excitation light is used to measure GFP fluorescence using a 510 nm filter for emission, and a 600 nm light source is used for OD without any emission filter. Plate-Q is completely open-source, and users can customize the design to scan for other fluorescent proteins by replacing the light source and filter for different wavelengths. Users can retrain the Plate-Q algorithm to take measurements in an affordable and sustainable manner.

FULL PARTS LIST

  • Frosted Acrylic Plate:
  • Raspberry Pi
  • Raspberry Pi HQ Camera with 16mm Lens
  • 440 nm LED Strip Lights:
  • 600 nm LED Strip Lights
  • 510 nm Glass emission filter

DESIGN

Current Design

Plate-Q is broken down into 3 detachable parts: a top cover (Fig. 1-1), a Light-diffusing chamber (Fig. 1-2), and a bottom cap (Fig. 1-3). The Raspberry Pi camera is housed on the top cover and has 4 positions of movement (Fig. 2). The cover is printed with a matte black filament which helps prevent the reflection of light into the camera. This part then slides onto the light-diffusing chamber (Figures 4&5). The 96-well plate is placed into the slot on top of the light-diffusing chamber. We use a frosted acrylic diffuser able to diffuse the light from the LED strips evenly across the well plate (Figure 3). The bottom cap, which houses the LED light strips, will be under the diffuser in order to provide the GFP and OD600 wavelengths needed for quantification.

Well Plate Cover

Another Common problem with extracting luminescence from a clear well plate image is signal distortion [1]. To solve this problem we developed a well plate cover that fits into most 96 holed well plates (Figure 7). With this the signal from one well won't affect the readings of the surrounding wells.

Figure 7. Well plate cover used to minimize signal interference.

Camera Case

To hold the camera in place over the camera holes, we developed a 3d printed camera case that secures the camera in its intended position. This is accomplished through the 4 magnet holes placed at the corner of the case which snap into the respective magnets on the top cover of Plate-Q. This also allows for the ease of movements to each position over the well plate.

Figure 8. Image of camera with emission filter installed on the camera case.

ASSEMBLY

Users can assemble Plate-Q on their own by collecting the parts shown in the Parts List in Design, and by downloading the .stl files from this GitHub repository: Github Link. These stl files can be sliced through a slicing software such as Ultimaker Cura, and then converted into .gcode to be 3D printed. There are 6 different stl files that must be printed: a top cover, light-diffusing chamber, bottom cap, camera holder, and well plate cover (see Design).

For the light source, print 2 bottom caps, one for the 600nm light source and one for the 440nm light source. Place LED light strips along the length of the bottom cap (Figure 7). Wire the 440nm light source to a 12V light source and the 600nm light source to a 9V light source.

To assemble the Plate-Q scanning components, place the light-diffusing chamber on the bottom cap. Insert the well plate into the slot on top of the light-diffusing chamber. Then slide the top cover into the slots on the light-diffusing chamber.

To attach the camera to Plate-Q, attach the camera sensor to the camera holder and screw in the lens to lock the camera into place. Then, glue in magnets into the holes in both the camera holder and Plate-Q. This will help precisely place the camera to view different areas of the well plate. Next, connect the camera to a Raspberry Pi using a ribbon cable.

Run the main.py file on the Raspberry Pi and input the images captured by the Raspi camera. This file can be found on the Github repository. Each well in the image will be located using Otsu’s method, and the output will show the optical density and fluorescence values calculated by the Plate-Q algorithm.

RESULTS

After obtaining results, we analyzed fluorescence values and determined that Plate-Q works best at greater brightness values between 400 to 2000 with a percent error of 4.7% (see Fig. 10). Because Plate-Q overestimates for lower fluorescence values, this shows that the camera sensor has trouble quantifying lower brightness values (see Fig. 8).

Figure 16. Comparison of fluorescence in plate reader and Plate-Q at different concentrations and ranges.

Similar to fluorescence measurement, optical density (OD) in Plate-Q performs more accurately at higher brightness values than lower brightness values. In comparison with a laboratory microplate reader, Plate-Q tends to overestimate optical density (OD) at values approximately greater than 0.3. This is likely due to the low sensitivity of the Raspberry Pi camera sensor at low light settings because higher OD values are associated with less light transmission through a sample or lower brightness values. Due to this, the returned output of Plate-Q will be less reliable at a higher OD. Plate-Q was able to output with an average percent error of 18.74%.

Figure 17. Comparison of optical density output between Plate-Q and a laboratory microplate reader.

Overall Plate-Q was able to have less variability in data, but had trouble with lower brightness values. Moving forward, we plan to utilize either a night-vision camera or the combination of a phone camera and app to better account for Plate-Q’s current inability to quantify lower fluorescence values.

CONTRIBUTION TO FUTURE iGEM TEAMS

With the low cost of Plate-Q being under $150, future iGEM teams without access to a laboratory-grade plate reader will be able to quantify fluorescence and optical density at a comparable accuracy. With the easy-to-follow assembly steps, open-source CAD files, Electronics, and camera iGEM teams around the world can come together and contribute to improving the design. In addition to this, the easy-to-adjust open source code will allow teams to provide continued support and improve the accuracy of quantification for years to come.

REFERENCES

[1] Ashour, M., et al. (1987). Use of a 96-well microplate reader for measuring routine enzyme activities, 166(2), 353-360. https://doi.org/10.1016/0003-2697(87)90585-9. [2] Berg, B., Cortazar, B., Tseng, D., Ozkan, H., Feng, S., Wei, Q., ... & Ozcan, A. (2015). Cellphone-based hand-held microplate reader for point-of-care testing of enzyme-linked immunosorbent assays. ACS Nano, 9(8), 7857-7866. Retrieved from https://pubs.acs.org/doi/10.1021/acsnano.5b03203

[3] Otsu, N. (1979). A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cbernetics, 9(1), 62-66. Retrieved from https://ieeexplore.ieee.org/document/4310076

[4] Muthukrishnan. (2020). Otsu’s method for image thresholding explained and implemented. AI, Computer Vision and Mathematics. Retrieved from https://www.sciencedirect.com/topics/computer-science/gamma-correction https://muthu.co/otsus-method-for-image-thresholding-explained-and-implemented/

[5] McReynolds, T. & Blythe, D. (2005). Gamma correction. Advanced Graphics Programming Using OpenGL. Science Direct. Chapter 3. Retrieved from https://www.sciencedirect.com/topics/computer-science/gamma-correction

[6] Delyfer, M.N. & Delcourt, C. (2014). Optical density. Science Direct. Retrieved from https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/optical-density

[7] İlktaç, R. & Henden, E. (2021). Optical sensor. Science Direct. Retrieved from https://www.sciencedirect.com/topics/chemistry/optical-sensor