Development || Progress Updates

Week 1 10-16 May

We kicked off with an initial consultation with Tony in the Making and Tinkering lab. After pitching our idea of a 3D-printed microscope with automated cell counting, we obtained feedback on the feasibility of the project and guidance on how to implement what we want to do. We will have to discuss the specific requirements and details in order to tackle the hardware component of our project.

 

Initially, we were looking to 3D print a microscope as open-source laboratory projects are quite interesting and hold potential [1]. During the consultation, we were also introduced to multiple microscope models to get a sensing of the direction we would like to proceed with. Amongst these were: an Amscope model which was linked to the computer using an application, a motor-controlled model built by previous students, a hand-held microscope and another built partially from 3D printed components.

AmScope Microscope

 

Motor-controlled Microscope

 

The hand-held microscope looks flimsy and will likely have to be mounted; the Amscope model likely uses proprietary software and it will be a challenge to automate the extraction of images. On the other hand, the motor-controlled model holds promise: since it was controlled via Arduino, we could in principle control it via serial commands. However, there was no camera module and we would have to figure that out ourselves. 

 

Motor-controlled Microscope with Arduino

According to the feedback, 3D-printing a microscope would be very challenging, as it would not be as stable as compared to store-bought microscopes. This would affect the focusing of the microscope onto samples, along with the stability of any sample slides we want to image.

 

Other parameters of the microscope such as image input size and the magnification were also brought to our attention as important things to consider. From our team’s past experience, imaging single cells could require a magnification of around 400X, which is typical for a biological microscope. Thus, we will have to decide on what cells we want to use for imaging. It is also possible that we will have to culture our own cells.

 

We decided not to build a microscope from any 3D printed components unless necessary. Given a budget of $2000, it is not difficult to purchase biological microscopes. Furthermore, this would sidestep any issues of structural instability: 3D microscopes have to be assembled and the quality of the print may not always be comparable to the quality of store-bought stuff.

 

Besides, in order to train our machine learning model, we would require a large quantity of training data. Since the training data should be high quality, it is unlikely that 3D printed components will be suitable. 

 

Subsequently, we prepared a rough initial sketch after the meeting.

Rough Sketch of Microscope Design

Week 2 17-23 May & Week 3 24-3o May

After attending lessons on 3D printing and block diagrams, we set up another project consultation meeting with Dr Ho and Tony with our block diagram and a more detailed plan for our project. 

V1 Block Diagram Draft Presented

 

V1 Microscope Draft Diagram (Labelled)

 

Logic Diagram for Microscope Motor Control

We also took the opportunity to clarify funding-related queries. 

 

Possible complications brought up by Dr Ho were overlapping cells and shaky images caused by the movement of cells if still alive. For regular cell counting e.g. for red blood cells, a known, small titre of well-mixed cell solution is inserted into a hemocytometer, which has a printed grid that allows for estimating the cell density of a solution. 

 

With a sufficiently dilute and well-mixed solution, cells are unlikely to clump. Hemocytometers are usually designed with an interface that keeps the cells stationary, but we can quantify the extent of cell motion in the future. 

 

While we were inspired by a previous project to use gear motors to control the knobs for the x and y-axis on the microscope stage, we obtained feedback that such a mechanism was inadequate for the precision required by cells of our target size. Instead, a belt system, specifically an Arduino with RAMPS, was recommended for our use.

 

In preparation for the presentation, we worked on an improved version for our block diagram and our labelled diagram of the set-up after receiving feedback from the consultation meeting.

 

V2 Block Diagram Part 1
V2 Block Diagram Part 2

 

V2 Labelled Sketch of Set-up

 


Week 4 31- 6 June

After the lab equipment ordered online arrived, we booked a slot to visit the MnT lab to try out the microscopes in person with the yeast solution to determine the required magnification for the lens. 

 

In the process of using different microscopes, we were able to better understand the pros and cons of each. For instance, the most costly microscope was a work in progress and lacked a controller. It also had settings for auto-brightness, which affect the viewing of samples.

Trying out AmScope Microscope with Prepared Yeast Sample Slides
Trying out Eakins Microscope with Prepared Yeast Sample Slides — A lot of effort was needed to adjust the focus for the microscope. There was also no working controller at the moment for movement control by axis.
Trying out 3D Printer Frame Microscope with Prepared Yeast Sample Slides — Microscope had lighting from the top of the slides instead of the back, which affected the appearance of the cells in the webcam image.

 

Delivery time was a concern as waiting for shipments to arrive would impede our progress. Hence, we opted for loaning as we would be able to bypass this issue. In this case, we were allowed to borrow a microscope used in a previous MnT project. While we initially intended to work with a handheld microscope, in consideration of image quality and magnification options, we pivoted towards a conventional biological microscope.

Trying out Borrowed Microscope with Prepared Yeast Sample Slides
Yeast Sample Image photographed from Eyepiece Lens

While we managed to borrow the biological microscope, the 40x objective lens we needed was faulty and there were no other replacements available. To circumvent this, we stacked two eyepiece lenses of 30x and 2x and used a 10x objective lens to achieve a magnification of 600x. While this is higher than 400x, it is still in the acceptable range. Stacking of eyepiece lenses is acceptable in this case as the eyepiece has a wide-angle and does not reduce the field of vision to an adverse extent.  

Subsequently, we proceeded to photograph the yeast sample slides in preparation for the training dataset for the cell counting model, obtaining a total of 173 images.


Week 5 7- 13 June

We first marked out the milestones to achieve for future weeks to establish a clear direction for our project. While we had obtained our microscope, we were still in the midst of sourcing for hardware and delivery time became an impediment on our progress. We also needed to customise our set-up to the existing biological microscope specifications and currently lacked specific dimensions and details regarding what we intended to set up.

Updated Gantt Chart for Goal Setting

We finished biological equipment ordering. While waiting for shipments and the finalising of hardware decisions, we worked on software, specifically the cell counting model. A toy model for prediction was drafted and developed. However, we noted that data augmentation via the use of a data generator would be good in improving the model performance as the manual labelling of the training dataset for hundreds of images (split into 900 small 50 x 50 pieces each) was very time-consuming. In addition, a preprocessing pipeline for the images was drafted as well.

As there were existing cell-counting deep learning models, we started writing code to do transfer learning over the VGG and U-Net model. We also attempted to minimise effort required for manual labelling by designing a keystroke logger application. 

We also flashed marlin onto a RAMPS board and demonstrated motion control. Now, we needed to design an interface for non-technical users.

In addition, we read up on the hardware specifications. In particular, stepper motors can have multiple coils that are energised at specific times to rotate the motor. Specifically, the motors available in the lab are bipolar (two poles) and require a 12 V power supply. We also learnt that we should disconnect the Arduino-PC connection when powering on the motors just in case of any power spikes.

After obtaining the biological microscope, we dismantled it and intended to mount it using the 3d printer. We also worked on the 3D printed design for webcam adapter and microscope tube mount after attending the 3D Printing workshop and the Soldering and Circuits workshop.


Week 6 14- 20 June

We started 3D printing these parts:

We had to reprint the lens adaptor and the camera holder as some parts did not fit as expected. Instead of reprinting the whole components and having to wait longer, we opted to isolate the part of the design that did not fit the microscope or webcam specifications and reprint those parts for shorter waiting time.

Isolating Parts of the 3D Printed Camera Holder Prototype for Quicker Reprinting Time and Checking of Fit

 

Rail Acceptor Component (Part Marked in Pen with a Cross) of 3D Printed Microscope Mount Prototype did not Fit Microscope

 

Isolated Reprinted Rail Acceptor Section of 3D Printed Microscope Mount for Test Fit

 

Reprinted Rail Acceptor Component Fits Microscope Specification

 

We also printed the bottom part of the base (made of two parts).

Bottom Part of 3D Printed Microscope Stage Base Rendered

 

In addition, we also assembled the Creality Ender-3  3D Printing Frame and set up the raspberry pi with the raspberry pi touchscreen. We realised that the touchscreen required a support structure to prop it up and attach to the 3D Printer frame and started designing a touchscreen holder to be 3D-Printed as well. We mounted the RAMPS board onto the aluminium profile frame using a m5 x 8mm screw and a rotatable nut and plugged in the end stop cables to X_MIN, Y_MIN, Z_MIN.

After obtaining a 3D Printed Microscope Mount that fit the specifications, the microscope mount was screwed with an M5 screw after drilling a hole. It appeared secure and other m5 capped screws helped in securing it. We also managed to calibrate the webcam camera.

For software, we manually labelled more data for the training dataset. Model training accuracy stagnated at roughly 0.92 despite more training data, making data augmentation something worthy to look into for subsequent weeks.


Week 7 21- 27 June

We mounted the webcam onto the stage successfully and mounted the microscope tube onto the 3D Printer Frame after reprinting.

Mounted Webcam onto Stage and Mounted Microscope Tube 

 

Initially, the microscope tube component of the 3D printed webcam holder had a smaller diameter (28.3mm) than the lens and had to be reprinted with a larger diameter.

 

Webcam Mount with Lens Diameter that is Smaller than Actual Microscope Tube Diameter

 

We assembled the top and bottom parts of the 3D Printed Microscope Stage Base and fitted the light in it as well. We realised we needed to design something to affix the sample slide in place and considered designing stage clips.

Assembled 3D Printed Microscope Stage Base with Light Fitted

 

We added some postprocessing to remove distortion from the webcam images as there was a noticeable fisheye effect from the webcam lens. We tested the set-up after preparing sample slides with the hemocytometer and micropipette.

Testing Webcam using Laptop (Development) with Microscope Set-up

 

In addition, we also added a fan to cool down the Arduino in case it heats up excessively.

Added fan to Arduino for cooling

On the software side, we implemented code for synthetic dataset generation using 22 template cell types and generated 1.5k synthetic cell images, 300 each of 0, 1, 2 3, 4 cells. We then trained the model on 60/20/20 training/ validation/ test dataset split with these new images. However, we revised our understanding of cell counting and tried out other options that did not involve deep learning. These include blob detection, object segmentation with HSV range, and contour detection. Ultimately, we settled on using contour detection as it was more accurate, flexible, and required less computational power. The full details can be found on our software page for cell counting.


Week 8 28- 4 July

We designed and started 3D printing these parts:

We had to print the parts multiple times. Specifically, the light block aperture prototype had too many holes and the holes were spaced too closely the first time. More details can be found on the pages linked.

Modifications were made to the microscope mount. These include redoing the holes at the back of the mount, reducing the amount of material used for the mount. Previously, there was just one M5 screw holding it up drilled into the PLA plastic. As it was not very secure, we modified it such that the rolling screw things at the back end of the mount were extended and secured with a nut. This is more secure as the screw threads in the plastic will wear off if the screw is repeatedly removed.

After printing and testing out the light blocker aperture, we realised it was not working as expected. The issue is described in detail here: the hardware prototyping log page for the light blocker. Ultimately, we opted for a new design instead of a rotating aperture.

We also tested out the set-up with the GUI interface written in Flask.  Coordinates can be sent to the controller to adjust the position of the microscope. We also tested image loading within the flask application successfully.

Video Demo of Microscope Axes Control with Interface

Subsequently, we also tested out the focusing algorithm after preparing yeast slides in the MnT lab. 

For software, we refined the contour detection with more preprocessing, adjusting code to suit the webcam images.


Week 9 5- 11 July

We 3D printed a new version for these parts:

Testing out Light Blocker with Entire Set-up

After printing these parts, we tested the light blocker bridge prototype with webcam image capturing and show. In addition, cell counting model was tested on the raspberry pi. The webcam configuration was changed to video streaming to reduce lag between image captures.

We also started planning and taking short clips for the editing of the blog video.

We decided to try out with a different type of light — a filament bulb. Some concerns however include the fact that filaments generate heat. This would likely require the addition of a fan to cool it down or the redesigning of base to improve airflow. This is a concern as it might possibly melt the 3D Printed plastic of the stage base.

To test how hot the filament can get, a thermal thermometer was used for measurement. When turned off, the filament bulb measured 22 degrees and after approximately 1 minute of lighting up the filament bulb, the bulb measured 40 degrees.

Measuring Temperature of Filament Light Bulb

In addition, the brightness of the filament bulb cannot be adjusted once the bulb is lighted up. 

On the software side, new cell types and background to synthetic cell image generation so it imitates webcam images more closely and cell counting model performance with synthetic dataset was quantified.

Subsequently, we presented the project overview and current prototype version to Dr Ho, MnT staff and fellow peers.


Week 10 12- 18 July

We 3D printed a new version for these parts:

We brought a pond water sample to the MnT lab in hopes of observing how the cell counting algorithm handles different cells. However, there seemed to be no observable cells or living cellular organisms when viewed under the microscope. We will probably try again with new pond samples next week.

 Pond Water Sample

 

Additionally, we continued filming more clips for the blog video.

 Filming in Progress for Blog Video Clips

 

For this week, we worked more intensely on the software side. We tested the focusing algorithm and worked on it to improve it. More details can be found here for autofocusing.

While we previously used synthetically generated cells with known cell counts to judge cell counting accuracy, we quickly realised the synthetic dataset is not a reliable substitute for real microscope images as it did not account for uneven lighting and unfocused cells. Hence, we worked on improving the likeness of the synthetic cell dataset when generated by mocking unfocused cells using blur filters. This simulates how the image appears for the webcam more closely in reality as some cells tend to go unfocused. Moreover, due to the design of the algorithm, fewer cells could be inserted into an image if their row or column coordinates coincided with another inserted cell. We resolved this by using a H x W array to keep track of cell coordinates instead of sets. More details can be found here under Dateset V2.2 (Synthetic Dataset).

After testing cell counting in the lab with real-time webcam images, we noticed that the cell counting model was not resilient against lighting changes and cell counts were inconsistent across multiple images, despite the sample slide remaining in the same position with no explicit adjustments made in lighting. This made us decide to try out alternative thresholding algorithms to see if there was one that would do well even with uneven lighting. We also improved preprocessing for the image by implementing unsharp masking. More details can be found here for thresholding.


Week 11 19- 25 July

We finished editing the blog video, which can be found here. In addition, we explored more thresholding versions such as adaptive thresholding. More details can be found here for thresholding.

Subsequently, we tested the autofocusing algorithm and prepared algae samples to try out different cells. Compared to our previous attempt, we were able to view cells under the microscope successfully.

 

 Preparation of Algae Sample for Viewing 

 Algae Cells under the Microscope as taken by Webcam

 

 Testing Autofocusing Algorithm with the Full Set-Up

We also tested the different thresholding solutions we had coded out prior to the lab visit. Lastly, we started working on our final presentation slides.

 


References

[1] “OpenFlexure Microscope,” Openflexure.org. [Online]. Available: https://openflexure.org/projects/microscope/. [Accessed: 3-Jul-2021].

 

Leave a Reply