• Users Online: 272
  • Print this page
  • Email this page


 
 
Table of Contents
ORIGINAL ARTICLE
Year : 2021  |  Volume : 9  |  Issue : 4  |  Page : 170-176

Automated drawing tube (camera lucida) method in light microscopy images analysis can comes true


1 Department of Bioelectrics and Biomedical Engineering, School of Advanced Technologies in Medicine, Isfahan University of Medical Sciences, Isfahan, Iran
2 Department of Bioelectrics and Biomedical Engineering, School of Engineering, University of Isfahan, Isfahan, Iran
3 Department of Parasitology and Mycology, School of Medicine, Isfahan University of Medical Sciences, Isfahan, Iran

Date of Submission26-Apr-2020
Date of Decision15-May-2020
Date of Acceptance13-Aug-2020
Date of Web Publication22-Nov-2021

Correspondence Address:
Dr. Saeed Kermani
Associate Professor of Biomedical Engineering, Department of Bioelectrics and Biomedical Engineering, School of Advanced Technologies in Medicine, Isfahan University of Medical Sciences, Isfahan
Iran
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/JMAU.JMAU_25_20

Get Permissions

  Abstract 


Background: In a light microscope, image acquisition with different component depths is difficult, and there are various approaches for solving this problem. One of the common approaches is Camera Lucida (CL). This method has some disadvantages such as time-consuming, handed problems in painting, causing user boring, and produce gray scale output images. Aims and Objectives: In this study, we purposed a novel-combined hardware and software method. In this article, we try to present an automated method for our designed microscope. Materials and Methods: We have done a project with designed code number 377,694 to design and implement an upgraded light microscope. That project was about automatic movement of a stage with closed-looped control of a servomotor. Furthermore, automated camera catches images in predefined positions. That project has acceptable results in different parts, which encourage us to work on this study. This study help specialist have good fixative of all components in a sample. It is about trying to have useful Lucida Camera (drawing tube) in an automated scheme. Results: This method is an acceptable usual way for microscopic specialists, but with some disadvantages. It is time-consuming and boring that effect on the accuracy of results. Hence, how can be good if automated similar method could be implemented is exciting and affective. This studies idea comes from the basis of manual drawing tube (CL) method. In this experimental study, we have taken 400 handed an image of microorganisms. Captured images are from its whole body or various organs. They have been captured in different z-axis positions of stage, and hence components with different depths could be focused. Each patch checked for its edge strength to choose highest resolutions sub image and reconstruct focused image like a puzzle. This process has been continued for all areas to merge and complete reconstructed image as output. Conclusion: Comparing edge strength with other images and mean square error with manual focused on confirm our method with pleasure outcomes. Furthermore, independent focusing of an internal component in a sample body has been surveyed. It helps to have better resolution in internal selected component for more analysis and replace in its primitive image. This article presents efficient consequences with good accuracy and saving time in process period, which could be useful in different microscopes types and various samples type.

Keywords: Auto-focusing, camera lucida, combination, image stacking, microscopy, position control


How to cite this article:
Vahabi F, Kermani S, Vahabi Z, Pestechian N. Automated drawing tube (camera lucida) method in light microscopy images analysis can comes true. J Microsc Ultrastruct 2021;9:170-6

How to cite this URL:
Vahabi F, Kermani S, Vahabi Z, Pestechian N. Automated drawing tube (camera lucida) method in light microscopy images analysis can comes true. J Microsc Ultrastruct [serial online] 2021 [cited 2022 Jan 27];9:170-6. Available from: https://www.jmau.org/text.asp?2021/9/4/170/330899




  Introduction Top


Microscopic imaging systems have become an overarching theme for scientific analysis systems in biomedical, pathological, biochemistry, and many other sciences. They have been significant improvements in their sketching such some recent research aspects about image acquisition and quality.[1]

Many biological experiments about specimen's studies are based on the microscopic images analysis. Some of them are independent from these creatures' bodies, which have various structure volumes. However, even more, when a parasitologist adjusts microscopes stage to have focused image, only special district could be seen with high resolution and for the other parts, this process must be repeated. Therefore, specialist could not have focused view in all over parts of gained image.[2]

For this problem, researchers improved an optical method named drawing tube (camera lucida [CL]). This is a superimposition scheme aiding in the accurate interpretation of perspective.[3]

Drawing tube is a very usual manual to have focused images in a desired field of view. This method is the most common medical method among biochemist, parasitologist, neurobiologists, and so on. It could be mentioned that the limitations of drawing tube (CL) can be avoided by the procedure of digital autofocusing and reconstruction.

Automated microscope helps to have auto-focusing process with acceptable capturing and processing images. Focusing part could be a critical section for continued segmentation, classification, and diagnosis. Several autos-focusing methods have been studied for different biological and biomedical applications.[4],[5],[6],[7]

Hence, based on recent studies implementation, an appropriate combination system with acceptable results which could be compared with output manual images has remained.

For this aim, we have divided samples slide with similar regions in the XY plane. The equivalent zones were compared in different internal parts based on a sharpness index. After that, each zone with higher resolution has been selected for reconstruction final-focused image. This process has been continued for all redefined divided regions.

Reconstructed images with higher resolution, acceptable accuracy, and pleasure time consuming algorithm motivated us to report this automated-focusing digital method as a simulated method of drawing tube (CL).


  Methods Top


The analysis of microscopic images captured from different microorganisms such as parasites or specimens is very effective in the parasitology science. Many specialists believe that studying focused images of specimens in parasitic diseases period are very effective in better diagnosis.[8] One standard method in microscopic image reconstruction is drawing tube (CL) in which specialist tried to draw approximately three-dimensional parasitological figures for the better identification of microorganisms. Parasitologist moves the stage and paint an image through the perspective scheme of sample. This approach helps to have a high-resolution image in almost all body components of a specimen.[3],[9],[10] Hence, auto-focusing microscope for imaging specimens meets to be more useful in improving accuracy and time-saving. Fortunately pleasure results have been seen in our study about implementing automatic, similar method.

Our implemented system has two hardware and software parts that works together on a light microscope NAVITE, 2205-XSZ of Ningbo Shensheng Optics and Electronics Co., Ltd which has been calibrated for imaging.

Motion control could be done for a two screw in micro and macrostages movement, but in this study, our purpose is to work on delicate adjusting. In our goal, user adjusts specimen's stage approximately with macro screw and micro tuning done with auto microscrew change. In hardware section, we have designed a closed loop controlling scheme for a servomotor with an encoder, as shown in [Figure 1]. This part has been designed to set the sampling stage in predefined situations with determined times in taking pictures. This section provides image data for software's part, which image processing algorithm help user to have enhanced image from each specimen.
Figure 1: Implemented autofocus microscopic system

Click here to view


An Arduino board embedded to control a servo motor MG945 which helps us to have predefined positions with good accuracy in movement distances. Based on engineering and parasitologist opinions, preferred limits for stages movement steps are about 30 um in total 1200 um distance limit around the reference point. They prefer approximately 90% accuracy, which has been measured and checked with micrometer. For image capturing raspberry, Pi Camera module V2 with an ultra-high quality 8MP SONY IMX219 image sensor has been used.

By this process, we have captured about 400 images from four different parasites types. Captured images have been saved in Raspberry pi to process and result in one enhanced image.

[Figure 2] shows a block diagram of our implemented autofocus microscope.
Figure 2: Schematic diagram of system

Click here to view


In our proposed method, focused image could be reconstructed from different captured images. They have been captured with focus magnification ×4 in changing stage position through the Z axis of a microscope. Hence, we need to determine focused districts in each received image and missing unfocused parts. Focused parts must be merged to reconstruct focused image in all parts of a microscopic sample, enjoy drawing tube (CL) method. Comparing contrast or amount of details in tantamount parts of different images helps to choose focused part for merging. Some usual marks of images details are the good appearing of edges and dots, which they could not be such clear in blurred one. Based on previous studies edges, information reflects the level of images gradient. Hence, in images with more resolution, this index seems to be larger.[11],[12],[13],[14],[15],[16]

Its definition is:



Where and represent the parameters of the edge strength and orientation preservation. Is the edge strength and is similar to . The size of the input image is X × Y.

Hence, higher contrast in any coordination could be achieved if the amount of variations between any pixel and its neighbors increased among different taken images. Pixel score is defined as the differences between every pixel and its neighbors. In different frames of any region, highest score determine which pixel must be chosen for merging. In addition, to have more accurate results, taken images must be normalized for size, brightness, contrast, and color. Because merged, one is composed of different parts of all images.[12],[13],[14] Hence, at start point, which parasitologist adjust stage in a good place, captured image would be saved as a reference for these parameters. Then based on hardware adjustments, motor motion control could be done. In each step, the motor is stopped for 1 s to let camera has time for capturing image without slip. It continues until complete predefined distance in z-axis has been covered with staged movement. [Figure 3] and [Figure 4] show different neon focus-captured images with output merged one in the right direction for two types of parasites.
Figure 3: Three different selected nonfocus captured images in various stages positions (a-c) with merged output (d) for throat organ of parasites

Click here to view
Figure 4: Three different selected nonfocus captured images in various stages position (a-c) with merged output (d) for gloves organ of parasites

Click here to view


Our process is surveyed based of the microscopic systems structures. In microscopic imaging systems, specimen bodies' objects in image appear at different resolution regarding the stage's position. Objects outside the focus position usually appear blurred. Therefore, specialists survey a collection of images with different stage positions to see different objects of one specimen with higher resolution. Hence, creating a single-focused image from this collection of captured images with different height in the stage position could help specialists have better focused image.[17],[18],[19] To have most objects of specimen in focus, the microscope stage can be moved up or down with controlled motor as described before. Automatic changes in z-value of stage do not solve the focusing problem since there is no single good in-focus image that covers all microscopic objects in a specimen. We are required to analyze a collection of microscopic images where each image maybe has a good focus region of the specimen's objects.

In our purposed approach, the best-focused image is generated from several images captured at different z-value of stages.

Let F represents an image set (F1, F2, F3, F4,…., Fn) and n represents the number of captured different z-value images. Size of all images is normalized to X × Y. Therefore, Fi (x, y) represents the pixel value at coordinate of (x, y) in the ith image.

Pixel-based approach is the most usual method in the focal stacking algorithms. In this method, corresponding pixel value at each position in all images is compared with each other's determining the best-focusing pixel.[20],[21] Differential function could determine pixel value in each position. Differential image Di with corresponding pixel values are calculated for each image Fi ∈ F. Maximum function for differential is used in selecting good pixels.



Then



Another method is based on neighboring pixel values. In pixel-based algorithm, discontinuity may be occurred because of picking up pixels from different images. Hence, neighboring algorithms minimize the inconsistency by using neighborhood information to get a good value of corresponds pixel. Hence, parameter QAB/F as described before in formula (1) has been used with a similar basis.

To evaluate our results, mean square error (MSE) and loss function (LS) have been used as follows:”[19],[20],[21]





Where LS is a LS, X and Y are the dimensions of image, T is for input image as a target to be focused and G is for algorithms output. Because of the same concept of above formulas and better computing scheme of MSE, we use this parameter in continuation.

[Table 1] shows MSE-based quantitative results for two different parasite image dataset. Stage is moved by controlling motor from down to up with ten interval positions. The reference point which has been predefined with a specialist is in its middle limit.
Table 1: Results of random two sparatists types of samples focusing process while capturing image through moving stage with motor, merging them

Click here to view


[Figure 5] and [Figure 6] show acceptable manual focused image by specialist and our reconstructed puzzled image from different focused parts of two separatist's images. These figures show differences between these outputs in three layers of an RGB image with their equalized histograms too.
Figure 5: Merged image and manual focused image for one type of specimen with histogram of abs differences of these two images. (a) Manual focused image, (b) merged output image, (c) histogram of gray scale of absolute difference between manual focused image and merged image, (d) abs diff of a red layer of two images, (e) abs diff of green layer of two images, (f) abs diff of blue layer of two images, (g-i) equalized histogram of d, e and f are g, h and i

Click here to view
Figure 6: Merged image and manual focused image for one type of specimen with histogram of abs differences of these two images. (a) Manual focused image, (b) merged output image, (c) histogram of gray scale of abs difference between manual focused image and merged image, (d) abs diff of a red layer of two images, (e) abs diff of green layer of two images, (f) abs diff of blue layer of two images, (g-i) equalized histogram of d, e and f are g, h and i

Click here to view


In deliberation with simulating manual drawing tube (CL) method, we test our purposed algorithm on images to improve the internal part of a parasites body. In this part of our study, first we have a sample like [Figure 7]a which it has been focused manually in the referenced position. Then, merging different parts through moving stage have been done. [Figure 7]b shows the merged image of input. As previously mentioned, captured images in different positions have been normalized in different parameters. Hence, output merged [Figure 7]b has 1024 × 768 pixels. One additionally subject of interest for microscopic image analyzers is to have higher resolution in a special internal part of parasite's body like [Figure 7]c. To achieve this benefit, we want user to choose that part with peripheral window. After that, our purposed algorithm could be used. User has been wanted to adjust stage to focus that part in the good position. The closed loop controlled motor force stage to move. Captured images feed into process to reconstructed focused internal part by size 1024 × 768. This super resolution internal part must come back to its position in a primitive sample of whole parasites body. Bicubic interpolated down sampling method has been used in this manner based of output image size and peripheral window.[22] This window helps us have the primitive size of this part and its location in a body of sample. [Figure 7]d shows super resolution of the internal part in the whole body.
Figure 7: An image that has been focused manually in referenced position (a). merging different parts through moving stage and combine them to reconstruct automated merged image (b). Selected part of parasite's body for focusing (c). Super resolution of selected interval part in whole body (d)

Click here to view


Size of sub images windows which are merged to reconstruct output image is 32 × 32 pixels. Hence, if selected input internal part is smaller than these areas or have very little depth differences, it could not reach to very higher resolution.


  Discussion Top


In this article, we have presented autofocus microscope. For this aim, pathological specialist's help we gather samples data of four type's parasites. Our drone plan of its hardware with designed number 377,694 has servomotor with some mechanical links to move and stop stage with acceptable velocity and accuracy. Its shake is so small. An encoder has been embedded to check these systems movements. Different images of every sample in various positions have been captured. Focused parts of these images have been used to reconstruct new high-resolution images in approximately all parts of sample. Basis of this approach is very similar to Lucida Camera (Drawing Tube) method. Hence, this article could be shown that automated Lucida Camera method would be available.

The presented method could be useful in different microscope types with various samples with pleasure results. Actually, higher precise stage movements with more captured images and choosing more efficient resolutions features help to have better focused microscopic images. Furthermore, smaller sub images can be used to focus the smaller parts of any images. Even more in any instance with its special particularity, deburring algorithms could be tried in the final reconstructed image.


  Conclusion Top


This article presents an algorithm by combining engineering techniques and medical diagnostics skills, which get uses good quality images for more accurate parasites diagnoses such as automatic drawing tube (CL) method. In this method, specialists select high-resolution parts of different images and merged them for a distinct view of specimen's internal components. Our purposed algorithm has been run on a personal computer with central processing unit cori5, random access memory 8.

In our implemented system with design code number 377,694 and a number of ethics, 1397.462 samples stage move with closed loop control of a servomotor. Pictures have been captured in predefined stages positions and specific timing. Our method enables scientific research institutes and laboratories to reduce the problems due to manual focusing. These problems are time-consuming, little user access to high details, and yield good results in practical application. In fact, drawing tube is handed method to have gray scaled pen-painted images. However, in our automated proposed method, we can have natural colored images in different microscopes types which could be more efficient in the diagnosis. Furthermore, higher resolution in various organs with lower time-consuming, and hence, fastness in results encourages laboratories to use this application. Another benefit of presented method is that it could be availed for an educated approach for laboratories, parasitologist, and other users of various microscopes types.

Edge strength could choose high-resolution parts in a different image for merging. After reconstructing puzzled image with selected parts, MSE between every captured image, output manual image, and output of our algorithm which determined by specialist has been calculated for different parasites types. This parameter decreases for our algorithms output. Another parameter is edge strength, which could be a representation of an image resolution based on previous studies. It is reported too for different images, and our purposed microscopic algorithm. It is pleasure to have acceptable results that confirms our study.

Financial support and sponsorship

Nil.

Conflicts of interest

This research was financially supported by Isfahan University of Medical Sciences, Isfahan, Iran (Grant no. 397694). The authors would like to acknowledge the products and services provided by Isfahan University of Medical Sciences.



 
  References Top

1.
Sigdel MS, Sigdel M, Dinç S, Dinç I, Pusey ML, Aygün RS. FocusALL: Focal stacking of microscopic images using modified harris corner response measure. IEEE/ACM Trans Comput Biol Bioinform 2016;13:326-40.  Back to cited text no. 1
    
2.
Sands GB, Gerneke DA, Hooks DA, Green CR, Smaill BH, Legrice IJ. Automated imaging of extended tissue volumes using confocal microscopy. Microsc Res Tech 2005;67:227-39.  Back to cited text no. 2
    
3.
Mandira S. Light and scanning electron microscopic studies of Myxobolus indica n. sp. and a report of three Myxozoan (Myxosporea: Bivalvulida) parasites of cultured ornamental goldfish, Carassius auratus L. for the first time in India. Aquaculture reports 2017;7:66-76.  Back to cited text no. 3
    
4.
Majumder M, Deen J. Smartphone Sensors for health monitoring and diagnosis. Sensor 2019;19:1-45.  Back to cited text no. 4
    
5.
Yan Z, Chen G, Xu W, Yang C, Lu Y. Study of an image autofocus method based on power threshold function wavelet reconstruction and a quality evaluation algorithm. Appl Opt 2018;57:9714-21.  Back to cited text no. 5
    
6.
Avbnjh A, Sbghn Z. Autofocuse by bayes spectral entropy applied to optical. Microsc Microanal 2016;22:199-207.  Back to cited text no. 6
    
7.
Sun J, Han Q, Kou L, Zhang L, Zhang K, Jin Z. Multi-focus image fusion algorithm based on laplacian pyramids. J Opt Soc Am A Opt Image Sci Vis 2018;35:480-90.  Back to cited text no. 7
    
8.
Luckner M, Wanner G. From light microscopy to analytical scanning electron microscopy (SEM) and focused ion beam (FIB)/SEM in biology: Fixed coordinates, flat embedding, absolute references. Microsc Microanal 2018;24:526-44.  Back to cited text no. 8
    
9.
Notsu E, Toida K. Examination of morphological and synaptic features of calbindin-immunoreactive neurons in deep layers of the rat olfactory bulb with correlative laser and volume electron microscopy. Microscopy (Oxf) 2019;68:316-29.  Back to cited text no. 9
    
10.
Kartik D, Chhveen B, Indu K, Sand Vikrant AA. Microscopic illustration of pelargoniumxhortorum. Asian J Res Botany 2019;2:1-6.  Back to cited text no. 10
    
11.
Fatma U, Kutay I, Kasim T, Bulent Y. Automated quantification of immunomagnetic beads and leukemia cells from optical microscope images. Biomed Signal Process Control 2019;473-482.  Back to cited text no. 11
    
12.
Alexandre A, James N, Gabriella D, Rafaela M, Paulo O, Fabio F. A stomata classification and detection system in microscope images of maize cultivars. Microscopy 2019;538165:435-55.  Back to cited text no. 12
    
13.
Miura, K, Nose A, Suzuki H, Okada M. Cutting tool edge and textured surface measurements with a point autofocus probe. Int J Auto Technol 2017;11:38-52.  Back to cited text no. 13
    
14.
Rakun J, Stajnko D, Zazu D. Plant size estimation based on the construction of high-density corresponding points using image registration. Comput Electron Agric 2019;157:288-304.  Back to cited text no. 14
    
15.
Theodoly O, Garcia-Seyda N, Bedu F, Luo X, Gabriele S, Mignot T, et al. Live nanoscopic to mesoscopic topography reconstruction with an optical microscope for chemical and biological samples. PLoS One 2018;13:e0207881.  Back to cited text no. 15
    
16.
Shahid Farida M, Mahmooda A, AlMaadeeda S. Multi-focus image fusion using content adaptive blurring. Inf Fusion 2018;16:1-17.  Back to cited text no. 16
    
17.
Sun L, Zhang Y, Wang Y, Yang Y, Zhang C, Weng X, et al. Real-time subcellular imaging based on graphene biosensors. Nanoscale 2018;10:1759-65.  Back to cited text no. 17
    
18.
Jan K, Khan A, Sajjad M, Muhammad K, Rho S, Mehmood I. A review on automated diagnosis of malaria parasite in microscopic blood smears images. Multimed Tools Appl 2017;77:1-19.  Back to cited text no. 18
    
19.
Yunfei W, Guoliang C, Yang X. Autofocus optimization algorithm for micro vision system based on micromanipulation, computer measurement. Control 2018;10:24-38.  Back to cited text no. 19
    
20.
Wentao L, Guijin W, Xinghao C, Xuanwu Y, Xiaowei H. Blurring-effect-free CNN for optimization of structural edges in focus stacking. 2019 IEEE International Conference on Image Processing (ICIP);978:4634-9.  Back to cited text no. 20
    
21.
Kou L, Zhang L, Zhang K, Sun J, Han Q, Jin Z. A multi-focus image fusion method via region mosaicking on Laplacian pyramids. PLoS One 2018;13:e0191085.  Back to cited text no. 21
    
22.
Raza SE, Cheung L, Shaban M, Graham S, Epstein D, Pelengaris S, et al. Micro-net: A unified model for segmentation of various objects in microscopy images. Med Image Anal 2019;52:160-73.  Back to cited text no. 22
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6], [Figure 7]
 
 
    Tables

  [Table 1]



 

Top
 
  Search
 
    Similar in PUBMED
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Introduction
Methods
Discussion
Conclusion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed537    
    Printed23    
    Emailed0    
    PDF Downloaded52    
    Comments [Add]    

Recommend this journal