A microscopic visual feedback system

9

Click here to load reader

Transcript of A microscopic visual feedback system

Page 1: A microscopic visual feedback system

A Microscopic Visual Feedback System

Hiromasa Oku,1 Idaku Ishii,2 and Masatoshi Ishikawa3

1Department of Mathematical Engineering and Information Physics, Graduate School of Engineering, The University of Tokyo, Tokyo, 113-8656 Japan

2Department of Electrical and Electronic Engineering, Tokyo University of Agriculture and Technology, Koganei, 184-8588 Japan

3Department of Information Physics and Computing, Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, 113-8656 Japan

SUMMARY

Over the past few years, there have been advances inresearch into high-speed vision systems, which can acquireand process video faster than current vision systems thatuse video signals, with many areas of applied researchemerging. In this paper we will explore high-speed visionsystems for controlling and manipulating microscopic ob-jects, such as cells or semiconductors, and propose the“microscopic visual feedback” (MVF) a method for con-trolling microscopic objects using high-speed visual feed-back based on high-speed vision systems. We then considerthe components that will be needed in an MVF system,examine the specifications that will be needed for eachcomponent, and construct an actual MVF system based onthese specifications. Finally, we conduct control-perform-ance testing to ascertain the MVF system’s actual ability tocontrol moving microscopic subjects, and by tracking aparamecium in applied testing, demonstrate that this is aviable system for acquiring motion tracks. © 2004 WileyPeriodicals, Inc. Syst Comp Jpn, 35(13): 71–79, 2004;Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.10056

Key words: visual feedback; microscopic world;high-speed vision.

1. Introduction

Massively parallel, high-speed vision chips [1] haverecently garnered attention as a way to implement visionsystems that can quickly acquire and process visual infor-mation. These designs integrate image sensor elements andparallel processor elements, linked together 1:1, all on asingle chip, resulting in a system that directly couples imageacquisition and massively parallel processing for consider-ably greater speeds than previous systems could manage.Alongside the development of the general-purpose digitalvision chip by Ishikawa and colleagues, research has pro-gressed into numerous applications for high-speed visionsystems: fast subject tracking [2] and fast parsing [3] aretwo such areas where high-speed vision systems have beenshown to be useful.

Autonomous control in micro-level environments isone suitable application area for such high-speed visionsystems. In the microscopic world, where measurementsare typically counted in micrometers, physical movementsare much faster than in the everyday world [4]: this is knownas the scaling rule [5], where the quality of motion changeswith the physical scale. Because subjects at the micro levelmove so quickly, we need a rapid rate of measurement andfast sensor information-processing capacity to be able tocontrol the motion of these subjects. The high-speed visionsystem meets these requirements, and makes possible

© 2004 Wiley Periodicals, Inc.

Systems and Computers in Japan, Vol. 35, No. 13, 2004Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J84-D-II, No. 6, June 2001, pp. 994–1002

71

Page 2: A microscopic visual feedback system

autonomous control in micro-level environments that in thepast had been difficult to achieve.

In this paper, we are proposing controlling micro-level subjects through a system that uses a high-speedvision system for subject information feedback at the microlevel. In particular, we examined the system specificationsthat would be needed to implement what we call the micro-scopic visual feedback (MVF) system and built a prototypesystem based on these specifications. Finally, we conducteda number of actual motion experiments with the prototypeto assess its performance, demonstrating that the MVFsystem is effective.

2. High-Speed Vision Systems and theMicro World

The past few years have seen an increasing need tomanipulate subjects in the micro world, with lengths meas-ured in micrometers: this comes from a wide range ofdisciplines—such as biology, which uses micromanipula-tion [6], and semiconductors, which uses microprobes.Micromachines are another area where micro-level ma-nipulation is a key issue, but autonomous action at the microlevel still has not been achieved. For this reason, we haveseen almost no sensors that can measure micro-level sub-jects and their surrounding environments adequately.

Key features that would apply to any such sensorwould be precise measurement, noncontact measurement,and fast sensor data acquisition and processing. Of these,fast sensor data acquisition and processing would be mostimportant, assuming that the goal is to control the move-ment of micro subjects.

Because of the scaling rule, micrometer-level bodiesmove at much greater speeds than we experience in oureveryday lives. Consider, for example, vibrations in a beamsupported on one end. Vibration frequency ω varies withscale L by the formula ω ∝ L–1, so a beam at 1/100 the scaleof a reference beam will vibrate at 100 times the frequency.Because of this very fast motion at the micro level, sensorsused for measurements must have a very high samplingrate. It would be important to have sensors that acquireinformation quickly, extract useful information from largevolumes of sensor data, and process sensor informationquickly for real-time feedback.

Visual sensors have typically been used for measur-ing subjects in the micro world. These are noncontactdevices, typically coupled to the optics of a microscope, andthey can measure subjects with good accuracy, so thesesatisfy two of the three key conditions. In fact, because oftheir excellent qualities as sensors for microscopic subjects,there have already been projects to automate and assistmicromanipulation using conventional visual sensors, such

as CCDs [7–10]. But because these are using the frame rateof a standard video signal—30 or 60 Hz—they have beenhampered by their inability to satisfy the third requirementfor micro-world sensors: fast sampling. These systems havehad a particularly hard time controlling the movements ofmicroscopic subjects.

High-speed vision systems, which accelerate theprocess of capturing and processing images, can offer thehigh sampling rate required of micro-world sensors whileretaining the same noncontact, accurate measurement thatconventional image sensors offer. This is what broughthigh-speed vision systems to our attention as a candidatefor micro-world sensors for controlling microscopic sub-jects.

3. Microscopic Visual Feedback Proposal

In this paper, we propose microscopic visual feed-back (MVF), a novel method to control microscopic objectsusing high-speed visual feedback. Figure 1 shows a con-ceptual implementation of an MVF system. The systemdepicted can autonomously manipulate micro-world sub-jects, and thanks to high-speed visuals, can intelligentlyrecognize micro-world bodies and perform actions onthem. A typical configuration would consist of a high-speedvision system, an actuator, and a control processor.

Advances in biotechnology and semiconductors havecreated a more pressing need for manipulation of micro-scopic subjects, and high-speed manipulation that canmatch the speed of microscopic subjects is necessary forefficient operation. Once high-speed manipulation is pos-sible (something that conventional systems cannot man-age), we could expect new forms of manipulation,including ultrafast machining and assembly for micro-scopic construction. The MVF will make the most of thehigh-speed characteristics inherent in microscopic subjects,

Fig. 1. MVF system, conceptual drawing.

72

Page 3: A microscopic visual feedback system

permitting micromanipulation at high speeds that previoussystems could not achieve.

4. Overview of the Prototype System

To show that the principles of MVF are sound, weconstructed a prototype MVF system that can track a mi-cro-order subject under a microscope and keep it within themicroscope’s field of view.

4.1. The column-parallel vision (CPV) system[11]

For a high-speed vision system, we used a column-parallel vision system, which combines column-parallelimage transfer with completely parallel processing ele-ments. This used a 128 × 128 photodetector (PD) array, withpixels linked 1:1 to a 128 × 128 array of parallel processingelements (PE). With the image data transferred from the PDarray to the PE array, and then processed in parallel by it,we were able to obtain a high frame rate while performingvarious forms of image processing all within the samesystem. Figure 2 shows a block diagram of the CPV system.This system samples and processes images with eight levelsof grayscale every 1.28 ms.

4.2. System components

Our prototype MVF system is shown in Fig. 3; ablock diagram of the system is shown in Fig. 4. Followingis an explanation of the system’s major components.

a. Movable stage with linear actuators

We used an SMC LAL00-X070 stage with XY trans-lation as an actuator to control subjects in the micro world;specifications for this device are given in Table 1. Figures5 and 6 show torque index values as inputs and measuredfrequency response as outputs for the X and Y axes, respec-tively. The XY stage offers sufficient stroke, rapid response,and accurate positioning.

Fig. 2. Column-parallel vision system.

Fig. 3. Prototype MVF system.

Fig. 4. MVF system block diagram.

Table 1. XY stage specifications

X axis Y axis

Stroke 25 mm 25 mm

Movable mass 1.25 kg 0.25 kg

Rated current 1.2 A 1.2 A

Thrust constant 14 7.8

Encoder resolution 1 µm 1 µm

73

Page 4: A microscopic visual feedback system

b. Parallel processing DSP control system

To effect high-speed control using the high-speedvision system and actuator, we used a Texas InstrumentsTMS320C40 parallel-processing DSP system. This systemexecutes multiple processes in parallel and supports asyn-chronous communications between DSPs, making it suit-able for submillisecond real-time control with fast I/O. Inour prototype system, we used three DSPs for systemcontrol.

c. Microscope system

We used an Olympus BX50WI, an upright micro-scope with a fixed stage. This model permits observationswith both transmitted and reflected light sources, and hasvariable magnification from 50× to 3750×. It comes with aCCD camera attached that can be used as an input sourcefor the high-speed vision system.

5. Implementation

We conducted two sets of tests to determine whetherour proposed system would be effective in controllingsubjects in the micro world: a test to control moving sub-jects, and an applied operation test.

5.1. Test to control moving subjects: vibrationcontrol of a beam supported on one end

Because visual sensors have almost never been usedin high-speed control systems, we had no way of predictingwhether quantization errors, including visual feature val-ues, would affect control in measuring and controlling amoving subject with a high-speed vision system, as in theMVF. We tested whether we would achieve sufficientlinearity in our MVF system, specifically, whether thehigh-speed vision system would have any problems servingas the control sensor for subjects moving at high speed.

In our experiment, we attempted to control the vibra-tion of a beam that is supported on one end, a typicalvibrating structure in the micromachine field. Observingthe vibrating beam with a high-speed vision system con-nected to a microscope, we attempted to inhibit the vibra-tion of the beam using that visual information.

The beam was affixed to an actuator at one tip bymeans of a jig; the actuator could be used to control the tip’sposition. The beam is shown in Fig. 7. The high-speedvision system observes the beam through a 5×-power mi-croscope located 35 mm from the beam’s fixed end, andcontrols it based on visual information of the beam’s vibra-tions in one dimension. The beam’s displacement is deter-mined by the image centroid and the actuator’s position.Because control is only being exerted on primary vibrations

Fig. 5. X-axis frequency response.

Fig. 6. Y-axis frequency response.

74

Page 5: A microscopic visual feedback system

horizontally, the actuator uses only one degree of freedom.The beam is a cylinder of soft copper (Young’s modulus118 GPa, density 8.89 Mgm–3), 79 mm long and 260 µm indiameter. Based on the theory of elastic bodies, primaryvibrations in the beam will have a fixed frequency of 21.1Hz.

To design the control unit, we modeled both thevarious elements of the MVF system and the control sub-ject. We hypothesized that the high-speed vision systemwould be the ideal sensor for reading the target’s positionwithout affecting it. The model of the actuator was taken asa secondary system to that structure, and the dynamicsparameters were taken from the frequency response. Giventhat we were controlling only the primary vibrations in thebeam, we modeled this as a secondary system. The parame-ter measured and identified the free vibration of the beam.

For a control device, we designed a servo controllerbased on the Smith–Davison servo design approach [12],constructing a control mechanism that combined this withthe observer. The combination of actuator and beam wastaken as the actual subject to be controlled, and the controlfrequency was 1.4 ms, based on the I/O response speed,among other factors.

We conducted an experiment to control and stop thevibration in a single-end supported beam using information

from the high-speed vision system. The beam was initiallyvibrating from external noise, and based on this we wereable to estimate the dynamics parameters for it. After thatwe brought the control unit online and attempted to inhibitthat vibration using the actuator. In this experiment, be-cause of physical constraints, we were using a v4.2 visionchip with 24 × 24 resolution [13] in combination with thecontroller. Within the scope of this experiment, this high-speed vision system is equivalent of one kind of CPV withlower resolution than a CPV.

The path taken by the beam that was the subject ofthis experiment is plotted in Fig. 9. In this chart, the sectionbefore time t = 0 was used to identify the dynamics parame-ters; beginning at t = 0, the control unit was brought onlineand used the identified parameters to exert control over thevibrations. As the figure makes clear, vibrations in thesubject were controlled and stopped. Figure 10 shows atime-series of photos taken by a CCD camera used as amonitor.

We repeated the estimate/control experiment fivetimes. In these five experiments, the estimated primaryvibration frequency in the beam was 21.20 Hz; the mini-mum observed was 21.19 Hz and the maximum 21.22 Hz.The beam’s theoretical frequency for primary vibrationswould be 21.1 Hz, and the MVF system can estimate tothree significant digits of accuracy. To assess the vibrationcontroller that we built based on our estimates, we meas-ured its step response. Figure 11 shows the theoreticalresponse (as determined by pole assignment) and measuredresponse. Our experiment revealed that the response on therising edge was roughly the same in practice as in theory,but that some vibration remained; we attribute this to theeffect of secondary and higher vibration components that

Fig. 8. Vibration control block diagram. Fig. 9. Beam trajectory.

Fig. 7. The experimental subject: beam supported onone end.

75

Page 6: A microscopic visual feedback system

we ignored in our model. We believe that control perform-ance can be brought closer to design by working out thesemodel errors.

These results show that information from the high-speed vision system can be used to effectively controlvibrations in a single-end supported beam. The MVF sys-tem we proposed and constructed showed sufficient linear-ity overall, and the high-speed vision system was shown tobe appropriate as a sensor for controlling rapidly movingsubjects.

5.2. Application testing: paramecium tracking

Our next step was to test whether the MVF could beused in practical applications, to control a microscopicsubject. The fields where we foresee the MVF systemhaving the most likely applications are semiconductor in-spection, micromachine control, and biotechnology, so for

this experiment, we chose an application in biotechnology,a field of increasing importance.

Visual inspection of subjects under optical micro-scopes is an important tool in bioscience. But becausemicroorganisms can quickly move out of the visual fieldwhen under a microscope, they are difficult to observe in anatural state for any length of time. A bacterium, for exam-ple, can move a distance 50 times its own length in 1 second[14]; if a bacterium is being observed under a microscopeat a power such that it only just fits within the field of view,it could move out of view every 1/50 of a second. Becausethis rapid motion exceeds the limits of conventional CCD-based visual systems [with 30 Hz (NTSC) frequencies], itdemands a high-speed visual feedback system. In our ex-periment, we attempted to track a moving organism to keepit within the field of view and measure it.

Fig. 10. Time-series CCD photographs of beam duringexperiment.

Fig. 11. Beam step-response.

Fig. 12. Subject tracking block diagram.

Fig. 13. Time-series CCD photographs of parameciumduring experiment.

76

Page 7: A microscopic visual feedback system

Because the MVF system cannot measure depth, itdoes not handle bacteria moving in all directions. We dealtwith this in our experiment by using a larger microorgan-ism, paramecia, and preparing the slide so as to be able totrack the paramecium in focus in the imaging plane andkeep it in the field of view. By quickly taking visual meas-urements of the paramecium’s position and working theactuator, we kept the paramecium centered in the field ofview. Given our test equipment and imaging needs, we onlyprocessed the central 64 × 64-pixel region of the image(roughly 400 µm on a side), measuring the position at theimage’s center. Control was by a position-based visualservo; Fig. 12 shows a diagram of the control system. Themicroscope’s power was set to 20×.

Figure 13 shows a time-series of photos when track-ing the paramecium; Figs. 14 and 15 show the parame-cium’s track as calculated based on the actuator’s positionand the high-speed visuals; Fig. 16 shows the parameciumas captured by the high-speed vision system. The subjecttrack shown in Fig. 14 is based on the series of photos inFig. 13. Because paramecia have asymmetric shapes, theycharacteristically move in one direction while turning heli-cally; this is why the track in Fig. 15 meanders.

Based on these results, we found that the MVF sys-tem was effective at tracking moving subjects and keepingthem within the field of view, and should be able to manipu-late and control microorganisms in the future. The MVFsystem has practical applications, and can control micro-scopic subjects. The paramecium could move five times itsown body length per second (about 200 µm), meaning itcan traverse the field of view in roughly 0.4 second. Whilethe MVF system would not always be necessary withparamecium subjects, it would be with faster-moving bac-teria.

6. Analysis

To effect control of subjects via the MVF system, thesystem must have the speed to measure the subject’s mo-tion, and the subject must be larger than the optic subsys-tem’s resolution. In our experiments, we were able to takemeasurements using a frame rate of roughly 780 Hz, thusthe bandwidth is 390 Hz according to sampling theorem.The optic subsystem had a resolution of roughly 1 µm. Assuch, the required conditions for a subject would be maxi-mum response frequency no greater than 390 Hz and a sizeno less than 1 µm. The bandwidth 390 Hz corresponds withthe primary resonance frequency of 1/20 the scale of thebeam supported on one side in our vibration-control test, inone-dimensional vibration. Considering feedback control

Fig. 14. Trajectory of paramecium No. 1.

Fig. 15. Trajectory of paramecium No. 2.

Fig. 16. CPV image of paramecium.

77

Page 8: A microscopic visual feedback system

of MEMS actuators using the MVF system, almost allelectrostatic motors can be controlled, which are up to 100µm in size, with a rotational speed of up to 500 revolutionsper second [15]. Conversely, an electrostatic linear actuator,with a size of 1 mm and a resonant frequency of somethousands of hertz, could not be controlled by the MVF.Looking at actuators studied in the MEMS field overall,more than half those could be controlled using the MVFsystem. With almost all those that could not, it is becausethey have maximum response frequencies that are too high:the MVF system’s performance is determined by the high-speed vision system’s frame rate.

7. Conclusion

In this paper, we have explained how high-speedvision systems can be used to measure and control subjectsin the micro-world, put forward a plan for a microscopicvisual feedback (MVF) system, and suggested componentsthat would constitute such a system along with the specifi-cations they would need. We assembled a system based onthese specifications and conducted tests to establish itsbasic control performance and use in a practical application.

Because this system sees a two-dimensional imagewith its direct high-speed vision system, it is constrained tocapturing two dimensions of subject information, but cap-turing three dimensions would be important in real-worldmicroscopic applications, making research into a three-di-mensional successor a high priority.

Acknowledgments. We acknowledge the invalu-able assistance we received from Professor Kamiya of theLaboratory of Molecular Physiology in the Department ofBiological Science, Graduate School of Science, Universityof Tokyo, along with the help from the others in that lab.

REFERENCES

1. Komuro T, Kagami WS, Ishii I, Ishikawa M. Deviceand system development of general purpose digitalvision chip. J Robotics Mechatron 2000;12:515.

2. Nakabo Y, Ishii I, Ishikawa M. High speed targettracking using 1ms visual feedback system. Video

Proc IEEE Int Conf Robotics and Automation/VideoProc Abstract, p 6, 1996.

3. Namiki A, Nakabo Y, Ishii I, Ishikawa M. High speedgrasping using visual and force feedback. Proc IEEEInt Conf on Robotics and Automation/Proceedings,p 3195, 1999.

4. Harajima F, Ezashi S, Fujita H (editors). Intelligentmotion systems. Nikkan Kogyo Shinbunsha; 1991.

5. Trimmer W. Microrobots and micromechanical sys-tems. Sensors Actuators 1989;19:267–287.

6. Tokyo Clinical Medicine General Labs, Test AnimalResearch Dept (editor). Mouse lab manual. Springer-Verlag; 1998.

7. Nelson B, Zhou Y, Vikramaditya B. Sensor-basedmicroassembly of hybrid MEMS devices. IEEE Con-trol Syst Mag 1998;186:35–45.

8. Zhou Y, Nelson B, Vikramaditya B. Fusing force andvision feedback for micromanipulation. Proc IEEEInt Conf on Robotics and Automation 1998;2:1220–1225.

9. Sano T, Nagahata H, Yamamoto H. Automatic micro-manipulation system using stereoscopic microscope.Proc IEEE Instrumentation and Measurement Tech-nology Conference 1999;1:327–331.

10. FJT & STW. Visual servoing and CAD-driven micro-assembly. IEEE Robotics Automation Mag1998;54:18–24.

11. Nakabo Y, Ishikawa M, Toyoda H, Mizuno S. One-mscolumn parallel vision system and its application ofhigh-speed target tracking. Proc IEEE Int Conferenceon Robotics and Automation, p 650–655, 2000.

12. Smith H, Davison E. Design of industrial regulators.Proc IEE 1972;119:1210–1216.

13. Ogawa I, Komuro T, Ishii I, Ishikawa M. A digitalvision chip with increased integration, based on theS3PE architecture. IEICE Tech Proc, ICD99-4, 1999.

14. Berg H. How to track bacteria. Rev Sci Instrum1971;42:868–871.

15. Guckel H, Christenson T, Skrobis K, Jung T, Klein J,Hartojo K, Widjaja I. A first functional current ex-cited planar rotational magnetic micromotor. ProcIEEE Workshop on Micro Electro-mechanical Sys-tems, p 7–11, 1993.

16. Fan L, Lane L, Robertson N, Crawforth L, Moser M,Reiley T, Imaino W. Batch-fabricated milli-actuators.Proc IEEE Workshop on Micro Electro-mechanicalSystems, p 179–183, 1993.

78

Page 9: A microscopic visual feedback system

AUTHORS (from left to right)

Hiromasa Oku graduated from the Department of Physics, University of Tokyo, in 1998, completed his master’s andPh.D. programs in 2000 and 2003, and became a research fellow at PRESTO, Japan Science and Technology Agency. He isinterested in micro-manipulation using high-speed vision systems.

Idaku Ishii graduated from the Department of Mathematical Engineering and Information Physics, University of Tokyo,in 1992, completed his master’s program in 1994, and enrolled in the Mathematical Engineering and Information Physics Ph.D.degree program. He became an instructor at Tokyo College of Agriculture and Technology in 2000. He joined HiroshimaUniversity as an associate professor in the Department of Artificial Complex Systems Engineering in 2003. His research interestsare vision chips, parallel sensor information processing, and robotic control systems. He received a paper award from the JapanRobotics Society in 1998 and special recognition in 1999, IP excellence in 2nd LSI IP Design Awards. He holds a D.Eng. degree.

Masatoshi Ishikawa (member) graduated from the Department of Mathematical Engineering and Information Physics,University of Tokyo, in 1977, completed his master’s program in 1979, and entered MITI’s Industrial Product Research Institute.In 1988, he joined the University of Tokyo as an associate professor of engineering/mathematical engineering and informationphysics. He became a professor and is currently posted to the Graduate School of Information Science and Technology. Hisresearch interests include circuit models of organic information-processing structures, massively parallel high-speed vision,optical computing, and sensor fusion. He received a paper award from the Instrumentation and Automatic Control Society in1983; Applied Physical Science Optics paper in 1988; Advanced Automation Technology Promotion Prize in 1998; paper awardfrom the Japan Robotics Society in 1998; achievement award from the Japan Society of Mechanical Engineers, Robotics andMechatronics division, in 1999; Kenjiro Sakurai Memorial Award in 1999; and award for IP excellence in LSI IP Design Awardsin 2000. He holds a D.Eng. degree.

79