Fourth International Conference I.TECH 2006 Figure 12. Image sensors applications in medicine 4.1 Wireless Capsule Endoscopy Conventional medical instrumentation for gastrointestinal tract observation and surgery uses an endoscope that is externally penetrated. These systems are well developed and provide a good solution for inter-body observation and surgery. However, the small intestine (bowel) was almost not reachable using this conventional equipment, leaving it for observation only through surgery or through an inconvenient and sometimes painful push endoscopy procedures. Few years ago the sphere was revolutionized by the invention of the wireless image sensor capsule, which after swallowing, constantly transmits a video signal during its travel inside the body . The capsule movement is insured by the natural peristalsis. According to Gavriel Iddan , the founder of Given ImagingTM  that commercializes this technology, “The design of the video capsule was made possible by progress in the performance of three technologies: complementary metal oxide silicon (CMOS) image sensors, applicationspecific integrated circuit (ASIC) devices, and white-light emitting diode (LED) illumination”.
The general architecture of the capsule is shown in the Figure 13. It consists of LEDs, optics, camera, digital system processing, transmitter or transceiver and a power source. The dashed blocks represent additional future requirements for such capsules.
All capsule electronic components are required to be low power consumers to enable constant video transmission for a prolonged time (for about 6-8 hours) and/or high capacity batteries. An alternative solution to in-capsule batteries  is to use an external wireless power source that supplies energy to the capsule through electromagnetic coils. Such a solution enables to relax power requirements for the capsule electronics. This solution also provides an advantage in freeing space inside the capsule for other useful functions such as biopsy or medication. Also, the capsule position can be controlled externally through a strong magnetic field. But the required strong magnetic field can limit the capsule usage in spite of position control advantages .
Figure 13. The swalable capsule architecture.
In the dashed boxes additional functionality that will be required in the future is shown Currently the Given ImagingTM capsule developers have reached very encouraging results enabling two capsules:
one intended for the Esophagus part (the upper part) of the gastrointestinal tract and the second for small Information Technologies in Biomedicine intestine observation. The first kind of the capsule is equipped with two CMOS image sensors and can transmit the video signal for about 20 minutes with 14 frames per second for each camera. The second one consists of only one CMOS image sensor and can transmit two frames per second for about eight hours. The company is developing now a new capsule generation that can transmit four frames per second.
Despite these encouraging results, a lot of work should be done to allow further miniaturization, image processing and compression algorithms integration, power reduction by various means (system integration, technology scaling etc.), frame-rate increase, quality improvement and usage of alternative power sources with larger capacity. The ultimate goal that needs to be achieved is full video frame-rate transmission for about 7-8 hours. To achieve these goals, a number of additional research groups work worldwide on wireless capsules development:
eStool by Calgary university in Canada , MiRO by Intelligent Microsystems Center in Korea , EndoPill by Olympus .
4.2 Artificial Retina Artificial vision is another example of CMOS image sensors implementation in medical applications. Today millions of people are suffering from full or partial blindness that was caused by various retinal deceases. In the early eighties it was shown that electrical stimulation of the retinal nerves can simulate visual sensation even in the patients with fully degraded receptors. Recently, researchers in a number of research institutes have developed miniature devices that can be implanted into the eye and stimulate the remaining retinal neural cells, returning partial vision ability for the blind patients. Such implants are called artificial retinas. Usually they are implanted in the macula area that normally is densely populated by the receptors and enables high-resolution vision. This break-through was enabled by the progress in electronics, surgical instrumentation, and biocompatible materials. Currently there are two major approaches for artificial retina development (see Figure 14).
(a) (b) Figure 14. Artificial retinas (a) implantable sensor (b) external sensor The first and the most promising one is the integration of sensing and stimulation elements in the same device and the second is separation of sensing and stimulation. In the first approach, an artificial retina device is an autonomous circuitry that does not require external control and the optics that is used for sensing is the natural optics of the eye composed of the cornea and lens. In the second, all the sensing and processing is performed outside of the eye and only stimulating elements are implanted during surgery. The data transfer from the sensing part to the stimulation part is performed through an RF link or through a tiny cable.
Actually there are two groups that have shown very promising results and are now performing clinical trials and commercialization through companies named OptobionicsTM  and Second Sight . Both groups already have a number of patients with such implants.
The device developed by OptobionicsTM group does not require any power source, integrates about 5000 sensing (microphotodiodes) and stimulation (electrodes) elements, features two millimetres in diameter and is implanted under retina. The basic artificial silicon retina unit is shown in Figure 15 . It is composed of a stimulating electrode and three PIN photodiodes connected in series to increase the output voltage.
Fourth International Conference I.TECH 2006 Figure 15. Artificial silicon retina – basic unit The second group decided to follow the second approach and separate sensing from stimulation. The camera with the processor is situated on the patient glasses and the signal is transmitted through a cable to the eye that has an implanted stimulator. Currently, the implant resolution is not so impressive compared to the first group and features only 16 electrodes, but the developers plan to increase the resolution in the future models to 60 and 1000 electrodes .
5. Conclusions In this paper we have presented a brief review of CMOS image sensors utilization in security and medical applications. In these applications image sensors play a major role and usually define the edge of the imaging technology. Despite the CMOS image sensor technology already exists for more than a decade, it is continuously developing and penetrating into new fields that were unreachable by its predecessor, CCD technology. Although many successes have been achieved during the last decade, a lot of work still needs to be done in this area. It requires extensive collaboration between various fields such as: electrical engineering, materials, computer science, medicine, psychology, chemistry etc. As to the electrical engineering and sensing fields, the work should be concentrated in the directions of power consumption reduction, functionality improvement and system integration. However, like in every multidisciplinary, electrical engineers, developing the electronic devices for medical purposes, are required to understand all above mentioned fields to successfully implement such devices.
References  G. Iddan, G. Meron, A. Glukhovsky, P. Swain, “Wireless capsule endoscopy”, Nature, Vol. 405, p. 417, May  E. Fossum, "Low power camera-on-a-chip using CMOS Active Pixel Sensor technology", in IEEE Symposium on Low Power Electronics, pp. 74–77, 1995.
 O. Yadid-Pecht and R. Etienne-Cummings, "CMOS imagers: from phototransduction to image processing", Kluwer Academic Publishers, 2004.
 A. Fish, D. Turchin, O. Yadid-Pecht, " An APS with 2-Dimensional winner-take-all selection employing adaptive spatial filtering and false alarm reduction", IEEE Trans. on Electron Devices, Special Issue on Image Sensors, January, 2003.
 V. Brajovic and T. Kanade, “Computational sensor for visual tracking with attention”, IEEE journal of solid-state circuits, Vol.33, No.8, August 1998.
 T. Horiuchi and E. Niebur, “Conjunction search using a 1-D, analog VLSI-based attentional search/tracking chip,” Conference for Advanced Research in VLSI, D. Scott Wills and Stephen P. DeWeerth, Eds., pp. 276–290. IEEE Computer Society, 1999.
 G. Indiveri, “Neuromorphic analog VLSI sensor for visual tracking: Circuits and application examples.” IEEE Trans. On Circuits and Systems, II 46(11), pp. 1337–1347, November 1999.
 C. S. Wilson, T. G. Morris, and P. DeWeerth, “A two-dimensional, object-based analog VLSI visual attention system", Twentieth Anniversary Conference on Advanced Research in VLSI, IEEE Computer Society Press: Los Alamitos, CA.
Vol. 20. pp. 291-308. March 1999.
 M.Clapp and R..Etienne-Cummings, “A dual pixel-type imager for imaging and motion centroid localozation”, Proc.
ISCAS’01, Sydney, Australia, May Information Technologies in Biomedicine  N. Mei Yu, T. Shibata and T. Ohmi, “A Real-Time Center-of-Mass Tracker Circuit Implemented by Neuron MOS Technology”, IEEE transactions on circuits and systems—II, vol. 45, no. 4, April 1998.
 R.C. Meitzler, K. Strohbehn and A.G. Andreou, “A silicon retina for 2-D position and motion computation”, Proc.
ISCAS’95, New York, USA, 1995.
 A. Simoni, G. Torelli, F. Maloberti, A. Sartori, S. E. Plevridis and A. N. Birbas, “A Single-Chip Optical Sensor with Analog Memory for Motion Detection”, IEEE Journal of Solid-State Circuits, Vol. 30, No. 7, July 1995.
 M. Clapp and R. Etienne-Cummings, “Dual Pixel Array for Imaging, Motion Detection and Centroid Tracking,” IEEE Sensors Journal, Vol, 2, No. 6, pp. 529-548, December 2002.
 S. Kawahito, M. Yoshida, M. Sasaki, K. Umehara, D. Miyazaki, Y. Tadokoro, K. Murata, S. Doushou, and A.
Matsuzawa, “A CMOS Image Sensor with Analog Two-Dimensional DCT-Based Compression Circuits for One-Chip Cameras”, IEEE Journal of Solid-State Circuits, Vol. 32, No. 12, 1997.
 K. Aizawa, H. Ohno, Y. Egi, T.Hamamoto, M. Hatory, H. Maruyama and J. Yamazaki “On sensor Image Compression,” IEEE Transaction On Circuits And Systems For Video Technology, vol 7, no. 3, pp. 543-548, June 1997.
 O. Yadid-Pecht, A. Belenky " In-Pixel Autoexposure CMOS APS " IEEE Journal of Solid-State Circuits, Vol. 38, No. 8, pp. 1425-1428, August 2003.
 A. Fish, A. Belenky and O. Yadid-Pecht, “Wide Dynamic Range Snapshot APS for Ultra Low-Power Applications, IEEE Transactions on Circuits and Systems II, vol. 52, no. 11, pp. 729-733, November, 2005.
 O. Yadid-Pecht, C. Clark, B. Pain, C. Staller, E. Fossum, Wide dynamic range APS star tracker, in Proc. SPIE/IS&T Sym. on Electronic Imaging: Science and Technology, San Jose, California, SPIE Vol. 2654, Jan 29-Feb3, 1996, pp. 82-92.
 O. Yadid-Pecht, E. Fossum, "Wide Intrascene Dynamic Range CMOS APS Using Dual Sampling", IEEE Trans. Elec.
Dev., special issue on solid state image sensors, Vol. 44, No. 10, pp. 1721-1724, October 1997.
 O. Yadid-Pecht, "Wide dynamic range sensors", Optical Engineering, Vol. 38, No. 10, pp.1650-1660, October 1999.
 K. Buckley, "Selecting an Analog Front-End for Imaging Applications", Analog Dialogue, vol. 34-6, pp. 1-5, 2000.
 D.X.D. Yang, A. El Gamal, B. Fowler and H. Tian, “A 640x512 CMOS Image Sensor with Ultra Wide Dynamic Range Floating Point Pixel Level ADC,” IEEE ISSCC, WA 17.5, 1999.
 B. Pain, S. Mendis, R. Scober, R. Nixon, and E. Fossum, "Low-power low-noise analog circuits for on-focal-plane signal processing of infrared sensors," IEEE Workshop on Charge Coupled Devices and Advanced Image Sensors, June, 1995.
 A. Dickinson, S. Mendis, D. Inglis, K. Azadet, and E. Fossum, "CMOS Digital Camera with Parallel Analog-to Digital Conversion Architecture," IEEE Workshop on Charge Coupled Devices and Advanced Image Sensors, April, 1995.
 A. Krymski and N. Tu, "A 9-V/Lux-s 5000-Frames/s 512x512 CMOS Sensor," IEEE Trans. Electron Devices, vol. 50 pp.
136-143, Jan. 2003.
 S. Smith, J.Hurwitz, M. Torrie, D. Baxter, A. Holmes, M.Panaghiston, R. Henderson, A. Murray, S. Anderson, and P.
Denyer, "A single-chip 306x244-pixel CMOS NTSC video camera," ISSCC Digest of Technical Papers, pp. 170-171, February 1998.
 M. Loinaz, K. Singh, A. Blanksby, D. Inglis, K. Azadet, and B. Acland," A 200mW 3.3V CMOS color camera IC producing 352x288 24b Video at 30frames/s," ISSCC Digest of Technical Papers, pp 186-169, February 1998.
 G. L. Foresti, C. Micheloni, L. Snidaro, P. Remagnino, and T. Ellis, “Active video-based surveillance system”, IEEE Signal Processing Magazine, pp. 25-37, March 2005.
 E. Artyomov, Y. Rivenson, G. Levi, Orly Yadid-Pecht, “Morton (Z) Scan Based Real-Time Variable Resolution CMOS Image Sensor”, IEEE Transactions on Circuits and Systems for Video Technology, Volume 15, Issue 7, July 2005, pp. 947 – 952.
 A. Jain, L. Hong, S. Pankanti, “Biometric identification”, Communications of the ACM, Vol. 43, No. 2, 2000.
 K. Uchida, “Fingerprint identification”, NEC Journal of Advanced Technology, Vol. 2, No. 1, pp. 19-27, 2005.
 S.J. Kim, K.H. Lee, S.W. Han, and E. Yoon, "A 200160 Pixel CMOS Fingerprint Recognition SoC with Adaptable Column-Parallel Processors,", IEEE International Solid-State Circuits Conference (ISSCC), pp. 250-251, February 2005.
 “Robotics road map”, EURON Technology Roadmaps, April 23, 2004, www.org.id.tue.nl/IFIP-SG16/robotics-roadmap-2004.pdf  S. Shigematsu, H. Morimura, Y. Tanabe, T. Adachi, and K. Machida, “A Single-Chip Fingerprint Sensor and Identifier”, IEEE JOURNAL OF SOLID-STATE CIRCUITS, Vol. 34, No. 12, pp. 1852-1859, December  www.givenimaging.com Fourth International Conference I.TECH 2006  www.rfnorika.com  K.N.C. Hin, O. Yadid-Pecht, M. Mintchev, “e-Stool: Self-Stabilizng Capsule for Colonic Imaging”, Neuro-stimulation Conf, France, July 2005.