Endoscopy is an imaging procedure used for diagnosis as well as for some surgical purposes. The camera used for the endoscopy should be small and able to produce a good quality image or video, to reduce discomfort of the patients, and to increase the efficiency of the medical team. To achieve these fundamental goals, a small endoscopy camera with a footprint of 1 mm×1 mm×1.65 mm is used. Due to the physical properties of the sensors and human vision system limitations, different image-processing algorithms, such as noise reduction, demosaicking, and gamma correction, among others, are needed to faithfully reproduce the image or video. A full image-processing pipeline is implemented using a field-programmable gate array (FPGA) to accomplish a high frame rate of 60 fps with minimum processing delay. Along with this, a viewer has also been developed to display and control the image-processing pipeline. The control and data transfer are done by a USB 3.0 end point in the computer. The full developed system achieves real-time processing of the image and fits in a Xilinx Spartan-6LX150 FPGA.
CMOS image sensor manufacturer, AWAIBA, is providing the world’s smallest digital camera modules to the world market for minimally invasive surgery and one time use endoscopic equipment. Based on the world’s smallest digital camera head and the evaluation board provided to it, the aim of this paper is to demonstrate an advanced fast response dynamic control algorithm of the illumination LED source coupled to the camera head, over the LED drivers embedded on the evaluation board. Cost efficient and small size endoscopic camera modules nowadays embed minimal size image sensors capable of not only adjusting gain and exposure time but also LED illumination with adjustable illumination power. The LED illumination power has to be dynamically adjusted while navigating the endoscope over changing illumination conditions of several orders of magnitude within fractions of the second to guarantee a smooth viewing experience. The algorithm is centered on the pixel analysis of selected ROIs enabling it to dynamically adjust the illumination intensity based on the measured pixel saturation level. The control core was developed in VHDL and tested in a laboratory environment over changing light conditions. The obtained results show that it is capable of achieving correction speeds under 1 s while maintaining a static error below 3% relative to the total number of pixels on the image. The result of this work will allow the integration of millimeter sized high brightness LED sources on minimal form factor cameras enabling its use in endoscopic surgical robotic or micro invasive surgery.
Centered on Awaiba’s NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this
paper is to demonstrate a new technique to synchronize up to 8 individual self-timed cameras with minimal error. Small
form factor self-timed camera modules of 1 mm x 1 mm or smaller do not normally allow external synchronization.
However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed
illumination it is required to synchronize multiple cameras. In this work, the challenge of synchronizing multiple selftimed
cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the
cameras. To that effect, a control core was created to constantly monitor the operating frequency of each camera by
measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the
voltage level applied to the sensor based on the error between the measured line period and the desired line period. To
ensure phase synchronization between frames, a Master-Slave interface was implemented. A single camera is defined as
the Master, with its operating frequency being controlled directly through a PC based interface. The remaining cameras
are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining
cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The
result of this work will allow the implementation of smaller than 3mm diameter 3D stereo vision equipment in medical
endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.
Based on Awaiba’s NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper
is to demonstrate a novel technique to perfectly synchronize up to 8 individual self-timed cameras. Minimal form factor
self-timed camera modules of 1 mm x 1 mm or smaller do not generally allow external synchronization. However, for
stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is
required to synchronize multiple cameras. In this work, the challenge to synchronize multiple self-timed cameras with only
4 wire interface has been solved by adaptively regulating the power supply for each of the cameras to synchronize their
frame rate and frame phase. To that effect, a control core was created to constantly monitor the operating frequency of
each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted
by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line
period. To ensure phase synchronization between frames of multiple cameras, a Master-Slave interface was implemented.
A single camera is defined as the Master entity, with its operating frequency being controlled directly through a PC based
interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control
module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase
and frequency synchronization. The result of this work will allow the realization of smaller than 3mm diameter 3D stereo
vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.
Today´s standard procedure for the examination of the colon uses a digital endoscope located at the tip of a tube encasing wires for camera read out, fibers for illumination, and mechanical structures for steering and navigation. On the other hand, there are swallowable capsules incorporating a miniaturized camera which are more cost effective, disposable, and less unpleasant for the patient during examination but cannot be navigated along the path through the colon. We report on the development of a miniaturized endoscopic camera as part of a completely wireless capsule which can be safely and accurately navigated and controlled from the outside using an electromagnet. The endoscope is based on a global shutter CMOS-imager with 640x640 pixels and a pixel size of 3.6μm featuring through silicon vias. Hence, the required electronic connectivity is done at its back side using a ball grid array enabling smallest lateral dimensions. The layout of the f/5-objective with 100° diagonal field of view aims for low production cost and employs polymeric lenses produced by injection molding. Due to the need of at least one-time autoclaving, high temperature resistant polymers were selected. Optical and mechanical design considerations are given along with experimental data obtained from realized demonstrators.
This paper presents a digital image sensor SOC featuring a total chip area (including dicing tolerances) of 0.34mm2 for endoscopic applications. Due to this extremely small form factor the sensor enables integration in endoscopes, guide wires and locater devices of less than 1mm outer diameter. The sensor embeds a pixel matrix of 10'000 pixels with a pitch of 3um x 3um covered with RGB filters in Bayer pattern. The sensor operates fully autonomous, controlled by an on chip ring oscillator and readout state machine, which controls integration AD conversion and data transmission, thus the sensor only requires 4 pin's for power supply and data communication. The sensor provides a frame rate of 40Frames per second over a LVDS serial data link. The endoscopic application requires that the sensor must work without any local power decoupling capacitances at the end of up to 2m cabling and be able to sustain data communication over the same wire length without deteriorating image quality. This has been achieved by implementation of a current mode successive approximation ADC and current steering LVDS data transmission. An band gap circuit with -40dB PSRR at the data frequency was implemented as on chip reference to improve robustness against power supply ringing due to the high series inductance of the long cables. The B&W versions of the sensor provides a conversion gain of 30DN/nJ/cm2 at 550nm with a read noise in dark of 1.2DN when operated at 2m cable. Using the photon transfer method according to EMVA1288 standard the full well capacity was determined to be 18ke-. According to our knowledge the presented work is the currently world smallest fully digital image sensor. The chip was designed along with a aspheric single surface lens to assemble on the chip without increasing the form factor. The extremely small form factor of the resulting camera permit's to provide visualization with much higher than state of the art spatial resolution in sub 1mm endoscopic applications, where so far only optical fiber bundles providing 1k - 3k image points could be used. In many applications, such as guide wires and locater devices the small form factor permits to implement visualization for the first time.
This paper presents a digital line-scan sensor in standard CMOS technology for high resolution scanning application in machine vision, mainly surface inspection of large panel and web materials. The sensor however has due to the unprecedented resolution also application potential in earth observation and motion picture context. The sensor features 16384 charge integrating pixels of 3.5um x 3.5um photo active area. Each pixel has it's own charge integrating transconductance amplifier circuit, a true correlated double sampling stage, sample & hold stage and a pixel level 13 bit linear AD converter. Readout is performed over 16 parallel digital output tap's operated at 50MHz pixel clock. The sensor generates at maximum speed a total data rate of 10.4Gbit/s. In order to maximize the integration time, data readout, AD conversion and integration can be performed simultaneously. Therefore even at the maximum line rate of 43kScans/second the integration time can be maintained at 20us. In order to accommodate for different application scenarios with very different lighting budget's, the sensors full well capacity can be programmed by means of a two step programmable gain from 3000e- to 40ke-. The prototype characterization results showed a total quantum efficiency of 72% at 625nm. With the full well capacity set to 26ke- the conversion gain was measured to be 0.13DN/e- with a read noise in dark of 1.7DN, or 12 e- dark noise equivalent. Over all DSNU is reduced to 3DN rms independent of the conversion gain by the on chip combination of CDS and digital DSNU correction. PRNU was measured according the EMVA1288 standard to 1.2% rms. The sensor is mounted on an "Invar" enforced COB board without glass cover for reduced reflections on optical interface stacks. Instead of traditional package leads SMD mounted board to board connectors are used for the electrical connections.
CMOS image sensors brought a number of advantages compared to CCD image sensors. Selective readout (ROI), logarithmic compression and better high-speed performance are just some of the key assets of CMOS technology. In spite of these features, CMOS sensors are only rarely used in industrial vision applications. One of the reasons for this gap between potential and realized applications is the lack of industrial cameras, with standard interfaces. This paper presents a digital camera with the CameraLinkTM interface based on a megapixel CMOS imagesensor. The CameraLinkTM standard has the potential to set an end to company specific interconnect solutions, and limitations by the analog TV-standard. The CameraLinkTM standard is based on a 7:1 serialization and LVDS (Low Voltage Differential Signal) transmittance chip set.
CMOS image sensors offer over the standard and ubiquitous charge-coupled devices several advantages, in terms of power consumption, miniaturization, on-chip integration of analog- to-digital converters and signal processing for dedicated functionality. Due to the typically higher readout noise of CMOS cameras compared to CCD cameras applications demanding ultimate sensitivity were so far not accessible to CMOS cameras. This paper present an analysis of major noise sources, concepts to reduce them, and results obtained ona single chip digital camera with a QCIF resolution of 144 by 176 pixels and a dynamic range in excess of 120 dB.
This article presents the design and realization of a CMOS digital image sensor optimized for button-battery powered applications. First, a pixel with local analog memory was designed, allowing efficient sensor global shutter operation. The exposure time becomes independent on the readout speed and a lower readout frequency can be used without causing image distortion. Second, a multi-path readout architecture was developed, allowing an efficient use of the power consumption in sub-sampling modes. These techniques were integrated in a 0.5 um CMOS digital image senor with a resolution of 648 by 648 pixels. The peak supply current is 7 mA for a readout frequency of 4 Mpixel/s at Vdd equals 3V. Die size is 55 mm2 and overall SNR is 55 dB. The global shutter performance was demonstrated by acquiring pictures of fast moving objects without observing any distortion, even at a low readout frequency of 4 MHz.