Advanced Image Stabilization Techniques for Tablet Camera Performance
Intel processors play a leading role in the tablet and two-in-one device market, especially for those higher-performance devices targeted at business environments and high-end consumer applications. One of the more popular applications for these devices is still photography and video capture. Market research indicates that business users and consumers prefer to use their tablets to share high-quality photos or videos on Facebook, Instagram, Snapchat or other popular, visually oriented social media sites. In fact for many users, their tablet serves as a replacement for a digital still camera or inexpensive video camera.
Not surprisingly, Intel processors help make that possible. The latest generation of the Intel Atom processor, for example, not only improves overall performance and extends battery life, it also supports excellent graphics and video with integrated image signal processing for both still and video image capture. By coupling high-resolution screens with high pixel density, together with the graphics-processing capabilities embedded in Intel processors, many of today’s tablets and two-in-one devices deliver extremely high-quality graphics and video.
Whether users are capturing still images or recording video, image stabilization plays a key role in producing a high-quality result by eliminating image distortion through pixel blurring and the creation of unwanted artifacts. Typically standalone cameras and mobile devices offering a photo or video function also add some form of image-stabilization capability to compensate for unintentional movements by the user. Intel-based tablets are no exception. The latest Atom processor adds multi-axis document image solution (DIS) and image alignment to help remove blur from moving objects.
However, as tablet and other mobile device developers move to ever-higher levels of resolution, demand is accelerating for more advanced image stabilization techniques. Two of the more common implementations—electronic image stabilization (EIS) and optical image stabilization (OIS)— are taking video and still image photography to a new level of performance.
Image stabilization techniques are designed to reduce blurring associated with relatively minor shaking of the camera within a few optical degrees while the image sensor is exposed to the capturing environment. These functions are not designed to prevent motion blur caused by movement of the target subject or extreme movements of the camera itself. This minor movement of the camera by the user is characterized by its pan and tilt components where the angular movements are known as yaw and pitch, respectively. Typically, these image stabilization functions cannot compensate for camera roll because rolling the lens doesn’t actually change or compensate for the roll motion, and therefore does not have any effect on the image itself relative to the image sensor.
EIS is a digital image compensation technique which uses complex algorithms to compare frame contrast and pixel location for each changing frame. Pixels on the image border provide the buffer needed for motion compensation. An EIS algorithm calculates the subtle differences between each frame and the camera uses this information to interpolate new frames to reduce the sense of motion.
EIS offers distinct advantages and disadvantages. As an image-stabilization scheme, it offers developers a relatively compact and lower-cost option. However, image quality is limited due to image scaling and image signal post-processing artifacts and any incremental improvement in image quality requires additional power to capture additional images and perform image processing. In addition, EIS solutions do not perform well at full electronic zoom (long field-of-view) and under low-light conditions.
In comparison, OIS is a mechanical technique used in imaging devices to stabilize the recording image by controlling the optical path to the image sensor. Two primary methods are used to implement OIS. One, called lens shift, involves moving the parts of the lens. The second, termed module tilt, moves the module itself (see Figure 1).
|Figure 1: There are two primary methods of implementing optical image stabilization
Camera movements by the user can cause misalignment of the optical path between the focusing lens and the center of the image sensor. In the OIS lens-shift method, only the lens within the camera module is controlled and used to realign the optical path to the center of the image sensor. The module tilt method, on the other hand, controls the movement of the entire module including the fixed lens and the image sensor. The module-tilt approach allows for greater range of movement compensation by the OIS system and achieves minimal image distortion because of the fixed focal length between the lens and the image sensor.
Compared to EIS solutions, OIS systems reduce blurring without significantly sacrificing image quality especially in low-light and long-range image capture. But unlike EIS which needs no additional hardware, OIS solutions require actuators and power driving sources that tend to require a larger footprint and higher cost.
An OIS system relies on a complete module of sensing, compensation and control components to accurately correct for unwanted camera movement. This movement or vibration is characterized in the X/Y-plane, with yaw/pan and pitch/tilt movements detected by different types of isolated sensors. The lens shift method uses Hall sensors for lens movement detection while the module tilt method uses photodetectors to detect human movement. OIS controllers can use gyroscope data within a lens target-positioning circuit to predict where the lens needs to return in order to compensate for the user’s natural movement. With lens shift, Hall sensors are used to detect real-time X/Y locations of the lens after taking into consideration actuator mechanical variances and the influence of gravity. The controller uses a separate internal servo system that combines the lens positioning data of the Hall sensors with the target lens position calculation from the gyroscope to calculate the exact driving power needed for the actuator to reposition the lens. The process is similar with module tilt but the module’s location is measured and repositioned instead of just the lens. With both methods, the new lens position realigns the optical path to the center of the image sensor.
|Figure 2. ROHM’s OIS system uses a complete module of sensing, compensation and control components to accurately correct for unwanted camera movement.|
OIS control is designed to be very simple from the customer’s standpoint, consisting simply of ON/OFF and enable/power-save modes. The only other commands are optional manual control of the lens in the X/Y plane or altering OIS performance based on ambient conditions such as day, night, sports, picture, video or viewfinder. This minimizes I2C traffic from the host processor to the OIS controller and simplifies software driver development for the end customer. All of the actual OIS control algorithms are performed autonomously on the controller itself, using the internal processor and RAM.
OIS Controller Considerations
Controller architectures for OIS applications vary significantly. Some combine a programmable core with custom programmable digital signal processing for gyroscope signal processing and servo control. Others integrate programmable gyroscope signal processing and servo control into the core itself. Typically all OIS memory and control calculations are performed on the OIS controller and do not require an external host processor’s computational resources or external memory.
Developers looking for a controller for OIS applications should consider a number of issues. Does the controller offer full control of the X- and Y-axis voice coil motor (VCM) drivers, Hall amplifier and current drivers and photo-reflector drivers? Does it feature the wide variety of interfaces and peripherals needed for the application including I2C, ADCs, PLL oscillators, SPI master for digital gyroscopes and support for analog gyroscopes? Does the MCU support integrated drivers for autofocus, neutral density filters or shutter functions? Be aware that some controllers offer digital filter designs in their servo control and gyroscope signal processing circuits that can improve performance by dynamically compensating for gyroscope and actuator temperature drift while not removing intentional pan and tilt movement by the camera user. Others add custom control software for automatic lens control, automatic pan-tilt detection and access to different programmable capturing modes and calibration settings.
Measuring Image Stabilization
Image stabilization is measured by suppression ratio (SR) and is utilized to gauge OIS performance. SR is calculated using a spatial test chart with a target pattern. Images of the target pattern are captured with OIS ON/OFF and with/without vibration. The images with and without OIS are then compared to compute a ratio of the amount of blur in each image. This test is typically used to provide a final guarantee that all of the components in the OIS system are functioning properly.
The figure below depicts examples of motion blur in the target pattern. The DSTATIC image represents an ideal result with no vibration or motion in the image. Ideally an OIS system attempts to match the quality of a still image with no motion blur and the DSTATIC image serves as a benchmark for calculating SR performance of the OIS system. In this example the DSTATIC image exhibits the shortest zoomed white area distance due to the absence of movement or blurring in the captured image. The DOISoff image represents the appearance of an image when it is vibrating or moving without using image stabilization. As a result, the DOISoff image exhibits much more blurring compared to the other images.
|Figure 3. The DOISoff image exhibits much more blurring compared to the other images in generic test pattern.
The observed amount of blur represents what needs to be corrected or suppressed to match the DOISoff image with the DSTATIC image. Therefore, the DOISon image represents the actual benefit of the OIS system. In this example, the DOISon image depicts an image that is vibrating or moving while image stabilization is enabled. The stabilization system suppresses blurring of the image and the distance of the zoomed white area is less than when compared to the DOISoff image. Once all three images have been captured, the blurring effect of each image is measured as a function of pixel count by calculating the number of pixels within the width of the zoomed white area and then using equation 1 (shown below diagram in Figure 3) to calculate final SR. This process is repeated for each image shaking frequency performance target and for each axis.
Proper OIS operation requires simulating the entire system to ensure that all components interact correctly together. While most OIS controller suppliers can simulate the ideal performance of golden OIS components such as the actuator, ROHM has developed highly specialized simulation tools that allow not only for simulation of OIS components, but also provide real-world OIS component simulations as well. These real-world results help accelerate the development of custom firmware for customers integrating OIS into their design (see Figure 4).
|Figure 4. Graph compares real-world OIS performance vs. ROHM’s simulated OIS performance.|
OIS systems also require careful calibration to ensure proper operation. All of the components within the OIS system possess individual manufacturing variances and assembly misalignment variances. A properly functioning system, the OIS controller must know the subtle sensitivity variances introduced by the manufacturing and assembly processes. Once the calibration process is complete, the OIS controller can use the collected data to modify control of the system and its components.
As next-generation tablets and two-in-one devices migrate up the performance curve, users will increasingly demand higher performance image and video capture capabilities. High of users’ list will be crisp, clear and blur-free images. By leveraging the latest advances in optical image stabilization, tablet and two-in-one device designers can meet those expectations.
Mark Aaldering is the senior director of technical product marketing at ROHM Semiconductor where his dedicated team drives new products into development and adoption in the computing, consumer, automotive and industrial markets.