Shenzhen Kai Mo Rui Electronic Technology Co. LTDShenzhen Kai Mo Rui Electronic Technology Co. LTD

News

CMOS Sensor debugging experience sharing

Source:Shenzhen Kai Mo Rui Electronic Technology Co. LTD2020-08-08

At present, cameras are used on many multimedia devices, including mobile devices, and they are being updated at a very fast rate. There are two types of cameras currently in use: CCD (Charge Couple Device) and CMOS (Complementary Metal Oxide Semiconductor). These two types have their own advantages and disadvantages: At present, CCD mainly uses high-quality DC, DV and high-end mobile phones, and its image quality is better, but the entire drive module is relatively complicated, and currently only some Japanese companies have the production technology.


 For selected manufacturers, the cost will be relatively high, and some equipment does not have strict requirements on image quality, and will have higher volume requirements; while CMOS meets such requirements, CMOS modules are relatively simple. At present, many manufacturers have The ISP (Image Signal Processor) for driving and signal processing is integrated inside the module, so the volume is smaller, and its production technology requirements are relatively simple, the process is relatively mature, the cost is relatively simple, the peripheral circuit is simple, and the image quality can also meet the general requirements. Therefore, it occupies a large share in the embedded market. At present, the quality of some high-end CMOS Sensors is comparable to that of CCDs.


First, we must understand the structure of the CMOS camera. What we usually get is an integrated packaged module, which is generally composed of three parts: lens, sensor, and image signal processor. Under normal circumstances, we only see the outside lens, interface and encapsulation of the integrated module, which is usually a fixed focal length. Some manufacturers only provide chips and need to install the lens by themselves. The lens should be a suitable size lens. If there is no night vision requirement, it is best to choose a lens with infrared filter, because ordinary sensors can sense infrared light. Filtering out will affect the color of the image. In addition, pay attention to ensure that the focus center of the lens is designed on the center of the photosensitive matrix of the sensor during PCB design. Except for this point, the CMOS Sensor hardware is almost the same as an ordinary IC. Be careful not to stain or polish the glass on the surface.


Secondly, the output signal of CMOS module can be analog signal output and digital signal output. The analog signal is generally a TV signal output, both PAL and NTSC, directly connected to the TV; digital output generally has two forms of parallel and serial, due to different image sizes, the data to be transmitted is different, and the frequency of the data is different It is also very large, but the pixel clock frequency of the serial interface is higher than that of the parallel mode (this is not difficult to understand under the same amount of data), and the higher frequency also has higher requirements on the peripheral circuit; the frequency of the parallel mode will be relatively It is much lower, but it requires more pin connections; so this should be of benefit to each. (The system used in my test is an 8bit parallel interface) In addition, there are many output signal formats, the main video output formats are: RGB, YUV, BAYER PATTERN, etc. Generally, the CMOS Sensor module will integrate the ISP inside the module, and its output format can be selected, so that you can make a more suitable choice for your system according to the interface of the chip you use. Among them, in order to reduce costs or technical problems, some sensors do not have ISP or have simple functions. The output is BAYER PATTERN. This format is the original image of the sensor, so post-processing is required, which requires a special image processor. Or the connected general-purpose processor has strong computing power (need to run image processing algorithms).

No matter what data format the sensor module uses, there are generally three synchronization signal outputs: frame synchronization/field synchronization (Frame synchronizing), horizontal synchronization (Horizontal synchronizing) and pixel clock (pixel clock). It is necessary to ensure that the effective state of the signal is consistent with that of the own system, such as triggering on the rising (falling) edge of the field synchronization, and the high (low) level of the line synchronization being effective.

Through the above introduction, we can choose the suitable sensor module according to the system we use. It is necessary to select the corresponding interface (if the parallel interface, the sensor module outputs more data bits than the receiving end, you can connect by discarding the lower data), the data format can be accepted or processed, and the pixel clock does not exceed the acceptable maximum frequency (some are Adjustable, but the frame rate will be affected), field synchronization and line synchronization can be adjusted to the same sensor module, so that it can be used.


To ensure the correctness of these conditions, it must also meet its hardware circuit requirements. The first thing is to determine whether its power, clock, RESET and other signals meet the chip requirements. Secondly, it depends on whether all the pins are connected correctly, so as to ensure the peripheral The image can be displayed correctly when there is no error in the circuit. The products produced by various manufacturers are different. Some manufacturers' sensor modules can output images by default, while some manufacturers' sensor modules must set up some registers before they can get images. To distinguish whether the image can be output directly, you can check the output pin of the sensor. If there are three synchronization signals and there is data on the data line, there will usually be a default image output. In addition, you can contact the manufacturer to obtain relevant information. If there is no default output, the register needs to be set. Generally, the register is set by the two-wire serial mode (the IIC bus is used very frequently).   



Summary of camera problems and solutions

One, noun explanation

1. White balance

White balance refers to the ability of the sensor to accurately reproduce the color in an environment with changing light. Most camera systems have an automatic white balance function, which can automatically change the white balance value under changing lighting conditions. The image sensor that design engineers are looking for should be equipped with a good automatic white balance (AWB) control to provide correct color reproduction.

2. Dynamic range

Dynamic range measures the ability of an image sensor to capture both light and dark objects in the same photo. It is usually defined as the logarithm of the ratio of the brightest signal to the darkest signal (noise threshold level), and 54dB is usually used as the general purpose of commercial image sensors. index. An image sensor with a wider dynamic range can provide better performance in bright light environments (for example, photos taken with a narrower dynamic range sensor in bright light will appear "washed" or blurred.)


3. Power frequency interference (Banding)

Sensor will generate flicker when acquiring image data under fluorescent lamp as the light source. The fundamental reason is that the light energy is different on different pixels, and the difference in light energy received is also the brightness of the image.

Since the exposure mode of the CMOS sensor is carried out line by line, the exposure time of any pixel is the same, that is, the exposure start point and exposure time of each pixel on the same line are exactly the same, so the same line The energy received by all points is the same, and although the exposure time is the same between different lines, the starting point of the exposure is different, so the energy received between different lines is not necessarily the same. In order to make the energy received between different rows the same, it is necessary to find a specific condition, so that even if the exposure start point is different for each row, the received light energy is the same, thus avoiding the flicker, this specific condition That is, the exposure time must be an integral multiple of the light energy period.

Banding is caused by power frequency interference. The AC light source has light intensity fluctuations. In China, the AC frequency is 50Hz, and the light intensity fluctuation is 100Hz, with a period of 10ms. If the camera exposure time is not an integral multiple of 10ms, then the light energy received on different photosensitive surfaces must be different, and there are bright and dark stripes on the image. To eliminate banding, you have to think about making the exposure time an integer multiple of 10ms! The 60Hz alternating current needs to control the exposure time to an integer multiple of 8.33ms.

Taking 50Hz as an example, there are two ways to achieve this:

1. Set the exposure control and force it to be an integer multiple of 10ms, but this will waste a part of the exposure time, resulting in the exposure not being used up, and the performance will naturally be lost indoors.

2. Modify the frame rate so that the time allocated to each frame image is an integer multiple of 10ms, and the exposure time per frame can be used to achieve better indoor effects. Modify the frame rate and insert Dummy Line or Dummy Pixel. This requires a little bit of calculation, and the specific calculation depends on the sensor output Timing.

For example, if the frame rate is set to 7.14fps, the exposure time per frame is 140ms. If it is 15fps, the exposure time per frame is 66.66ms. If the forced exposure is an integer multiple of 10ms, the maximum is 60ms, then 6.66ms cannot participate in the exposure and performance is lost.

The specific method of adjusting the frame rate has to be communicated with the sensor's FAE. Each sensor may be different and cannot be generalized. There is another principle to pay attention to when adjusting the frame rate. Generally, the preview should not be lower than 10fps. If it is lower, it will be very stuck. 14.3fps and 12.5fps are commonly used; the capture cannot be lower than 5fps, otherwise it will be difficult to take clear photos by hand, usually 7.14 fps. The frame rate is a trade-off choice. If the frame rate is high, the exposure time is not enough, the dark light effect is too poor, and the frame rate is low.

4. Lens Shading (color shading)

5. Chief Ray Angle

The interface between the shooting lens and the sensor is one of the most important interfaces in the entire camera phone system. As the length of the lens becomes shorter and shorter, the angle at which the light reaches the pixel position of the sensor also becomes larger. There is a micro lens on each pixel. The main function of the micro lens is to focus the light from different angles on this pixel. However, as the angle of the pixel position becomes larger and larger, some light will not be able to focus on the pixel, resulting in light loss and reduced pixel response.


From the sensor side of the lens, the maximum angle of light that can be focused on the pixel is defined as a parameter called the principal light angle (CRA). The general definition of the main light angle is: the pixel response at this angle is reduced to 80% of the zero-degree pixel response (at this time, the pixel is perpendicular to the light).

The angle at which the light enters each pixel will depend on where the pixel is located. The light near the lens axis will enter the pixel at an angle close to zero degrees. As its distance from the axis increases, the angle will increase accordingly. CRA and the position of the pixel in the sensor are related, and the relationship between them is related to the lens design. Very compact lenses have very complicated CRA modes. If the CRA of the lens does not match the micro lens design of the sensor, there will be undesirable light intensity (ie, "shadows") passing through the sensor. This phenomenon can be greatly reduced by changing the design of the micro lens and appropriately processing the captured images.

Changing the micro lens design can greatly reduce the shadow phenomenon. However, when changing the micro lens design, it is necessary to work closely with the lens designer to find a suitable CRA mode for various shooting lenses. The design engineer of the camera should ensure that such technical cooperation is achieved and that the CRA characteristics of the sensor and the lens can be well matched. To ensure the successful realization of this goal, Micron has developed related simulation tools and evaluation tools.

Because the light is incident on the sensor along different angles, the shadow phenomenon is inherent to various lens designs. The "cos4 law" shows that the reduced light is proportional to the fourth power of the cosine value of the angle. In addition, in some lens designs, the lens itself may block part of the light (called "flaring"), which can also cause shadows. Therefore, even if the micro lens design can minimize the shadow phenomenon of short lenses, this phenomenon will still exist to some extent. In order to provide camera designers with additional methods of correcting shadow phenomena, the image processor embedded in the MT9D111 includes a shadow correction function, which is customized for some specific lenses. To help design engineers integrate sensors in their products, Micron provides various development software for all sensor products it produces. 


By using these software, camera design engineers can simplify the process of modifying the default values of various chip characteristics. The result of each change can be displayed on a PC monitor. For many new types of lenses used in cameras, through the use of this development system, parameters can be set to correct lens shadows and spatial color distortion. By using a uniformly lit white target, a simple experiment can be performed on the setup response process. The software development tool can display the analysis results of the shadow phenomenon. After that, the engineer can use the area method to apply the correction value. The register settings regarding the calibration process will be saved in the development system for use in camera design.


6. Binning

Binning is to add together the charges induced in adjacent pixels and read out in a pixel mode. Binning is divided into horizontal direction Binning and vertical direction Binning. The horizontal direction Binning is to add the charges of adjacent rows to read out, while the vertical direction Binning is to add the charges of adjacent columns to read out. Binning is a technology The advantage is that it can combine several pixels as one pixel, improve sensitivity, output speed, reduce resolution, when row and column

When using Binning at the same time, the aspect ratio of the image does not change. When using 2:2 Binning, the image resolution will be reduced by 75%. It is recommended to use this method when previewing on the small screen of the mobile phone instead of using DSP to do the pumping action.

7. IR cut (filter out infrared light)

The sensor is not only sensitive to the visible spectrum, but also sensitive to the infrared spectrum. IR is infrared infrared light. If there is no IR-Cut Filter, the image will be obviously reddish. This color difference cannot be adjusted by software. Generally, IR- Cut is 650+/-10nm, while UV and ultraviolet light have very small energy and are generally ignored.

Photos taken without IR cut, it can be seen that the color of the image is the most influential.


2. Summary of image sensor shooting problems

1. Horizontal stripes appear

For example, horizontal purple or green stripes appear. Under normal circumstances, there is a problem with the timing.

The example diagram is as follows:

The hardware has improved the MCLK and PCLK lines, and there is basically no green line now.

When wiring, pay attention to MCLK, PCLK, frame synchronization (vsync) and horizontal synchronization (hsync). Basically, these signals of the chips on the market must be routed separately, and it is best to add GND shielding.


to sum up:

phenomenon: flashing horizontal purple or green interference lines

Reason: The distance between Hsync and the high-speed line is too close and too long, and coupling occurs (a 10cm high-speed line produces a coupling capacitance of about 5pF), which causes HSYNC to not be quickly pulled up to 90% of the area, the phase is not synchronized, and the final data collection is misaligned . Then because of the effect of the YUV algorithm, green and purple flashing lines are caused.

Solution: It is absolutely forbidden to squeeze the three wires of HSYNC, PCLK and MCLK together. 1) HSYNC is sandwiched between the low speed line SDA and SCL

2) If PCLK and MCLK must be routed close to each other, it is best to extend a little distance and clamp a ground wire between them.

2. Color and brightness are not continuous

Generally, there are short circuit, open circuit and wrong connection problems in the data line. The image will have contours similar to water ripples or large-area color shift. The overall picture with D signal loss will also have color shift. For example, RGB565, D0~D4 are all disconnected, the image will appear red due to excessive loss of blue and green signals.

1) An example of contour lines and color distortion caused by virtual welding of a data line

contour line

normal image

2) The greenish problem caused by the multiplexing of two data lines and other equipment

Two of the 8 data lines are multiplexed by other devices, so there is no data out of these two lines.

3) Reverse connection of data line:

4) Data line misalignment

Example 1. I finally initialized the OV2640, but the preview image is wrong. The attachment is a picture of my capture (one of my fingers -_-|||). I analyzed the above picture with Photoshop and found that only the G channel has a signal, and the RB channel is all black.

I tested the connection between the 10 data lines of the 2640 and the 16 data lines of the CSI, and found that the hardware engineer made a mistake when laying out the board. The 10 data lines D[0]~D[9] of the sensor were connected to the CSI. D[4]~D[15], and CSI obtains 8bit data of D[8]~D[15]. As a result, the data bits are misplaced and lost, resulting in the above image conditions.

5) Summary of examples of data line problems

The first image is the raw data captured under very low brightness

The second picture is a phenomenon that occurs after the aperture is increased


3. Only red or green color in the image

Y and U/V are in wrong order. After changing the sampling format of the camera from CbYCrY to YCbYCr, the color is correct. The sample picture is shown below:

4. Horizontal irregular stripes

5. Vertical irregular stripes

6. Reddish

7. Thermal noise.

The noise gradually increased over time.

It was normal when it started working, and there was no color point. After a period of work, the module began to show color points, and more and more color points. As shown in the figure above. Reason:

The sensor temperature will increase after working for a period of time, and the increase in temperature will aggravate the intrinsic excitation of the semiconductor material. This will cause the sensor S/N to decrease and the noise to increase. This situation has a great relationship with sensor materials, and back-end or software processing can alleviate this situation but cannot eradicate it. This is called a hot pixel, which is caused by overheating of the chip.

8. The analog voltage is too low or unstable

The analog voltage is too low to cause strong light to sense the image, and the color cast.

Example 1 is as shown in the figure below. Only the lamp on the ceiling can sense the image, and the other parts are blurred.

Example 2, too low analog voltage causes vertical stripes. The problem is solved after increasing AVDD.

Example 3. When debugging the OV7725, it was found that the image was striped when the camera was turned on. After a period of time, the image became normal. Does anyone know the reason? The abnormal images are as follows. The problem was found, and it was caused by unstable analog voltage.

9. The back material is too thin, causing "ghosting"

The reinforced surface should be matt black oil to prevent light leakage.

Example 1. OV2715 abnormal image, the light leakage of the back circuit board is sensed, the image is as follows:

Example 2, GC0307 image is abnormal, as shown below. There is a line in the middle, like a layered line, there is no normal situation. Ge Kewei told us to fill up the glue around, and that's it.

10. Image streaks caused by noise

In the new version of the circuit board, the phenomenon of moving the CMOS far away from the main IC disappears. It was placed on the back of the main IC before. It is guessed that the main IC has an impact on the CMOS, such as introducing noise on the analog voltage. Example 1 is shown in the figure below.

Example 2:


cmos is the 30w pixel of ov, the model is ov7141. When in use, there are obvious horizontal horizontal ripples. Use 3.3v and 2.5v power supply, where VDD_C and VDD_A are powered by 2.5v, and connect them directly to 2.5v on the pcb. Pave the ground directly, without dividing the analog ground and digital ground.


Using an external power supply to supply AVDD, the above phenomenon does not appear. It can be determined that it is caused by the power supply noise of the motherboard

The effect after changing the board is okay, the main changes are:

1) It used to be a two-layer board, but now it uses a 4-layer board with a dedicated power layer

2) Use large-capacity tantalum capacitors to filter the LDO output. The power ripple measured by the oscilloscope is smaller than before.

11. Power frequency interference

If it does not appear under natural outdoor light, it must be a flicker caused by 50/60Hz;

12. Lens calibration parameters are not adjusted well in the middle caused by the brighter situation

The picture taken with the OV9650 camera module, the pixel is 800 X 600; the middle is brighter

From the hardware point of view, it may be that the lens set does not match the sensor, especially CRA. You have to see if the datasheet is too far apart.

On the software, it may be that the lens correction is not adjusted properly (personally feel that the host’s situation is in this column). Set the correction area and then increase the gain value to reduce the brightness difference between the center and the periphery. If the entire screen is overexposed at this time, you can adjust the overall Adjust the gain value further (you can also set the exposure parameter to reduce the screen brightness).

Adjust the values of several registers related to lens correction of the OV9650 according to the above method to make the brightness of the center and the surroundings uniform!

13. Reduce noise through automatic gain control

When debugging OV7675, the left side of the image is fuzzy, the right side is normal, the picture is as follows:

After reducing the AGC, it will not appear, but it is not illuminated before. The effect is as follows:

14. Greenish phenomenon in automatic exposure calculation

OV7670:

When shooting in bright outdoor light, the color of the screen is normal at all times.

When shooting indoors in low light, the picture taken when the camera is turned on is greenish and will return to normal after a few seconds.

is normal.

OV7670 30W It takes a long time to calculate AE. In the process of calculating AE, color cast is prone to occur. Can drop frames or delay to solve this problem

15. Streaks appear at the top or bottom of the image due to incorrect timing. An example of a problem due to Vsync shift is shown in the figure below.

Problem solution:

The timing of the camera module cannot be adjusted. Modify the camera control of the AP to make the vertical sync shift by 12 rows. The image output is correct.

16. Glare caused by reflection between lens

This is a picture taken by a 5M module. The ceiling light is at the outer edge of the field of view. Why does the purple-red light appear in the picture? What caused it? is a glare phenomenon, generally caused by reflection between multiple lenses. By improving the coating process and increasing the transmittance of the lens, this problem can be alleviated.

In addition, the optical center of this photo is shifted to the left, and the holder shifted? The lens set circle is big enough, and this offset can be covered.

Thank you for your attention. The problem has been solved. This phenomenon is caused by the mechanism of the lens assembly on the module.

17. pclk and vsync wiring interference

When debugging a mobile phone camera (OV7675), I found that the picture was not synchronized vertically, mainly because the lower part of the picture was very jittery, and the upper part was good.

The problem has been found, the frame synchronization VSYNC and PCLK wiring are interfered

18. Noise caused by incorrect selection of PCLK sampling edge

Example 1, there is noise in the picture

I changed the polarity of Pclk, and this problem of irritability was solved very well.

Example 2. The photo taken by ov7675 turns green. It may be that the sampling edge of PCLK is wrong, you can try to reverse pclk. It may also be a missing data cable.

Example 3, as shown in the figure below. It is solved by modifying the rising and falling edges of pclk.

has two main points:

1. Modify the slope of the rising edge of PCLK. 2. Or modify the slope of the rising edge of I/O.

The reason is that the length of the layout of the modules of different manufacturers, the thickness of the FPC, may affect the acquisition of PCLK, the tolerance of the FPC is too large, or whether there is any problem in the production of the head board, it may cause this problem. If the slope of the rising edge of PCLK can be changed by hardware, this problem can also be solved.

came to close the case, solved by modifying the rising and falling edges of pclk

19. FPN problem

There is no such problem in the daytime or in a brighter place. This is only the case when using the flash to take pictures in low light.

FPN (fixed pattern noise), no solution.

20. Step effect

Gain is too large, multiplying the quantization step of digitalize to a large value, a step effect will appear. It is also related to the lack of internal quantification accuracy.

In addition, if the gain of different color channels is different (the R/G/B_gain calculated by the white balance is different), a color phase error will occur.

The schematic diagram is as follows. Only the two channels B and G are drawn. B_gain is larger than G_gain, which will cause gray-scale scenes. In some places, B is large, and in some places, G is large, and colors will continue to alternate.

Combined with the step-up effect, it may appear as this picture

21. Vertical stripes caused by power problems

Now it is determined that the power supply is the problem. I connected a large capacitor to each power supply and the stripes disappeared. Now I use the I/O of the CPU to collect, the effect is very good.

22. Part of the reddish phenomenon caused by the mismatch between the lens and the camera

The lower center of the picture is reddish. The ov engineer adjusted LENS CORRECTION to the limit. The problem still exists. It is confirmed that the LENS and SENSOR do not match. The problem is basically solved after the module manufacturer replaced the lens.

I downloaded your picture and found the following problems:

1. First of all, your photo awb is wrong, the photo itself has not reached white balance. 2. The jaggedness of the photo border is very serious.

3. For color shift, you must first understand the Lenschief ray angle of your sensor and what is the CRA of the lens. If the CRA of the lens is smaller than that of the sensor. There must be a color cast. Or change the lens. If an unsuitable Lens is found on the market, it means that the quality of the sensor itself is not very good.

4. In theory, lens shading is to solve the problem of different lens transparency. But maybe everyone can add their own algorithms and give it a try.

5. If both Lens and sensor are fixed, you can think of some artificial methods to reduce the color difference. a. You can make the color lighter so that it is not obvious

B. Perform AWB calibration to eliminate the difference in RGB sensing of different sensors, which may cause the AWB curve to go inaccurately.

Generally speaking, CRA is the angle between the main axis of the lens and the largest incident light that contributes to imaging.Generally, Lens manufacturers will provide the CRA curve, because the CRA of the lens is different from the center to the surroundings.

In addition to SHADING, you may still need to adjust AWB for redness, because the bottom of the picture is actually a piece of white, and the sensor appears reddish in the white place, try to adjust AWB, or look at the R, G, B in the light box The three lines are

Does it coincide!

If it is the AWB problem, why does the image still have white areas? AWB will not adjust some of the color cast, some of it is not biased, don't talk nonsense if you don't know.

If the CRA does not match, the color cast should be symmetrical. If the bottom is reddish, the top will be reddish. Personally, I think it should be caused by light leakage, either the barrel or the stray light introduced from the light hole.

twenty three. The DOVDD28 trace is too thin and long and the ground wire is unreasonable. Phenomenon: Fuzzy screen

Reason: The 2.8V voltage absorbs the voltage due to the resistance on the wire, resulting in insufficient drive capacity. The ground wire is pulled high and produces burrs, which affects signal integrity and data acquisition.

24. DVDD voltage problem

The bright part in the picture is the office window. The other parts are all black, without any details? what is the reason? AWB? AGC? Still contrast?

The problem is solved, the DVDD voltage is wrong.

1.8V written in the datasheet, I asked the FAE and the result was 1.2V.

25. White streak problem caused by small gain

When facing a white object, when you first enter the preview, the stripe shown in the figure below will appear. When you move the phone, the stripe disappears and will not appear in the future. It may appear only when you enter the preview again. Please ask everyone What is the reason for prawns?

This problem has now been solved. After increasing the gain in the initialization code, it is fine.

26. Image misalignment caused by frame rate issues

Sensor is 0v9655. When shooting 1.3 million images of sxga, the image misalignment may sometimes appear (as shown in the picture), but the vga does not appear. Help analyze. Thank you!

The frame rate is too high, and the exposure time is short. You can adjust VBLANK, HBLANK to solve the problem, and then reduce the FPS to 5, try, what about your buffer speed?? Thank you! It is more effective to reduce the frame rate here.

27. Power supply noise

OV9653 has horizontal lines as shown in the figure.

The problem has been solved, the power supply problem, just add tantalum capacitor to AVDD. It is estimated that the power supply ripple is relatively serious.

Related News

Professional Engineer

24-hour online serviceSubmit requirements and quickly customize solutions for you

+8613798538021