Click here to Skip to main content
15,885,767 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
We use EDSDK to control cannon Eos 7D, for taking picture with one fixed object.

We try to make everything the same, including camera position, aperture, ISO, shutter Speed, focus (manual focusing), no flash lamp, and take picture one after another to make sure everything is not changed. It is expected we can obtain the close RGB images every time.

But then we found the JPG images are diffent every time we capture. For example , we calculate the RGB sum of the whole object block (block position is fixed, background are pure dark -- zeros), first time we get RGB == (10000,20000,15000), second time we get (12000,24000,17000), third time we get(9000, 18000, 13000). We know there must be some little variance/noise when capturing pictures. But the RGB values shifted much every time (-15% to 15% difference), this must be not noise ( we guess it must be caused by some auto adjusting setting).

Why we get the different results? Where do we make mistake?

We also try to get the raw format image (.CR2) , and then use dcraw.exe to transfer it to PPM or TIFF format, with the same tranforming parameter (we use -v -k 2400 -S 13000 -W -g 2.222 4.5 ). But the image RGB values still shifted much every time.

Below are some snippet of our code (in C#, some details are ignored).

Since our task is to calculate the RGB values accurately, so this problem is quite important for us. Thank you very much for your help !


C#
public void main(){

    EDSDK.EdsInitializeSDK();
    EDSDK.EdsGetCameraList(out cameraList);
    EDSDK.EdsGetChildCount(cameraList, out cameraCount);
    EDSDK.EdsGetChildAtIndex(cameraList, 0, out cam);
    EDSDK.EdsGetDeviceInfo(cam, out deviceInfo);
    EDSDK.EdsSetPropertyEventHandler(cam, EDSDK.PropertyEvent_All, propertyEventHandle, inContext);
    ObjectEventHandle = new EDSDK.EdsObjectEventHandler(ObjectEventCallBack);
    EDSDK.EdsSetObjectEventHandler(cam, EDSDK.ObjectEvent_All, ObjectEventHandle, IntPtr.Zero);
    EDSDK.EdsSetCameraStateEventHandler(cam, EDSDK.StateEvent_All, stateEventHandle, inContext);
    EDSDK.EdsOpenSession(cam);

    EDSDK.EdsSetPropertyData(cam, EDSDK.PropID_SaveTo, 0, 4, (uint)EDSDK.EdsSaveTo.Host);
    EDSDK.EdsSetPropertyData(cam, EDSDK.PropID_ImageQuality, 0, 4, (uint)0x0013ff0f);

    EDSDK.EdsSetPropertyData(cam, EDSDK.PropID_Av, 0, 4, (uint)0x58);
    EDSDK.EdsSetPropertyData(cam, EDSDK.PropID_Tv, 0, 4, (uint)0x6b);
    EDSDK.EdsSetPropertyData(cam, EDSDK.PropID_ISOSpeed, 0, 4, (uint)0x48);

    EDSDK.EdsCapacity capacity = default(EDSDK.EdsCapacity);
    capacity.NumberOfFreeClusters = 0x10000000;
    capacity.BytesPerSector = 0x0200;
    capacity.Reset = 1;
    EDSDK.EdsSetCapacity(cam, capacity);

    EDSDK.EdsSendCommand(cam, EDSDK.CameraCommand_TakePicture, 0);


}


    public void DownloadImage(String Path, IntPtr DirItem)
    {
        uint Err = 0;

        EDSDK.EdsDirectoryItemInfo DirInfo;

        Err = EDSDK.EdsGetDirectoryItemInfo(DirItem, out DirInfo);
        if (Err != 0) throw new Exception();

        IntPtr Stream;
        Err = EDSDK.EdsCreateFileStream(Path, EDSDK.EdsFileCreateDisposition.CreateAlways, EDSDK.EdsAccess.ReadWrite, out Stream);
        if (Err != 0) throw new Exception();

        Err = EDSDK.EdsDownload(DirItem, DirInfo.Size, Stream);
        if (Err != 0) throw new Exception();

        Err = EDSDK.EdsDownloadComplete(DirItem);
        if (Err != 0) throw new Exception();

        Err = EDSDK.EdsRelease(Stream);
        if (Err != 0) throw new Exception();

        while (!System.IO.File.Exists(Path))
            Thread.Sleep(100);

    }

   public uint ObjectEventCallBack(uint Event, IntPtr Object, IntPtr Context)
    {
        switch (Event)
        {
            case EDSDK.ObjectEvent_DirItemCreated:
                foreach (EDSFileObject File in Results)
                {
                    if (File.mFileInfo.isFolder == 0)
                    {
                            DownloadImage(Filepath, File.mFilePointer);
                        }
                    }
                }
                break;
        }
        return EDSDKLib.EDSDK.EDS_ERR_OK;
    }
Posted

1 solution

I don't think you do a mistake in programming. The camera sensor is an analog device, it has certain accuracy of measurement and certain noise. I am trying to say that it has randomized digital output even in the perfectly static conditions. But, additionally, the conditions are not perfectly static. Despite of all your efforts, the scene shakes (by the way, do you keep the mirror lifted during the whole process, or do you flip it back and forth on every shot? unfortunately, in modern cameras, this is the usability problem; for example, on my camera, the feature does exist, but it still flips mirror on bracketing operation, which is a real shame; I seriously think there is a lot of stupidity even in best camera manufacturers). And lighting still slightly fluctuates, no matter how hard you try. And, finally, I don't know what your in-camera processing does. Do you shoot in raw?

What can I suggest? Well, learn to live with that. Make your algorithms independent of small variations in pixels. Experiment with statistics, try to make many shots of identical scene and average out the difference. Unfortunately, we did not share the ultimate goals of your project. Maybe I could give you some ideas if I knew them. Anyway, it sounds interesting!

—SA
 
Share this answer
 
Comments
azure912 28-Apr-13 19:48pm    
Thank you very much, Sergey Alexandrovich. I agree with you that there is certain random noise during the capture. But I think the variance is too high (sometimes more than 15%), which must be not normal for this situation.


The goal of our project is to calibrate the RGB pixels of the LED screen. So the measurement of RGB must be accurate.

Averaging multiple shots of identical scene is one kind of choice, but this may cause a lot of time, and is not the optimal solution used by other LED calibration commercial software.

We have fixed the camera and target LED sceen, also all the light condition is controlled in dark room to keep the light not fluctuated.

I guess the phenomenon may be cause by reasons below:


1. The pixels of LED changes with time, so is the shutter time long enough to keep the RGB meaturement stable?

We set Aperture 32, Shutter Time 1/60, ISO 100. Could you give some suggestion about the camera parameter setting?

2. In-camera processing. From the EDSDK API documents, we know that there is PictureStyle and WhiteBalance operation performed in the CR2->JPEG .

We choose the default setting. PictureStyle => Standard, WhiteBalcne => Daylight. Are they correct setting?

3. Another solution for the second point, as I mentioned in the initial question. We capture with raw format (.CR2), and use dcraw to transform it to PPM or TIFF, the arguments for dcraw (-v -k 2400 -S 13000 -W -g 2.222 4.5), are they enough to keep the output the same?


Thanks again for your help.
Sergey Alexandrovich Kryukov 28-Apr-13 20:08pm    
From the very beginning, I felt your high variance values higher than I would expect. Consider my answer as preliminary.
I would need to know your measurement method. This is not as simple as it could be. Imagine that you image is well focused, contrast, and you measure variance pixel by pixel. In this case, you should not be able to separate different effects. I can explain why. Imagine the variance is very low, but you shake the camera by some 0.5-3 pixels. If the picture is very contrast, a bright pixel may move to a dark side. I hope you understand the effect.

How you can separate the effect. First of all, you need to make a series of measurements with strongly de-focused uniform background. Uniform does not solve the problem, defocusing is the most important technique.

You can estimate the light fluctuations by variation of exposure time. First of all, compare shortest possible and longest possible. Of course, exposure should be normal in all cases (far from black and burned). Of course, never use auto-exposure. Color balance does not matter much, but it should depend on color condition. Do you have a standard gray card? (You can buy one at the photo store.) Better use custom color balance, but never change is (with the same source).

Forget JPEG forever (for measurement and such application). Shoot in raw and only in raw.

I'm not familiar with and CR2 format. If you could directly access raw pixels, it would be the best.

—SA
azure912 29-Apr-13 11:29am    
Our algorithm is to measure the color of LED pixel. For example, we hope to measure how green one LED pixel shines, we control the LED pixel to output green light, and then capture the screen (including the specific LED pixel) to one image, then analyze the LED pixel using one rectangel block in the image (we accurately locate the position of pixel). The sum of Green values in the rectrangel block is used for describing the green degree for that LED pixel. But in our experiment, the Green sum values variate much every time we shot.

You mentioned about the measurements with strongly de-focused uniform background. I don't know knowledge about the defocusing technology, would you please provide some document about that?


The camera is fixed on the tripod, so I think the shaked caused 0.5-3 pixel movement can be avoided. CR2 is raw format that dominanted by Cannon company. It is quite similar with other raw format.

Many thanks for your kind help.
Sergey Alexandrovich Kryukov 29-Apr-13 12:38pm    
"Defocusing" is not a technique: simply means that you turn your lens focus ring to make picture fuzzy. Bokeh:
http://en.wikipedia.org/wiki/Bokeh

So far, you are focused on your pixels on a LED screen. You mix up spatial and time effects. Separate them. Separately, learn the variance on pixel values due to lighting or camera noise in different mode on a uniform background. It can be really uniform if the color is uniform + important! the image of the background is very much out of lens's focus. Wide open diaphragm (as possible) and out of focus.

Separately, study the resolution. Do you have a "mira" (special target with lines to resolve)? If not, use your LED screen. For example, do you know how do you shoot: how many camera pixels do you have, per LED pixel? Photograph just one isolated pixel or straight vertical or horizontal lines of pixels in different spacing between lines and different conditions. The problem could be there... As a bonus, you would be able to evaluate your distortion...

—SA
azure912 30-Apr-13 11:39am    
For the second point, since our LED screen is very big and the camera is quite far from the LED. We have designed algorithm to acurately seperate the image by each LED pixel . Every LED pixel occupies about 20*20=400 image pixels. We sum up the whole green values in the rectrange block to desctribe the green degree of the LED pixel.

We take the picture in the dark room or night outside. So the background is simply black, I mean, all of them are zero. So I think the background must not be the root cause.

The refresh frequency of LED is 100HZ, our shutter time is 1/60 second. I don't know whether the shutter time is too quick to capture the stable RGB values.


The Aperture value is set as big as possible (about 32) , but the image is still over exposure for many 255 value pixels.

The ISO is set 100, should I set it larger?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900