mastada
2005-07-12 14:10:47 UTC
Good day.
Well another on on the Forum. Lets see what we can do.I'm using Measurement Studio and all the CWIMAQ Controls and I want to Create my own calibration grid using the
CWIMAQVision1.LearnCalibrationPoints method. I've already been doing some stuff with a grid graphic and setting the Dx and DY. That works. But now I want to Pick the Pixel Points and Pick supply the Real World X and Y for each point. As is described just better than vaguely in the IMAQ manuals. One.... How many points do I need to supply to get a decent "Learn Score" from LearnCalibrationPoints? And Exactly how does the LearnCalibrationPoints work? I presume that it just interpolates somehow for all the pixels between the points selected?
Here is my App. (Plan). I have an image take from a Ship (Sea going ship) looking out to the horizon. The horizon could be as far as 6 mile away. But I am wanting to create a grid that is perhaps out to 500 feet and possibly 200 feet wide. I can supply many points for this region of my image, then I am perhaps less precise with point that describe the perimeter of the image out to the Horizon. I can solve distance out to the horizon mathematically, and then I can solve the distance across the left and right of the image at the Horizon with math. All calibrated with perspective.
I've attached a picture. You'll see 19 points supplied to pixel image therefore 19 different real world positions, all relative to each other.
Please let me know if this will work. The LearnCalibrationPOints should pull it off, as I want to make a powerful user defined tool with the NI tools.
Thanks for the help on previous threads.
UserCalibration.jpg:
Loading Image...![](https://natinst.public.daq.digital.general.narkive.com/3GsWI38N/creating-a-calibration-grid-for-measurement-studio:i.1.1.thumb)
Well another on on the Forum. Lets see what we can do.I'm using Measurement Studio and all the CWIMAQ Controls and I want to Create my own calibration grid using the
CWIMAQVision1.LearnCalibrationPoints method. I've already been doing some stuff with a grid graphic and setting the Dx and DY. That works. But now I want to Pick the Pixel Points and Pick supply the Real World X and Y for each point. As is described just better than vaguely in the IMAQ manuals. One.... How many points do I need to supply to get a decent "Learn Score" from LearnCalibrationPoints? And Exactly how does the LearnCalibrationPoints work? I presume that it just interpolates somehow for all the pixels between the points selected?
Here is my App. (Plan). I have an image take from a Ship (Sea going ship) looking out to the horizon. The horizon could be as far as 6 mile away. But I am wanting to create a grid that is perhaps out to 500 feet and possibly 200 feet wide. I can supply many points for this region of my image, then I am perhaps less precise with point that describe the perimeter of the image out to the Horizon. I can solve distance out to the horizon mathematically, and then I can solve the distance across the left and right of the image at the Horizon with math. All calibrated with perspective.
I've attached a picture. You'll see 19 points supplied to pixel image therefore 19 different real world positions, all relative to each other.
Please let me know if this will work. The LearnCalibrationPOints should pull it off, as I want to make a powerful user defined tool with the NI tools.
Thanks for the help on previous threads.
UserCalibration.jpg:
Loading Image...