kalganian
2008-07-15 18:40:10 UTC
greetings, i'm using a pci 6533 board to acquire digital data from a imager chip. i'm using the ms visual c api.the chip generates a VSYNC pulse after which it generates 352*288=101376 ticks of a pixel clock at which the data is available on a 8 bit wide bus. once all ticks are done, VSYNC pulses again and the next frame is read out.i have the circuit hooked up thus:VSYNC to pfi6PIXEL CLOCK to pfi28 bit output bus to port0following is the configuration code for the card:DAQmxErrChk (DAQmxCreateTask("di_pfi6start_pfi2clk",&taskHandle));DAQmxErrChk (DAQmxCreateDIChan(taskHandle,"Dev4/port0","",DAQmx_Val_ChanForAllLines));DAQmxErrChk (DAQmxCfgDigEdgeStartTrig(taskHandle,"/Dev4/PFI6",DAQmx_Val_Rising));DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandle,"/Dev4/PFI2",DINFREQ,DAQmx_Val_Rising,DAQmx_Val_ContSamps,10*ROW*COL));DAQmxErrChk (DAQmxRegisterEveryNSamplesEvent(taskHandle,DAQmx_Val_Acquired_Into_Buffer,ROW*COL,0,EveryNCallback,NULL));DAQmxErrChk (DAQmxRegisterDoneEvent(taskHandle,0,DoneCallback,NULL));DAQmxErrChk (DAQmxStartTask(taskHandle));following is the EveryNCallback function:DAQmxErrChk (DAQmxReadDigitalU32(taskHandle,ROW*COL,10.0,DAQmx_Val_GroupByScanNumber,frame,ROW*COL,&read,NULL));now, when the code runs, it appears to me that the digital acquisition starts a random amount of time after the VSYNC pulse. basically the data that i read out is not aligned to the begining of the frame but starts at different locations at each time. this problems persists even if there are several tens of milliseconds between VSYNC and the first PIXEL CLOCK. also the way the code is configured, it looks only for the first VSYNC before acquiring data. after that it just grabs a frame worth of data each time. i do not see the frame offset changing from frame to frame ie1. i run the code2. i wait for a VSYNC and grab 101376 bits - instead of bit 1 to bit 101376, it reads bit 101 to 101476. thus it looks like the card had some latency after getting triggered by the VSYNC and could not read bit 1. since it was configured to read 101376 bits, it read them but started from a random point (101 in this case, if i run the code again, it will be different).3. after the first set of 101376 bits, the daq again has to read 101376 bits and this time it reads them from 101477 onwards. thus the offset of 100 bits from the VSYNC remains constant.any suggestions about what might be happening? am i right in my understanding of DAQmxCfgDigEdgeStartTrig and DAQmxCfgSampClkTiming ? is there a better way of doing this?thanks,kartik