objective c - Is there a simpler way to capture bytes from an iOS device's camera? -


i'm working on image processing using opencv in ios 5.1. have libraries detect markers , other stuff.

what need take each frame of video camera process further using opencv. have found sample code, example @ https://developer.apple.com/library/ios/#qa/qa1702/_index.html, seems unnecessaryily difficult. mean, need write 500 lines of code capture frame , push view?

can please give me hint start?

//update simplest captureoutput:didoutputsamplebuffer:fromconnection: method i've been able code.

- (void)captureoutput:(avcaptureoutput *)captureoutput didoutputsamplebuffer:(cmsamplebufferref)samplebuffer       fromconnection:(avcaptureconnection *)connection {   @autoreleasepool {       cvimagebufferref imagebuffer = cmsamplebuffergetimagebuffer(samplebuffer);       cvpixelbufferlockbaseaddress(imagebuffer, 0);       uint8_t *baseaddress = (uint8_t *)cvpixelbuffergetbaseaddress(imagebuffer);        cgrect videorect = cgrectmake(0.0f, 0.0f, cvpixelbuffergetwidth(imagebuffer), cvpixelbuffergetheight(imagebuffer));       cv::mat mat(videorect.size.height, videorect.size.width, cv_8uc1, baseaddress, 0);       [self processframe:mat videorect:videorect]; //which implemented in subclass - means it's gonna opencv , opengl view :)       mat.release();       cvpixelbufferunlockbaseaddress(imagebuffer, 0);   } } 


Comments

Popular posts from this blog

jasper reports - Fixed header in Excel using JasperReports -

media player - Android: mediaplayer went away with unhandled events -

python - ('The SQL contains 0 parameter markers, but 50 parameters were supplied', 'HY000') or TypeError: 'tuple' object is not callable -