我需要与应用程序“即时心率”相同的功能。

基本过程要求用户:

  • 将食指的尖端轻轻放在相机镜头上。
  • 施加均匀压力并覆盖整个镜头。
  • 保持稳定10秒钟,并获取心率。

  • 这可以通过打开闪光灯并观察血液在食指中移动时的光线变化来实现。

    如何从视频捕获中获取光照水平数据?我应该在哪里寻找?
    我浏览了AVCaptureDevice类,但没有发现任何有用的东西。

    我还找到了AVCaptureDeviceSubjectAreaDidChangeNotification,这会有用吗?

    最佳答案

    看看这个。

    // switch on the flash in torch mode
     if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
     [camera lockForConfiguration:nil];
     camera.torchMode=AVCaptureTorchModeOn;
     [camera unlockForConfiguration];
     }
    
      [session setSessionPreset:AVCaptureSessionPresetLow];
    
       // Create the AVCapture Session
       session = [[AVCaptureSession alloc] init];
    
      // Get the default camera device
       AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
      if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
        [camera lockForConfiguration:nil];
      camera.torchMode=AVCaptureTorchModeOn;
        [camera unlockForConfiguration];
     }
     // Create a AVCaptureInput with the camera device
        NSError *error=nil;
         AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
       if (cameraInput == nil) {
        NSLog(@"Error to create camera capture:%@",error);
      }
    
        // Set the output
        AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    
       // create a queue to run the capture on
      dispatch_queue_t captureQueue=dispatch_queue_create("catpureQueue", NULL);
    
       // setup our delegate
       [videoOutput setSampleBufferDelegate:self queue:captureQueue];
    
        // configure the pixel format
        videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber     numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
         nil];
       // cap the framerate
       videoOutput.minFrameDuration=CMTimeMake(1, 10);
      // and the size of the frames we want
      [session setSessionPreset:AVCaptureSessionPresetLow];
    
       // Add the input and output
       [session addInput:cameraInput];
       [session addOutput:videoOutput];
    
       // Start the session
    
        [session startRunning];
    
       - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    
    
    
       // this is the image buffer
    
      CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    
       // Lock the image buffer
    
      CVPixelBufferLockBaseAddress(cvimgRef,0);
    
    
      // access the data
    
      int width=CVPixelBufferGetWidth(cvimgRef);
      int height=CVPixelBufferGetHeight(cvimgRef);
    
    
      // get the raw image bytes
      uint8_t *buf=(uint8_t *) CVPixelBufferGetBaseAddress(cvimgRef);
      size_t bprow=CVPixelBufferGetBytesPerRow(cvimgRef);
    
    
    // get the average red green and blue values from the image
    
     float r=0,g=0,b=0;
     for(int y=0; y<height; y++) {
     for(int x=0; x<width*4; x+=4) {
      b+=buf[x];
      g+=buf[x+1];
      r+=buf[x+2];
     }
     buf+=bprow;
     }
      r/=255*(float) (width*height);
      g/=255*(float) (width*height);
      b/=255*(float) (width*height);
    
      NSLog(@"%f,%f,%f", r, g, b);
      }
    

    示例代码 Here

    关于objective-c - 使用相机检测心率,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/9274027/

    10-10 16:22