问题描述
我正试图通过USB连接镜像iOS设备屏幕到OSX。 QuickTime做得很好,我用一个代码示例阅读了这篇文章:
I'm trying to mirror iOS device screen via USB connection to OSX. QuickTime does this fine, and I read this article with a code example: https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/
但是,从来没有调用 CMIOStreamCopyBufferQueue
的回调,我想知道我做错了什么?
However, the callback of CMIOStreamCopyBufferQueue
is never called and I'm wondering what am I doing wrong?
有没有人遇到过这个问题,可以提供一个有效的例子吗?
Have anyone faced this issue and can provide a working example ?
谢谢。
推荐答案
嗯..最终我做了 Nadav 在他的博客中告诉我 - 发现DAL设备并使用 AVCaptureSession
捕获他们的输出,如下所示:
Well.. eventually I did what Nadav told me in his blog - discover DAL devices and capture their output using AVCaptureSession
like this:
-(id) init {
// Allow iOS Devices Discovery
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL,
sizeof(allow), &allow );
// Get devices
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
BOOL deviceAttahced = false;
for (int i = 0; i < [devices count]; i++) {
AVCaptureDevice *device = devices[i];
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
deviceAttahced = true;
[self startSession:device];
break;
}
}
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
// Device not attached - subscribe to onConnect notifications
if (!deviceAttahced) {
id deviceWasConnectedObserver = [notificationCenter addObserverForName:AVCaptureDeviceWasConnectedNotification
object:nil
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
AVCaptureDevice *device = note.object;
[self deviceConnected:device];
}];
observers = [[NSArray alloc] initWithObjects:deviceWasConnectedObserver, nil];
}
return self;
}
- (void) deviceConnected:(AVCaptureDevice *)device {
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
[self startSession:device];
}
}
- (void) startSession:(AVCaptureDevice *)device {
// Init capturing session
session = [[AVCaptureSession alloc] init];
// Star session configuration
[session beginConfiguration];
// Add session input
NSError *error;
newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (newVideoDeviceInput == nil) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
NSLog(@"%@", error);
});
} else {
[session addInput:newVideoDeviceInput];
}
// Add session output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
[session addOutput:videoDataOutput];
// Finish session configuration
[session commitConfiguration];
// Start the session
[session startRunning];
}
#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];
/*
* Here you can do whatever you need with the frame (e.g. convert to JPG)
*/
}
这篇关于如何通过USB镜像iOS屏幕?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!