我的TwilioVideo模块期望捕获器(相机或麦克风)未收到该输入时出现错误。在我们切换到Cocoapods来安装SDK和PureLayout UI库之后,就开始发生此错误。以前,我们已经将所有这些依赖项手动安装到XCode中。

我正在使用React-native-cli版本1.0.0开发一个React Native iOS 0.40.0版本。我正在使用XCode版本8.2.1(8C1002),并将iPhone 6模拟器运行在iOS 10.2上。我正在使用Cocoapods 1.2.0版。我正在使用TwilioVideo SDK 1.0.0-beta5版本。还有一个1.0.0-beta6版本,我也尝试过(具有相同的结果)。恢复为1.0.0-beta4版本确实可以消除该错误,该错误向我提示了我实现的注册音频和视频轨道的方式存在问题。

这是我的Podfile:

source 'https://github.com/CocoaPods/Specs'
source 'https://github.com/twilio/cocoapod-specs'

target 'MyApp' do
  # Uncomment the next line if you're using Swift or would like to use dynamic frameworks
  # use_frameworks!

  # Pods for MyApp
    pod 'TwilioVideo', '1.0.0-beta5'
    pod 'PureLayout', '~> 3.0'
  target 'MapleNativeProviderTests' do
    inherit! :search_paths
    # Pods for testing
  end

end

我已经基于以下存储库react-native-twilio-video-webrtc在XCode中实现了TwilioVideo模块。他最近更新了存储库以使其可用于React Native 0.40.0,从而更改了XCode的导入语法。我已经尝试使用旧的导入语法和新的导入语法,尝试安装视频组件时仍然出现以下错误:

ios - TwilioVideo iOS SDK的TVIVideoCapturer未提供Capturer(iPhone相机)-LMLPHP

这是TwilioVideo SDK的文档。这是TVIVideoCapturer

我对react-native-twilio-video-webrtc进行了修改,它实际上只是使用RCT_EXPORT_METHOD公开关键API方法的TwilioVideo SDK的薄包装。该库使用init方法初始化音频和视频轨道,这会导致一些令人讨厌的行为,与事件侦听器在应用程序启动时未接收到回调有关。因此,我将这些曲目移到了一个公开的自定义RCT_EXPORT_METHOD上,称为initialize。我从应用程序中的特定视图调用此视图,该视图装入视频并初始化摄像头/麦克风输入。

我的TWVideoModule.m实现是:
#import "TWVideoModule.h"

static NSString* roomDidConnect               = @"roomDidConnect";
static NSString* roomDidDisconnect            = @"roomDidDisconnect";
static NSString* roomDidFailToConnect         = @"roomDidFailToConnect";
static NSString* roomParticipantDidConnect    = @"roomParticipantDidConnect";
static NSString* roomParticipantDidDisconnect = @"roomParticipantDidDisconnect";

static NSString* participantAddedVideoTrack   = @"participantAddedVideoTrack";
static NSString* participantRemovedVideoTrack = @"participantRemovedVideoTrack";
static NSString* participantAddedAudioTrack   = @"participantAddedAudioTrack";
static NSString* participantRemovedAudioTrack = @"participantRemovedAudioTrack";
static NSString* participantEnabledTrack      = @"participantEnabledTrack";
static NSString* participantDisabledTrack     = @"participantDisabledTrack";

static NSString* cameraDidStart               = @"cameraDidStart";
static NSString* cameraWasInterrupted         = @"cameraWasInterrupted";
static NSString* cameraDidStopRunning         = @"cameraDidStopRunning";

@interface TWVideoModule () <TVIParticipantDelegate, TVIRoomDelegate, TVIVideoTrackDelegate, TVICameraCapturerDelegate>

@end

@implementation TWVideoModule

@synthesize bridge = _bridge;

RCT_EXPORT_MODULE();

- (dispatch_queue_t)methodQueue
{
  return dispatch_get_main_queue();
}

- (NSArray<NSString *> *)supportedEvents
{
  return @[roomDidConnect,
           roomDidDisconnect,
           roomDidFailToConnect,
           roomParticipantDidConnect,
           roomParticipantDidDisconnect,
           participantAddedVideoTrack,
           participantRemovedVideoTrack,
           participantAddedAudioTrack,
           participantRemovedAudioTrack,
           participantEnabledTrack,
           participantDisabledTrack,
           cameraDidStopRunning,
           cameraDidStart,
           cameraWasInterrupted];
}


- (instancetype)init
{
  self = [super init];
  if (self) {

    UIView* remoteMediaView = [[UIView alloc] init];
    //remoteMediaView.backgroundColor = [UIColor blueColor];

    //remoteMediaView.translatesAutoresizingMaskIntoConstraints = NO;
    self.remoteMediaView = remoteMediaView;


    UIView* previewView = [[UIView alloc] init];
    //previewView.backgroundColor = [UIColor yellowColor];

    //previewView.translatesAutoresizingMaskIntoConstraints = NO;
    self.previewView = previewView;

  }
  return self;
}

- (void)dealloc
{
  [self.remoteMediaView removeFromSuperview];
  self.remoteMediaView = nil;

  [self.previewView removeFromSuperview];
  self.previewView = nil;

  self.participant = nil;
  self.localMedia = nil;
  self.camera = nil;
  self.localVideoTrack = nil;
  self.videoClient = nil;
  self.room = nil;
}

RCT_EXPORT_METHOD(initialize) {
  self.localMedia = [[TVILocalMedia alloc] init];
  self.camera = [[TVICameraCapturer alloc] init];

  NSLog(@"Camera %@", self.camera);

  self.camera.delegate = self;

  self.localVideoTrack = [self.localMedia addVideoTrack:YES
                                               capturer:self.camera
                                            constraints:[self videoConstraints]
                                                  error:nil];

  self.localAudioTrack = [self.localMedia addAudioTrack:YES];

  if (!self.localVideoTrack) {
    NSLog(@"Failed to add video track");
  } else {
    // Attach view to video track for local preview
    [self.localVideoTrack attach:self.previewView];
  }

}

该文件的其余部分与添加和删除轨道以及从Twilio通道加入/断开连接有关,因此我没有包括它。我还有TWVideoPreviewManagerTWRemotePreviewManager,它们只是为本地和远程视频流的媒体对象提供UIViews。

我的TwilioVideoComponent.js组件是:
import React, { Component, PropTypes } from 'react'
import {
    NativeModules,
    NativeEventEmitter
} from 'react-native';

import {
    View,
} from 'native-base';

const {TWVideoModule} = NativeModules;

class TwilioVideoComponent extends Component {

    state = {};

    static propTypes = {
        onRoomDidConnect: PropTypes.func,
        onRoomDidDisconnect: PropTypes.func,
        onRoomDidFailToConnect: PropTypes.func,
        onRoomParticipantDidConnect: PropTypes.func,
        onRoomParticipantDidDisconnect: PropTypes.func,
        onParticipantAddedVideoTrack: PropTypes.func,
        onParticipantRemovedVideoTrack: PropTypes.func,
        onParticipantAddedAudioTrack: PropTypes.func,
        onParticipantRemovedAudioTrack: PropTypes.func,
        onParticipantEnabledTrack: PropTypes.func,
        onParticipantDisabledTrack: PropTypes.func,
        onCameraDidStart: PropTypes.func,
        onCameraWasInterrupted: PropTypes.func,
        onCameraDidStopRunning: PropTypes.func,
        ...View.propTypes,
    };

    _subscriptions = [];

    constructor(props) {
        super(props);

        this.flipCamera = this.flipCamera.bind(this);
        this.startCall = this.startCall.bind(this);
        this.endCall = this.endCall.bind(this);

        this._eventEmitter = new NativeEventEmitter(TWVideoModule)
    }

    //
    // Methods

    /**
     * Initializes camera and microphone tracks
     */
    initializeVideo() {
        TWVideoModule.initialize();
    }

    flipCamera() {
        TWVideoModule.flipCamera();
    }

    startCall({roomName, accessToken}) {
        TWVideoModule.startCallWithAccessToken(accessToken, roomName);
    }

    endCall() {
        TWVideoModule.disconnect();
    }

    toggleVideo() {
        TWVideoModule.toggleVideo();
    }

    toggleAudio() {
        TWVideoModule.toggleAudio();
    }

    _unregisterEvents() {
        this._subscriptions.forEach(e => e.remove());
        this._subscriptions = []
    }

    _registerEvents() {

        this._subscriptions = [

            this._eventEmitter.addListener('roomDidConnect', (data) => {
                if (this.props.onRoomDidConnect) {
                    this.props.onRoomDidConnect(data)
                }
            }),

            this._eventEmitter.addListener('roomDidDisconnect', (data) => {
                if (this.props.onRoomDidDisconnect) {
                    this.props.onRoomDidDisconnect(data)
                }
            }),

            this._eventEmitter.addListener('roomDidFailToConnect', (data) => {
                if (this.props.onRoomDidFailToConnect) {
                    this.props.onRoomDidFailToConnect(data)
                }
            }),

            this._eventEmitter.addListener('roomParticipantDidConnect', (data) => {
                if (this.props.onRoomParticipantDidConnect) {
                    this.props.onRoomParticipantDidConnect(data)
                }
            }),

            this._eventEmitter.addListener('roomParticipantDidDisconnect', (data) => {
                if (this.props.onRoomParticipantDidDisconnect) {
                    this.props.onRoomParticipantDidDisconnect(data)
                }
            }),

            this._eventEmitter.addListener('participantAddedVideoTrack', (data) => {
                if (this.props.onParticipantAddedVideoTrack) {
                    this.props.onParticipantAddedVideoTrack(data)
                }
            }),

            this._eventEmitter.addListener('participantRemovedVideoTrack', (data) => {
                if (this.props.onParticipantRemovedVideoTrack) {
                    this.props.onParticipantRemovedVideoTrack(data)
                }
            }),

            this._eventEmitter.addListener('participantAddedAudioTrack', (data) => {
                if (this.props.onParticipantAddedAudioTrack) {
                    this.props.onParticipantAddedAudioTrack(data)
                }
            }),

            this._eventEmitter.addListener('participantRemovedAudioTrack', (data) => {
                if (this.props.onParticipantRemovedAudioTrack) {
                    this.props.onParticipantRemovedAudioTrack(data)
                }
            }),

            this._eventEmitter.addListener('participantEnabledTrack', (data) => {
                if (this.props.onParticipantEnabledTrack) {
                    this.props.onParticipantEnabledTrack(data)
                }
            }),

            this._eventEmitter.addListener('participantDisabledTrack', (data) => {
                if (this.props.onParticipantDisabledTrack) {
                    this.props.onParticipantDisabledTrack(data)
                }
            }),

            this._eventEmitter.addListener('cameraDidStart', (data) => {
                if (this.props.onCameraDidStart) {
                    this.props.onCameraDidStart(data)
                }
            }),

            this._eventEmitter.addListener('cameraWasInterrupted', (data) => {
                if (this.props.onCameraWasInterrupted) {
                    this.props.onCameraWasInterrupted(data)
                }
            }),

            this._eventEmitter.addListener('cameraDidStopRunning', (data) => {
                if (this.props.onCameraDidStopRunning) {
                    this.props.onCameraDidStopRunning(data)
                }
            })

        ]

    }

    componentWillMount() {
        this._eventEmitter.addListener('cameraDidStart', (data) => {
            if (this.props.onCameraDidStart) {
                this.props.onCameraDidStart(data)
            }
        });
        this._registerEvents()
    }

    componentWillUnmount() {
        this._unregisterEvents()
    }

    render() {
        return this.props.children || null
    }
}

export default TwilioVideoComponent;

我不确定如何修改XCode以使其与TwilioVideo beta5 API兼容。任何帮助,将不胜感激。

最佳答案

在您的Podfile中,查找# use_frameworks!并删除#。

关于ios - TwilioVideo iOS SDK的TVIVideoCapturer未提供Capturer(iPhone相机),我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/42075289/

10-10 21:32