在将此Objective-C类方法移植到JS / nativescript时寻求帮助。.我尝试过的每一种变化都导致产生TypeError: undefined is not a function...

https://developer.apple.com/documentation/avfoundation/avvideocomposition/1389556-init

我试图用JS编写为:

const videoComp = AVVideoComposition.alloc().initWithAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });

//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });

//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandlerApplier(asset, (request) => { ... });

//OR
const videoComp = new AVVideoComposition(asset, (request) => { ... });


仅举几例。本质上,我正在尝试将此代码移植到nativescript / JS:

let blurRadius = 6.0
let asset = AVAsset(url: streamURL)
let item = AVPlayerItem(asset: asset)
item.videoComposition= AVVideoComposition(asset: asset) { request in
    let blurred = request.sourceImage.clampedToExtent().applyingGaussianBlur(sigma: blurRadius)
    let output = blurred.clampedToRect(request.sourceImage.extent)
    request.finish(with: output, context: nil)
}


在此博客文章中找到:https://willowtreeapps.com/ideas/how-to-apply-a-filter-to-a-video-stream-in-ios

最佳答案

JavaScript / Typescript应该看起来像这样,

let blurRadius = 6.0;
let asset = AVAsset.assetWithURL(streamURL);
let item = AVPlayerItem.alloc().initWithAsset(asset);
item.videoComposition = AVVideoComposition.videoCompositionWithAssetApplyingCIFiltersWithHandler(asset, request => {
    let blurred = request.sourceImage.imageByClampingToExtent().imageByApplyingGaussianBlurWithSigma(blurRadius);
    let output = blurred.imageByClampingToRect(request.sourceImage.extent);
    request.finishWithImageContext(output, null);
});


注意:该代码未经测试,仅是给定本机代码的翻译。将tns-platform-declarations用于IntelliSense支持。

关于javascript - 将AVVideoComposition初始化程序转换为Nativescript,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/58082795/

10-12 03:23