问题描述
我尝试在.mp4中输入.aac
I try to input a .aac in .mp4
我们所知道的
ffmpeg cmd
ffmpeg -i audio.mp3 -i video.mp4 -c copy output
如何在c#uwp中执行此操作,我已经在Google中进行了搜索,但没有看到关于此问题的任何结果,每个人都发布了转换问题.
How to do it in c# uwp, i have search in google i not see this any result regarding this issue , everyone has posted converting issue.
但是如何将其与C#合并任何示例prj或信息都很好
But how to merge it with c# any sample prj or information will be great
推荐答案
UWP提供有关此要求的API.可以通过 BackgroundAudioTracks 方法nofollow noreferrer> MediaComposition
类.有关操作的详细信息,请参考将背景音轨添加到乐曲中.您可以从 MediaEditing官方示例.
UWP provide APIs about this requirement. This feature can be implemented in uwp by BackgroundAudioTracks
method of MediaComposition
class. Details for how to do please reference Add a background audio track to a composition. And you can find a sample from scenario 3 of MediaEditing official sample.
例如:
// Create the original MediaComposition
var clip = await MediaClip.CreateFromFileAsync(pickedFile);
composition = new MediaComposition();
composition.Clips.Add(clip);
// Add background audio
var picker = new Windows.Storage.Pickers.FileOpenPicker();
picker.SuggestedStartLocation = Windows.Storage.Pickers.PickerLocationId.MusicLibrary;
picker.FileTypeFilter.Add(".mp3");
picker.FileTypeFilter.Add(".wav");
picker.FileTypeFilter.Add(".flac");
var audioFile = await picker.PickSingleFileAsync();
if (audioFile == null)
{
rootPage.NotifyUser("File picking cancelled", NotifyType.ErrorMessage);
return;
}
var backgroundTrack = await BackgroundAudioTrack.CreateFromFileAsync(audioFile);
composition.BackgroundAudioTracks.Add(backgroundTrack);
// Render to MediaElement
mediaElement.Position = TimeSpan.Zero;
mediaStreamSource = composition.GeneratePreviewMediaStreamSource((int)mediaElement.ActualWidth, (int)mediaElement.ActualHeight);
mediaElement.SetMediaStreamSource(mediaStreamSource);
MediaComposition
是从视频文件创建的. BackgroundAudioTrack
是从您要合并到视频的Mp3或其他音频文件中创建的.最后,我们需要渲染 MediaComposition 代码>
(将示例渲染到 MediaElement
进行播放).
The MediaComposition
is created from the video file. The BackgroundAudioTrack
is created from the Mp3 or other audio files you want to merge to the video. At last, we need to render the MediaComposition
to file(the sample is render to MediaElement
for playing).
这篇关于C#通用平台音频合并或视频输入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!