背景需求
如何將視頻添加上自定義的渲染效果袍患,并顯示?
大致流程
1竣付、解碼視頻
2诡延、獲取視頻幀
3、渲染視頻幀
4古胆、顯示渲染后的視頻幀
5肆良、編碼視頻幀,生成新的視頻
通過(guò)AVPlayer進(jìn)行實(shí)時(shí)獲取視頻幀
核心對(duì)象:AVPlayer逸绎,AVPlayerItemVideoOutput
AVPlayer:驅(qū)動(dòng)播放用例的中心階層惹恃,是用于管理媒體資產(chǎn)的回放和定時(shí)的控制器對(duì)象
這里AVPlayer,我制作簡(jiǎn)單的播放棺牧,暫停巫糙,seek。并且添加上AVPlayerItemVideoOutput做一個(gè)視頻幀輸出的工作陨帆。
創(chuàng)建一個(gè)播放器
init(videoPath url:URL) {
super.init()
videoURL = url
let urlAsset = AVURLAsset(url: videoURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey:true])
playerItem = AVPlayerItem(asset: urlAsset)
playerItem?.add(playerOutput)
player = AVPlayer(playerItem: playerItem)
player?.isMuted = false
playerItem?.addObserver(self, forKeyPath: "status", options: .new, context: nil)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "status" {
if let item = object as? AVPlayerItem {
switch item.status {
case .readyToPlay:
durationTime = CMTimeGetSeconds(item.duration)*1000
sendPlayerStauDelegate(videoStatu: .Prepared)
default:
sendPlayerStauDelegate(videoStatu: .Error)
}
}
}
}
func play() {
if !playing {
player?.play()
playing = true
}
}
func pause() {
if playing {
player?.pause()
playing = false
}
}
func seekTo(time:CMTime) {
player?.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero)
}
AVPlayerItemVideoOutput獲取視頻幀
var playerOutput:AVPlayerItemVideoOutput = {
// 根據(jù)需求獲取相對(duì)于視頻幀數(shù)據(jù) 通常選擇RGBA數(shù)據(jù),比較容易處理.系統(tǒng)默認(rèn)是Y-UV數(shù)據(jù)
let out = AVPlayerItemVideoOutput(pixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String:kCVPixelFormatType_32BGRA])
return out
}()
var onPixelBuffer:CVPixelBuffer? {
get {
//獲取當(dāng)前視頻幀
let time = self.playerOutput.itemTime(forHostTime: CACurrentMediaTime())
if self.playerOutput.hasNewPixelBuffer(forItemTime: time) {
return self.playerOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil)
} else {
return nil
}
}
}
主要的核心工具是AVPlayerItemVideoOutput曲秉,這對(duì)象相當(dāng)于一個(gè)視頻解碼工具,對(duì)它進(jìn)行屬性設(shè)置疲牵,可以獲取視頻中某一時(shí)刻的想要數(shù)據(jù)的CVPixelBuffer視頻幀承二。
func copyPixelBuffer(forItemTime itemTime: CMTime, itemTimeForDisplay outItemTimeForDisplay: UnsafeMutablePointer<CMTime>?) -> CVPixelBuffer?
通過(guò)獲取到的CVPixelBuffer,進(jìn)行OPenGL自定義渲染顯示纲爸。
外部需要開(kāi)啟一個(gè)定時(shí)器亥鸠,來(lái)實(shí)時(shí)的進(jìn)行畫(huà)面的刷新。定時(shí)器時(shí)間可以根據(jù)視頻的FPS來(lái)控制。
至此如何獲取視頻幀就可以了负蚊。
如何獲取視頻幀神妹,這里都比較簡(jiǎn)單,都是通過(guò)系統(tǒng)層去實(shí)現(xiàn)功能家妆。
主要注意的是:
1鸵荠、AVPlayerItemVideoOutput的獲取的數(shù)據(jù)格式定義,根據(jù)需求設(shè)置RGBA還是YUV420的數(shù)據(jù)伤极。
2蛹找、AVPlayer使用seek時(shí)候,使用精度比較高的方法哨坪,提高在seek時(shí)候的畫(huà)面流暢度
func seek(to time: CMTime, toleranceBefore: CMTime, toleranceAfter: CMTime)
3庸疾、獲取的CVPixelBuffer在Swift語(yǔ)言,不需要手動(dòng)釋放当编。在OC上需要調(diào)用CVPixelBufferRelease()
手動(dòng)釋放