簡(jiǎn)介
首先我們需要了解在使用LivePhoto是會(huì)有這樣的效果,我們點(diǎn)擊拍照鍵烦味,系統(tǒng)會(huì)為我們展示一段視頻和一張封面圖。
如果只是視頻和圖片我們可簡(jiǎn)單地將其合稱為一段LivePhoto壁拉,但是最難實(shí)現(xiàn)的效果就是拐叉,系統(tǒng)為我們展示的3秒視頻,前面1.5秒是在你按拍照鍵之前的扇商。基于這一點(diǎn)我們要實(shí)現(xiàn)預(yù)錄制的功能宿礁。
實(shí)現(xiàn)方案
要實(shí)現(xiàn)這一效果案铺,我們通常會(huì)想到,在我點(diǎn)擊實(shí)況按鈕或者進(jìn)入我的相機(jī)App的時(shí)候就開始錄制視頻梆靖,點(diǎn)擊拍照鍵后1.5秒就結(jié)束錄制控汉,最終裁剪最后的三秒笔诵。這么做也未嘗不可,但是帶來是后果是如果在live頁(yè)面停留太久姑子,低內(nèi)存的機(jī)器的緩沖區(qū)會(huì)滿乎婿,造成崩潰。
我在stackoverflow上提問后街佑,有人告訴我谢翎,可以維護(hù)一個(gè)緩沖區(qū),緩沖區(qū)內(nèi)只保存最近的1.5s視頻沐旨,拍攝完成再進(jìn)行拼接森逮。
那么問題來了,緩沖區(qū)該如何設(shè)計(jì)呢磁携?怎么才能實(shí)現(xiàn)只保存最近1.5秒這一效果呢褒侧?
1.設(shè)置緩存隊(duì)列
我經(jīng)過思考,認(rèn)為谊迄,維護(hù)一個(gè)視頻隊(duì)列是最合適的闷供,也是較容易去實(shí)現(xiàn)的。
我在用戶點(diǎn)擊實(shí)況按鈕時(shí)统诺,開始錄制歪脏,但是所不同的是,我會(huì)設(shè)置一個(gè)計(jì)時(shí)器篙议,每過1.5秒就停止錄制唾糯,存到緩存區(qū)隊(duì)列,然后繼續(xù)錄制鬼贱。如果隊(duì)列的長(zhǎng)度超過2移怯,那么就把隊(duì)首的一段視頻刪除。
這樣就保證了緩沖區(qū)一定存在我拍照前1.5秒的視頻这难,也不會(huì)造成緩存區(qū)滿的狀況舟误。
2.點(diǎn)擊拍照鍵的操作
用戶點(diǎn)擊拍照鍵時(shí),立即停止錄制姻乓,保存嵌溢,然后開始錄制1.5秒的視頻。這樣就有了后面兩段視頻蹋岩。
我簡(jiǎn)單的畫了個(gè)示意圖:
字丑赖草,湊合著看:
也就是說,我點(diǎn)擊拍照鍵的一瞬間剪个,肯定不是恰好的1.5秒視頻錄制結(jié)束的時(shí)候秧骑,比如是0.6秒,那么我把這三段視頻保存,三段視頻即可合成一段LivePhoto乎折。
所以绒疗,核心思想就是用隊(duì)列保存三段視頻,最后只對(duì)這三段視頻進(jìn)行裁切合并骂澄。
3.具體實(shí)現(xiàn)代碼
這里給出部分代碼吓蘑。項(xiàng)目來自于AppStore上線項(xiàng)目#折紙相機(jī)#,希望大家多多支持坟冲。
最新版本尚未更新預(yù)錄制livephoto功能磨镶。
變量定義
//判斷是否正在拍攝livePhoto
var isLivePhoto = false
//第一段計(jì)時(shí)器
var liveTimer:Timer?
//第二段計(jì)時(shí)器
var liveTimer2:Timer?
var liveCounter:Double = 0.5;
var liveCounter2:Double = 0.5;
var liveUrl:URL!
var videoUrls = [URL]()
var saveManager:SaveVieoManager?
具體核心代碼
func setLiveMode(){
if !isLivePhoto{
isLivePhoto = true
setLiveStart()
}else{
isLivePhoto = false
movieWriter?.finishRecording()
liveTimer?.invalidate()
liveTimer = nil
}
}
/// 開始錄制LivePhoto
func setLiveStart(){
//shotButton.isUserInteractionEnabled = false
startRecord()
videoUrls.append(videoUrl!)
self.topView.liveCounter.isHidden = false
topView.setCounter(text: "0")
liveTimer = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(updateLiveCounter), userInfo: nil, repeats: true)
// liveTimer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: #selector(updateLiveCounter), userInfo: nil, repeats: true)
}
//倒計(jì)時(shí)控制
@objc func updateLiveCounter(){
liveCounter = liveCounter + 0.5
print("正在拍攝LivePhoto: ",liveCounter)
topView.setCounter(text: "\(liveCounter)")
// if liveCounter == 3{
// finishLiveRecord()
// }
if liveCounter == 1.5{
movieWriter?.finishRecording()
deleteLiveBuffer()
startRecord()
videoUrls.append(videoUrl!)
liveCounter = 0
}
}
/// 倒計(jì)時(shí)結(jié)束,結(jié)束錄制
func finishLiveRecord(){
movieWriter?.finishRecording()
shotButton.isUserInteractionEnabled = false
liveTimer?.invalidate()
startRecord()
liveCounter2 = 0
liveTimer2 = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(setIntervalFinish), userInfo: nil, repeats: true)
}
//錄制完成,進(jìn)行合成和跳轉(zhuǎn)
@objc func setIntervalFinish(){
liveCounter2 = liveCounter2 + 0.5
if liveCounter2 == 1.5{
deleteAdditionalBuffer()
shotButton.isUserInteractionEnabled = false
movieWriter?.failureBlock = {
Error in
print(Error as Any)
}
// movieWriter?.completionBlock = {
self.videoUrls.append(self.videoUrl!)
self.movieWriter?.finishRecording()
print("-------------:",self.videoUrls)
self.topView.liveCounter.isHidden = true
self.shotButton.isUserInteractionEnabled = true
self.liveTimer = nil
self.liveTimer2?.invalidate()
self.liveTimer2 = nil
self.liveCounter2 = 0
setLiveStart()
//視頻合成
// ProgressHUD.show("合成中")
saveManager = SaveVieoManager(urls: videoUrls)
let newUrl = URL(fileURLWithPath: "\(NSTemporaryDirectory())folder_all.mp4")
unlink(newUrl.path)
videoUrl = newUrl
//視頻裁剪以及合成
saveManager?.combineLiveVideos(success: {
com in
self.saveManager?.store(com, storeUrl: newUrl, success:{
DispatchQueue.main.async {
let vc = CheckViewController.init(image: nil, type: 2)
vc.videoUrl = newUrl
weak var weakSelf = self
vc.videoScale = weakSelf?.scaleRate
vc.willDismiss = {
//將美顏狀態(tài)重置
if (weakSelf?.isBeauty)!{
weakSelf?.isBeauty = false
weakSelf?.defaultBottomView.beautyButton.isSelected = false
}
//使用閉包樱衷,在vc返回時(shí)將底部隱藏棋嘲,點(diǎn)擊切換時(shí)在取消隱藏
if weakSelf?.scaleRate != 0{
// weakSelf?.scaleRate = 0
weakSelf?.defaultBottomView.backgroundColor = UIColor.clear
}
//LivePhoto錄像狀態(tài)重置
}
// ProgressHUD.showSuccess("合成成功")
// weakSelf?.videoUrls.removeAll()
weakSelf?.present(vc, animated: true, completion: nil)
//self.setLiveStart()
}
})
})
}
}
func deleteLiveBuffer(){
if videoUrls.count>=2{
do {
try FileManager.default.removeItem(atPath: (videoUrls.first!.path))
videoUrls.removeFirst()
} catch {
}
}
}
func deleteAdditionalBuffer(){
while videoUrls.count>=3{
do {
try FileManager.default.removeItem(atPath: (videoUrls.first!.path))
videoUrls.removeFirst()
} catch {
}
}
}
視頻裁剪和合成部分代碼:
裁剪視頻
/// 剪輯視頻
///
/// - Parameters:
/// - frontOffset: 前面剪幾秒
/// - endOffset: 后面剪幾秒
/// - index: url的下標(biāo)
/// - Returns: 合成
func cutLiveVideo(frontOffset:Float64,endOffset:Float64,index:Int)->AVMutableComposition{
let composition = AVMutableComposition()
// Create the video composition track.
let compositionVideoTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
// Create the audio composition track.
let compositionAudioTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
let pathUrl = videoUrls[index]
let asset = AVURLAsset(url: pathUrl, options: nil)
let videoTrack: AVAssetTrack = asset.tracks(withMediaType: .video)[0]
let audioTrack: AVAssetTrack = asset.tracks(withMediaType: .audio)[0]
compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform
// CMTime
let trackDuration: CMTime = videoTrack.timeRange.duration
let trackTimescale: CMTimeScale = trackDuration.timescale
// 用timescale構(gòu)造前后截取位置的CMTime
let startTime: CMTime = CMTimeMakeWithSeconds(frontOffset, trackTimescale)
let endTime: CMTime = CMTimeMakeWithSeconds(endOffset, trackTimescale)
let intendedDuration: CMTime = CMTimeSubtract(asset.duration, CMTimeAdd(startTime, endTime))
try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(startTime, intendedDuration), of: videoTrack, at: kCMTimeZero)
try? compositionAudioTrack?.insertTimeRange(CMTimeRangeMake(startTime, intendedDuration), of: audioTrack, at: kCMTimeZero)
return composition
}
合成視頻
func combineLiveVideos(success:@escaping(_ mixComposition:AVMutableComposition)->()){
for i in 0...videoUrls.count-1{
//剪第一段
if i == 0{
//求liveVideo第二段長(zhǎng)度n,需要用1.5 - 此長(zhǎng)度作為第一段需要減去的長(zhǎng)度
if videoUrls.count >= 1{
var videoAsset2:AVURLAsset?
videoAsset2 = AVURLAsset.init(url: videoUrls[1])
let tt = videoAsset2!.duration
let getLengthOfVideo2 = Double(tt.value)/Double(tt.timescale)
//減掉開始 n 秒
let video1Composition = cutLiveVideo(frontOffset: getLengthOfVideo2, endOffset: 0.0, index: 0)
let newUrl = URL(fileURLWithPath: "\(NSTemporaryDirectory())foldercut_1.mp4")
unlink(newUrl.path)
videoUrls[0] = newUrl
//裁剪完第一段視頻后,開始進(jìn)行三段視頻的合成
store(video1Composition, storeUrl: newUrl, success: {
let mixCom = self.combineVideos()
success(mixCom)
})
}
}
}
}
存儲(chǔ)視頻
/**
* 存儲(chǔ)合成的視頻
*
* @param mixComposition mixComposition參數(shù)
* @param storeUrl 存儲(chǔ)的路徑
* @param successBlock successBlock
* @param failureBlcok failureBlcok
*/
func store(_ mixComposition:AVMutableComposition,storeUrl:URL,success successBlock:@escaping ()->()){
//weak var weakSelf = self
var assetExport: AVAssetExportSession? = nil
assetExport = AVAssetExportSession.init(asset: mixComposition, presetName: AVAssetExportPreset640x480)
assetExport?.outputFileType = AVFileType("com.apple.quicktime-movie")
assetExport?.outputURL = storeUrl
assetExport?.exportAsynchronously(completionHandler: {
successBlock()
// UISaveVideoAtPathToSavedPhotosAlbum((storeUrl.path), self,#selector(weakSelf?.saveVideo(videoPath:didFinishSavingWithError:contextInfo:)), nil)
})
}
好了矩桂,就到這里沸移。