iOS ReplayKit2錄屏 + h264編解碼 + socket傳輸(Swift)

前言

Swift版本的h264編碼很少钻心,我也是參照簡書大佬的文章搞的,可以正常跑铅协,放心cv捷沸。
不了解ReplayKit2和app extension的自行百度。

demo實現(xiàn)

拉起app extension

let picker = RPSystemBroadcastPickerView.init(frame: CGRectMake(0, 0, 1, 1))
picker.preferredExtension = "你的app extention bundle id"
 picker.showsMicrophoneButton = true
        
 for subView in picker.subviews {
        if let btn = subView as? UIButton {
             btn.sendActions(for: .allTouchEvents)
        }
 }

上面代碼會彈起一個廣播列表狐史,選擇你的extension痒给,點擊開始直播按鈕,會進入你的app extension骏全,開始錄屏苍柏。

共享代碼文件

app extension可以使用宿主代碼,只需要按照下圖配置即可:


Xnip2024-01-29_21-26-41.jpg

app extension使用宿主工程pod庫

在宿主podfile app extension target下編寫添加三方庫就行吟温,如下圖:


Xnip2024-01-29_21-31-00.jpg

RPBroadcastSampleHandler

class SampleHandler: RPBroadcastSampleHandler {
    private var socketMananger: SocketManager?
    private var encoder: LYH264Encoder?
    
    override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) {
        // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
        setup()
    }
    
    override func broadcastPaused() {
        // User has requested to pause the broadcast. Samples will stop being delivered.
    }
    
    override func broadcastResumed() {
        // User has requested to resume the broadcast. Samples delivery will resume.
    }
    
    override func broadcastFinished() {
        // User has requested to finish the broadcast.
        socketMananger?.dispose()
        encoder?.dispose()
    }
    
    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        switch sampleBufferType {
        case RPSampleBufferType.video:
            // Handle video sample buffer
            encode(sampleBuffer)
            break
        case RPSampleBufferType.audioApp:
            // Handle audio sample buffer for app audio
            break
        case RPSampleBufferType.audioMic:
            // Handle audio sample buffer for mic audio
            break
        @unknown default:
            // Handle other sample buffer types
            fatalError("Unknown type of sample buffer")
        }
    }
    
    func setup() {
        socketMananger = SocketManager(isServer: true)
        let scale = UIScreen.main.scale
        encoder = LYH264Encoder(width: Int32(UIScreen.main.bounds.width * scale), height: Int32(UIScreen.main.bounds.height * scale), bitRate: nil, fps: 30)
    }
    
    func encode(_ sampleBuffer: CMSampleBuffer) {
        encoder?.encodeVideo(sampleBuffer: sampleBuffer)
        encoder?.videoEncodeCallback = { [weak self] (data) in
            self?.socketMananger?.sendParam([KSocketDataType : LYSocketDataType.sampleBuffer.rawValue, KSocketDataKey: data])
        }
    }
    
    func senData(_ sampleBuffer: CMSampleBuffer) {
        if let string = self.sampleBufferToBase64(sampleBuffer: sampleBuffer) {
            self.socketMananger?.sendParam([KSocketDataType : LYSocketDataType.image.rawValue, KSocketDataKey: string])
        }
    }
}

h264編碼

原理就不說了序仙,代碼也大同小異,以下代碼Swift 5.x跑沒毛病

import VideoToolbox

class LYH264Encoder {
    public var videoEncodeCallback: ((Data) -> Void)?
    public var width: Int32 = 375
    public var height: Int32 = 852
    
    private var bitRate : Int32 = 375 * 852 * 3 * 4
    private var fps: Int32 = 10
    private var frameID: Int64 = 0
    
    private var encodeCallBack: VTCompressionOutputCallback?
    private var encodeQueue = DispatchQueue(label: "com.ly_encode")
    private var callBackQueue = DispatchQueue(label: "com.ly_encode_callBack")
    private var encodeSession: VTCompressionSession?
    
    public init(width: Int32 = 375, height: Int32 = 852, bitRate: Int32?, fps: Int32?) {
        self.width = width
        self.height = height
        self.bitRate = bitRate != nil ? bitRate! : width * height * 3 * 4
        self.fps = fps != nil ? fps! : 10
        
        setupCallBack()
        initVideoToolBox()
    }
    
    public func encodeVideo(sampleBuffer: CMSampleBuffer) {
        if self.encodeSession == nil {
            initVideoToolBox()
        }
        
        encodeQueue.async {
            guard let encodeSession = self.encodeSession else {
                return
            }
            
            guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
                return
            }
            
            var time = CMTime(value: self.frameID, timescale: 1000)
            self.frameID += 1
            
            if #available(iOS 15, *) {  // iOS 15 timescale = 1000鲁豪, 碼率很低潘悼,馬賽克
                time = CMTime(value: self.frameID, timescale: 100)
            }
            
            var flags: VTEncodeInfoFlags = []
            //編碼
            let state = VTCompressionSessionEncodeFrame(encodeSession, imageBuffer: imageBuffer, presentationTimeStamp: time, duration: .invalid, frameProperties: nil, sourceFrameRefcon: nil, infoFlagsOut: &flags)
            if state != 0 {
                print("VTCompression: encode failed: status \(state)")
            }
        }
    }
    
    func dispose() {
        frameID = 0
        if let session = encodeSession {
            VTCompressionSessionCompleteFrames(session, untilPresentationTimeStamp: .invalid)
            VTCompressionSessionInvalidate(session);
            encodeSession = nil;
        }
    }
    
    deinit {
        dispose()
    }
    
    private func initVideoToolBox() {
        // session
        let status = VTCompressionSessionCreate(allocator: kCFAllocatorDefault,  // 會話的分配器
                                                width: width,   // 幀的寬度
                                                height: height,  // 幀的高度
                                                codecType: kCMVideoCodecType_H264, // 編解碼器的類型,表示使用h.264進行編碼
                                                encoderSpecification: nil,  // 指定必須使用的特定視頻編碼
                                                imageBufferAttributes: nil,  // 源像素緩沖區(qū)所需的屬性律秃,用于創(chuàng)建像素緩沖池
                                                compressedDataAllocator: nil, // 壓縮數(shù)據(jù)的分配器
                                                outputCallback: encodeCallBack, //encodeCallBack, // 當一次編碼結(jié)束會在該函數(shù)進行回調(diào),可以在該函數(shù)中將數(shù)據(jù),寫入文件中
                                                refcon:  unsafeBitCast(self, to: UnsafeMutableRawPointer.self),
                                                compressionSessionOut: &self.encodeSession
        )
        
        if status != 0 {
            print("創(chuàng)建編碼會話失敗")
            return
        }
        
        guard let encodeSession = encodeSession else {
            return
        }
        
        //設(shè)置實時編碼輸出
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_RealTime, value: kCFBooleanTrue)
        //設(shè)置編碼方式
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_ProfileLevel, value: kVTProfileLevel_H264_Baseline_AutoLevel)
        //設(shè)置是否產(chǎn)生B幀(因為B幀在解碼時并不是必要的,是可以拋棄B幀的)
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_AllowFrameReordering, value: kCFBooleanFalse)
        //設(shè)置關(guān)鍵幀間隔
        var frameInterval = 10
        let number = CFNumberCreate(kCFAllocatorDefault, CFNumberType.intType, &frameInterval)
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_MaxKeyFrameInterval, value: number)
        
        //設(shè)置期望幀率,不是實際幀率
        let fpscf = CFNumberCreate(kCFAllocatorDefault, CFNumberType.intType, &fps)
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_ExpectedFrameRate, value: fpscf)
        
        //設(shè)置碼率平均值治唤,單位是bps棒动。碼率大了話就會非常清晰,但同時文件也會比較大宾添。碼率小的話船惨,畫面會模糊
        let bitrateAverage = CFNumberCreate(kCFAllocatorDefault, CFNumberType.intType, &bitRate)
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_AverageBitRate, value: bitrateAverage)
        
        //碼率限制
        let bitRatesLimit: CFArray = [bitRate * 5 / 8, 1] as CFArray
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_DataRateLimits, value: bitRatesLimit)
        
        /// 壓縮質(zhì)量
        var quality = 1
        VTSessionSetProperty(encodeSession, key: kVTCompressionPropertyKey_Quality, value: CFNumberCreate(kCFAllocatorDefault, CFNumberType.floatType, &quality))
    }
    
    private func gotSpsPps(sps: Data, pps: Data) {
        // startcode sps pps i b p startcode .....
        let startCode = Data([0x00, 0x00, 0x00, 0x01])
        var h264Data = Data()
        h264Data.append(startCode)
        h264Data.append(sps)
        callBackQueue.async {
            self.videoEncodeCallback?(h264Data)
        }
        
        var ppsData = Data()
        ppsData.append(startCode)
        ppsData.append(pps)
        callBackQueue.async {
            self.videoEncodeCallback?(ppsData)
        }
    }

    private func gotEncodedData(_ data: Data, isKeyFrame: Bool) {
        let startCode = Data([0x00, 0x00, 0x00, 0x01])
        var h264Data = Data()
        h264Data.append(startCode)
        h264Data.append(data)
        callBackQueue.async {
            self.videoEncodeCallback?(h264Data)
        }
    }
    
    private func setupCallBack() {
        //編碼完成回調(diào)
        encodeCallBack = { (outputCallbackRefCon, sourceFrameRefCon, status, infoFlags, sampleBuffer)  in
            // 1.判斷狀態(tài)是否等于沒有錯誤
            if status != noErr {
                print("encode with err")
                return
            }
            
            guard let sampleBuffer = sampleBuffer else {
                print("no buffer")
                return
            }
            
            guard CMSampleBufferDataIsReady(sampleBuffer) else {
                print("didCompressH264 data is not ready ")
                return
            }
            
            // 2.根據(jù)傳入的參數(shù)獲取對象
            let encoder: LYH264Encoder = unsafeBitCast(outputCallbackRefCon, to: LYH264Encoder.self)
            
            // 3.判斷是否是關(guān)鍵幀
            let attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: true)
            let attachments = unsafeBitCast(CFArrayGetValueAtIndex(attachmentsArray, 0), to: CFDictionary.self)
            let isKeyframe = !CFDictionaryContainsKey(attachments, Unmanaged.passUnretained(kCMSampleAttachmentKey_NotSync).toOpaque())

            // 判斷當前幀是否為關(guān)鍵幀
            // 獲取sps & pps數(shù)據(jù)
            if isKeyframe {
                print("獲取到關(guān)鍵幀")
                // 獲取編碼后的信息(存儲于CMFormatDescriptionRef中)
                let format = CMSampleBufferGetFormatDescription(sampleBuffer)
                
                // 獲取SPS信息
                var sparameterSetSize: Int = 0
                var sparameterSetCount: Int = 0
                var sparameterSet: UnsafePointer<UInt8>?
                CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format!, parameterSetIndex: 0, parameterSetPointerOut: &sparameterSet, parameterSetSizeOut: &sparameterSetSize, parameterSetCountOut: &sparameterSetCount, nalUnitHeaderLengthOut: nil)
                
                // 獲取PPS信息
                var pparameterSetSize: Int = 0
                var pparameterSetCount: Int = 0
                var pparameterSet: UnsafePointer<UInt8>?
                CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format!, parameterSetIndex: 1, parameterSetPointerOut: &pparameterSet, parameterSetSizeOut: &pparameterSetSize, parameterSetCountOut: &pparameterSetCount, nalUnitHeaderLengthOut: nil)
                
                // 裝sps/pps轉(zhuǎn)成NSData
                let sps = Data(bytes: sparameterSet!, count: sparameterSetSize)
                let pps = Data(bytes: pparameterSet!, count: pparameterSetSize)
                
                // 寫入文件
                encoder.gotSpsPps(sps: sps, pps: pps)
            }
            
            // 獲取數(shù)據(jù)塊
            guard let dataBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) else {
                return
            }
            
            var length: size_t = 0
            var totalLength: size_t = 0
            var dataPointer: UnsafeMutablePointer<Int8>?
            
            let statusCodeRet = CMBlockBufferGetDataPointer(dataBuffer, atOffset: 0, lengthAtOffsetOut: &length, totalLengthOut: &totalLength, dataPointerOut: &dataPointer)
            
            if statusCodeRet == noErr {
                var bufferOffset: size_t = 0
                let AVCCHeaderLength: Int = 4 // 返回的nalu數(shù)據(jù)前四個字節(jié)不是0001的startcode,而是大端模式的幀長度length
                
                // 循環(huán)獲取nalu數(shù)據(jù)
                while bufferOffset < totalLength - AVCCHeaderLength {
                    var NALUnitLength: UInt32 = 0
                    // Read the NAL unit length
                    memcpy(&NALUnitLength, dataPointer! + bufferOffset, AVCCHeaderLength)
                    
                    // 從大端轉(zhuǎn)系統(tǒng)端
                    NALUnitLength = CFSwapInt32BigToHost(NALUnitLength)
                    
                    let data = Data(bytes: dataPointer! + bufferOffset + AVCCHeaderLength, count: Int(NALUnitLength))
                    encoder.gotEncodedData(data, isKeyFrame: isKeyframe)
                    
                    // 移動到寫一個塊缕陕,轉(zhuǎn)成NALU單元
                    // Move to the next NAL unit in the block buffer
                    bufferOffset += AVCCHeaderLength + Int(NALUnitLength)
                }
            }
        }
    }
}

h264解碼

enum LYH264DecodeType {
    /// CVPixcelBuffer 圖片
    case imageBuffer
    /// h264 編碼結(jié)構(gòu)
    case sampleBuffer
}

class LYH264Decoder {
    /// 解碼類型
    public var returnType: LYH264DecodeType = .sampleBuffer
    /// 解碼后image結(jié)果回調(diào)
    public var videoDecodeCallback: ((CVImageBuffer?) -> Void)?
    /// 解碼后sample結(jié)果回調(diào)
    public var videoDecodeSampleBufferCallback: ((CMSampleBuffer?) -> Void)?
    
    private var width: Int32 = 375
    private var height: Int32 = 852
    
    private var spsData: Data?
    private var ppsData: Data?
    private var decompressionSession: VTDecompressionSession?
    private var decodeDesc: CMVideoFormatDescription?
    private var callback: VTDecompressionOutputCallback?
    private var decodeQueue = DispatchQueue(label: "com.ly_decode")
    private var callBackQueue = DispatchQueue(label: "com.ly_decode_callBack")

    public init(width: Int32, height: Int32) {
        self.width = width
        self.height = height
    }
    
    public func decode(data: Data) {
        decodeQueue.async {
            let length: UInt32 =  UInt32(data.count)
            self.decodeByte(data: data, size: length)
        }
    }
    
    
    private func decodeByte(data: Data, size: UInt32) {
        //數(shù)據(jù)類型:frame的前4個字節(jié)是NALU數(shù)據(jù)的開始碼粱锐,也就是00 00 00 01,
        // 將NALU的開始碼轉(zhuǎn)為4字節(jié)大端NALU的長度信息
        let naluSize = size - 4
        let length: [UInt8] = [
            UInt8(truncatingIfNeeded: naluSize >> 24),
            UInt8(truncatingIfNeeded: naluSize >> 16),
            UInt8(truncatingIfNeeded: naluSize >> 8),
            UInt8(truncatingIfNeeded: naluSize)
        ]
        
        var frameByte: [UInt8] = length
        [UInt8](data).suffix(from: 4).forEach { (item) in
            frameByte.append(item)
        }
        
        let bytes = frameByte //[UInt8](frameData)
        // 第5個字節(jié)是表示數(shù)據(jù)類型扛邑,轉(zhuǎn)為10進制后怜浅,7是sps, 8是pps, 5是IDR(I幀)信息
        let type: Int  = Int(bytes[4] & 0x1f)
        switch type {
            case 0x05:
                print("-收到關(guān)鍵幀,準備編碼-")
                if initDecoder() {
                    print("!收到關(guān)鍵幀蔬崩,開始編碼!")
                    decode(frame: bytes, size: size)
                }
                
            case 0x06:
    //            print("增強信息")
                break
            case 0x07:
                print("- sps -")
                spsData = data
            case 0x08:
                print("- pps -")
                ppsData = data
            default:
                if initDecoder() {
                    decode(frame: bytes, size: size)
                }
        }
    }
    
    private func decode(frame: [UInt8], size: UInt32) {
        var blockBUffer: CMBlockBuffer?
        var frame1 = frame
        //創(chuàng)建blockBuffer
        /*!
         參數(shù)1: structureAllocator kCFAllocatorDefault
         參數(shù)2: memoryBlock  frame
         參數(shù)3: frame size
         參數(shù)4: blockAllocator: Pass NULL
         參數(shù)5: customBlockSource Pass NULL
         參數(shù)6: offsetToData  數(shù)據(jù)偏移
         參數(shù)7: dataLength 數(shù)據(jù)長度
         參數(shù)8: flags 功能和控制標志
         參數(shù)9: newBBufOut blockBuffer地址,不能為空
         */
        let blockState = CMBlockBufferCreateWithMemoryBlock(allocator: kCFAllocatorDefault,
                                           memoryBlock: &frame1,
                                           blockLength: Int(size),
                                           blockAllocator: kCFAllocatorNull,
                                           customBlockSource: nil,
                                           offsetToData:0,
                                           dataLength: Int(size),
                                           flags: 0,
                                           blockBufferOut: &blockBUffer)
        if blockState != 0 {
            print("創(chuàng)建blockBuffer失敗")
        }

        var sampleSizeArray: [Int] = [Int(size)]
        var sampleBuffer: CMSampleBuffer?
        //創(chuàng)建sampleBuffer
        /*
         參數(shù)1: allocator 分配器,使用默認內(nèi)存分配, kCFAllocatorDefault
         參數(shù)2: blockBuffer.需要編碼的數(shù)據(jù)blockBuffer.不能為NULL
         參數(shù)3: formatDescription,視頻輸出格式
         參數(shù)4: numSamples.CMSampleBuffer 個數(shù).
         參數(shù)5: numSampleTimingEntries 必須為0,1,numSamples
         參數(shù)6: sampleTimingArray.  數(shù)組.為空
         參數(shù)7: numSampleSizeEntries 默認為1
         參數(shù)8: sampleSizeArray
         參數(shù)9: sampleBuffer對象
         */
        let readyState = CMSampleBufferCreateReady(allocator: kCFAllocatorDefault,
                                  dataBuffer: blockBUffer,
                                  formatDescription: decodeDesc,
                                  sampleCount: CMItemCount(1),
                                  sampleTimingEntryCount: CMItemCount(),
                                  sampleTimingArray: nil,
                                  sampleSizeEntryCount: CMItemCount(1),
                                  sampleSizeArray: &sampleSizeArray,
                                  sampleBufferOut: &sampleBuffer)
        
        guard let buffer = sampleBuffer, readyState == kCMBlockBufferNoErr else {
            print("解碼失敗")
            return
        }
        
        if returnType == .sampleBuffer {
            if let attachmentArray = CMSampleBufferGetSampleAttachmentsArray(buffer, createIfNecessary: true) {
                let dic = unsafeBitCast(CFArrayGetValueAtIndex(attachmentArray, 0), to: CFMutableDictionary.self)
                CFDictionarySetValue(dic,
                                     Unmanaged.passUnretained(kCMSampleAttachmentKey_DisplayImmediately).toOpaque(),
                                     Unmanaged.passUnretained(kCFBooleanTrue).toOpaque())
            }
            videoDecodeSampleBufferCallback?(sampleBuffer)
            return
        }
    
        //解碼數(shù)據(jù)為CVPixcelBuffer
        /*
         參數(shù)1: 解碼session
         參數(shù)2: 源數(shù)據(jù) 包含一個或多個視頻幀的CMsampleBuffer
         參數(shù)3: 解碼標志
         參數(shù)4: 解碼后數(shù)據(jù)outputPixelBuffer
         參數(shù)5: 同步/異步解碼標識
         */
        let sourceFrame: UnsafeMutableRawPointer? = nil
        var inforFalg = VTDecodeInfoFlags.asynchronous
        let decodeState = VTDecompressionSessionDecodeFrame(self.decompressionSession!,
                                                            sampleBuffer: sampleBuffer!,
                                                            flags: VTDecodeFrameFlags._EnableAsynchronousDecompression,
                                                            frameRefcon: sourceFrame,
                                                            infoFlagsOut: &inforFalg
        )
        
        if decodeState != 0 {
            print("解碼失敗")
        }
    }
    
    private func initDecoder() -> Bool {
        if decompressionSession != nil {
            return true
        }
        
        guard spsData != nil, ppsData != nil else {
            return false
        }

        //處理sps/pps
        var sps: [UInt8] = []
        [UInt8](spsData!).suffix(from: 4).forEach { (value) in
            sps.append(value)
        }
        
        var pps: [UInt8] = []
        [UInt8](ppsData!).suffix(from: 4).forEach{(value) in
            pps.append(value)
        }
        
        let spsAndpps = [sps.withUnsafeBufferPointer{$0}.baseAddress!,pps.withUnsafeBufferPointer{$0}.baseAddress!]
        let sizes = [sps.count,pps.count]

        /**
        根據(jù)sps pps設(shè)置解碼參數(shù)
        param kCFAllocatorDefault 分配器
        param 2 參數(shù)個數(shù)
        param parameterSetPointers 參數(shù)集指針
        param parameterSetSizes 參數(shù)集大小
        param naluHeaderLen nalu nalu start code 的長度 4
        param _decodeDesc 解碼器描述
        return 狀態(tài)
        */
        let descriptionState = CMVideoFormatDescriptionCreateFromH264ParameterSets(allocator: kCFAllocatorDefault,
                                                                                   parameterSetCount: 2,
                                                                                   parameterSetPointers: spsAndpps,
                                                                                   parameterSetSizes: sizes,
                                                                                   nalUnitHeaderLength: 4,
                                                                                   formatDescriptionOut: &decodeDesc
        )
        
        if descriptionState != 0 {
            print("description創(chuàng)建失敗" )
            return false
        }
        
        //解碼回調(diào)設(shè)置
        /*
         VTDecompressionOutputCallbackRecord 是一個簡單的結(jié)構(gòu)體恶座,它帶有一個指針 (decompressionOutputCallback),指向幀解壓完成后的回調(diào)方法沥阳。你需要提供可以找到這個回調(diào)方法的實例 (decompressionOutputRefCon)
         */
        setupCallBack()
        var callbackRecord = VTDecompressionOutputCallbackRecord(decompressionOutputCallback: callback, decompressionOutputRefCon: unsafeBitCast(self, to: UnsafeMutableRawPointer.self))
        
        /*
         解碼參數(shù):
        * kCVPixelBufferPixelFormatTypeKey:攝像頭的輸出數(shù)據(jù)格式
         kCVPixelBufferPixelFormatTypeKey跨琳,已測可用值為
            kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,即420v
            kCVPixelFormatType_420YpCbCr8BiPlanarFullRange桐罕,即420f
            kCVPixelFormatType_32BGRA脉让,iOS在內(nèi)部進行YUV至BGRA格式轉(zhuǎn)換
         YUV420一般用于標清視頻,YUV422用于高清視頻功炮,這里的限制讓人感到意外侠鳄。但是,在相同條件下死宣,YUV420計算耗時和傳輸壓力比YUV422都小。
         
        * kCVPixelBufferWidthKey/kCVPixelBufferHeightKey: 視頻源的分辨率 width*height
         * kCVPixelBufferOpenGLCompatibilityKey : 它允許在 OpenGL 的上下文中直接繪制解碼后的圖像碴开,而不是從總線和 CPU 之間復制數(shù)據(jù)毅该。這有時候被稱為零拷貝通道,因為在繪制過程中沒有解碼的圖像被拷貝.
         
         */
        let imageBufferAttributes = [
            kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
            kCVPixelBufferWidthKey: width,
            kCVPixelBufferHeightKey: height,
//            kCVPixelBufferOpenGLCompatibilityKey:true
            ] as [CFString : Any]
        
        //創(chuàng)建session
        
        /*!
         @function    VTDecompressionSessionCreate
         @abstract    創(chuàng)建用于解壓縮視頻幀的會話潦牛。
         @discussion  解壓后的幀將通過調(diào)用OutputCallback發(fā)出
         @param    allocator  內(nèi)存的會話眶掌。通過使用默認的kCFAllocatorDefault的分配器。
         @param    videoFormatDescription 描述源視頻幀
         @param    videoDecoderSpecification 指定必須使用的特定視頻解碼器.NULL
         @param    destinationImageBufferAttributes 描述源像素緩沖區(qū)的要求 NULL
         @param    outputCallback 使用已解壓縮的幀調(diào)用的回調(diào)
         @param    decompressionSessionOut 指向一個變量以接收新的解壓會話
         */
        let state = VTDecompressionSessionCreate(allocator: kCFAllocatorDefault,
                                                 formatDescription: decodeDesc!,
                                                 decoderSpecification: nil,
                                                 imageBufferAttributes: imageBufferAttributes as CFDictionary,
                                                 outputCallback: &callbackRecord,
                                                 decompressionSessionOut: &decompressionSession
        )
        
        if state != 0 {
            print("創(chuàng)建decodeSession失敗")
        }
        
        VTSessionSetProperty(self.decompressionSession!, key: kVTDecompressionPropertyKey_RealTime, value: kCFBooleanTrue)
        return true
    }

    //解碼成功的回掉
    private func setupCallBack() {
        /*
         VTDecompressionOutputCallback 回調(diào)方法包括七個參數(shù):
                參數(shù)1: 回調(diào)的引用
                參數(shù)2: 幀的引用
                參數(shù)3: 一個狀態(tài)標識 (包含未定義的代碼)
                參數(shù)4: 指示同步/異步解碼巴碗,或者解碼器是否打算丟幀的標識
                參數(shù)5: 實際圖像的緩沖
                參數(shù)6: 出現(xiàn)的時間戳
                參數(shù)7: 出現(xiàn)的持續(xù)時間
         */
        //(UnsafeMutableRawPointer?, UnsafeMutableRawPointer?, OSStatus, VTDecodeInfoFlags, CVImageBuffer?, CMTime, CMTime) -> Void
        callback = { decompressionOutputRefCon, sourceFrameRefCon, status,inforFlags, imageBuffer, presentationTimeStamp, presentationDuration in
            let decoder: LYH264Decoder = unsafeBitCast(decompressionOutputRefCon, to: LYH264Decoder.self)
            
            guard imageBuffer != nil else {
                return
            }
            
            if let block = decoder.videoDecodeCallback  {
                decoder.callBackQueue.async {
                    block(imageBuffer)
                }
            }
        }
    }
    
    deinit {
        if decompressionSession != nil {
            VTDecompressionSessionInvalidate(decompressionSession!)
            decompressionSession = nil
        }
        
    }
}

socket

解決粘包就好朴爬,我上一篇文章也提到過,有興趣可以去看看橡淆。

import CocoaAsyncSocket
import SVProgressHUD
import Foundation

fileprivate let ipStr = "172.20.10.1"
fileprivate let myPort: UInt16 = 12345  
/// 字典key "type"
let KSocketDataType = "type"
/// 字典key "data"
let KSocketDataKey = "data"
fileprivate let bodyLegth = 4 //信息長度位

enum LYSocketDataType: String, CaseIterable {
    /// ping
    case ping
    /// 圖片
    case image
    ///  錄屏視頻圖片數(shù)據(jù)容器
    case sampleBuffer
    /// 字符串消息
    case string
    /// 單擊
    case touch
    /// 雙擊
    case dobleTap
    /// 滑動
    case slide
}

protocol SocketManagerDelegate: AnyObject {
    func receivedImage(_ image: UIImage)
    func receivedPoint(_ x: Double, _ y: Double)
    func receivedKeyWord(_ keyWord: String)
    func receivedSampleBuffer(_ sampleBuffer: Data)
}

class SocketManager: NSObject {
    // MARK: - property
    private var socket: GCDAsyncSocket!
    //作為服務端時召噩,連接的客戶端
    private var clientSocket: GCDAsyncSocket?
    /// 是否為服務端
    public var isServer: Bool

    /// 代理
    public weak var delegate: SocketManagerDelegate?
    /// 是否已連接客戶端
    private(set) var isConnectClient = false

    // 接收緩存母赵,用于解決粘包
    private lazy var dataBuffer: Data = {
        let data = Data()
        return data
    }()

    // 定時器,發(fā)心跳包
    private lazy var timer: Timer = {
        let timer = Timer(timeInterval: 30, target: self, selector: #selector(timerAction), userInfo: nil, repeats: true)
        RunLoop.current.add(timer, forMode: .common)
        return timer
    }()
    
    //MARK: - cycle
    init(isServer: Bool) {
        self.isServer = isServer
        super.init()
        
        let queue = DispatchQueue(label: "com.lanyou.socket")
        socket = GCDAsyncSocket(delegate: self, delegateQueue: queue)

        if isServer {
            setupServer()
        } else {
            setupClient()
        }
    }
    
    fileprivate func setupServer() {
        do {
            try socket.accept(onPort: myPort)
        } catch {
            print("socket服務器啟動失敗: \(error.localizedDescription)")
        }
    }

    fileprivate func setupClient() {
        do {
            try socket.connect(toHost: ipStr, onPort: myPort)
        } catch {
            print("socket連接服務器失敗: \(error.localizedDescription)")
        }
    }
    
    deinit {
        dispose()
    }
    
    public func dispose() {
        socket.disconnect()
        clientSocket?.disconnect()
        if !isServer {
            timer.invalidate()
        }
    }
    
    // MARK: - sendData
    public func sendParam(_ param: [String : Any]) {
        if isServer && !isConnectClient {
            return
        }
                    
        var dict = param as [String : Any]
        // 處理value為data的情況
        for (key, value) in dict {
            if let dataValue = value as? Data {
                if let base64String = dataToBase64String(dataValue) {
                    dict[key] = base64String
                }
            }
        }
    
        // dict - > data
        do {
            let jsonData = try JSONSerialization.data(withJSONObject: dict, options: [])
            let jsonString = String(data: jsonData, encoding: .utf8)
            guard let json = jsonString else { return }
            if let data = json.data(using: .utf8) {
                sendData(data)
            }
        } catch {
            print("字典轉(zhuǎn)data出錯: \(error)")
        }
    }
    
    fileprivate func dataToBase64String(_ data: Data) -> String? {
        return data.base64EncodedString()
    }
        
    // 粘包封包
    fileprivate func sendData(_ data: Data) {
        // 拼接數(shù)據(jù) -> 帶有長度信息的數(shù)據(jù)包
        var messageLength: UInt32 = UInt32(data.count)
        let lengthData = Data(bytes: &messageLength, count: MemoryLayout<UInt32>.size) //4字節(jié)
        var sendData = lengthData
        sendData.append(data)
        
        if isServer {
            clientSocket?.write(sendData, withTimeout: -1, tag: 0)
        } else {
            socket.write(sendData, withTimeout: -1, tag: 0)
        }
    }
    
    fileprivate func jsonToDictionary(_ jsonString: String) -> [String : Any]? {
        if let jsonData = jsonString.data(using: .utf8) {
            do {
                if let jsonDictionary = try JSONSerialization.jsonObject(with: jsonData, options: []) as? [String: Any] {
                    return jsonDictionary
                } else {
                    print("json字符串轉(zhuǎn)字典失敗")
                }
            } catch {
                print("json轉(zhuǎn)字典err: \(error.localizedDescription)")
            }
        }
        return nil
    }
    
    // MARK: - Timer
    @objc fileprivate func timerAction() {
        sendParam([KSocketDataType : LYSocketDataType.ping.rawValue])
    }
}

extension SocketManager: GCDAsyncSocketDelegate {
    func socket(_ sock: GCDAsyncSocket, didAcceptNewSocket newSocket: GCDAsyncSocket) {
        print("didAcceptNewSocket: \(newSocket.connectedHost ?? "")")
        DispatchQueue.main.async {
            SVProgressHUD.showSuccess(withStatus: "didAcceptNewSocket: \(newSocket.connectedHost ?? "")")
        }
        
        if isServer {
            clientSocket = newSocket
            isConnectClient = true
            newSocket.readData(withTimeout: -1, tag: 0)
        }
    }
    
    func socketDidDisconnect(_ sock: GCDAsyncSocket, withError err: Error?) {
        if let errStr = err?.localizedDescription {
            print("連接出錯: \(err?.localizedDescription ?? "")")
            DispatchQueue.main.async {
                SVProgressHUD.showError(withStatus: errStr)
            }
        }
    }

    func socket(_ sock: GCDAsyncSocket, didConnectToHost host: String, port: UInt16) {
        print("成功連接服務器: \(host):\(port)")
        sock.readData(withTimeout: -1, tag: 0)
        if !isServer {
            timer.fire()
        }
    }
    
    // MARK: -  粘包拆包
    func socket(_ sock: GCDAsyncSocket, didRead data: Data, withTag tag: Int) {
        // 先存入緩存區(qū)
        dataBuffer.append(data)
        
        while true {
            guard dataBuffer.count >= bodyLegth else { break } // 保證至少有消息頭, 數(shù)據(jù)大于4個字節(jié)具滴,說明有數(shù)據(jù)

            // 獲取消息頭凹嘲,即消息長度
            var messageLength: UInt32 = 0
            (dataBuffer as NSData).getBytes(&messageLength, length: MemoryLayout<UInt32>.size)

            guard dataBuffer.count >= Int(messageLength) + bodyLegth else { break } // 判斷是否收到完整的消息

            // 獲取完整的消息
            let messageData = dataBuffer.subdata(in: bodyLegth..<(Int(messageLength) + bodyLegth))

            // 處理完整的消息
            handleData(messageData, socket: sock)

            // 移除已經(jīng)處理過的消息
            dataBuffer = Data(dataBuffer.subdata(in: (Int(messageLength) + bodyLegth)..<dataBuffer.count))
        }
        
        // 繼續(xù)監(jiān)聽數(shù)據(jù)
        sock.readData(withTimeout: -1, tag: 0)
    }
    
    fileprivate func handleData(_ data: Data, socket: GCDAsyncSocket) {
        // dict : eg: {"type" : "image", "data" : "base64"}
        guard let receivedString = String(data: data, encoding: .utf8) else {
            return
        }
        
        guard let dic = jsonToDictionary(receivedString) else {
            return
        }
        
        // 事件類型
        if let type = dic[KSocketDataType] as? LYSocketDataType.RawValue {
            switch type {
            case LYSocketDataType.string.rawValue: // string類型, 如ping/pong
                guard let str = dic[KSocketDataKey] as? String else { return }
                if isServer {
                    delegate?.receivedKeyWord(str)
                }
                /*DispatchQueue.main.async {
                    SVProgressHUD.showSuccess(withStatus: self.isServer ? ("服務端收到數(shù)據(jù): \(str)") : ("客戶端收到數(shù)據(jù) \(str)") )
                }*/
                
            case LYSocketDataType.image.rawValue:
                // base64 -> image
                let dataString = dic[KSocketDataKey] as? String
                guard let base64Str = dataString else { return }
                guard let imageData = Data(base64Encoded: base64Str) else { return }
                guard let image = UIImage(data: imageData) else { return }
                delegate?.receivedImage(image)
                
            case LYSocketDataType.touch.rawValue, LYSocketDataType.slide.rawValue: //點擊事件
                handlerTouch(type: type, jsonDic: dic)
                
            case LYSocketDataType.sampleBuffer.rawValue:
                handlerSample(type: type, jsonDic: dic)
                
            default:
                print("- handleData - ")
            }
        }
    }
    
    fileprivate func handlerTouch(type: String, jsonDic: [String : Any]) {
        if isServer {
            guard let dataDic = jsonDic[KSocketDataKey] as? [String : Any] else { return }
            guard let x = dataDic["x"] as? String, let y = dataDic["y"] as? String  else { return }
            delegate?.receivedPoint(Double(x) ?? 0, Double(y) ?? 0)
            
            /*DispatchQueue.main.async {
                SVProgressHUD.showSuccess(withStatus: "client \(type) x: \(x) y:\(y) ")
            }*/
        }
    }
    
    fileprivate func handlerSample(type: String, jsonDic: [String : Any]) {
        let dataString = jsonDic[KSocketDataKey] as? String
        guard let base64Str = dataString else { return }
        guard let imageData = Data(base64Encoded: base64Str) else { return }
        delegate?.receivedSampleBuffer(imageData)

    }

    func socket(_ sock: GCDAsyncSocket, didWriteDataWithTag tag: Int) {
        //print("數(shù)據(jù)發(fā)送成功")
    }
}

解碼

實現(xiàn)socket代理方法

func receivedSampleBuffer(_ sampleBuffer: Data) {
     self.decoder.decode(data: sampleBuffer)
}

AVSampleBufferDisplayLayer播放

private var displayLayer: AVSampleBufferDisplayLayer?
private lazy var decoder: LYH264Decoder = {
     let decoder = LYH264Decoder(width: 375, height: 852)
     decoder.returnType = .sampleBuffer
     decoder.videoDecodeSampleBufferCallback = { [weak self] (buffer) in
         guard let displayLayer = self?.displayLayer , let buffer = buffer else {
             return
          }
            
          if displayLayer.isReadyForMoreMediaData {
              displayLayer.enqueue(buffer)
          } else {
              print("播放h264失敗")
          }
      }
      return decoder
}()

完結(jié)

以上代碼都是demo构韵,而且我也是剛轉(zhuǎn)寫Swift周蹭,需要根據(jù)自己項目改改嗷。

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末疲恢,一起剝皮案震驚了整個濱河市凶朗,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌显拳,老刑警劉巖棚愤,帶你破解...
    沈念sama閱讀 216,372評論 6 498
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異萎攒,居然都是意外死亡遇八,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,368評論 3 392
  • 文/潘曉璐 我一進店門耍休,熙熙樓的掌柜王于貴愁眉苦臉地迎上來刃永,“玉大人,你說我怎么就攤上這事羊精∷构唬” “怎么了?”我有些...
    開封第一講書人閱讀 162,415評論 0 353
  • 文/不壞的土叔 我叫張陵喧锦,是天一觀的道長读规。 經(jīng)常有香客問我,道長燃少,這世上最難降的妖魔是什么束亏? 我笑而不...
    開封第一講書人閱讀 58,157評論 1 292
  • 正文 為了忘掉前任,我火速辦了婚禮阵具,結(jié)果婚禮上碍遍,老公的妹妹穿的比我還像新娘。我一直安慰自己阳液,他們只是感情好怕敬,可當我...
    茶點故事閱讀 67,171評論 6 388
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著帘皿,像睡著了一般东跪。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 51,125評論 1 297
  • 那天虽填,我揣著相機與錄音丁恭,去河邊找鬼。 笑死卤唉,一個胖子當著我的面吹牛涩惑,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播桑驱,決...
    沈念sama閱讀 40,028評論 3 417
  • 文/蒼蘭香墨 我猛地睜開眼竭恬,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了熬的?” 一聲冷哼從身側(cè)響起痊硕,我...
    開封第一講書人閱讀 38,887評論 0 274
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎押框,沒想到半個月后岔绸,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,310評論 1 310
  • 正文 獨居荒郊野嶺守林人離奇死亡橡伞,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,533評論 2 332
  • 正文 我和宋清朗相戀三年盒揉,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片兑徘。...
    茶點故事閱讀 39,690評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡刚盈,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出挂脑,到底是詐尸還是另有隱情藕漱,我是刑警寧澤,帶...
    沈念sama閱讀 35,411評論 5 343
  • 正文 年R本政府宣布崭闲,位于F島的核電站肋联,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏刁俭。R本人自食惡果不足惜橄仍,卻給世界環(huán)境...
    茶點故事閱讀 41,004評論 3 325
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望牍戚。 院中可真熱鬧沙兰,春花似錦、人聲如沸翘魄。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,659評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽暑竟。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間但荤,已是汗流浹背罗岖。 一陣腳步聲響...
    開封第一講書人閱讀 32,812評論 1 268
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留腹躁,地道東北人桑包。 一個月前我還...
    沈念sama閱讀 47,693評論 2 368
  • 正文 我出身青樓,卻偏偏與公主長得像纺非,于是被迫代替她去往敵國和親哑了。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 44,577評論 2 353

推薦閱讀更多精彩內(nèi)容