iOS 基于WebRTC的音視頻通信 總結篇(2020最新)
附上我的swfit項目, 項目里面有整個swift應用使用框架, 網絡請求框架, DSBridge原生與H5交互的用法, 反射知識的使用, WCDB數(shù)據(jù)庫的封裝使用, WebRTC音視頻直播demo, socket的使用, socket協(xié)議的封裝使用等等知識點. 希望對大家有用.-->
swfit完整項目2020持續(xù)更新完善
公司要用webrtc進行音視頻通信, 參考了國內外眾多博客和demo, 總結一下經驗:
webrtc官網
webrtc對iOS使用的說明
另外一個幫助大家很好理解webrtc的單獨實例demo: https://github.com/Xianlau/WebRTC_Demo
先展示demo效果圖
WEBRTC結構
完整的WebRTC框架藕畔,分為 Server端蹄皱、Client端兩大部分般渡。
- Server端:
Stun服務器
: 服務器用于獲取設備的外部網絡地址
Turn服務器
: 服務器是在點對點失敗后用于通信中繼
信令服務器
: 負責端到端的連接。兩端在連接之初寂恬,需要交換信令,如sdp、candidate等笋除,都是通過信令服務器 進行轉發(fā)交換的踏兜。 - Client有四大應用端:
AndroidiOS
PC Broswer
介紹下WebRTC三個主要API词顾,以及實現(xiàn)點對點連接的流程。
-
MediaStream
:通過MediaStream的API能夠通過設備的攝像頭及話筒獲得視頻碱妆、音頻的同步流 -
RTCPeerConnection
:RTCPeerConnection是WebRTC用于構建點對點之間穩(wěn)定肉盹、高效的流傳輸?shù)慕M件 -
RTCDataChannel
:RTCDataChannel使得瀏覽器之間(點對點)建立一個高吞吐量、低延時的信道疹尾,用于傳輸任意數(shù)據(jù)上忍。
其中RTCPeerConnection
是我們WebRTC的核心組件。
WEBRTC的建立連接流程圖
整個webrtc連接的流程說明
其主要流程如上圖所示, 具體流程說明如下:
客戶端通過socket, 和服務器建立起TCP長鏈接, 這部分WebRTC并沒有提供相應的API, 所以這里可以借助第三方框架, OC代碼建議使用
CocoaAsyncSocket
第三方框架進行socket連接https://github.com/robbiehanson/CocoaAsyncSocket
swift代碼的話國外工程師最喜歡用Starscream
(WebSocket)
https://github.com/daltoniam/Starscream客戶端通過信令服務器, 進行offer SDP 握手
SDP
(Session Description Protocol):描述建立音視頻連接的一些屬性纳本,如音頻的編碼格式窍蓝、視頻的編碼格式、是否接收/發(fā)送音視頻等等
SDP
是通過webrtc框架里面的PeerConnection
所創(chuàng)建, 詳細創(chuàng)建請參考我的demo
.
3.客戶端通過信令服務器, 進行Candidate 握手
Candidate
:主要包含了相關方的IP信息繁成,包括自身局域網的ip吓笙、公網ip、turn服務器ip朴艰、stun服務器ip等
Candidate
是通過webrtc框架里面的PeerConnection
所創(chuàng)建, 詳細創(chuàng)建請參考我的demo
.
- 客戶端在SDP 和Candidate握手成功后, 就建立起一個P2P端對端的鏈接, 視頻流就能直接傳輸, 不需要經過服務器啦.
SDP握手流程和Candidate握手流程類似, 但有點繁瑣, 下面就SDP握手流程簡要說明:
下圖為WebRTC通過信令建立一個SDP握手的過程观蓄。只有通過SDP握手,雙方才知道對方的信息祠墅,這是建立p2p通道的基礎侮穿。
1.anchor端通過 createOffer 生成 SDP 描述
2.anchor通過 setLocalDescription,設置本地的描述信息
3.anchor將 offer SDP 發(fā)送給用戶
4.用戶通過 setRemoteDescription毁嗦,設置遠端的描述信息
5.用戶通過 createAnswer 創(chuàng)建出自己的 SDP 描述
6.用戶通過 setLocalDescription亲茅,設置本地的描述信息
7.用戶將 anwser SDP 發(fā)送給主播
8.anchor端通過 setRemoteDescription,設置遠端的描述信息。
9.通過SDP握手后克锣,兩端之間就會建立起一個端對端的直接通訊通道茵肃。
由于我們所處的網絡環(huán)境錯綜復雜,用戶可能處在私有內網內袭祟,使用p2p傳輸時验残,將會遇到NAT以及防火墻等阻礙。這個時候我們就需要在SDP握手時巾乳,通過STUN/TURN/ICE相關NAT穿透技術來保障p2p鏈接的建立您没。
1. 建立Socket長連接, 為接下來的信令通信做好鋪墊.
與服務器端建立長連接, 選用了socket連接, 用的第三方框架是CocoaAsyncSocket, 其實也可以使用WebSocket, 看你們團隊的方案選型吧.
- 以下是socket建立連接以及WebRTC建立連接的邏輯代碼. socket連接其實代碼量極少, socket連接參考一下github的CocoaAsyncSocket說明就好, 不必花太多時間在這塊, 重點還是在WebRTC建立連接, 在與服務端進行數(shù)據(jù)傳輸?shù)臅r候, 注意你們可能會有數(shù)據(jù)分包策略.
- 網上絕大部分代碼用的是OC, 而且很多已經過且零散的, OC版本相對簡單, 以下分享的是swift版, 閱讀以下代碼請一定一定要先看看以上提到的兩個邏輯時序圖.
// MARK: - socket狀態(tài)代理
protocol SocketClientDelegate: class {
func signalClientDidConnect(_ signalClient: SocketClient)
func signalClientDidDisconnect(_ signalClient: SocketClient)
func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription)
func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate)
}
final class SocketClient: NSObject {
//socket
var socket: GCDAsyncSocket = {
return GCDAsyncSocket.init()
}()
private var host: String? //服務端IP
private var port: UInt16? //端口
weak var delegate: SocketClientDelegate?//代理
var receiveHeartBeatDuation = 0 //心跳計時計數(shù)
let heartBeatOverTime = 10 //心跳超時
var sendHeartbeatTimer:Timer? //發(fā)送心跳timer
var receiveHeartbearTimer:Timer? //接收心跳timer
//接收數(shù)據(jù)緩存
var dataBuffer:Data = Data.init()
//登錄獲取的peer_id
var peer_id = 0
//登錄獲取的遠程設備peer_id
var remote_peer_id = 0
// MARK:- 初始化
init(hostStr: String , port: UInt16) {
super.init()
self.socket.delegate = self
self.socket.delegateQueue = DispatchQueue.main
self.host = hostStr
self.port = port
//socket開始連接
connect()
}
// MARK:- 開始連接
func connect() {
do {
try self.socket.connect(toHost: self.host ?? "", onPort: self.port ?? 6868, withTimeout: -1)
}catch {
print(error)
}
}
// MARK:- 發(fā)送消息
func sendMessage(_ data: Data){
self.socket.write(data, withTimeout: -1, tag: 0)
}
// MARK:- 發(fā)送sdp offer/answer
func send(sdp rtcSdp: RTCSessionDescription) {
//轉成我們的sdp
let type = rtcSdp.type
var typeStr = ""
switch type {
case .answer:
typeStr = "answer"
case .offer:
typeStr = "offer"
default:
print("sdpType錯誤")
}
let newSDP:SDPSocket = SDPSocket.init(sdp: rtcSdp.sdp, type: typeStr)
let jsonInfo = newSDP.toJSON()
let dic = ["sdp" : jsonInfo]
let info:SocketInfo = SocketInfo.init(type: .sdp, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
let data = self.packData(info: info)
//print(data)
self.sendMessage(data)
print("發(fā)送SDP")
}
// MARK:- 發(fā)送iceCandidate
func send(candidate rtcIceCandidate: RTCIceCandidate) {
let iceCandidateMessage = IceCandidate_Socket(from: rtcIceCandidate)
let jsonInfo = iceCandidateMessage.toJSON()
let dic = ["icecandidate" : jsonInfo]
let info:SocketInfo = SocketInfo.init(type: .icecandidate, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
let data = self.packData(info: info)
//print(data)
self.sendMessage(data)
print("發(fā)送ICE")
}
}
extension SocketClient: GCDAsyncSocketDelegate {
// MARK:- socket連接成功
func socket(_ sock: GCDAsyncSocket, didConnectToHost host: String, port: UInt16) {
debugPrint("socket連接成功")
self.delegate?.signalClientDidConnect(self)
//登錄獲取身份id peer_id
login()
//發(fā)送心跳
startHeartbeatTimer()
//開啟接收心跳計時
startReceiveHeartbeatTimer()
//繼續(xù)接收數(shù)據(jù)
socket.readData(withTimeout: -1, tag: 0)
}
// MARK:- 接收數(shù)據(jù) socket接收到一個數(shù)據(jù)包
func socket(_ sock: GCDAsyncSocket, didRead data: Data, withTag tag: Int) {
//debugPrint("socket接收到一個數(shù)據(jù)包")
let _:SocketInfo? = self.unpackData(data)
//let type:SigType = SigType(rawValue: socketInfo?.type ?? "")!
//print(socketInfo ?? "")
//print(type)
//繼續(xù)接收數(shù)據(jù)
socket.readData(withTimeout: -1, tag: 0)
}
// MARK:- 斷開連接
func socketDidDisconnect(_ sock: GCDAsyncSocket, withError err: Error?) {
debugPrint("socket斷開連接")
print(err ?? "")
self.disconnectSocket()
// try to reconnect every two seconds
DispatchQueue.global().asyncAfter(deadline: .now() + 5) {
debugPrint("Trying to reconnect to signaling server...")
self.connect()
}
}
}
2. 進行信令通信, 建立端對端的連接.
import Foundation
import WebRTC
// MARK: - webrtc連接狀態(tài)代理
protocol WebRTCClientDelegate: class {
func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate)
func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState)
func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data)
}
final class WebRTCClient: NSObject {
// MARK:- 懶加載factory
private static let factory: RTCPeerConnectionFactory = {
RTCInitializeSSL()
let videoEncoderFactory = RTCVideoEncoderFactoryH264()
let videoDecoderFactory = RTCVideoDecoderFactoryH264()
let factory = RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)
// let options = RTCPeerConnectionFactoryOptions()
// options.ignoreVPNNetworkAdapter = true
// options.ignoreWiFiNetworkAdapter = true
// options.ignoreCellularNetworkAdapter = true
// options.ignoreEthernetNetworkAdapter = true
// options.ignoreLoopbackNetworkAdapter = true
// factory.setOptions(options)
return factory
}()
weak var delegate: WebRTCClientDelegate?
private let peerConnection: RTCPeerConnection
private let rtcAudioSession = RTCAudioSession.sharedInstance()
private let audioQueue = DispatchQueue(label: "audio")
private let mediaConstrains = [kRTCMediaConstraintsOfferToReceiveAudio: kRTCMediaConstraintsValueTrue,
kRTCMediaConstraintsOfferToReceiveVideo: kRTCMediaConstraintsValueTrue]
private var videoCapturer: RTCVideoCapturer?
private var localVideoTrack: RTCVideoTrack?
private var remoteVideoTrack: RTCVideoTrack?
private var localDataChannel: RTCDataChannel?
private var remoteDataChannel: RTCDataChannel?
@available(*, unavailable)
override init() {
fatalError("WebRTCClient:init is unavailable")
}
required init(iceServers: [String]) {
// // gatherContinually will let WebRTC to listen to any network changes and send any new candidates to the other client
// config.continualGatheringPolicy = .gatherContinually
//config.iceTransportPolicy = .all
//contraints: 控制MediaStream的內容(媒體類型、分辨率胆绊、幀率)
// let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
// optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
let config = RTCConfiguration()
config.iceServers = [RTCIceServer(urlStrings: iceServers)]
// Unified plan is more superior than planB
config.sdpSemantics = .unifiedPlan
//contraints: 控制MediaStream的內容(媒體類型氨鹏、分辨率、幀率)
let mediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: nil, optionalConstraints: nil)
self.peerConnection = WebRTCClient.factory.peerConnection(with: config, constraints: mediaConstraints, delegate: nil)
super.init()
self.createMediaSenders()
self.configureAudioSession()
self.peerConnection.delegate = self
}
// MARK: 掛起
func disconnect(){
self.peerConnection.close()
}
// MARK: Signaling 獲取本地sdp 用來發(fā)送給socket服務器
func offer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void) {
let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
optionalConstraints: nil)
self.peerConnection.offer(for: constrains) { (sdp, error) in
guard let sdp = sdp else {
return
}
self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
completion(sdp)
})
}
}
// MARK:- 回復sockdet服務器sdp answer
func answer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void) {
let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
optionalConstraints: nil)
//拿到本地sdp
self.peerConnection.answer(for: constrains) { (sdp, error) in
guard let sdp = sdp else {
return
}
//設置本地sdp
self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
//發(fā)送出去sdp
completion(sdp)
})
}
}
// MARK:- 設置遠程sdp
func set(remoteSdp: RTCSessionDescription, completion: @escaping (Error?) -> ()) {
self.peerConnection.setRemoteDescription(remoteSdp, completionHandler: completion)
}
// MARK:- 添加遠程candidate
func set(remoteCandidate: RTCIceCandidate) {
self.peerConnection.add(remoteCandidate)
}
// MARK: Media
func startCaptureLocalVideo(renderer: RTCVideoRenderer) {
guard let capturer = self.videoCapturer as? RTCCameraVideoCapturer else {
return
}
guard
//獲取前置攝像頭front 后置取back
let frontCamera = (RTCCameraVideoCapturer.captureDevices().first { $0.position == .front }),
// choose highest res
let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
return width1 < width2
}).last,
// choose highest fps
let fps = (format.videoSupportedFrameRateRanges.sorted { return $0.maxFrameRate < $1.maxFrameRate }.last) else {
return
}
capturer.startCapture(with: frontCamera,
format: format,
fps: Int(fps.maxFrameRate))
self.localVideoTrack?.add(renderer)
}
func renderRemoteVideo(to renderer: RTCVideoRenderer) {
self.remoteVideoTrack?.add(renderer)
}
private func configureAudioSession() {
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
} catch let error {
debugPrint("Error changeing AVAudioSession category: \(error)")
}
self.rtcAudioSession.unlockForConfiguration()
}
// MARK:- 創(chuàng)建媒體流
private func createMediaSenders() {
let streamId = "stream"
// Audio
let audioTrack = self.createAudioTrack()
self.peerConnection.add(audioTrack, streamIds: [streamId])
// Video
let videoTrack = self.createVideoTrack()
self.localVideoTrack = videoTrack
self.peerConnection.add(videoTrack, streamIds: [streamId])
self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
//samadd
//self.remoteVideoTrack?.source.adaptOutputFormat(toWidth: 960, height: 480, fps: 30)
// Data
if let dataChannel = createDataChannel() {
dataChannel.delegate = self
self.localDataChannel = dataChannel
}
}
// MARK:- 創(chuàng)建音頻track
private func createAudioTrack() -> RTCAudioTrack {
let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
let audioSource = WebRTCClient.factory.audioSource(with: audioConstrains)
let audioTrack = WebRTCClient.factory.audioTrack(with: audioSource, trackId: "audio0")
return audioTrack
}
// MARK:- 創(chuàng)建視頻track
private func createVideoTrack() -> RTCVideoTrack {
let videoSource = WebRTCClient.factory.videoSource()
#if TARGET_OS_SIMULATOR
self.videoCapturer = RTCFileVideoCapturer(delegate: videoSource)
#else
self.videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
#endif
let videoTrack = WebRTCClient.factory.videoTrack(with: videoSource, trackId: "video0")
return videoTrack
}
// MARK: Data Channels
// MARK:- 創(chuàng)建data通道
private func createDataChannel() -> RTCDataChannel? {
let config = RTCDataChannelConfiguration()
guard let dataChannel = self.peerConnection.dataChannel(forLabel: "WebRTCData", configuration: config) else {
debugPrint("Warning: Couldn't create data channel.")
return nil
}
return dataChannel
}
// MARK:- 發(fā)送data
func sendData(_ data: Data) {
let buffer = RTCDataBuffer(data: data, isBinary: true)
self.remoteDataChannel?.sendData(buffer)
}
}
// MARK:- Audio control
extension WebRTCClient {
func muteAudio() {
self.setAudioEnabled(false)
}
func unmuteAudio() {
self.setAudioEnabled(true)
}
// Fallback to the default playing device: headphones/bluetooth/ear speaker
func speakerOff() {
self.audioQueue.async { [weak self] in
guard let self = self else {
return
}
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.overrideOutputAudioPort(.none)
} catch let error {
debugPrint("Error setting AVAudioSession category: \(error)")
}
self.rtcAudioSession.unlockForConfiguration()
}
}
// Force speaker
func speakerOn() {
self.audioQueue.async { [weak self] in
guard let self = self else {
return
}
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.overrideOutputAudioPort(.speaker)
try self.rtcAudioSession.setActive(true)
} catch let error {
debugPrint("Couldn't force audio to speaker: \(error)")
}
self.rtcAudioSession.unlockForConfiguration()
}
}
private func setAudioEnabled(_ isEnabled: Bool) {
let audioTracks = self.peerConnection.transceivers.compactMap { return $0.sender.track as? RTCAudioTrack }
audioTracks.forEach { $0.isEnabled = isEnabled }
}
}
extension WebRTCClient: RTCDataChannelDelegate {
func dataChannelDidChangeState(_ dataChannel: RTCDataChannel) {
debugPrint("dataChannel did change state: \(dataChannel.readyState)")
}
func dataChannel(_ dataChannel: RTCDataChannel, didReceiveMessageWith buffer: RTCDataBuffer) {
self.delegate?.webRTCClient(self, didReceiveData: buffer.data)
}
}
3. 對Webrtc模塊進行封裝管理
import Foundation
import AVFoundation
import WebRTC
// MARK:- 圖傳連接狀態(tài)
public enum RtcConnectedState {
case sucessed //連接成功
case falure //連接失敗
case connecting //正在連接
}
protocol WebRTCManagerDelegate: class {
//socket是否連接上
func webRTCManager(_ manager: WebRTCManager, socketConnectState isSucessed: Bool)
//webrtc連接狀態(tài)
func webRTCManager(_ manager: WebRTCManager, didChangeConnectionState state: RTCIceConnectionState)
}
class WebRTCManager {
static let shareInstance:WebRTCManager = WebRTCManager()
//private let signalClient: SignalingClient
var signalClient: SocketClient?
var webRTCClient: WebRTCClient?
///初始化的時候請傳入config
var sockitConfig: SocketConfig = SocketConfig.default
//代理
weak var delegate: WebRTCManagerDelegate?
var remoteCandidate: Int = 0
///rtc連接成功回調
var feedbackConnectedBlock: ((_ webClient: WebRTCClient)->())?
// MARK:- 斷開socket連接
public func disconnect(){
self.signalClient?.disconnectSocket()
self.webRTCClient?.disconnect()
self.signalClient?.delegate = nil
self.webRTCClient?.delegate = nil
self.signalClient = nil
self.webRTCClient = nil
remoteCandidate = 0
}
// MARK:- 開始連接socket
public func connect(){
//打印RTC日記
//RTCSetMinDebugLogLevel(.verbose)
//let log = RTCFileLogger.init()
//log.start()
//創(chuàng)建socket和rtc對象
signalClient = SocketClient.init(hostStr: sockitConfig.host, port: sockitConfig.port)
webRTCClient = WebRTCClient(iceServers: sockitConfig.webRTCIceServers)
webRTCClient?.delegate = self
signalClient?.delegate = self
self.signalClient?.connect()
}
}
extension WebRTCManager: SocketClientDelegate {
//socket登錄成功
func signalClientdidLogin(_ signalClient: SocketClient) {
logger.info("********socket登錄成功************************")
}
// MARK:- socket連接成功
func signalClientDidConnect(_ signalClient: SocketClient) {
self.delegate?.webRTCManager(self, socketConnectState: true)
}
// MARK:- socket連接失敗
func signalClientDidDisconnect(_ signalClient: SocketClient) {
self.delegate?.webRTCManager(self, socketConnectState: false)
}
// MARK:- 收到對方sdp
func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription) {
logger.info("************收到對方sdp****************************")
//設置遠方sdp
self.webRTCClient?.set(remoteSdp: sdp) { (error) in
self.webRTCClient?.answer { (localSdp) in
self.signalClient?.send(sdp: localSdp)
}
logger.error(error.debugDescription)
}
}
// MARK:- 收到對方ice
func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate) {
logger.info("************收到對方ice****************************")
self.remoteCandidate += 1
//設置遠方ice
self.webRTCClient?.set(remoteCandidate: candidate)
}
}
extension WebRTCManager: WebRTCClientDelegate {
// MARK:- 收到本地ice
func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate) {
logger.info("********************************發(fā)現(xiàn)本地 ice candidate **********")
self.signalClient?.send(candidate: candidate)
}
// MARK:- rtc連接狀態(tài)
func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState) {
self.delegate?.webRTCManager(self, didChangeConnectionState: state)
switch state {
case .connected, .completed:
logger.info("*********RTC連接狀態(tài)成功*****************************************")
if let block = feedbackConnectedBlock {
block(client)
}
case .disconnected:
logger.info("*********RTC失去連接*****************************************")
case .failed, .closed:
logger.info("*********RTC連接失敗*****************************************")
case .new, .checking, .count: break
@unknown default: break
}
}
// MARK:- 收到rtc 數(shù)據(jù)通道數(shù)據(jù)
func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data) {
// DispatchQueue.main.async {
// let message = String(data: data, encoding: .utf8) ?? "(Binary: \(data.count) bytes)"
// let alert = UIAlertController(title: "Message from WebRTC", message: message, preferredStyle: .alert)
// alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
// self.present(alert, animated: true, completion: nil)
// }
}
}
持續(xù)更新中.....
大家有問題可以QQ我: 506299396