前言
在AV Foundation
中使用AVAudioRecorder
類添加音頻錄制功能和使用AVAudioPlayer
一樣簡(jiǎn)單, 都是在Audio Queue Server
上層構(gòu)建的.同時(shí)支持macOS
和iOS
平臺(tái).可以從內(nèi)置麥克風(fēng)錄制音頻,也可以支持?jǐn)?shù)字音頻接口或USB外接麥克風(fēng)錄制.
主要內(nèi)容如下:
如何創(chuàng)建AVAudioRecorder
1\. 音頻格式
2\. 采樣率
3\. 通道數(shù)
創(chuàng)建Demo
1\. 配置音頻會(huì)話
2\. 實(shí)現(xiàn)錄音功能
3\. 使用Audio Metering實(shí)現(xiàn)聲波視覺(jué)顯示
創(chuàng)建AVAudioRecorder
之前先了解一下它的方法和成員變量
@property (readonly, getter=isRecording) BOOL recording;//是否正在錄音
@property (readonly) NSDictionary<NSString *, id> *settings;//錄音配置:采樣率、音頻格式、通道數(shù)...
@property (readonly) NSURL *url;//錄音文件存放URL
@property (readonly) NSTimeInterval currentTime;//錄音時(shí)長(zhǎng)
@property (getter=isMeteringEnabled) BOOL meteringEnabled;//是否監(jiān)控聲波
AVAudioRecorder
的實(shí)例方法:
- (BOOL)prepareToRecord;//為錄音準(zhǔn)備緩沖區(qū)
- (BOOL)record;//錄音開(kāi)始亿絮,暫停后調(diào)用會(huì)恢復(fù)錄音
- (BOOL)recordAtTime:(NSTimeInterval)time;//在指定時(shí)間后開(kāi)始錄音
- (BOOL)recordForDuration:(NSTimeInterval) duration;//按指定時(shí)長(zhǎng)錄音
- (BOOL)recordAtTime:(NSTimeInterval)time
forDuration:(NSTimeInterval)duration;//上面2個(gè)的合體
- (void)pause; //暫停錄音
- (void)stop; //停止錄音
- (BOOL)deleteRecording;//刪除錄音往枷,必須先停止錄音再刪除
AVAudioRecorder
的代理方法:
//錄音完成后調(diào)用
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder
successfully:(BOOL)flag;
//錄音編碼發(fā)生錯(cuò)誤時(shí)調(diào)用
- (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)recorder
error:(NSError *)error;
如何創(chuàng)建AVAudioRecorder
創(chuàng)建AVAudioRecorder
對(duì)象所需要的參數(shù)如下:
- 音頻流錄制時(shí)寫(xiě)入到本地的路徑URL
-
settings
錄音配置:采樣率、音頻格式缓呛、通道數(shù)…等鍵值參數(shù)字典 - 發(fā)生錯(cuò)誤的
NSError
指針
如下代碼:
/**
創(chuàng)建錄音器
*/
- (void)createRecorder {
NSString *directory = NSTemporaryDirectory();
NSString *filePath = [directory stringByAppendingPathComponent:@"voice1.m4a"];
NSURL *url = [NSURL fileURLWithPath:filePath];
NSDictionary *setting = @{AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVSampleRateKey: @22050.0f,
AVNumberOfChannelsKey: @1};
NSError *error;
self.recorder = [[AVAudioRecorder alloc] initWithURL:url
settings:setting
error:&error];
if (self.recorder) {
[self.recorder prepareToRecord];
} else {
NSLog(@"Recorder Create Error: %@", [error localizedDescription]);
}
}
這里的建議調(diào)用[self.recorder prepareToRecord]
方法對(duì)錄音實(shí)例進(jìn)行預(yù)設(shè)就像上一章創(chuàng)建AVAudioPlayer
類似.都是為了執(zhí)行底層Audio Queue
初始化的必要過(guò)程.這個(gè)prepareToRecord
方法還在給定的URL參數(shù)指定的位置創(chuàng)建一個(gè)文件,這樣就減少了錄制啟動(dòng)時(shí)的延時(shí)
音頻格式
AVFormatIDKey
key指定錄制格式,這里的除了kAudioFormatMPEG4AAC
格式還有下面這些:
CF_ENUM(AudioFormatID)
{
kAudioFormatLinearPCM = 'lpcm',
kAudioFormatAC3 = 'ac-3',
kAudioFormat60958AC3 = 'cac3',
kAudioFormatAppleIMA4 = 'ima4',
kAudioFormatMPEG4AAC = 'aac ',
kAudioFormatMPEG4CELP = 'celp',
kAudioFormatMPEG4HVXC = 'hvxc',
kAudioFormatMPEG4TwinVQ = 'twvq',
kAudioFormatMACE3 = 'MAC3',
kAudioFormatMACE6 = 'MAC6',
kAudioFormatULaw = 'ulaw',
kAudioFormatALaw = 'alaw',
kAudioFormatQDesign = 'QDMC',
kAudioFormatQDesign2 = 'QDM2',
kAudioFormatQUALCOMM = 'Qclp',
kAudioFormatMPEGLayer1 = '.mp1',
kAudioFormatMPEGLayer2 = '.mp2',
kAudioFormatMPEGLayer3 = '.mp3',
kAudioFormatTimeCode = 'time',
kAudioFormatMIDIStream = 'midi',
kAudioFormatParameterValueStream = 'apvs',
kAudioFormatAppleLossless = 'alac',
kAudioFormatMPEG4AAC_HE = 'aach',
kAudioFormatMPEG4AAC_LD = 'aacl',
kAudioFormatMPEG4AAC_ELD = 'aace',
kAudioFormatMPEG4AAC_ELD_SBR = 'aacf',
kAudioFormatMPEG4AAC_ELD_V2 = 'aacg',
kAudioFormatMPEG4AAC_HE_V2 = 'aacp',
kAudioFormatMPEG4AAC_Spatial = 'aacs',
kAudioFormatAMR = 'samr',
kAudioFormatAMR_WB = 'sawb',
kAudioFormatAudible = 'AUDB',
kAudioFormatiLBC = 'ilbc',
kAudioFormatDVIIntelIMA = 0x6D730011,
kAudioFormatMicrosoftGSM = 0x6D730031,
kAudioFormatAES3 = 'aes3',
kAudioFormatEnhancedAC3 = 'ec-3'
};
這里的kAudioFormatLinearPCM
會(huì)將為壓縮的音頻流寫(xiě)入到文件中,這就是原始數(shù)據(jù),保真度最高,當(dāng)然文件也最大, 選擇ACCkAudioFormatMPEG4AAC
或者AppleIMA4kAudioFormatAppleLossless
等格式會(huì)顯著縮小文件,還能保證音頻質(zhì)量.
注意:
指定的音頻格式一定要和文件寫(xiě)入的URL文件類型保持一致捎泻。如果錄制xxx.wav文件格式 是 Waveform Audio File Format(WAVE)的格式要求,即 低字節(jié)序、 LinePCM埋哟。 如果AVFormatIDKey
指定的值不是kAudioFormatLinearPCM
則會(huì)發(fā)生錯(cuò)誤笆豁。NSError 會(huì)返回如下錯(cuò)誤
The operation couldn’t be completed. (OSState error 1718449215.)
采樣率
上邊的代碼里AVSampleRateKey
用于定義錄音器的采樣率. 采樣率定義了對(duì)輸入的模擬音頻信號(hào)每一秒內(nèi)的采樣數(shù). 如果使用低采樣率 比如8kHz,會(huì)導(dǎo)致粗粒度、AM廣播類型的錄制效果, 不過(guò)文件會(huì)比較小; 使用44.1kHz的采樣率(CD質(zhì)量的采樣率)會(huì)得到非常高質(zhì)量的內(nèi)容, 不過(guò)文件比較大. 至于使用什么樣的采樣率沒(méi)有明確的定義. 不過(guò)開(kāi)發(fā)者應(yīng)該盡量使用標(biāo)準(zhǔn)的采樣率赤赊,比如: 8000Hz闯狱、16 000Hz(16kHz)、22050Hz(22.05kHz)或 44100Hz(44.1kHz)砍鸠、當(dāng)然還有48000Hz和96000Hz ,(kHz代表千赫),超過(guò)48000或96000的采樣對(duì)人耳已經(jīng)沒(méi)有意義.最終是我們的耳朵在進(jìn)行判斷.(上一章說(shuō)了 人耳所能聽(tīng)到的聲音扩氢,最低的頻率是從20Hz起一直到最高頻率20kHz,錄音最好采用 x 2 倍的頻率)
通道數(shù)
AVNumberOfChannelsKey
用于定義記錄音頻內(nèi)容的通道數(shù)。指定默認(rèn)值1 意味著使用單聲道錄制爷辱、設(shè)置2意味著使用立體聲錄制录豺。除非使用外部硬件進(jìn)行錄制遍尺,否則同窗應(yīng)該創(chuàng)建單聲道錄音牵现。 這里的通道數(shù)是指 錄制設(shè)備的輸入數(shù)量 可以理解為 麥克風(fēng) 內(nèi)置 或者外接麥克風(fēng)錄制比如 插入Apple耳機(jī) 里面的麥克風(fēng)。
以上是全面
AVAudioRecorder
的部分概念,AVAudioRecorder
支持無(wú)限時(shí)長(zhǎng)錄制,還可以設(shè)置從未來(lái)某一時(shí)間點(diǎn)開(kāi)始錄制或指定時(shí)長(zhǎng)錄制
網(wǎng)絡(luò)流媒體處理
AVAudioPlayer
音頻播放器只能播放本地文件五续,并且是一次性加載所有的音頻數(shù)據(jù)弟断,但我們有時(shí)候需要邊下載邊聽(tīng)怎么辦咏花?
AVAudioPlayer
是不支持這種網(wǎng)絡(luò)流媒體形式的音頻播放,要播放這種網(wǎng)絡(luò)流媒體,我們需要使用AudioToolbox
框架的音頻隊(duì)列服務(wù)Audio Queue Services
昏翰。
音頻隊(duì)列服務(wù)分為3個(gè)部分:
- 3個(gè)緩沖器
- 1個(gè)緩沖隊(duì)列
- 1個(gè)回調(diào)
1. 下面是錄音的音頻隊(duì)列服務(wù)的工作原理:
2. 下面是播放音頻的音頻隊(duì)列服務(wù)的工作原理;
當(dāng)然處理這些不需要我們自己去寫(xiě)C語(yǔ)言函數(shù)實(shí)現(xiàn) 有個(gè)開(kāi)源庫(kù)FreeStreamer
FreeStreamer使用
#import <FreeStreamer/FreeStreamer.h>
- (void)viewDidLoad {
[super viewDidLoad];
[self initAudioStream];
//播放網(wǎng)絡(luò)流媒體音頻
[self.audioStream play];
}
/* 初始化網(wǎng)絡(luò)流媒體對(duì)象 */
- (void)initAudioStream{
NSString *urlStr = @"http://sc1.111ttt.com/2016/1/02/24/195242042236.mp3";
NSURL *url = [NSURL URLWithString:urlStr];
//創(chuàng)建FSAudioStream對(duì)象
self.audioStream = [[FSAudioStream alloc] initWithUrl:url];
//設(shè)置播放錯(cuò)誤回調(diào)Block
self.audioStream.onFailure = ^(FSAudioStreamError error, NSString *description){
NSLog(@"播放過(guò)程中發(fā)生錯(cuò)誤苍匆,錯(cuò)誤信息:%@",description);
};
//設(shè)置播放完成回調(diào)Block
self.audioStream.onCompletion = ^(){
NSLog(@"播放完成!");
};
[self.audioStream setVolume:0.5];//設(shè)置聲音大小
}
有點(diǎn)跑遠(yuǎn)了 回到正題 本章將不會(huì)把這個(gè)寫(xiě)到demo中 請(qǐng)諒解
下面我們來(lái)寫(xiě)個(gè)AVAudioRecorder
的Demo 完成上述功能
配置會(huì)話
首先創(chuàng)建以一個(gè)AVAudioRecorderDemo工程iOS平臺(tái)這些相信大家非常熟練了.
在AppDelegate
里面導(dǎo)入#import <AVFoundation/AVFoundation.h>
寫(xiě)上設(shè)置如下代碼
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error;
if (![session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error]) {
NSLog(@"Category Error: %@",[error localizedDescription]);
}
//激活會(huì)話
if (![session setActive:YES error:&error]) {
NSLog(@"Activation Error: %@",[error localizedDescription]);
}
return YES;
}
這AVAudioSessionCategoryPlayAndRecord
是上一章說(shuō)的那幾種Category,我們需要錄音+播放功能
下一步 配置 plist文件訪問(wèn)權(quán)限信息
然后選擇SourceCode
填寫(xiě)上
<!-- ?? Microphone -->
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) microphone use</string>
上邊這些是為了訪問(wèn)本地授權(quán), 記得授權(quán)如果第一次被拒就必須讓用戶手動(dòng) 到通用-設(shè)置里面去配置否則將永遠(yuǎn)不好使哈。如果不寫(xiě)這種本地授權(quán) 程序應(yīng)該會(huì) crash
錄音代碼實(shí)現(xiàn)
首先我們來(lái)封裝一個(gè)類起名叫BDRecoder
吧. 這里類我們讓它負(fù)責(zé)所有 音頻錄制棚菊、暫停錄制浸踩、保存錄制文件等功能 并有回調(diào)函數(shù)等block. BDRecoder.h
看起來(lái)像下面這樣, 這里后續(xù)完善的話可以加個(gè)代理 表示錄制過(guò)程中意外中斷或者線路切換等邏輯.
//
// BDRecorder.h
// AVAudioRecorderDemo
//
// Created by sunyazhou on 2017/3/29.
// Copyright ? 2017年 Baidu, Inc. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@class MemoModel;
//錄音停止的回調(diào)
typedef void (^BDRecordingStopCompletionHanlder)(BOOL);
//保存錄音文件完成的回調(diào)
typedef void (^BDRecordingSaveCompletionHanlder)(BOOL, id);
@interface BDRecorder : NSObject
/**
* 外部獲取當(dāng)前錄制的時(shí)間
* 小時(shí):分鐘:秒 當(dāng)然后續(xù)可以加微秒和毫秒哈就是格式字符串 00:03:02 這樣
*/
@property (nonatomic, readonly) NSString *formattedCurrentTime;
- (BOOL)record; //開(kāi)始錄音
- (void)pause; //暫停錄音
- (void)stopWithCompletionHandler:(BDRecordingStopCompletionHanlder)handler;
- (void)saveRecordingWithName:(NSString *)name
completionHandler:(BDRecordingSaveCompletionHanlder)handler;
/**
回放錄制的文件
@param memo 備忘錄文件model 放著當(dāng)前播放的model
@return 是否播放成功
*/
- (BOOL)playbackURL:(MemoModel *)memo;
@end
BDRecoder.m
//
// BDRecorder.m
// AVAudioRecorderDemo
//
// Created by sunyazhou on 2017/3/29.
// Copyright ? 2017年 Baidu, Inc. All rights reserved.
//
#import "BDRecorder.h"
#import "MemoModel.h"
@interface BDRecorder () <AVAudioRecorderDelegate>
@property (nonatomic, strong) AVAudioPlayer *player;
@property (nonatomic, strong) AVAudioRecorder *recorder;
@property (nonatomic, strong) BDRecordingStopCompletionHanlder completionHandler;
@end
@implementation BDRecorder
- (instancetype)init {
self = [super init];
if (self) {
NSString *temDir = NSTemporaryDirectory();
NSString *filePath = [temDir stringByAppendingPathComponent:@"test1.caf"];
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
NSDictionary *setting = @{AVFormatIDKey: @(kAudioFormatAppleIMA4),
AVSampleRateKey: @44100.0f,
AVNumberOfChannelsKey: @1,
AVEncoderBitDepthHintKey: @16,
AVEncoderAudioQualityKey: @(AVAudioQualityMedium)
};
NSError *error;
self.recorder = [[AVAudioRecorder alloc] initWithURL:fileURL settings:setting error:&error];
if (self.recorder) {
self.recorder.delegate = self;
[self.recorder prepareToRecord];
} else {
NSLog(@"Create Recorder Error: %@",[error localizedDescription]);
}
}
return self;
}
- (BOOL)record {
return [self.recorder record];
}
- (void)pause {
[self.recorder pause];
}
- (void)stopWithCompletionHandler:(BDRecordingStopCompletionHanlder)handler {
self.completionHandler = handler;
[self.recorder stop];
}
- (void)saveRecordingWithName:(NSString *)name
completionHandler:(BDRecordingSaveCompletionHanlder)handler {
NSTimeInterval timestamp = [NSDate timeIntervalSinceReferenceDate];
NSString *filename = [NSString stringWithFormat:@"%@-%f.caf", name, timestamp];
NSString *docDir = [self documentsDirectory];
NSString *destPath = [docDir stringByAppendingPathComponent:filename];
NSURL *srcURL = self.recorder.url;
NSURL *destURL = [NSURL fileURLWithPath:destPath];
NSError *error;
BOOL success = [[NSFileManager defaultManager] copyItemAtURL:srcURL toURL:destURL error:&error];
if (success) {
MemoModel *model = [MemoModel memoWithTitle:name url:destURL];
handler(YES, model);
}
}
- (NSString *)documentsDirectory {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
return [paths objectAtIndex:0];
}
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder
successfully:(BOOL)flag {
if (self.completionHandler) { self.completionHandler(flag); }
}
@end
這里的self.completionHandler
當(dāng)外部調(diào)用stopWithCompletionHandler
的時(shí)候暫存一下block是為了錄音完成時(shí)告訴外部通知一下以便于可以彈出一個(gè)UIAlertView去顯示保存等操作
當(dāng)停止錄音, 進(jìn)入語(yǔ)音備忘階段命名階段時(shí) 讓外部調(diào)用saveRecordingWithName:completionHandler
傳入文件的命名,然后我們通過(guò)self.recorder.url
獲取到URL并且copy到tmp里面是目錄并命名
下一步要實(shí)現(xiàn)playbackURL:
這里面有個(gè)MemoModel
參數(shù)的對(duì)象,
這個(gè)MemoModel
是一個(gè)對(duì)象model放著 文件name、url…
#import <Foundation/Foundation.h>
@interface MemoModel : NSObject <NSCopying>
@property (copy, nonatomic, readonly) NSString *title;
@property (strong, nonatomic, readonly) NSURL *url;
@property (copy, nonatomic, readonly) NSString *dateString;
@property (copy, nonatomic, readonly) NSString *timeString;
+ (instancetype)memoWithTitle:(NSString *)title url:(NSURL *)url;
- (BOOL)deleteMemo;
@end
//具體實(shí)現(xiàn)請(qǐng)參考我的最終demo
實(shí)現(xiàn)播放部分需要?jiǎng)?chuàng)建播放器 這里就簡(jiǎn)單創(chuàng)建一下AVAudioPlayer
/**
回放錄制的文件
@param memo 備忘錄文件model 放著當(dāng)前播放的model
@return 是否播放成功
*/
- (BOOL)playbackURL:(MemoModel *)memo {
[self.player stop];
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:memo.url error:nil];
if (self.player) {
[self.player prepareToPlay];
return YES;
}
return NO;
}
這里通過(guò)memo.url 給當(dāng)前播放器播放, 這里就簡(jiǎn)單實(shí)現(xiàn)一下 如果需要復(fù)雜實(shí)現(xiàn)可以參考我上一章講解的AVAudioPlayer
最后把顯示事件部分的代碼加上
/**
* 外部獲取當(dāng)前錄制的時(shí)間
* 小時(shí):分鐘:秒 當(dāng)然后續(xù)可以加微秒和毫秒哈就是格式字符串 00:03:02 這樣
*/
@property (nonatomic, readonly) NSString *formattedCurrentTime;
這里我們需要復(fù)寫(xiě)formattedCurrentTime
get方法獲取時(shí)間格式例如: 00:00:00
/**
返回當(dāng)前錄制的時(shí)間格式 HH:mm:ss
@return 返回組裝好的字符串
*/
- (NSString *)formattedCurrentTime {
NSUInteger time = (NSUInteger)self.recorder.currentTime;
NSInteger hours = (time / 3600);
NSInteger minutes = (time / 60) % 60;
NSInteger seconds = time % 60;
NSString *format = @"%02i:%02i:%02i";
return [NSString stringWithFormat:format, hours, minutes, seconds];
}
上邊大致是封裝BDRecorder
的過(guò)程
下面是對(duì)ViewController
UI的設(shè)置, 設(shè)置好時(shí)間格式 我們需要在ViewController
里 自己搞個(gè)定時(shí)器去更新錄制的時(shí)間在UI上的顯示, 因?yàn)?code>self.recorder.currentTime是只讀熟悉 沒(méi)提供set方法 所以我們也無(wú)法用KVO監(jiān)聽(tīng)recorder的屬性變化.
代碼如下:
//
// ViewController.m
// AVAudioRecorderDemo
//
// Created by sunyazhou on 2017/3/28.
// Copyright ? 2017年 Baidu, Inc. All rights reserved.
//
#import "ViewController.h"
#import <Masonry/Masonry.h>
#import <AVFoundation/AVFoundation.h>
#import "BDRecorder.h"
#import "LevelMeterView.h"
#import "MemoModel.h"
#import "MemoCell.h"
#import "LevelPair.h"
#define MEMOS_ARCHIVE @"memos.archive"
@interface ViewController () <UITableViewDelegate, UITableViewDataSource>
@property (nonatomic, strong) NSMutableArray <MemoModel *>*memos;
@property (nonatomic, strong) BDRecorder *recorder;
@property (nonatomic, strong) NSTimer *timer;
@property (nonatomic, strong) CADisplayLink *levelTimer;
@property (weak, nonatomic) IBOutlet UIView *containerView;
@property (weak, nonatomic) IBOutlet UIButton *recordButton;
@property (weak, nonatomic) IBOutlet UIButton *stopButton;
@property (weak, nonatomic) IBOutlet UILabel *timeLabel;
@property (weak, nonatomic) IBOutlet LevelMeterView *levelMeterView;
@property (weak, nonatomic) IBOutlet UITableView *tableview;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.recorder = [[BDRecorder alloc] init];
self.memos = [NSMutableArray array];
self.stopButton.enabled = NO;
UIImage *recordImage = [[UIImage imageNamed:@"record"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
UIImage *pauseImage = [[UIImage imageNamed:@"pause"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
UIImage *stopImage = [[UIImage imageNamed:@"stop"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
[self.recordButton setImage:recordImage forState:UIControlStateNormal];
[self.recordButton setImage:pauseImage forState:UIControlStateSelected];
[self.stopButton setImage:stopImage forState:UIControlStateNormal];
NSData *data = [NSData dataWithContentsOfURL:[self archiveURL]];
if (!data) {
_memos = [NSMutableArray array];
} else {
_memos = [NSKeyedUnarchiver unarchiveObjectWithData:data];
}
[self.tableview registerNib:[UINib nibWithNibName:@"MemoCell" bundle:[NSBundle mainBundle]] forCellReuseIdentifier:@"MemoCell"];
[self layoutSubveiws];
}
- (void)layoutSubveiws{
[self.containerView mas_makeConstraints:^(MASConstraintMaker *make) {
make.top.equalTo(self.view.mas_top).offset(30);
make.left.equalTo(self.view.mas_left).offset(20);
make.right.equalTo(self.view.mas_right).offset(-20);
make.centerX.equalTo(self.view.mas_centerX);
make.bottom.equalTo(self.tableview.mas_top).offset(-50);
}];
[self.tableview mas_makeConstraints:^(MASConstraintMaker *make) {
make.left.right.bottom.equalTo(self.view);
make.top.equalTo(self.view.mas_top).offset(200);
}];
[self.timeLabel mas_makeConstraints:^(MASConstraintMaker *make) {
make.top.left.right.equalTo(self.containerView);
make.centerX.equalTo(self.containerView.mas_centerX);
make.height.equalTo(@25);
}];
[self.recordButton mas_makeConstraints:^(MASConstraintMaker *make) {
make.left.equalTo(self.containerView.mas_left);
make.bottom.equalTo(self.containerView.mas_bottom);
make.width.height.equalTo(@71);
}];
[self.stopButton mas_makeConstraints:^(MASConstraintMaker *make) {
make.right.equalTo(self.containerView.mas_right);
make.bottom.equalTo(self.containerView.mas_bottom);
make.width.height.equalTo(@71);
}];
[self.levelMeterView mas_makeConstraints:^(MASConstraintMaker *make) {
make.left.right.equalTo(self.view);
make.height.equalTo(@30);
make.bottom.equalTo(self.tableview.mas_top);
}];
}
- (void)startTimer {
[self.timer invalidate];
self.timer = [NSTimer timerWithTimeInterval:0.5
target:self
selector:@selector(updateTimeDisplay)
userInfo:nil
repeats:YES];
[[NSRunLoop mainRunLoop] addTimer:self.timer forMode:NSRunLoopCommonModes];
}
- (void)stopTimer {
[self.timer invalidate];
self.timer = nil;
}
- (void)updateTimeDisplay {
self.timeLabel.text = self.recorder.formattedCurrentTime;
}
- (void)startMeterTimer {
[self.levelTimer invalidate];
self.levelTimer = [CADisplayLink displayLinkWithTarget:self selector:@selector(updateMeter)];
// if ([self.levelTimer respondsToSelector:@selector(setPreferredFramesPerSecond:)]) {
// self.levelTimer.preferredFramesPerSecond = 5;
// } else {
self.levelTimer.frameInterval = 5;
// }
[self.levelTimer addToRunLoop:[NSRunLoop currentRunLoop]
forMode:NSRunLoopCommonModes];
}
- (void)stopMeterTimer {
[self.levelTimer invalidate];
self.levelTimer = nil;
[self.levelMeterView resetLevelMeter];
}
- (void)updateMeter {
LevelPair *levels = [self.recorder levels];
self.levelMeterView.level = levels.level;
self.levelMeterView.peakLevel = levels.peakLevel;
[self.levelMeterView setNeedsDisplay];
}
#pragma mark -
#pragma mark - UITableViewDelegate
- (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section {
return self.memos.count;
}
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath {
MemoCell *cell = [tableView dequeueReusableCellWithIdentifier:@"MemoCell"];
cell.model = self.memos[indexPath.row];
return cell;
}
- (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath{
MemoModel *model = self.memos[indexPath.row];
[self.recorder playbackURL:model];
}
- (BOOL)tableView:(UITableView *)tableView canEditRowAtIndexPath:(NSIndexPath *)indexPath {
return YES;
}
- (void)tableView:(UITableView *)tableView commitEditingStyle:(UITableViewCellEditingStyle)editingStyle forRowAtIndexPath:(NSIndexPath *)indexPath {
if (editingStyle == UITableViewCellEditingStyleDelete) {
MemoModel *memo = self.memos[indexPath.row];
[memo deleteMemo];
[self.memos removeObjectAtIndex:indexPath.row];
[self saveMemos];
[tableView deleteRowsAtIndexPaths:@[indexPath] withRowAnimation:UITableViewRowAnimationAutomatic];
}
}
- (CGFloat)tableView:(UITableView *)tableView heightForRowAtIndexPath:(NSIndexPath *)indexPath {
return 80;
}
#pragma mark - event response 所有觸發(fā)的事件響應(yīng) 按鈕统求、通知检碗、分段控件等
- (IBAction)record:(UIButton *)sender {
self.stopButton.enabled = YES;
if ([sender isSelected]) {
[self stopMeterTimer];
[self stopTimer];
[self.recorder pause];
} else {
[self startMeterTimer];
[self startTimer];
[self.recorder record];
}
[sender setSelected:![sender isSelected]];
}
- (IBAction)stopRecording:(UIButton *)sender {
[self stopMeterTimer];
self.recordButton.selected = NO;
self.stopButton.enabled = NO;
[self.recorder stopWithCompletionHandler:^(BOOL result) {
double delayInSeconds = 0.01;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t) (delayInSeconds * NSEC_PER_SEC));
dispatch_after(popTime, dispatch_get_main_queue(), ^{
[self showSaveDialog];
});
}];
}
- (void)showSaveDialog {
UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"保存錄音" message:@"輸入名稱" preferredStyle:UIAlertControllerStyleAlert];
[alertController addTextFieldWithConfigurationHandler:^(UITextField * _Nonnull textField) {
textField.placeholder = @"我的錄音";
}];
UIAlertAction *cancelAction = [UIAlertAction actionWithTitle:@"Cancel" style:UIAlertActionStyleCancel handler:nil];
UIAlertAction *okAction = [UIAlertAction actionWithTitle:@"OK" style:UIAlertActionStyleDefault handler:^(UIAlertAction * _Nonnull action) {
NSString *filename = [alertController.textFields.firstObject text];
[self.recorder saveRecordingWithName:filename completionHandler:^(BOOL success, id object) {
if (success) {
[self.memos insertObject:object atIndex:0];
[self saveMemos];
[self.tableview reloadData];
} else {
NSLog(@"Error saving file: %@", [object localizedDescription]);
}
}];
}];
[alertController addAction:cancelAction];
[alertController addAction:okAction];
[self presentViewController:alertController animated:YES completion:nil];
}
#pragma mark - Memo Archiving
//保存?zhèn)渫沵odel 這里簡(jiǎn)單用歸檔的方式存儲(chǔ)一下
- (void)saveMemos {
NSData *fileData = [NSKeyedArchiver archivedDataWithRootObject:self.memos];
[fileData writeToURL:[self archiveURL] atomically:YES];
}
//存儲(chǔ)歸檔的路徑
- (NSURL *)archiveURL {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [paths objectAtIndex:0];
NSString *archivePath = [docsDir stringByAppendingPathComponent:MEMOS_ARCHIVE];
return [NSURL fileURLWithPath:archivePath];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
代碼稍稍有點(diǎn)長(zhǎng) 我簡(jiǎn)單說(shuō)一下 大家可以參照最終的demo
文章最終的Demo獲取:加iOS高級(jí)技術(shù)交流群:624212887码邻,獲取Demo
@property (nonatomic, strong) NSMutableArray <MemoModel *>*memos;
@property (nonatomic, strong) BDRecorder *recorder;
聲明一個(gè)數(shù)組 存放需要播放的model對(duì)象信息 名稱 文件url折剃、日期等
@property (nonatomic, strong) NSTimer *timer;
@property (nonatomic, strong) CADisplayLink *levelTimer;
一個(gè)timer用于 刷新錄制時(shí)間
levelTimer
用于刷新錄制的視波圖也叫Audio Metering
對(duì)音頻進(jìn)行計(jì)量
在BDRecorder
中增加了
- (LevelPair *)levels {
[self.recorder updateMeters];
float avgPower = [self.recorder averagePowerForChannel:0];
float peakPower = [self.recorder peakPowerForChannel:0];
float linearLevel = [self.meterTable valueForPower:avgPower];
float linearPeak = [self.meterTable valueForPower:peakPower];
return [LevelPair levelsWithLevel:linearLevel peakLevel:linearPeak];
}
這兩個(gè)方法
1\. averagePowerForChannel取出波谷平均值
2\. peakPowerForChannel取出波峰
兩個(gè)方法都會(huì)返回一個(gè)用于表示聲音分貝(dB)等級(jí)的浮點(diǎn)值. 這個(gè)值的表示范圍0dB(fullscale) ~ -160dB
0dB最大 -160dB最小
開(kāi)啟音頻計(jì)量 (需要在BDRecorder
中開(kāi)啟, 如下代碼) 會(huì)帶來(lái)很多額外的開(kāi)銷,但我覺(jué)得還是很劃算的 畢竟要顯示視覺(jué)效果才是王道.
如果meteringEnabled
開(kāi)啟則音頻錄音器就會(huì)對(duì)捕捉到的音頻樣本進(jìn)行分貝計(jì)算像屋。
開(kāi)啟音頻計(jì)量(Audio Metering)方法:
self.recorder.meteringEnabled = YES;
更新前調(diào)用了如下代碼
- (LevelPair *)levels {
[self.recorder updateMeters];
...
}
每當(dāng)讀取值之前需要調(diào)用[self.recorder updateMeters]
方法才能獲取到最新值怕犁,否則可能獲取的不夠精確
然后 使用MeterTable
類 聲明的函數(shù)valueForPower:
把上邊兩個(gè)閥值 轉(zhuǎn)成線性運(yùn)算
就是分貝值從對(duì)數(shù)形式的-160 ~ 0
范圍轉(zhuǎn)換為線性0到1的形式.
//
// MeterTable.m
// AVAudioRecorderDemo
//
// Created by sunyazhou on 2017/4/5.
// Copyright ? 2017年 Baidu, Inc. All rights reserved.
//
#import "MeterTable.h"
#define MIN_DB -60.0f
#define TABLE_SIZE 300
@implementation MeterTable {
float _scaleFactor;
NSMutableArray *_meterTable;
}
- (id)init {
self = [super init];
if (self) {
float dbResolution = MIN_DB / (TABLE_SIZE - 1);
_meterTable = [NSMutableArray arrayWithCapacity:TABLE_SIZE];
_scaleFactor = 1.0f / dbResolution;
float minAmp = dbToAmp(MIN_DB);
float ampRange = 1.0 - minAmp;
float invAmpRange = 1.0 / ampRange;
for (int i = 0; i < TABLE_SIZE; i++) {
float decibels = i * dbResolution;
float amp = dbToAmp(decibels);
float adjAmp = (amp - minAmp) * invAmpRange;
_meterTable[i] = @(adjAmp);
}
}
return self;
}
float dbToAmp(float dB) {
return powf(10.0f, 0.05f * dB);
}
- (float)valueForPower:(float)power {
if (power < MIN_DB) {
return 0.0f;
} else if (power >= 0.0f) {
return 1.0f;
} else {
int index = (int) (power * _scaleFactor);
return [_meterTable[index] floatValue];
}
}
@end
這個(gè)類創(chuàng)建了一個(gè)數(shù)組
_meterTable
保存從計(jì)算前的分貝數(shù)到使用一定級(jí)別分貝解析之后的轉(zhuǎn)換結(jié)果, 這里使用的解析率-0.2dB
, 解析等級(jí)可以通過(guò)MIN_DB
TABLE_SIZE
這兩個(gè)宏的值來(lái)修改,每個(gè)分貝值都調(diào)用dbToAmp:
函數(shù)轉(zhuǎn)換為線性范圍內(nèi)的值,使其處于0(-60dB) ~ 1()
范圍內(nèi), 之后由這些范圍內(nèi)的值構(gòu)成平行曲線,開(kāi)平方計(jì)算并保存到內(nèi)部查找表格中. 然后如果外部需要可以調(diào)用valueForPower:
來(lái)獲取.
然后保存到LevelPair
的實(shí)例對(duì)象返回 這個(gè)實(shí)例很簡(jiǎn)單存放兩個(gè)值一個(gè)level
一個(gè)peakLevel
@interface LevelPair : NSObject
@property (nonatomic, readonly) float level;
@property (nonatomic, readonly) float peakLevel;
+ (instancetype)levelsWithLevel:(float)level peakLevel:(float)peakLevel;
- (instancetype)initWithLevel:(float)level peakLevel:(float)peakLevel;
@end
在ViewController
中顯示相關(guān)的UI
- (void)startTimer {
[self.timer invalidate];
self.timer = [NSTimer timerWithTimeInterval:0.5
target:self
selector:@selector(updateTimeDisplay)
userInfo:nil
repeats:YES];
[[NSRunLoop mainRunLoop] addTimer:self.timer forMode:NSRunLoopCommonModes];
}
- (void)stopTimer {
[self.timer invalidate];
self.timer = nil;
}
- (void)updateTimeDisplay {
self.timeLabel.text = self.recorder.formattedCurrentTime;
}
- (void)startMeterTimer {
[self.levelTimer invalidate];
self.levelTimer = [CADisplayLink displayLinkWithTarget:self selector:@selector(updateMeter)];
// if ([self.levelTimer respondsToSelector:@selector(setPreferredFramesPerSecond:)]) {
// self.levelTimer.preferredFramesPerSecond = 5;
// } else {
self.levelTimer.frameInterval = 5;
// }
[self.levelTimer addToRunLoop:[NSRunLoop currentRunLoop]
forMode:NSRunLoopCommonModes];
}
- (void)stopMeterTimer {
[self.levelTimer invalidate];
self.levelTimer = nil;
[self.levelMeterView resetLevelMeter];
}
- (void)updateMeter {
LevelPair *levels = [self.recorder levels];
self.levelMeterView.level = levels.level;
self.levelMeterView.peakLevel = levels.peakLevel;
[self.levelMeterView setNeedsDisplay];
}
用于定時(shí)器的處理
事件的相關(guān)響應(yīng)
#pragma mark - event response 所有觸發(fā)的事件響應(yīng) 按鈕、通知开睡、分段控件等
- (IBAction)record:(UIButton *)sender {
self.stopButton.enabled = YES;
if ([sender isSelected]) {
[self stopMeterTimer];
[self stopTimer];
[self.recorder pause];
} else {
[self startMeterTimer];
[self startTimer];
[self.recorder record];
}
[sender setSelected:![sender isSelected]];
}
- (IBAction)stopRecording:(UIButton *)sender {
[self stopMeterTimer];
self.recordButton.selected = NO;
self.stopButton.enabled = NO;
[self.recorder stopWithCompletionHandler:^(BOOL result) {
double delayInSeconds = 0.01;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t) (delayInSeconds * NSEC_PER_SEC));
dispatch_after(popTime, dispatch_get_main_queue(), ^{
[self showSaveDialog];
});
}];
}
這里保存數(shù)據(jù)使用的是歸檔方式
BDRecorder
沒(méi)有 處理意外中斷等情況 比如外接麥克風(fēng) 和音頻意外來(lái)電等因苹,如果需要處理 就可以在BDRecorder
中聲明幾個(gè)代理監(jiān)聽(tīng)音頻回話的那幾個(gè)通知就可以了 這里出于學(xué)習(xí)為目的就簡(jiǎn)單寫(xiě)到這里吧,如果大家需求強(qiáng)烈我可以回頭補(bǔ)上并開(kāi)源篇恒。
很多人糾結(jié)如何根據(jù)波形繪制更好的圖 我這里是借助本書(shū)作者的demo完成相關(guān)波形處理的視圖扶檐。
#import "LevelMeterView.h"
#import "LevelMeterColorThreshold.h"
@interface LevelMeterView ()
@property (nonatomic) NSUInteger ledCount;
@property (strong, nonatomic) UIColor *ledBackgroundColor;
@property (strong, nonatomic) UIColor *ledBorderColor;
@property (nonatomic, strong) NSArray *colorThresholds;
@end
@implementation LevelMeterView
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
[self setupView];
}
return self;
}
- (id)initWithCoder:(NSCoder *)coder {
self = [super initWithCoder:coder];
if (self) {
[self setupView];
}
return self;
}
- (void)setupView {
self.backgroundColor = [UIColor clearColor];
_ledCount = 20;
_ledBackgroundColor = [UIColor colorWithWhite:0.0f alpha:0.35f];
_ledBorderColor = [UIColor blackColor];
UIColor *greenColor = [UIColor colorWithRed:0.458 green:1.000 blue:0.396 alpha:1.000];
UIColor *yellowColor = [UIColor colorWithRed:1.000 green:0.930 blue:0.315 alpha:1.000];
UIColor *redColor = [UIColor colorWithRed:1.000 green:0.325 blue:0.329 alpha:1.000];
_colorThresholds = @[[LevelMeterColorThreshold colorThresholdWithMaxValue:0.5 color:greenColor name:@"green"],
[LevelMeterColorThreshold colorThresholdWithMaxValue:0.8 color:yellowColor name:@"yellow"],
[LevelMeterColorThreshold colorThresholdWithMaxValue:1.0 color:redColor name:@"red"]];
}
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context, 0, CGRectGetHeight(self.bounds));
CGContextRotateCTM(context, (CGFloat) -M_PI_2);
CGRect bounds = CGRectMake(0., 0., [self bounds].size.height, [self bounds].size.width);
CGFloat lightMinValue = 0.0f;
NSInteger peakLED = -1;
if (self.peakLevel > 0.0f) {
peakLED = self.peakLevel * self.ledCount;
if (peakLED >= self.ledCount) {
peakLED = self.ledCount - 1;
}
}
for (int ledIndex = 0; ledIndex < self.ledCount; ledIndex++) {
UIColor *ledColor = [self.colorThresholds[0] color];
CGFloat ledMaxValue = (CGFloat) (ledIndex + 1) / self.ledCount;
for (int colorIndex = 0; colorIndex < self.colorThresholds.count - 1; colorIndex++) {
LevelMeterColorThreshold *currThreshold = self.colorThresholds[colorIndex];
LevelMeterColorThreshold *nextThreshold = self.colorThresholds[colorIndex + 1];
if (currThreshold.maxValue <= ledMaxValue) {
ledColor = nextThreshold.color;
}
}
CGFloat height = CGRectGetHeight(bounds);
CGFloat width = CGRectGetWidth(bounds);
CGRect ledRect = CGRectMake(0.0f, height * ((CGFloat) ledIndex / self.ledCount), width, height * (1.0f / self.ledCount));
// Fill background color
CGContextSetFillColorWithColor(context, self.ledBackgroundColor.CGColor);
CGContextFillRect(context, ledRect);
// Draw Light
CGFloat lightIntensity;
if (ledIndex == peakLED) {
lightIntensity = 1.0f;
} else {
lightIntensity = clamp((self.level - lightMinValue) / (ledMaxValue - lightMinValue));
}
UIColor *fillColor = nil;
if (lightIntensity == 1.0f) {
fillColor = ledColor;
} else if (lightIntensity > 0.0f) {
CGColorRef color = CGColorCreateCopyWithAlpha([ledColor CGColor], lightIntensity);
fillColor = [UIColor colorWithCGColor:color];
CGColorRelease(color);
}
CGContextSetFillColorWithColor(context, fillColor.CGColor);
UIBezierPath *fillPath = [UIBezierPath bezierPathWithRoundedRect:ledRect cornerRadius:2.0f];
CGContextAddPath(context, fillPath.CGPath);
// Stroke border
CGContextSetStrokeColorWithColor(context, self.ledBorderColor.CGColor);
UIBezierPath *strokePath = [UIBezierPath bezierPathWithRoundedRect:CGRectInset(ledRect, 0.5, 0.5) cornerRadius:2.0f];
CGContextAddPath(context, strokePath.CGPath);
CGContextDrawPath(context, kCGPathFillStroke);
lightMinValue = ledMaxValue;
}
}
CGFloat clamp(CGFloat intensity) {
if (intensity < 0.0f) {
return 0.0f;
} else if (intensity >= 1.0) {
return 1.0f;
} else {
return intensity;
}
}
- (void)resetLevelMeter {
self.level = 0.0f;
self.peakLevel = 0.0f;
[self setNeedsDisplay];
}
@end
這里給出了level和peak的閥值 有很多第三方開(kāi)源的view大家可以自行研究一下 很簡(jiǎn)單 就是把相關(guān)閥值量化的過(guò)程。
總結(jié)
AVAudioRecorder
的學(xué)習(xí)還算完整的搞完了,隨時(shí)記錄一下學(xué)習(xí)內(nèi)容和技術(shù)知識(shí)胁艰。
文章最終的Demo獲瓤钪:加iOS高級(jí)技術(shù)交流群:624212887,獲取Demo,以及更多iOS學(xué)習(xí)資料
文章來(lái)源于網(wǎng)絡(luò)腾么,如有侵權(quán)請(qǐng)聯(lián)系小編刪除