版本記錄
版本號(hào) | 時(shí)間 |
---|---|
V1.0 | 2017.12.29 |
前言
Core Audio
使用專門的數(shù)據(jù)類型與音頻流温亲、復(fù)雜的緩沖區(qū)和audiovisual
時(shí)間戳交互。接下來這幾篇就對(duì)該框架進(jìn)行詳細(xì)解析。感興趣的可以參考上面幾篇文章。
1. Core Audio框架詳細(xì)解析(一) —— 基本概要
Core Audio框架的地位
Core Audio
是iOS和MAC系統(tǒng)中的關(guān)于數(shù)字音頻處理的基礎(chǔ),它是應(yīng)用程序用來處理音頻的一組軟件框架馍刮,所有關(guān)于iOS音頻開發(fā)的接口都是由Core Audio來提供或者經(jīng)過它提供的接口來進(jìn)行封裝的。
其實(shí)一句話窃蹋,它是任何iOS或者M(jìn)AC系統(tǒng)音頻處理框架的基礎(chǔ)卡啰。
具體可以用官方文檔的一張圖表示。
接下來我們就一起分析一下警没。
High-Level 高級(jí)服務(wù)
這里的高級(jí)別服務(wù)匈辱,更加接近于頂層,基本上我們很多關(guān)于音頻開發(fā)的工作在這一層就可以完成杀迹。
1. Audio Queue Services
它位于框架AudioToolbox
中亡脸。
提供錄制、播放树酪、暫停浅碾、循環(huán)、和同步音頻它自動(dòng)采用必要的編解碼器處理壓縮的音頻格式续语。
要在iOS設(shè)備上播放和錄制音頻垂谢,蘋果推薦我們使用AVFoundation
框架中的AVAudioPlayer
和AVAudioRecorder
類。雖然用法比較簡(jiǎn)單疮茄,但是不支持流式滥朱;這就意味著:在播放音頻前,必須等到整個(gè)音頻加載完成后力试,才能開始播放音頻徙邻;錄音時(shí),也必須等到錄音結(jié)束后懂版,才能獲取到錄音數(shù)據(jù)。這給應(yīng)用造成了很大的局限性躏率。為了解決這個(gè)問題躯畴,我們就需要使用Audio Queue Services
來播放和錄制音頻民鼓。感興趣的可以看我前面寫的幾篇關(guān)于Audio Queue Services
的文章。這里只是簡(jiǎn)單的給出錄音和播放的原理圖蓬抄,具體原理和流程丰嘉,看我前面寫的那幾篇,都有詳細(xì)的介紹嚷缭。
2. AVAudioPlayer
它位于框架AVFoundation
中饮亏。
是專為IOS平臺(tái)提供的基于Objective-C接口的音頻播放類,可以支持iOS所支持的所有音頻的播放阅爽,它主要支持以下音頻格式路幸。
AAC
AMR (Adaptive multi-Rate,一種語音格式)
ALAC (Apple lossless Audio Codec)
iLBC (internet Low Bitrate Codec付翁,另一種語音格式)
IMA4 (IMA/ADPCM)
linearPCM (uncompressed)
u-law 和 a-law
MP3 (MPEG-Laudio Layer 3)
這個(gè)是純OC的實(shí)現(xiàn)简肴,特點(diǎn)就是調(diào)用簡(jiǎn)單,下面簡(jiǎn)單的看一下他的API百侧。
#import <AVFoundation/AVBase.h>
#import <AVFoundation/AVAudioFormat.h>
#import <Foundation/Foundation.h>
#import <AVFAudio/AVAudioSettings.h>
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
#import <AVFAudio/AVAudioSession.h>
#endif // #if TARGET_OS_EMBEDDED
#import <Availability.h>
NS_ASSUME_NONNULL_BEGIN
@class NSData, NSURL, NSError;
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
@class AVAudioSessionChannelDescription;
#endif
@protocol AVAudioPlayerDelegate;
NS_CLASS_AVAILABLE(10_7, 2_2) __WATCHOS_AVAILABLE(3_0)
@interface AVAudioPlayer : NSObject {
@private
id _impl;
}
/* For all of these init calls, if a return value of nil is given you can check outError to see what the problem was.
If not nil, then the object is usable for playing
*/
/* all data must be in the form of an audio file understood by CoreAudio */
- (nullable instancetype)initWithContentsOfURL:(NSURL *)url error:(NSError **)outError;
- (nullable instancetype)initWithData:(NSData *)data error:(NSError **)outError;
/* The file type hint is a constant defined in AVMediaFormat.h whose value is a UTI for a file format. e.g. AVFileTypeAIFF. */
/* Sometimes the type of a file cannot be determined from the data, or it is actually corrupt. The file type hint tells the parser what kind of data to look for so that files which are not self identifying or possibly even corrupt can be successfully parsed. */
- (nullable instancetype)initWithContentsOfURL:(NSURL *)url fileTypeHint:(NSString * __nullable)utiString error:(NSError **)outError NS_AVAILABLE(10_9, 7_0);
- (nullable instancetype)initWithData:(NSData *)data fileTypeHint:(NSString * __nullable)utiString error:(NSError **)outError NS_AVAILABLE(10_9, 7_0);
/* transport control */
/* methods that return BOOL return YES on success and NO on failure. */
- (BOOL)prepareToPlay; /* get ready to play the sound. happens automatically on play. */
- (BOOL)play; /* sound is played asynchronously. */
- (BOOL)playAtTime:(NSTimeInterval)time NS_AVAILABLE(10_7, 4_0); /* play a sound some time in the future. time is an absolute time based on and greater than deviceCurrentTime. */
- (void)pause; /* pauses playback, but remains ready to play. */
- (void)stop; /* stops playback. no longer ready to play. */
/* properties */
@property(readonly, getter=isPlaying) BOOL playing; /* is it playing or not? */
@property(readonly) NSUInteger numberOfChannels;
@property(readonly) NSTimeInterval duration; /* the duration of the sound. */
#if !TARGET_OS_IPHONE
/* the UID of the current audio device (as a string) */
@property(copy, nullable) NSString *currentDevice API_AVAILABLE(macos(10.13));
#endif
/* the delegate will be sent messages from the AVAudioPlayerDelegate protocol */
@property(assign, nullable) id<AVAudioPlayerDelegate> delegate;
/* one of these properties will be non-nil based on the init... method used */
@property(readonly, nullable) NSURL *url; /* returns nil if object was not created with a URL */
@property(readonly, nullable) NSData *data; /* returns nil if object was not created with a data object */
@property float pan NS_AVAILABLE(10_7, 4_0); /* set panning. -1.0 is left, 0.0 is center, 1.0 is right. */
@property float volume; /* The volume for the sound. The nominal range is from 0.0 to 1.0. */
- (void)setVolume:(float)volume fadeDuration:(NSTimeInterval)duration API_AVAILABLE(macos(10.12), ios(10.0), watchos(3.0), tvos(10.0)); /* fade to a new volume over a duration */
@property BOOL enableRate NS_AVAILABLE(10_8, 5_0); /* You must set enableRate to YES for the rate property to take effect. You must set this before calling prepareToPlay. */
@property float rate NS_AVAILABLE(10_8, 5_0); /* See enableRate. The playback rate for the sound. 1.0 is normal, 0.5 is half speed, 2.0 is double speed. */
/* If the sound is playing, currentTime is the offset into the sound of the current playback position.
If the sound is not playing, currentTime is the offset into the sound where playing would start. */
@property NSTimeInterval currentTime;
/* returns the current time associated with the output device */
@property(readonly) NSTimeInterval deviceCurrentTime NS_AVAILABLE(10_7, 4_0);
/* "numberOfLoops" is the number of times that the sound will return to the beginning upon reaching the end.
A value of zero means to play the sound just once.
A value of one will result in playing the sound twice, and so on..
Any negative number will loop indefinitely until stopped.
*/
@property NSInteger numberOfLoops;
/* settings */
@property(readonly) NSDictionary<NSString *, id> *settings NS_AVAILABLE(10_7, 4_0); /* returns a settings dictionary with keys as described in AVAudioSettings.h */
/* returns the format of the audio data */
@property(readonly) AVAudioFormat *format API_AVAILABLE(macos(10.12), ios(10.0), watchos(3.0), tvos(10.0));
/* metering */
@property(getter=isMeteringEnabled) BOOL meteringEnabled; /* turns level metering on or off. default is off. */
- (void)updateMeters; /* call to refresh meter values */
- (float)peakPowerForChannel:(NSUInteger)channelNumber; /* returns peak power in decibels for a given channel */
- (float)averagePowerForChannel:(NSUInteger)channelNumber; /* returns average power in decibels for a given channel */
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
/* The channels property lets you assign the output to play to specific channels as described by AVAudioSession's channels property */
/* This property is nil valued until set. */
/* The array must have the same number of channels as returned by the numberOfChannels property. */
@property(nonatomic, copy, nullable) NSArray<AVAudioSessionChannelDescription *> *channelAssignments NS_AVAILABLE(10_9, 7_0); /* Array of AVAudioSessionChannelDescription objects */
#endif
@end
/* A protocol for delegates of AVAudioPlayer */
__WATCHOS_AVAILABLE(3_0)
@protocol AVAudioPlayerDelegate <NSObject>
@optional
/* audioPlayerDidFinishPlaying:successfully: is called when a sound has finished playing. This method is NOT called if the player is stopped due to an interruption. */
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag;
/* if an error occurs while decoding it will be reported to the delegate. */
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError * __nullable)error;
#if TARGET_OS_IPHONE
/* AVAudioPlayer INTERRUPTION NOTIFICATIONS ARE DEPRECATED - Use AVAudioSession instead. */
/* audioPlayerBeginInterruption: is called when the audio session has been interrupted while the player was playing. The player will have been paused. */
- (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player NS_DEPRECATED_IOS(2_2, 8_0);
/* audioPlayerEndInterruption:withOptions: is called when the audio session interruption has ended and this player had been interrupted while playing. */
/* Currently the only flag is AVAudioSessionInterruptionFlags_ShouldResume. */
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags NS_DEPRECATED_IOS(6_0, 8_0);
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withFlags:(NSUInteger)flags NS_DEPRECATED_IOS(4_0, 6_0);
/* audioPlayerEndInterruption: is called when the preferred method, audioPlayerEndInterruption:withFlags:, is not implemented. */
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player NS_DEPRECATED_IOS(2_2, 6_0);
#endif // TARGET_OS_IPHONE
@end
NS_ASSUME_NONNULL_END
3. Extended Audio File Services
由Audio File
與Audio Converter
組合而成砰识,提供壓縮及無壓縮音頻文件的讀寫能力。
它與Audio File Services
佣渴、Audio File Stream Services
和Audio Queue Services
等同時(shí)存在AudioToolbox
框架中辫狼。ExtendedAudioFile
相對(duì)Audio File Services
和 Audio Converter Services
,API調(diào)用非常簡(jiǎn)單和明確辛润,并且不需要去處理AudioStreamPacketDescription
膨处,在實(shí)際開發(fā)中邏輯更為清晰。
4. OpenAL
它就是存在框架OpenAL
中频蛔。
是CoreAudio對(duì)OpenAL標(biāo)準(zhǔn)的實(shí)現(xiàn)灵迫,可以播放3D混音效果。
OpenAL 主要的功能是在來源物體晦溪、音效緩沖和收聽者中編碼瀑粥。來源物體包含一個(gè)指向緩沖區(qū)的指標(biāo)、聲音的速度三圆、位置和方向狞换,以及聲音強(qiáng)度。收聽者物體包含收聽者的速度舟肉、位置和方向修噪,以及全部聲音的整體增益。緩沖里包含 8 或 16 位元路媚、單聲道或立體聲 PCM 格式的音效資料黄琼,表現(xiàn)引擎進(jìn)行所有必要的計(jì)算,如距離衰減整慎、多普勒效應(yīng)等脏款。
不同于 OpenGL 規(guī)格围苫,OpenAL 規(guī)格包含兩個(gè)API分支;以實(shí)際 OpenAL 函式組成的核心撤师,和 ALC API
剂府,ALC
用于管理表現(xiàn)內(nèi)容、資源使用情況剃盾,并將跨平臺(tái)風(fēng)格封在其中腺占。還有“ALUT
”程式庫,提供高階“易用”的函式痒谴,其定位相當(dāng)于 OpenGL 的 GLUT
衰伯。
Mid-Level 中級(jí)服務(wù)
該層功能比較齊全,包括音頻數(shù)據(jù)格式轉(zhuǎn)換闰歪,音頻文件讀寫嚎研,音頻流解析,插件工作支持等库倘。
1. Audio Convert Services
它位于框架AudioToolbox
中临扮。
負(fù)責(zé)音頻數(shù)據(jù)格式的轉(zhuǎn)換
2. Audio File Services
它位于框架AudioToolbox
中。
負(fù)責(zé)音頻數(shù)據(jù)的讀寫教翩。
3. Audio Unit Services 和 Audio Processing Graph Services
它位于框架AudioToolbox
中杆勇。
支持均衡器和混音器等數(shù)字信號(hào)處理的插件。
4. Audio File Scream Services
它位于框架AudioToolbox
中饱亿。
負(fù)責(zé)流解析蚜退。
5. Core Audio Clock Services
它位于框架Core Audio
中。
負(fù)責(zé)音頻音頻時(shí)鐘同步彪笼。
Low-Level 低級(jí)服務(wù)
該主要在MAC上的音頻APP實(shí)現(xiàn)中并且需要最大限度的實(shí)時(shí)性能的情況下使用钻注,大部分音頻APP不需要使用該層的服務(wù)。而且配猫,在iOS上也提供了具備較高實(shí)時(shí)性能的高層API達(dá)到你的需求幅恋。例如OpenAL
,在游戲中具備與I/O直接調(diào)用的實(shí)時(shí)音頻處理能力泵肄。
1. I/O Kit
它在IOKit
框架中捆交,與硬件驅(qū)動(dòng)交互。
獲得用戶空間訪問硬件設(shè)備和驅(qū)動(dòng)程序腐巢。I / O Kit
框架通過設(shè)備接口機(jī)制實(shí)現(xiàn)對(duì)I / O Kit對(duì)象(驅(qū)動(dòng)程序和結(jié)點(diǎn))的非內(nèi)核訪問品追。
2. Audio HAL
音頻硬件抽象層,使API調(diào)用與實(shí)際硬件相分離冯丙,保持獨(dú)立肉瓦。
3. Core MIDI
它位于Core MIDI
框架中,與MIDI設(shè)備(如硬件鍵盤和合成器)進(jìn)行通信。
Core MIDI
框架提供了用于與MIDI(樂器數(shù)字接口)設(shè)備(包括硬件鍵盤和合成器)進(jìn)行通信的API泞莉。 使用基座連接器或網(wǎng)絡(luò)從iOS設(shè)備進(jìn)行連接洁墙。 有關(guān)使用基座連接器的更多信息,請(qǐng)參閱Apple的MFi program戒财。
4. Host Time Services
訪問電腦硬件時(shí)鐘。
音頻框架不同場(chǎng)景使用分析
1. condition One
只實(shí)現(xiàn)音頻的播放捺弦,沒有其他需求饮寞,AVAudioPlayer
就可以滿足需求。它的接口使用簡(jiǎn)單列吼,不用關(guān)心其中的細(xì)節(jié)幽崩,通常只提供給它一個(gè)播放源的URL地址,并且調(diào)用其play寞钥、pause慌申、stop等方法進(jìn)行控制,observer其播放狀態(tài)更新UI即可理郑。
2. condition Two
APP需要對(duì)音頻進(jìn)行流播放蹄溉,就需要AudioFileStreamer
加Audio Queue
,將網(wǎng)絡(luò)或者本地的流讀取到內(nèi)存您炉,提交給AudioFileStreamer
解析分離音頻幀柒爵,分離出來的音頻幀可以送給AudioQueue
進(jìn)行解碼和播放,可參考下面赚爵。
3. condition Three
APP需要需要對(duì)音頻施加音效(均衡器棉胀、混響器),就是除了數(shù)據(jù)的讀取和解析以外還需要用到AudioConverter或者Codec來把音頻數(shù)據(jù)轉(zhuǎn)換成PCM數(shù)據(jù)冀膝,再由AudioUnit+AUGraph來進(jìn)行音效處理和播放唁奢,可參考下面。
參考文章
后記
未完窝剖,待續(xù)~~~