跟YUV數(shù)據(jù)一樣另伍,PCM數(shù)據(jù)也是裸數(shù)據(jù)流鼻百,里面不包含存儲(chǔ)格式、通道數(shù)摆尝、采樣率等信息,我們只有知道這些信息因悲,才能正常播放PCM數(shù)據(jù)堕汞,在AudioToolbox中使用AudioStreamBasicDescription結(jié)構(gòu)體來描述這些參數(shù)。AudioStreamBasicDescription結(jié)構(gòu)體定義如下:
struct AudioStreamBasicDescription
{
Float64 mSampleRate;//采樣率
AudioFormatID mFormatID;//數(shù)據(jù)格式
AudioFormatFlags mFormatFlags;//數(shù)據(jù)的存儲(chǔ)方式
UInt32 mBytesPerPacket;//每個(gè)packet中有幾個(gè)字節(jié)
UInt32 mFramesPerPacket;//每個(gè)packer中有幾個(gè)frame
UInt32 mBytesPerFrame;//每個(gè)frame中有幾個(gè)字節(jié)
UInt32 mChannelsPerFrame;//每個(gè)frame中有幾個(gè)通道
UInt32 mBitsPerChannel;//每個(gè)通道有多少位
UInt32 mReserved;
};
我用到的PCM數(shù)據(jù)格式是s16le晃琳,通道數(shù)為2讯检,采樣率是44.1K琐鲁。因此我構(gòu)建的AudioStreamBasicDescription結(jié)構(gòu)體如下:
int channels = 2;
AudioStreamBasicDescription audioStreamBasicDescription;
audioStreamBasicDescription.mSampleRate = 44100;
audioStreamBasicDescription.mFormatID = kAudioFormatLinearPCM;
audioStreamBasicDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioStreamBasicDescription.mFramesPerPacket = 1;
audioStreamBasicDescription.mChannelsPerFrame = channels;
audioStreamBasicDescription.mBitsPerChannel = 16;
audioStreamBasicDescription.mBytesPerPacket = channels*2;
audioStreamBasicDescription.mBytesPerFrame = channels*2;
構(gòu)建好這些音頻參數(shù)后,我們就來使用AudioUnit一步一步來播放PCM數(shù)據(jù)人灼。
1围段、初始化AudioUnit
- (void)setupAudioUnit {
AudioComponentDescription audioDesc;
audioDesc.componentType = kAudioUnitType_Output;
audioDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioDesc.componentFlags = 0;
audioDesc.componentFlagsMask = 0;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &audioDesc);
AudioComponentInstanceNew(inputComponent, &_audioUnit);
}
2、設(shè)置AudioUnit屬性
- (void)setupAudioUnitProperty:(AudioStreamBasicDescription)audioStreamBasicDescription {
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &audioStreamBasicDescription, sizeof(audioStreamBasicDescription));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = (AURenderCallback)AudioUnitPlayer_AURenderCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self);
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &callbackStruct, sizeof(callbackStruct));
}
這一步投放,我們把之前構(gòu)建好的AudioStreamBasicDescription音頻參數(shù)告訴AudioUnit奈泪,這樣它就知道怎么去解析播放PCM數(shù)據(jù)了。我們還設(shè)置了一個(gè)回調(diào)方法AudioUnitPlayer_AURenderCallback灸芳,播放音頻時(shí)涝桅,音頻設(shè)備需要數(shù)據(jù)了,設(shè)備就是通過這個(gè)回調(diào)方法告訴我們烙样,它沒有PCM數(shù)據(jù)了冯遂,需要我們傳輸數(shù)據(jù)給它了。是一個(gè)被動(dòng)的傳輸過程谒获,而不是我們主動(dòng)把數(shù)據(jù)喂給設(shè)備蛤肌。
3、定義AudioUnitPlayer_AURenderCallback方法
static OSStatus AudioUnitPlayer_AURenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
memset(ioData->mBuffers[0].mData, 0, ioData->mBuffers[0].mDataByteSize);
AudioUnitPlayer *player = (__bridge AudioUnitPlayer *)inRefCon;
NSData *data = [player.fileHandle readDataOfLength:ioData->mBuffers[0].mDataByteSize];
if (data.length<ioData->mBuffers[0].mDataByteSize) {
if (@available(iOS 13.0, *)) {
[player.fileHandle seekToOffset:0 error:nil];
} else {
[player.fileHandle seekToFileOffset:0];
}
[player stopPlayer];
return noErr;
}
memcpy(ioData->mBuffers[0].mData, data.bytes, data.length);
return noErr;
}
可以看到批狱,我們就是通過這個(gè)方法把PCM數(shù)據(jù)傳輸給音頻設(shè)備的裸准。
4、開始播放或暫停
- (void)startPlayer {
AudioOutputUnitStart(self.audioUnit);
}
- (void)stopPlayer {
AudioOutputUnitStop(self.audioUnit);
}
完整代碼如下:
.h文件
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
NS_ASSUME_NONNULL_BEGIN
@interface AudioUnitPlayer : NSObject
+ (instancetype)player:(AudioStreamBasicDescription)audioDesc pcmPath:(NSString *)pcmPath;
- (void)startPlayer;
- (void)stopPlayer;
@end
NS_ASSUME_NONNULL_END
.m文件
#import "AudioUnitPlayer.h"
@interface AudioUnitPlayer ()
@property (nonatomic, assign) AudioUnit audioUnit;
@property (nonatomic, strong) NSFileHandle *fileHandle;
@end
@implementation AudioUnitPlayer
+ (instancetype)player:(AudioStreamBasicDescription)audioStreamBasicDescription pcmPath:(NSString *)pcmPath
{
AudioUnitPlayer *obj = [[AudioUnitPlayer alloc] init:audioStreamBasicDescription pcmPath:pcmPath];
return obj;
}
- (instancetype)init:(AudioStreamBasicDescription)audioStreamBasicDescription pcmPath:(NSString *)pcmPath
{
self = [super init];
if (self) {
[self setupAudioUnit];
[self setupAudioUnitProperty:audioStreamBasicDescription];
self.fileHandle = [NSFileHandle fileHandleForReadingAtPath:pcmPath];
}
return self;
}
- (void)setupAudioUnit {
AudioComponentDescription audioDesc;
audioDesc.componentType = kAudioUnitType_Output;
audioDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioDesc.componentFlags = 0;
audioDesc.componentFlagsMask = 0;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &audioDesc);
AudioComponentInstanceNew(inputComponent, &_audioUnit);
}
- (void)setupAudioUnitProperty:(AudioStreamBasicDescription)audioStreamBasicDescription {
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &audioStreamBasicDescription, sizeof(audioStreamBasicDescription));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = (AURenderCallback)AudioUnitPlayer_AURenderCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self);
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &callbackStruct, sizeof(callbackStruct));
}
static OSStatus AudioUnitPlayer_AURenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
memset(ioData->mBuffers[0].mData, 0, ioData->mBuffers[0].mDataByteSize);
AudioUnitPlayer *player = (__bridge AudioUnitPlayer *)inRefCon;
NSData *data = [player.fileHandle readDataOfLength:ioData->mBuffers[0].mDataByteSize];
if (data.length<ioData->mBuffers[0].mDataByteSize) {
if (@available(iOS 13.0, *)) {
[player.fileHandle seekToOffset:0 error:nil];
} else {
[player.fileHandle seekToFileOffset:0];
}
[player stopPlayer];
return noErr;
}
memcpy(ioData->mBuffers[0].mData, data.bytes, data.length);
return noErr;
}
- (void)startPlayer {
AudioOutputUnitStart(self.audioUnit);
}
- (void)stopPlayer {
AudioOutputUnitStop(self.audioUnit);
}
@end