最近在項(xiàng)目中有個(gè)需求,公司有一個(gè)設(shè)備兔辅,能夠獲取音頻腊敲,音頻傳輸過(guò)來(lái),解碼后就播放了维苔,需求就是播放時(shí)碰辅,根據(jù)聲音進(jìn)行添加波形動(dòng)畫,網(wǎng)上有一些資源介时,大多都是用AVAudioRecorder的averagePowerForChannel方法没宾,獲取麥上獲取到的音量,然后動(dòng)畫顯示沸柔,如果是一個(gè)PCM碼流的話循衰,需要自己獲取對(duì)應(yīng)的音量信息,而后顯示在動(dòng)畫上了褐澎,后面我找了一些資料会钝,把PCM的數(shù)據(jù)解析出音量信息,然后再顯示工三,效果還可以迁酸,發(fā)出來(lái)參考一下。
先來(lái)點(diǎn)基礎(chǔ)的知識(shí)俭正,首先是采樣頻率奸鬓,指每秒鐘取得聲音樣本的次數(shù),采樣頻率越高段审,包含的聲音信息自然就越多全蝶,聲音也就越好闹蒜,頻率越高,保存需要的空間也會(huì)高抑淫,所以不一定越高越好绷落,看實(shí)際需求。
采樣位寬始苇,即采樣值砌烁,一般分為8位和16位,可以表示的范圍分別是28和216的區(qū)間催式,區(qū)間越大函喉,分辨率也就越大,發(fā)出聲音大能力也就越強(qiáng)荣月,同樣的管呵,位寬越大,需要的空間也就越大哺窄。
聲道數(shù)捐下,分為單聲道和雙聲道,雙聲道即立體聲萌业。
另外一些信息坷襟,需要一些理解能力,鏈接中有詳細(xì)的過(guò)程
dB = 20×log(data^2),data是從PCM中獲取到的對(duì)應(yīng)位寬的數(shù)據(jù)生年,例如婴程,如果是8位就是一個(gè)字節(jié),如果是16位就是2個(gè)字節(jié)
Y = A×sin(2×M_PI×X+Phase),X是橫坐標(biāo)抱婉,phase是相位
Y = (cos(M_PI+2×M_PI×X)+1)/2,X是0~1之間的一個(gè)值
再來(lái)分析一下我需求中的一些信息档叔,我的解碼后獲取到的PCM碼流是位寬為16位,采樣頻率為16KHz的單聲道數(shù)據(jù)授段,每秒鐘的碼流解碼后的PCM數(shù)據(jù)會(huì)被分為5個(gè)包蹲蒲,通過(guò)計(jì)算番甩,每個(gè)包的大小是(16×16000×1/8)/5 = 6400(字節(jié))侵贵,在程序中,我們使用的是CADisplayLink的定時(shí)刷新功能缘薛,以和屏幕一樣的刷新頻率刷新窍育,也就是60Hz,也就是說(shuō)宴胧,我們應(yīng)該保持讓屏幕每隔1/60秒就更新到一個(gè)新數(shù)據(jù)漱抓,所以,解碼后恕齐,每秒的數(shù)據(jù)應(yīng)該被分割成60個(gè)音量值乞娄,也就是說(shuō),五個(gè)包,每個(gè)包有6400個(gè)字節(jié)仪或,一個(gè)包可以獲取到12個(gè)音量值确镊,大概每1600/3個(gè)字節(jié)就得取到一個(gè)音量平均值,這樣就可以簡(jiǎn)單的實(shí)現(xiàn)在屏幕上顯示一段音頻波形動(dòng)畫了范删,不過(guò)蕾域,要注意的是,雖然每隔1/60秒刷新一個(gè)新的數(shù)據(jù)可以讓你的波形得到接近表現(xiàn)真實(shí)的音頻到旦,但是會(huì)導(dǎo)致動(dòng)畫的效果會(huì)發(fā)生類似抖動(dòng)的效果旨巷,因?yàn)椋噜彽拿總€(gè)波形直接添忘,音量差異可能比較大采呐,從一個(gè)波形到另一個(gè)波形的跨度大的話,在切換過(guò)去的時(shí)候就會(huì)出現(xiàn)跳過(guò)去的感覺搁骑,也就是抖動(dòng)懈万,解決這種抖動(dòng)現(xiàn)象需要用到插值,先把從PCM數(shù)據(jù)取音量的次數(shù)降下來(lái)靶病,原來(lái)每個(gè)小包取6400個(gè)字節(jié)取12個(gè)音量值会通,你改為取4次壳嚎,也就是的烁,每6400/4=1600字節(jié)就取一個(gè)音量值,然后兩個(gè)音量值之間再通過(guò)插值的方法秒旋,取2個(gè)值煤辨,我這邊直接簡(jiǎn)單的用一次線性插值取值裳涛,這樣可以使抖動(dòng)不那么明顯,甚至看不出來(lái)众辨,如果還有明顯的抖動(dòng)端三,可以以此類推,再減少取值數(shù)量鹃彻,增加插值數(shù)量郊闯。
/*音頻解碼成功后,在主線程中調(diào)用updateVolume方法蛛株,處理PCM數(shù)據(jù)獲取音頻波形需要的信息*/
-(void)OnDecodeAudio:(unsigned char*)data Length:(int)length
{
if (![_device SupportFunction:FUNCTION_VIDEO]) {
_isFramePreparedOK = YES;
}
if (_progressView.isProgressing) {
return;
}
//輸出
if (_audioPlay) {
//
if (_isSpeaking || _isSilence) {
_audioPlay->Silence(true);
}
else{
_audioPlay->Silence(false);
}
//
if (![_device SupportFunction:FUNCTION_VIDEO]) {
/*data為PCM數(shù)據(jù)团赁,長(zhǎng)度為length個(gè)字節(jié),保存到NSData中方便處理*/
_audioData = [NSData dataWithBytes:data length:length];
NSData* copyData = [_audioData copy];
[self performSelectorOnMainThread:@selector(updateVolume:) withObject:copyData waitUntilDone:NO];
}
//NSLog(@"OnDecodeAudio length(%d)",length);
if (!_audioPlay->Show((char*)data, length)) {
HHAudioPresent_Destroy(_audioPlay);
_audioPlay = NULL;
}
}
}
-(void)updateVolume:(NSData*)volumeData
{
if (![_device SupportFunction:FUNCTION_VIDEO]) {
/*獲取PCM中的振幅系數(shù)信息*/
NSArray* ampValueArray = [self pcmToAverageAmplitude:volumeData];
/*添加到音頻波形隊(duì)列中谨履,且在添加前欢摄,進(jìn)行插值*/
for (NSInteger i = 0; i < ampValueArray.count; i++) {
[_voiceWaveView changeVolume:[ampValueArray[i] floatValue]];
}
}
}
/* 把獲取到的PCM數(shù)據(jù)進(jìn)行處理,得到音頻振幅系數(shù)信息
* @param volumeData PCM數(shù)據(jù)
*/
-(NSArray*)pcmToAverageAmplitude:(NSData*)volumeData
{
NSMutableArray* array = [NSMutableArray array];
short bufferBytes[volumeData.length/2];
memcpy(bufferBytes, volumeData.bytes, volumeData.length);
NSInteger packets = 2;
// 將 buffer 內(nèi)容取出笋粟,進(jìn)行平方和運(yùn)算
for (int i = 0; i < packets; i++)
{
long long pcmSum = 0;
NSUInteger size = volumeData.length/packets/2;
for (int j = 0; j < size; j++) {
pcmSum += bufferBytes[size*i+j]*bufferBytes[size*i+j];
}
double mean = pcmSum / size/2;
double volume = 10*log10(mean);
double maxVolume = 20*log10(pow(2, 16)-1);
[array addObject:[NSNumber numberWithDouble:volume/maxVolume]];
}
return [array copy];
}
某一個(gè)音量值上怀挠,我們使用貝塞爾曲線來(lái)畫波形析蝴,用CAShapeLayer的遮蓋,先在layer上加一條透明的貝塞爾曲線绿淋,這個(gè)曲線是一條正弦波嫌变,頻率固定,再加一個(gè)相位躬它,然后振幅的話腾啥,可以用整個(gè)view的高的一半作為最大振幅,把PCM的音量大小冯吓,作為最大振幅的百分比倘待,這樣,這樣组贺,音量變高的時(shí)候凸舵,出來(lái)的波峰波谷就會(huì)變大,反之變小失尖,就能夠畫出想要的效果啊奄,最大振幅的百分比是通過(guò)以當(dāng)前音量做分子,以系統(tǒng)可以表示的最大音量為分母得到的掀潮,比如菇夸,如果是16位位寬的PCM數(shù)據(jù)的話,最大的音量應(yīng)該是20×log(1/(2^16-1)=-96.32dB仪吧,如果是普通室內(nèi)的聲音庄新,大概在-35dB左右,那振幅百分比就是 36.4% 薯鼠,表示在圖形上择诈,就是會(huì)出現(xiàn)一個(gè)波峰占了整個(gè)View高度36.4%的波形
MCVoiceWaveView.h
//
// MCVoiceWaveView.h
// MCVoiceWave
//
// Created by 朱進(jìn)林 on 10/8/16.
// Copyright ? 2016 Martin Choo. All rights reserved.
//
#import <UIKit/UIKit.h>
#pragma mark - HHVolumeQueue
@interface MCVolumeQueue : NSObject
-(void)pushVolume:(CGFloat)volume;
-(void)pushVolumeWithArray:(NSArray*)array;
-(CGFloat)popVolume;
-(void)cleanQueue;
@end
#pragma mark - HHVoiceWaveView
@interface MCVoiceWaveView : UIView
/**
* 添加并初始化波紋視圖
* parentView:父視圖
*/
-(void)showInParentView:(UIView*)parentView;
/**
* 開始聲波動(dòng)畫
*/
-(void)startVoiceWave;
/**
* 改變音量來(lái)改變聲波幅度
* volume:音量大小
*/
-(void)changeVolume:(CGFloat)volume;
/**
* 停止聲波動(dòng)畫
*/
-(void)stopVoiceWave;
/**
* 移掉聲波
*/
-(void)removeFromParent;
@end
MCVoiceWaveView.m
//
// MCVoiceWaveView.m
// MCVoiceWave
//
// Created by 朱進(jìn)林 on 10/8/16.
// Copyright ? 2016 Martin Choo. All rights reserved.
//
#import "MCVoiceWaveView.h"
#define voiceWaveDisappearDuration 0.25
#define minVolume 0.05
static NSRunLoop* _voiceWaveRunLoop;
#pragma mark - MCVolumeQueue
@interface MCVolumeQueue()
@property (nonatomic, strong) NSMutableArray* volumeArray;
@end
@implementation MCVolumeQueue
-(instancetype)init
{
self = [super init];
if (self) {
self.volumeArray = [NSMutableArray array];
}
return self;
}
-(void)pushVolume:(CGFloat)volume
{
if (volume >= minVolume) {
[_volumeArray addObject:[NSNumber numberWithFloat:volume]];
}
}
-(void)pushVolumeWithArray:(NSArray *)array
{
if (array.count > 0) {
for (NSInteger i = 0; i < array.count; i++) {
CGFloat volume = [array[i] floatValue];
[self pushVolume:volume];
}
}
}
-(CGFloat)popVolume
{
CGFloat volume = -10;
if (_volumeArray.count > 0) {
volume = [[_volumeArray firstObject] floatValue];
[_volumeArray removeObjectAtIndex:0];
}
return volume;
}
-(void)cleanQueue
{
if (_volumeArray) {
[_volumeArray removeAllObjects];
}
}
@end
#pragma mark - MCVoiceWaveView
@interface MCVoiceWaveView(){
CGFloat _idleAmplitude;//最小振幅
CGFloat _amplitude;//振幅系數(shù),表示音量在屏幕上高度的比例
CGFloat _density;//X軸粒度出皇,粒度越小羞芍,線條越順
CGFloat _waveHeight;//波形圖所在view的高
CGFloat _waveWidth;//波形圖所在view的寬
CGFloat _waveMid;//波形圖所在view的中點(diǎn)
CGFloat _maxAmplitude;//最大振幅
//可以多畫幾根線,使聲波波形看起來(lái)更復(fù)雜真實(shí)
CGFloat _phase;//初始相位位移
CGFloat _phaseShift;//_phase累進(jìn)的相位位移量郊艘,造成向前推移的感覺
CGFloat _frequencyFirst;//firstLine在view上的頻率
CGFloat _frequencySecond;//secondLine在view上的頻率
//
CGFloat _currentVolume;//音量相關(guān)
CGFloat _lastVolume;
CGFloat _middleVolume;
//
CGFloat _maxWidth;//波紋顯示最大寬度
CGFloat _beginX;//波紋開始坐標(biāo)
CGFloat _stopAnimationRatio;//衰減系數(shù),停止后避免音量過(guò)大荷科,波紋振幅大,乘以衰減系數(shù)
BOOL _isStopAnimating;//正在進(jìn)行消失動(dòng)畫
//
UIBezierPath* _firstLayerPath;
UIBezierPath* _secondLayerPath;
}
@property (nonatomic, strong) CADisplayLink* displayLink;
@property (nonatomic, strong) CAShapeLayer* firstShapeLayer;
@property (nonatomic, strong) CAShapeLayer* secondShapeLayer;
@property (nonatomic, strong) CAShapeLayer* fillShapeLayer;
//
@property (nonatomic, strong) UIImageView* firstLine;
@property (nonatomic, strong) UIImageView* secondLine;
@property (nonatomic, strong) UIImageView* fillLayerImage;
//
@property (nonatomic, strong) MCVolumeQueue* volumeQueue;
@end
@implementation MCVoiceWaveView
-(void)setup
{
_frequencyFirst = 2.0f;//2個(gè)周期
_frequencySecond = 1.8f;//1.6個(gè)周期暇仲,更平緩步做,有點(diǎn)周期差,使圖像看起來(lái)更有錯(cuò)落感
_amplitude = 1.0f;
_idleAmplitude = 0.01f;
_phase = 0.0f;
_phaseShift = -0.22f;
_density = 1.f;
_waveHeight = CGRectGetHeight(self.bounds);
_waveWidth = CGRectGetWidth(self.bounds);
_waveMid = _waveWidth / 2.0f;
_maxAmplitude = _waveHeight * 0.5;
_maxWidth = _waveWidth + _density;
_beginX = 0.0;
_lastVolume = 0.0;
_currentVolume = 0.0;
_middleVolume = 0.01;
_stopAnimationRatio = 1.0;
[_volumeQueue cleanQueue];
}
-(instancetype)init
{
self = [super init];
if (self) {
[self startVoiceWaveThread];
}
return self;
}
-(void)dealloc
{
[_displayLink invalidate];
}
-(void)voiceWaveThreadEntryPoint:(id)__unused object
{
@autoreleasepool {
[[NSThread currentThread] setName:@"com.anxin-net.VoiceWave"];
_voiceWaveRunLoop = [NSRunLoop currentRunLoop];
[_voiceWaveRunLoop addPort:[NSMachPort port] forMode:NSDefaultRunLoopMode];
[_voiceWaveRunLoop run];
}
}
-(NSThread*)startVoiceWaveThread
{
static NSThread* _voiceWaveThread = nil;
static dispatch_once_t oncePredicate;
dispatch_once(&oncePredicate, ^{
_voiceWaveThread = [[NSThread alloc] initWithTarget:self selector:@selector(voiceWaveThreadEntryPoint:) object:nil];
[_voiceWaveThread start];
});
return _voiceWaveThread;
}
-(void)showInParentView:(UIView *)parentView
{
if (![self.superview isKindOfClass:[parentView class]] || !_isStopAnimating) {
[parentView addSubview:self];
}else {
[self.layer removeAllAnimations];
return;
}
//
self.frame =CGRectMake(0, 0, parentView.bounds.size.width, parentView.bounds.size.height);
[self setup];
//
[self addSubview:self.firstLine];
self.firstLine.frame = self.bounds;
CGFloat firstLineWidth = 5 / [UIScreen mainScreen].scale;
self.firstShapeLayer = [self generateShaperLayerWithLineWidth:firstLineWidth];
self.firstLine.layer.mask = self.firstShapeLayer;
//
[self addSubview:self.secondLine];
self.secondLine.frame = self.bounds;
CGFloat secondLineWidth = 4 / [UIScreen mainScreen].scale;
self.secondShapeLayer = [self generateShaperLayerWithLineWidth:secondLineWidth];
self.secondLine.layer.mask = self.secondShapeLayer;
//
[self addSubview:self.fillLayerImage];
_fillLayerImage.frame = self.bounds;
_fillLayerImage.layer.mask = self.fillShapeLayer;
//
[self updateMeters];
}
-(void)startVoiceWave
{
if (_isStopAnimating) {
return;
}
[self setup];
if (_voiceWaveRunLoop) {
[self.displayLink invalidate];
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(invokeWaveCallback)];
[self.displayLink addToRunLoop:_voiceWaveRunLoop forMode:NSRunLoopCommonModes];
}else {
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3*NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
if (_voiceWaveRunLoop) {
[self.displayLink invalidate];
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(invokeWaveCallback)];
[self.displayLink addToRunLoop:_voiceWaveRunLoop forMode:NSRunLoopCommonModes];
}
});
}
}
-(void)stopVoiceWave
{
if (_isStopAnimating) {
return;
}
[self.layer removeAllAnimations];
_isStopAnimating = YES;
}
-(void)changeVolume:(CGFloat)volume
{
@synchronized (self) {
_lastVolume = _currentVolume;
_currentVolume = volume;
//
NSArray* volumeArray = [self generatePointsOfSize:6 withPowFactor:1 fromStartY:_lastVolume toEndY:_currentVolume];
[self.volumeQueue pushVolumeWithArray:volumeArray];
}
}
-(void)removeFromParent
{
[_displayLink invalidate];
[self removeFromSuperview];
}
-(void)invokeWaveCallback
{
[self updateMeters];
}
-(void)updateMeters
{
CGFloat volume = [self.volumeQueue popVolume];
if (volume > 0) {
_middleVolume = volume;
}else {
_middleVolume -= 0.01;
}
_phase += _phaseShift;
_amplitude = fmax(_middleVolume, _idleAmplitude);
if (_isStopAnimating) {
_stopAnimationRatio -=0.05;
_stopAnimationRatio = fmax(_stopAnimationRatio, 0.01);
if (_stopAnimationRatio == 0.01) {
[self animationStopped];
}
}
_firstLayerPath = nil;
_secondLayerPath = nil;
_firstLayerPath = [self generateBezierPathWithFrequency:_frequencyFirst maxAmplitude:_maxAmplitude phase:_phase];
_secondLayerPath = [self generateBezierPathWithFrequency:_frequencySecond maxAmplitude:_maxAmplitude * 0.8 phase:_phase+3];
//
NSDictionary* dic = @{@"firstPath":_firstLayerPath,@"secondPath":_secondLayerPath};
[self performSelectorOnMainThread:@selector(updateShapeLayerPath:) withObject:dic waitUntilDone:NO];
}
-(void)updateShapeLayerPath:(NSDictionary*)dic
{
UIBezierPath* firstPath = [dic objectForKey:@"firstPath"];
_firstShapeLayer.path = firstPath.CGPath;
UIBezierPath* secondPath = [dic objectForKey:@"secondPath"];
_secondShapeLayer.path = secondPath.CGPath;
if (firstPath && secondPath) {
UIBezierPath* fillPath = [UIBezierPath bezierPathWithCGPath:firstPath.CGPath];
[fillPath appendPath:secondPath];
[fillPath closePath];
_fillShapeLayer.path = fillPath.CGPath;
}
}
-(void)animationStopped
{
[self.displayLink invalidate];
_isStopAnimating = NO;
//
self.layer.mask = nil;
_lastVolume = 0.0;
_currentVolume = 0.0;
_middleVolume = 0.05;
[_volumeQueue cleanQueue];
}
#pragma mark - generate
-(CAShapeLayer*)generateShaperLayerWithLineWidth:(CGFloat)lineWidth
{
CAShapeLayer* waveLine = [CAShapeLayer layer];
waveLine.lineCap = kCALineCapButt;
waveLine.lineJoin = kCALineJoinRound;
waveLine.strokeColor = [UIColor redColor].CGColor;
waveLine.fillColor = [UIColor clearColor].CGColor;
waveLine.lineWidth = lineWidth;
waveLine.backgroundColor = [UIColor clearColor].CGColor;
return waveLine;
}
/** 根據(jù)頻率奈附,最大振幅,相位等信息煮剧,得到代表當(dāng)前音量的波形
*/
-(UIBezierPath*)generateBezierPathWithFrequency:(CGFloat)frequency maxAmplitude:(CGFloat)maxAmplitude phase:(CGFloat)phase
{
UIBezierPath* waveLinePath = [UIBezierPath bezierPath];
CGFloat normedAmplitude = fmin(_amplitude, 1.0);//振幅百分比斥滤,最高只能是1
//按X軸粒度連接多個(gè)點(diǎn)将鸵,拼接在一起,形成類似曲線的波形
for (CGFloat x = _beginX; x < _maxWidth; x += _density) {
CGFloat scaling = (1+cosf(M_PI+(x/_maxWidth)*2*M_PI))/2;
CGFloat y = scaling * _maxAmplitude * normedAmplitude * _stopAnimationRatio * sinf(2 * M_PI * (x / _waveWidth) * frequency + phase) + (_waveHeight * 0.5);
if (_beginX == x) {
[waveLinePath moveToPoint:CGPointMake(x, y)];
}else {
[waveLinePath addLineToPoint:CGPointMake(x, y)];
}
}
return waveLinePath;
}
/**插值方法佑颇,在相鄰的兩個(gè)值中間顶掉,插入若干個(gè)值,使波形切換時(shí)過(guò)渡更平滑*/
-(NSArray*)generatePointsOfSize:(NSInteger)size withPowFactor:(CGFloat)factor fromStartY:(CGFloat)startY toEndY:(CGFloat)endY
{
BOOL factorValid = factor < 2 && factor > 0 && factor != 0;
BOOL startYValid = 0 <= startY && startY <= 1;
BOOL endYValid = 0 <= endY && endY <= 1;
if (!(factorValid && startYValid && endYValid)) {
return nil;
}
//
NSMutableArray* mArray = [NSMutableArray arrayWithCapacity:size];
CGFloat startX,endX;
startX = pow(startY, 1/factor);
endX = pow(endY, 1/factor);
//
CGFloat pieceOfX = (endX - startX) / size;
CGFloat x,y;
[mArray addObject:[NSNumber numberWithFloat:startY]];
for (int i = 1; i < size; ++i) {
x = startX + pieceOfX * i;
y = pow(x, factor);
[mArray addObject:[NSNumber numberWithFloat:y]];
}
return [mArray copy];
}
#pragma mark - getter
-(UIImageView*)firstLine
{
if (!_firstLine) {
self.firstLine = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"pic_firstLine.png"]];
_firstLine.layer.masksToBounds = YES;
}
return _firstLine;
}
-(UIImageView*)secondLine
{
if (!_secondLine) {
self.secondLine = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"pic_secondLine.png"]];
_secondLine.layer.masksToBounds = YES;
_secondLine.alpha = 0.6;
}
return _secondLine;
}
-(UIImageView*)fillLayerImage
{
if (!_fillLayerImage) {
self.fillLayerImage = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"pic_fill.png"]];
_fillLayerImage.layer.masksToBounds = YES;
_fillLayerImage.alpha = 0.2;
}
return _fillLayerImage;
}
-(CAShapeLayer*)fillShapeLayer
{
if (!_fillShapeLayer) {
self.fillShapeLayer = [CAShapeLayer layer];
_fillShapeLayer.lineCap = kCALineCapButt;
_fillShapeLayer.lineJoin = kCALineJoinRound;
_fillShapeLayer.strokeColor = [UIColor clearColor].CGColor;
_fillShapeLayer.fillColor = [UIColor redColor].CGColor;
_fillShapeLayer.fillRule = @"even-odd";
_fillShapeLayer.lineWidth = 2;
_fillShapeLayer.backgroundColor = [UIColor clearColor].CGColor;
}
return _fillShapeLayer;
}
-(MCVolumeQueue*)volumeQueue
{
if (!_volumeQueue) {
self.volumeQueue = [[MCVolumeQueue alloc] init];
}
return _volumeQueue;
}
@end
以上部分代碼涉及公司機(jī)密挑胸,不方便把全部源碼貼出痒筒,我在Github上貼出的是通過(guò)捕捉麥克風(fēng)輸入的音頻的波形,如果有改進(jìn)建議或者疑問的話茬贵,可以聯(lián)系我簿透,感謝分享!
效果圖:
這是用AVAudioRecord獲取麥克風(fēng)的音頻解藻,獲取PCM數(shù)據(jù)進(jìn)行處理后老充,得到的效果圖如下:
代碼如下: https://github.com/HaloMartin/MCVoiceWave
參考資料鏈接
音量分貝計(jì)算:http://www.cnblogs.com/karlchen/archive/2007/04/10/707478.html
貝塞爾曲線:http://blog.csdn.net/likendsl/article/details/7852658
附上函數(shù)圖像繪制工具