ios中的視頻采集及參數(shù)設(shè)置和相機操作

概述

在直播應(yīng)用中曼尊,視頻的采集一般都是用AVFoundation框架盟榴,因為利用它我們能定制采集視頻的參數(shù)曹质;也能做切換手機攝像頭、拍照擎场、打開手電筒等一些列相機的操作羽德;當(dāng)然,更重要的一點是我們能獲取到原始視頻數(shù)據(jù)用來做編碼等操作顶籽。這篇文章我們介紹的內(nèi)容如下:

  • 介紹和視頻采集相關(guān)的關(guān)鍵類
  • 介紹視頻采集的步驟
  • 介紹如何改變視頻采集的參數(shù)玩般,例如:分辨率,幀率礼饱,放大&縮小預(yù)覽層坏为,設(shè)置曝光等究驴。
  • 詳細(xì)介紹相機操作,例如:拍照匀伏、切換前后鏡頭洒忧、打開&關(guān)閉手電筒等操作。

代碼:

視頻采集的關(guān)鍵類

AVCaptureDevice

它表示硬件設(shè)備够颠,我們可以從這個類中獲取手機硬件的照相機熙侍,聲音傳感器等。當(dāng)我們需要改變一些硬件設(shè)備的屬性時(例如:閃光模式改變履磨,相機聚焦改變等)蛉抓,必須要在改變設(shè)備屬性之前調(diào)用lockForConfiguration為設(shè)備加鎖,改變完成后調(diào)用unlockForConfiguration方法解鎖設(shè)備剃诅。

AVCaptureDeviceInput

輸入設(shè)備管理對象巷送,可以根據(jù)AVCaptureDevice創(chuàng)建創(chuàng)建對應(yīng)的AVCaptureDeviceInput對象,該對象會被添加到AVCaptureSession中管理矛辕。它代表輸入設(shè)備笑跛,它配置硬件設(shè)備的ports,通常的輸入設(shè)備有(麥克風(fēng),相機等)聊品。

AVCaptureOutput

代表輸出數(shù)據(jù)飞蹂,輸出的可以是圖片(AVCaptureStillImageOutput)或者視頻(AVCaptureMovieFileOutput)

AVCaptureSession

媒體捕捉會話,負(fù)責(zé)把捕捉的音視頻數(shù)據(jù)輸出到輸出設(shè)備中翻屈。一個AVCaptureSession可以有多個輸入或輸出陈哑。它是連接AVCaptureInput和AVCaptureOutput的橋梁,它協(xié)調(diào)input到output之間傳輸數(shù)據(jù)伸眶。它用startRunning和stopRunning兩種方法來開啟和結(jié)束會話芥颈。

每個session稱之為一個會話,也就是在應(yīng)用運行過程中如果需要改變會話的一些配置(eg:切換攝像頭),此時需要先開啟配置赚抡,配置完成之后再提交配置。

AVCaptureConnection

AVCaptureConnection represents a connection between an AVCaptureInputPort or ports, and an AVCaptureOutput or AVCaptureVideoPreviewLayer present in an AVCaptureSession.即它是一個連接纠屋,這個連接是inputPort和output之間或者是圖像當(dāng)前預(yù)覽層和當(dāng)前會話之間的涂臣。

AVCaptureVideoPreviewPlayer

它是圖片預(yù)覽層。我們的照片以及視頻是如何顯示在手機上的呢售担?那就是通過把這個對象添加到UIView 的layer上的赁遗。

視頻采集的步驟

以下是視頻采集的代碼,幀率是30FPS族铆,分辨率是1920*1080岩四。

#import "MiVideoCollectVC.h"
#import <AVFoundation/AVFoundation.h>

@interface MiVideoCollectVC ()<AVCaptureVideoDataOutputSampleBufferDelegate>
@property (nonatomic,strong) AVCaptureVideoDataOutput *video_output;
@property (nonatomic,strong) AVCaptureSession  *m_session;

@property (weak, nonatomic) IBOutlet UIView *m_displayView;
@end

@implementation MiVideoCollectVC

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view.
    
     [self startCaptureSession];
}

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    [self startPreview];
}
- (IBAction)onpressedBtnDismiss:(id)sender {
    [self dismissViewControllerAnimated:YES completion:^{
        [self stopPreview];
    }];
}

- (void)startCaptureSession
{
    NSError *error = nil;
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
        session.sessionPreset = AVCaptureSessionPreset1920x1080;
    }else{
        session.sessionPreset = AVCaptureSessionPresetHigh;
    }
    
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (error || !input) {
        NSLog(@"get input device error...");
        return;
    }
    [session addInput:input];
    
    _video_output = [[AVCaptureVideoDataOutput alloc] init];
    [session addOutput:_video_output];
    
    // Specify the pixel format
    _video_output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
                                                              forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    _video_output.alwaysDiscardsLateVideoFrames = NO;
    dispatch_queue_t video_queue = dispatch_queue_create("MIVideoQueue", NULL);
    [_video_output setSampleBufferDelegate:self queue:video_queue];
    
    CMTime frameDuration = CMTimeMake(1, 30);
    BOOL frameRateSupported = NO;
    
    for (AVFrameRateRange *range in [device.activeFormat videoSupportedFrameRateRanges]) {
        if (CMTIME_COMPARE_INLINE(frameDuration, >=, range.minFrameDuration) &&
            CMTIME_COMPARE_INLINE(frameDuration, <=, range.maxFrameDuration)) {
            frameRateSupported = YES;
        }
    }
    
    if (frameRateSupported && [device lockForConfiguration:&error]) {
        [device setActiveVideoMaxFrameDuration:frameDuration];
        [device setActiveVideoMinFrameDuration:frameDuration];
        [device unlockForConfiguration];
    }
    
    [self adjustVideoStabilization];
    _m_session = session;
    
    
    CALayer *previewViewLayer = [self.m_displayView layer];
    previewViewLayer.backgroundColor = [[UIColor blackColor] CGColor];
    
    AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_m_session];
    
    [newPreviewLayer setFrame:[UIApplication sharedApplication].keyWindow.bounds];
    
    [newPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    //    [previewViewLayer insertSublayer:newPreviewLayer atIndex:2];
    [previewViewLayer insertSublayer:newPreviewLayer atIndex:0];
}

- (void)adjustVideoStabilization
{
    NSArray *devices = [AVCaptureDevice devices];
    for (AVCaptureDevice *device in devices) {
        if ([device hasMediaType:AVMediaTypeVideo]) {
            if ([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
                for (AVCaptureConnection *connection in _video_output.connections) {
                    for (AVCaptureInputPort *port in [connection inputPorts]) {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
                            if (connection.supportsVideoStabilization) {
                                connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
                                NSLog(@"now videoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
                            }else{
                                NSLog(@"connection does not support video stablization");
                            }
                        }
                    }
                }
            }else{
                NSLog(@"device does not support video stablization");
            }
        }
    }
}

- (void)startPreview
{
    if (![_m_session isRunning]) {
        [_m_session startRunning];
    }
}

- (void)stopPreview
{
    if ([_m_session isRunning]) {
        [_m_session stopRunning];
    }
}

#pragma mark -AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"%s",__func__);
}

// 有丟幀時,此代理方法會觸發(fā)
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"MediaIOS: 丟幀...");
}

@end

視頻采集的具體步驟總結(jié)如下:

  1. 首先創(chuàng)建一個AVCaptureSession對象哥攘,并且為該對象輸入設(shè)備和輸出設(shè)備并把輸入輸出設(shè)備添加到AVCaptrueSession對象剖煌。
  2. 為AVCaptureSession設(shè)置視頻分辨率
  3. 設(shè)置視頻采集的幀率
  4. 創(chuàng)建視頻預(yù)覽層并插入到view的layer中

改變視頻采集參數(shù)-分辨率和幀率

我們先不介紹如何改變視頻的分辨率和幀率材鹦,我們首先來講一下如何監(jiān)控視頻采集的這些參數(shù),因為我們只有能監(jiān)控到這些參數(shù)的變化才能知道我們對這些參數(shù)的設(shè)置是否成功耕姊。

監(jiān)控視頻分辨率:

我們可以通過AVCaptureSession對象的sessionPreset直接獲取到桶唐,它是一個字符串,我們設(shè)置完成之后直接打印一下就可以了茉兰。

監(jiān)控視頻幀率:

視頻的幀率表示的是每秒采集的視頻幀數(shù)尤泽,我們可以通過啟動一個timer(1s刷新一次),來實時打印當(dāng)前采集的視頻幀率是多少规脸。下面是計算1s內(nèi)采集視頻幀數(shù)的代碼:

// 計算每秒鐘采集視頻多少幀
static int captureVideoFPS;
+ (void)calculatorCaptureFPS
{
    static int count = 0;
    static float lastTime = 0;
    CMClockRef hostClockRef = CMClockGetHostTimeClock();
    CMTime hostTime = CMClockGetTime(hostClockRef);
    float nowTime = CMTimeGetSeconds(hostTime);
    if(nowTime - lastTime >= 1)
    {
        captureVideoFPS = count;
        lastTime = nowTime;
        count = 0;
    }
    else
    {
        count ++;
    }
}

// 獲取視頻幀率
+ (int)getCaptureVideoFPS
{
    return captureVideoFPS;
}

改變分辨率

/**
 *  Reset resolution
 *
 *  @param m_session     AVCaptureSession instance
 *  @param resolution
 */
+ (void)resetSessionPreset:(AVCaptureSession *)m_session resolution:(int)resolution
{
    [m_session beginConfiguration];
    switch (resolution) {
        case 1080:
            m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1920x1080] ? AVCaptureSessionPreset1920x1080 : AVCaptureSessionPresetHigh;
            break;
        case 720:
            m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1280x720] ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPresetMedium;
            break;
        case 480:
            m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset640x480] ? AVCaptureSessionPreset640x480 : AVCaptureSessionPresetMedium;
            break;
        case 360:
            m_session.sessionPreset = AVCaptureSessionPresetMedium;
            break;
            
        default:
            break;
    }
    [m_session commitConfiguration];
}


改變視頻幀率

+ (void)settingFrameRate:(int)frameRate
{
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    [captureDevice lockForConfiguration:NULL];
    @try {
        [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
        [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
    } @catch (NSException *exception) {
        NSLog(@"MediaIOS, 設(shè)備不支持所設(shè)置的分辨率坯约,錯誤信息:%@",exception.description);
    } @finally {
        
    }
    
    [captureDevice unlockForConfiguration];
}

為視頻預(yù)覽層添加捏合手勢

在用雙手勢時,可以放大縮小所預(yù)覽的視頻莫鸭。

#define MiMaxZoomFactor 5.0f
#define MiPrinchVelocityDividerFactor 20.0f

+ (void)zoomCapture:(UIPinchGestureRecognizer *)recognizer
{
    
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    [videoDevice formats];
    if ([recognizer state] == UIGestureRecognizerStateChanged) {
        NSError *error = nil;
        if ([videoDevice lockForConfiguration:&error]) {
            CGFloat desiredZoomFactor = videoDevice.videoZoomFactor + atan2f(recognizer.velocity, MiPrinchVelocityDividerFactor);
            videoDevice.videoZoomFactor = desiredZoomFactor <= MiMaxZoomFactor ? MAX(1.0, MIN(desiredZoomFactor, videoDevice.activeFormat.videoMaxZoomFactor)) : MiMaxZoomFactor ;
            [videoDevice unlockForConfiguration];
        } else {
            NSLog(@"error: %@", error);
        }
    }
    
}

相機操作

在視頻采集的時候闹丐,可能還伴隨有切換前后鏡頭、打開&關(guān)閉閃光燈黔龟、拍照等操作妇智。

切換相機前后鏡頭

此處切換鏡頭后,我把分辨率默認(rèn)設(shè)置為了720p,因為對于有的設(shè)備可能前置攝像頭不支持1080p氏身,所以我在此設(shè)定一個固定的720p巍棱,如果在真實的項目中,這個值應(yīng)該是你以前設(shè)定的那個值蛋欣,如果前置攝像頭不支持對應(yīng)的又不支持的策略航徙。

// 切換攝像頭
- (void)switchCamera
{
    [_m_session beginConfiguration];
    if ([[_video_input device] position] == AVCaptureDevicePositionBack) {
        NSArray * devices = [AVCaptureDevice devices];
        for(AVCaptureDevice * device in devices) {
            if([device hasMediaType:AVMediaTypeVideo]) {
                if([device position] == AVCaptureDevicePositionFront) {
                    [self rePreviewWithCameraType:MiCameraType_Front device:device];
                    break;
                }
            }
        }
    }else{
        NSArray * devices = [AVCaptureDevice devices];
        for(AVCaptureDevice * device in devices) {
            if([device hasMediaType:AVMediaTypeVideo]) {
                if([device position] == AVCaptureDevicePositionBack) {
                    [self rePreviewWithCameraType:MiCameraType_Back device:device];
                    break;
                }
            }
        }
    }
    [_m_session commitConfiguration];
}

- (void)rePreviewWithCameraType:(MiCameraType)cameraType device:(AVCaptureDevice *)device {
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (!input) return;
    
    [_m_session removeInput:_video_input];
    _m_session.sessionPreset = AVCaptureSessionPresetLow;
    if ([_m_session canAddInput:input])  {
        [_m_session addInput:input];
    }else {
        return;
    }
    _video_input      = input;
    _m_cameraType    = cameraType;
    NSString *preset = AVCaptureSessionPreset1280x720;
    if([device supportsAVCaptureSessionPreset:preset] && [_m_session canSetSessionPreset:preset]) {
        _m_session.sessionPreset = preset;
    }else {
        NSString *sesssionPreset = AVCaptureSessionPreset1280x720;
        if(![sesssionPreset isEqualToString:preset]) {
            _m_session.sessionPreset = sesssionPreset;
        }
    }
}

打開關(guān)閉閃光燈

// 打開關(guān)閉閃光燈
-(void)switchTorch
{
    [_m_session beginConfiguration];
    [[_video_input device] lockForConfiguration:NULL];
    
    self.m_torchMode = [_video_input device].torchMode == AVCaptureTorchModeOn ? AVCaptureTorchModeOff : AVCaptureTorchModeOn;
    
    if ([[_video_input device] isTorchModeSupported:_m_torchMode ]) {
        [_video_input device].torchMode = self.m_torchMode;
    }
    [[_video_input device] unlockForConfiguration];
    [_m_session commitConfiguration];
}

拍照并保存到相冊

具體的方案是:

  • 設(shè)置一個flag,在視頻采集的代理方法中監(jiān)測這個flag陷虎,當(dāng)觸發(fā)了拍照動作后改變flag的值
  • 在視頻采集的代理方法中判斷flag的值是否為需要拍照的裝填到踏,如果是則轉(zhuǎn)化當(dāng)前幀CMSampleBufferRef為UIImage,然后再把UIImage存儲到相冊中

注意:以下代碼只有指定像素格式為RGB的時候,才能保存成功一張彩色的照片到相冊尚猿。

- (UIImage *)convertSameBufferToUIImage:(CMSampleBufferRef)sampleBuffer
{
    // 為媒體數(shù)據(jù)設(shè)置一個CMSampleBuffer的Core Video圖像緩存對象
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // 鎖定pixel buffer的基地址
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    // 得到pixel buffer的基地址
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // 得到pixel buffer的行字節(jié)數(shù)
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // 得到pixel buffer的寬和高
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    // 創(chuàng)建一個依賴于設(shè)備的RGB顏色空間
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    // 用抽樣緩存的數(shù)據(jù)創(chuàng)建一個位圖格式的圖形上下文(graphics context)對象
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // 根據(jù)這個位圖context中的像素數(shù)據(jù)創(chuàng)建一個Quartz image對象
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // 解鎖pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    // 釋放context和顏色空間
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    // 用Quartz image創(chuàng)建一個UIImage對象image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];
    // 釋放Quartz image對象
    CGImageRelease(quartzImage);
    return (image);
}

+ (void)saveImageToSysphotos:(UIImage *)image
{
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library writeImageToSavedPhotosAlbum:image.CGImage metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
        if (error) {
            NSLog(@"MediaIos, save photo to photos error, error info: %@",error.description);
        }else{
            NSLog(@"MediaIos, save photo success...");
        }
    }];
}

設(shè)置自動對焦

// 設(shè)置為自動對焦
- (void)mifocus:(UITapGestureRecognizer *)sender
{
    CGPoint point = [sender locationInView:self.m_displayView];
    [self miAutoFocusWithPoint:point];
    NSLog(@"MediaIos, auto focus complete...");
}

- (void)miAutoFocusWithPoint:(CGPoint)touchPoint{
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if ([captureDevice isFocusPointOfInterestSupported] && [captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([captureDevice lockForConfiguration:&error]) {
            // 設(shè)置曝光點
            [captureDevice setExposurePointOfInterest:touchPoint];
            [captureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
            
            // 設(shè)置對焦點
            [captureDevice setFocusPointOfInterest:touchPoint];
            [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
            [captureDevice unlockForConfiguration];
        }
    }
}

曝光調(diào)節(jié)

// 曝光調(diào)節(jié)
- (void)changeExposure:(id)sender
{
    UISlider *slider = (UISlider *)sender;
    [self michangeExposure:slider.value];
    
}

- (void)michangeExposure:(CGFloat)value{
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error;
    if ([device lockForConfiguration:&error]) {
        [device setExposureTargetBias:value completionHandler:nil];
        [device unlockForConfiguration];
    }
}

設(shè)置黑白平衡

- (AVCaptureWhiteBalanceGains)recalcGains:(AVCaptureWhiteBalanceGains)gains
                                 minValue:(CGFloat)minValue
                                 maxValue:(CGFloat)maxValue
{
    AVCaptureWhiteBalanceGains tmpGains = gains;
    tmpGains.blueGain   = MAX(MIN(tmpGains.blueGain , maxValue), minValue);
    tmpGains.redGain    = MAX(MIN(tmpGains.redGain  , maxValue), minValue);
    tmpGains.greenGain  = MAX(MIN(tmpGains.greenGain, maxValue), minValue);
    return tmpGains;
}

-(void)setWhiteBlanceUseTemperature:(CGFloat)temperature{
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = temperature,
            .tint        = currentTint,
        };
        
        AVCaptureWhiteBalanceGains gains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
        gains = [self recalcGains:gains minValue:1 maxValue:maxWhiteBalanceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:gains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

// 黑白平衡調(diào)節(jié)
- (void)whiteBlanceChange:(id)sender
{
    UISlider *slider = (UISlider *)sender;
    [self setWhiteBlanceUseTemperature:slider.value];
}

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末窝稿,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子凿掂,更是在濱河造成了極大的恐慌伴榔,老刑警劉巖,帶你破解...
    沈念sama閱讀 216,372評論 6 498
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件庄萎,死亡現(xiàn)場離奇詭異踪少,居然都是意外死亡,警方通過查閱死者的電腦和手機糠涛,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,368評論 3 392
  • 文/潘曉璐 我一進(jìn)店門援奢,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人忍捡,你說我怎么就攤上這事集漾×ǘ迹” “怎么了费封?”我有些...
    開封第一講書人閱讀 162,415評論 0 353
  • 文/不壞的土叔 我叫張陵号涯,是天一觀的道長亡驰。 經(jīng)常有香客問我,道長栽连,這世上最難降的妖魔是什么险领? 我笑而不...
    開封第一講書人閱讀 58,157評論 1 292
  • 正文 為了忘掉前任,我火速辦了婚禮秒紧,結(jié)果婚禮上绢陌,老公的妹妹穿的比我還像新娘。我一直安慰自己熔恢,他們只是感情好脐湾,可當(dāng)我...
    茶點故事閱讀 67,171評論 6 388
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著叙淌,像睡著了一般秤掌。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上鹰霍,一...
    開封第一講書人閱讀 51,125評論 1 297
  • 那天闻鉴,我揣著相機與錄音,去河邊找鬼茂洒。 笑死孟岛,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的督勺。 我是一名探鬼主播渠羞,決...
    沈念sama閱讀 40,028評論 3 417
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼智哀!你這毒婦竟也來了次询?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 38,887評論 0 274
  • 序言:老撾萬榮一對情侶失蹤瓷叫,失蹤者是張志新(化名)和其女友劉穎渗蟹,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體赞辩,經(jīng)...
    沈念sama閱讀 45,310評論 1 310
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,533評論 2 332
  • 正文 我和宋清朗相戀三年授艰,在試婚紗的時候發(fā)現(xiàn)自己被綠了辨嗽。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 39,690評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡淮腾,死狀恐怖糟需,靈堂內(nèi)的尸體忽然破棺而出屉佳,到底是詐尸還是另有隱情,我是刑警寧澤洲押,帶...
    沈念sama閱讀 35,411評論 5 343
  • 正文 年R本政府宣布武花,位于F島的核電站,受9級特大地震影響杈帐,放射性物質(zhì)發(fā)生泄漏体箕。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點故事閱讀 41,004評論 3 325
  • 文/蒙蒙 一挑童、第九天 我趴在偏房一處隱蔽的房頂上張望累铅。 院中可真熱鬧,春花似錦站叼、人聲如沸娃兽。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,659評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽投储。三九已至,卻和暖如春阔馋,著一層夾襖步出監(jiān)牢的瞬間玛荞,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 32,812評論 1 268
  • 我被黑心中介騙來泰國打工垦缅, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留冲泥,地道東北人。 一個月前我還...
    沈念sama閱讀 47,693評論 2 368
  • 正文 我出身青樓壁涎,卻偏偏與公主長得像凡恍,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子怔球,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,577評論 2 353

推薦閱讀更多精彩內(nèi)容