關于Netty的ByteBuff內存泄漏問題

之前做的東華車管數(shù)據(jù)采集平臺總是發(fā)生數(shù)據(jù)丟失的情況,雖然不頻繁但是還是要關注一下原因般妙,于是今天提高了Netty的Log級別纪铺,打算查找一下問題出在哪了,提高級別代碼:

ServerBootstrap b =new ServerBootstrap();
b.group(bossGroup,workerGroup).channel(NioServerSocketChannel.class).option(ChannelOption.SO_BACKLOG, 2048).handler(new LoggingHandler(LogLevel.DEBUG)).childHandler(new ChildChannelHandler());

將Loglevel設置成DEBUG模式就OK了碟渺。
于是開始安心的觀察日志:

2017-01-19 10:04:46  [ nioEventLoopGroup-1-0:1625429 ] - [ INFO ]  消息主體:60160308049620860021010707190117020453395443491162627407087d081f00002e37008801008c00f9
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628830 ] - [ ERROR ]  LEAK: ByteBuf.release() was not called before it's garbage-collected. Enable advanced leak reporting to find out where the leak occurred. To enable advanced leak reporting, specify the JVM option '-Dio.netty.leakDetectionLevel=advanced' or call ResourceLeakDetector.setLevel() See http://netty.io/wiki/reference-counted-objects.html for more information.
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628845 ] - [ INFO ]  入緩存隊列操作結果:9
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628845 ] - [ INFO ]  消息主體:601603080496208600210107071901170204573954434611626262170f88091f00002e37008801008c00fa
2017-01-19 10:04:53  [ nioEventLoopGroup-1-0:1632839 ] - [ INFO ]  入緩存隊列操作結果:9
2017-01-19 10:04:53  [ nioEventLoopGroup-1-0:1632839 ] - [ INFO ]  消息主體:60160308049620860021010707190117020501395443581162624817108a091f00002e37008801008c00fb
2017-01-19 10:04:55  [ nioEventLoopGroup-1-0:1634196 ] - [ INFO ]  入緩存隊列操作結果:9
2017-01-19 10:04:55  [ nioEventLoopGroup-1-0:1634196 ] - [ INFO ]  消息主體:601603080496208600210107071901170205023954436011626244571288091f00002e37008801008c00fc
2017-01-19 10:04:56  [ nioEventLoopGroup-1-0:1635288 ] - [ INFO ]  入緩存隊列操作結果:9
2017-01-19 10:04:56  [ nioEventLoopGroup-1-0:1635288 ] - [ INFO ]  消息主體:60160308049620860021010707190117020503395443651162624107118a091f00002e37008801008c00fd
2017-01-19 10:04:57  [ nioEventLoopGroup-1-0:1636443 ] - [ INFO ]  入緩存隊列操作結果:9
2017-01-19 10:04:57  [ nioEventLoopGroup-1-0:1636443 ] - [ INFO ]  消息主體:601603080496208600210107071901170205053954437111626234671088091f00002e37008801008c00fe

注意這句話:

LEAK: ByteBuf.release() was not called before it's garbage-collected. Enable advanced leak reporting to find out where the leak occurred. To enable advanced leak reporting, specify the JVM option '-Dio.netty.leakDetectionLevel=advanced' or call ResourceLeakDetector.setLevel() See http://netty.io/wiki/reference-counted-objects.html for more information.

通過這句話我們可以得知鲜锚,只要加入

ResourceLeakDetector.setLevel(ResourceLeakDetector.Level.ADVANCED);

將警告級別設置成Advaced即可查到更詳細的泄漏信息,之后再度查看日志:

2017-01-19 10:35:59  [ nioEventLoopGroup-1-0:665092 ] - [ ERROR ]  LEAK: ByteBuf.release() was not called before it's garbage-collected. See http://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records: 5
#5:
    io.netty.buffer.AdvancedLeakAwareByteBuf.readBytes(AdvancedLeakAwareByteBuf.java:435)
    com.dhcc.ObdServer.ObdServerHandler.channelRead(ObdServerHandler.java:31)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:243)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#4:
    Hint: 'ObdServerHandler#0' will handle the message from this point.
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:387)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:243)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#3:
    io.netty.buffer.AdvancedLeakAwareByteBuf.release(AdvancedLeakAwareByteBuf.java:721)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:237)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#2:
    io.netty.buffer.AdvancedLeakAwareByteBuf.retain(AdvancedLeakAwareByteBuf.java:693)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:277)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:216)
    io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:316)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#1:
    io.netty.buffer.AdvancedLeakAwareByteBuf.skipBytes(AdvancedLeakAwareByteBuf.java:465)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:272)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:216)
    io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:316)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
Created at:
    io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:250)
    io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:155)
    io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:146)
    io.netty.buffer.AbstractByteBufAllocator.ioBuffer(AbstractByteBufAllocator.java:107)
    io.netty.channel.AdaptiveRecvByteBufAllocator$HandleImpl.allocate(AdaptiveRecvByteBufAllocator.java:104)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:113)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)

定位到我的代碼中為:

ByteBuf buff=(ByteBuf) msg;
byte[] req=new byte[buff.readableBytes()];

于是可以確定是ByteBuff內存泄漏導致的問題苫拍,于是從這方面著手調查芜繁,發(fā)現(xiàn)netty5默認的分配bytebuff的方式是PooledByteBufAllocator,所以要手動回收,要不然會造成內存泄漏绒极。
于是釋放ByteBuff即可

ReferenceCountUtil.release(buff);

這里引入一個網(wǎng)友對于這行代碼的說明:

ReferenceCountUtil.release()其實是ByteBuf.release()方法(從ReferenceCounted接口繼承而來)的包裝骏令。netty4中的ByteBuf使用了引用計數(shù)(netty4實現(xiàn)了一個可選的ByteBuf池),每一個新分配的ByteBuf>>的引用計數(shù)值為1垄提,每對這個ByteBuf對象增加一個引用榔袋,需要調用ByteBuf.retain()方法,而每減少一個引用铡俐,需要調用ByteBuf.release()方法凰兑。當這個ByteBuf對象的引用計數(shù)值為0時,表示此對象可回收审丘。我這只是用ByteBuf說明吏够,還有其他對象實現(xiàn)了ReferenceCounted接口,此時同理滩报。

在檢查問題的過程中锅知,我還懷疑是不是我的Netty使用了UDP協(xié)議導致的數(shù)據(jù)丟失,于是這里附上Netty使用的是TCP還是UDP的判斷方法:

關于TCP和UDP
socket可以基于TCP脓钾,也可以基于UDP喉镰。區(qū)別在于UDP的不保證數(shù)據(jù)包都正確收到,所以性能更好惭笑,但容錯不高侣姆。TCP保證不錯生真,所以性能沒那么好。
UDP基本只適合做在線視頻傳輸之類捺宗,我們的需求應該會是TCP柱蟀。

那這2種方式在寫法上有什么不同?網(wǎng)上搜到這樣的說法:

在ChannelFactory 的選擇上蚜厉,UDP的通信選擇 NioDatagramChannelFactory长已,TCP的通信我們選擇的是NioServerSocketChannelFactory;
在Bootstrap的選擇上昼牛,UDP選擇的是ConnectionlessBootstrap术瓮,而TCP選擇的是ServerBootstrap。

對于編解碼器decoder和Encoder贰健,以及ChannelPipelineFactory胞四,UDP開發(fā)與TCP并沒有什么區(qū)別,在此不做詳細介紹伶椿。

對于ChannelHandler辜伟,是UDP與TCP區(qū)別的核心所在。大家都知道UDP是無連接的脊另,也就是說你通過 MessageEvent 參數(shù)對象的 getChannel() 方法獲取當前會話連接导狡,但是其 isConnected() 永遠都返回 false。
UDP 開發(fā)中在消息獲取事件回調方法中偎痛,獲取了當前會話連接 channel 對象后可直接通過 channel 的 write 方法發(fā)送數(shù)據(jù)給對端 channel.write(message, remoteAddress)旱捧,第一個參數(shù)仍然是要發(fā)送的消息對象,
第二個參數(shù)則是要發(fā)送的對端 SocketAddress 地址對象踩麦。
這里最需要注意的一點是SocketAddress廊佩,在TCP通信中我們可以通過channel.getRemoteAddress()獲得,但在UDP通信中靖榕,我們必須從MessageEvent中通過調用getRemoteAddress()方法獲得對端的SocketAddress 地址标锄。

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市茁计,隨后出現(xiàn)的幾起案子料皇,更是在濱河造成了極大的恐慌,老刑警劉巖星压,帶你破解...
    沈念sama閱讀 211,290評論 6 491
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件践剂,死亡現(xiàn)場離奇詭異,居然都是意外死亡娜膘,警方通過查閱死者的電腦和手機逊脯,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 90,107評論 2 385
  • 文/潘曉璐 我一進店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來竣贪,“玉大人军洼,你說我怎么就攤上這事巩螃。” “怎么了匕争?”我有些...
    開封第一講書人閱讀 156,872評論 0 347
  • 文/不壞的土叔 我叫張陵避乏,是天一觀的道長。 經(jīng)常有香客問我甘桑,道長拍皮,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 56,415評論 1 283
  • 正文 為了忘掉前任跑杭,我火速辦了婚禮铆帽,結果婚禮上,老公的妹妹穿的比我還像新娘德谅。我一直安慰自己爹橱,他們只是感情好,可當我...
    茶點故事閱讀 65,453評論 6 385
  • 文/花漫 我一把揭開白布女阀。 她就那樣靜靜地躺著,像睡著了一般屑迂。 火紅的嫁衣襯著肌膚如雪浸策。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 49,784評論 1 290
  • 那天惹盼,我揣著相機與錄音庸汗,去河邊找鬼。 笑死手报,一個胖子當著我的面吹牛蚯舱,可吹牛的內容都是我干的。 我是一名探鬼主播掩蛤,決...
    沈念sama閱讀 38,927評論 3 406
  • 文/蒼蘭香墨 我猛地睜開眼枉昏,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了揍鸟?” 一聲冷哼從身側響起兄裂,我...
    開封第一講書人閱讀 37,691評論 0 266
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎阳藻,沒想到半個月后晰奖,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 44,137評論 1 303
  • 正文 獨居荒郊野嶺守林人離奇死亡腥泥,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內容為張勛視角 年9月15日...
    茶點故事閱讀 36,472評論 2 326
  • 正文 我和宋清朗相戀三年匾南,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片蛔外。...
    茶點故事閱讀 38,622評論 1 340
  • 序言:一個原本活蹦亂跳的男人離奇死亡蛆楞,死狀恐怖溯乒,靈堂內的尸體忽然破棺而出,到底是詐尸還是另有隱情臊岸,我是刑警寧澤橙数,帶...
    沈念sama閱讀 34,289評論 4 329
  • 正文 年R本政府宣布,位于F島的核電站帅戒,受9級特大地震影響灯帮,放射性物質發(fā)生泄漏。R本人自食惡果不足惜逻住,卻給世界環(huán)境...
    茶點故事閱讀 39,887評論 3 312
  • 文/蒙蒙 一钟哥、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧瞎访,春花似錦腻贰、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 30,741評論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至伴奥,卻和暖如春写烤,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背拾徙。 一陣腳步聲響...
    開封第一講書人閱讀 31,977評論 1 265
  • 我被黑心中介騙來泰國打工洲炊, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人尼啡。 一個月前我還...
    沈念sama閱讀 46,316評論 2 360
  • 正文 我出身青樓暂衡,卻偏偏與公主長得像,于是被迫代替她去往敵國和親崖瞭。 傳聞我的和親對象是個殘疾皇子狂巢,可洞房花燭夜當晚...
    茶點故事閱讀 43,490評論 2 348

推薦閱讀更多精彩內容