webrtc源碼分析之視頻編碼之一

新的一年又開始了,我打算先把年前看的webrtc的流程先梳理下秋忙。省的后面忙于其他事情灰追,時(shí)間一長又忘記了。還是盡量按webrtc開篇中的順利來梳理吧弹澎。在之前的兩篇博客中基本完成了webrtc視頻采集的分析,因此本篇開始就分析一下webrtc視頻編碼相關(guān)的內(nèi)容殴胧,準(zhǔn)備分三篇來介紹佩迟。

廢話少說,如下圖所示惫撰,主要是從初始化流程和編碼流程來分析webrtc視頻編碼模塊躺涝,step1~step31是初始化流程扼雏,主要是創(chuàng)建相關(guān)的對(duì)象诗充,step32~step49是編碼流程,本篇文章打算先分析一下初始化流程碟绑。

上面流程涉及的主要類如下所示:

VideoStreamEncoder實(shí)現(xiàn)了VideoSinkInterface和EncodedImageCallback接口格仲,當(dāng)做為一個(gè)VideoSinkInterface對(duì)象時(shí),可以接收VideoBroadcaster分發(fā)的圖像谊惭。當(dāng)做為一個(gè)EncodedImageCallback對(duì)象時(shí)侮东,可以接收MediaCodecVideoEncoder編碼后的碼流數(shù)據(jù)圈盔。MediaCodecVideoEncoder的callback_是一個(gè)VCMEncodedFrameCallback對(duì)象悄雅,VCMEncodedFrameCallback的post_encode_callback_是一個(gè)VideoStreamEncoder對(duì)象,因此編碼后的數(shù)據(jù)就可以通過callback_成員回調(diào)到VideoStreamEncoder众眨。

創(chuàng)建視頻發(fā)送Stream時(shí)會(huì)調(diào)用WebRtcVideoSendStream的RecreateWebRtcStream便锨,在這個(gè)過程會(huì)完成視頻編碼模塊初始化工作,RecreateWebRtcStream主要代碼如下:

void WebRtcVideoChannel::WebRtcVideoSendStream::RecreateWebRtcStream() {
  stream_ = call_->CreateVideoSendStream(std::move(config),
                                         parameters_.encoder_config.Copy());
  if (source_) {
    stream_->SetSource(this, GetDegradationPreference());
  }
}

主要包括如下兩個(gè)流程:

  1. 調(diào)用Call對(duì)象的CreateVideoSendStream創(chuàng)建VideoSendStream對(duì)象姚建。
  2. 調(diào)用SetSource函數(shù)設(shè)置圖像數(shù)據(jù)源吱殉,最后會(huì)調(diào)用VideoTrack的AddOrUpdateSink函數(shù)注冊(cè)sink對(duì)象,該sink對(duì)象是一個(gè)VideoStreamEncoder對(duì)象稿湿。

Call的CreateVideoSendStream主要代碼如下:

webrtc::VideoSendStream* Call::CreateVideoSendStream(
    webrtc::VideoSendStream::Config config,
    VideoEncoderConfig encoder_config) {
  VideoSendStream* send_stream = new VideoSendStream(
      num_cpu_cores_, module_process_thread_.get(), &worker_queue_,
      call_stats_.get(), transport_send_.get(), bitrate_allocator_.get(),
      video_send_delay_stats_.get(), event_log_, std::move(config),
      std::move(encoder_config), suspended_video_send_ssrcs_,
      suspended_video_payload_states_);
  return send_stream;
}

VideoSendStream的構(gòu)造函數(shù)如下所示:

VideoSendStream::VideoSendStream(
    int num_cpu_cores,
    ProcessThread* module_process_thread,
    rtc::TaskQueue* worker_queue,
    CallStats* call_stats,
    RtpTransportControllerSendInterface* transport,
    BitrateAllocator* bitrate_allocator,
    SendDelayStats* send_delay_stats,
    RtcEventLog* event_log,
    VideoSendStream::Config config,
    VideoEncoderConfig encoder_config,
    const std::map<uint32_t, RtpState>& suspended_ssrcs,
    const std::map<uint32_t, RtpPayloadState>& suspended_payload_states)
    : worker_queue_(worker_queue),
      thread_sync_event_(false /* manual_reset */, false),
      stats_proxy_(Clock::GetRealTimeClock(),
                   config,
                   encoder_config.content_type),
      config_(std::move(config)),
      content_type_(encoder_config.content_type) {
  video_stream_encoder_.reset(
      new VideoStreamEncoder(num_cpu_cores, &stats_proxy_,
                             config_.encoder_settings,
                             config_.pre_encode_callback,
                             std::unique_ptr<OveruseFrameDetector>()));
  worker_queue_->PostTask(std::unique_ptr<rtc::QueuedTask>(new ConstructionTask(
      &send_stream_, &thread_sync_event_, &stats_proxy_,
      video_stream_encoder_.get(), module_process_thread, call_stats, transport,
      bitrate_allocator, send_delay_stats, event_log, &config_,
      encoder_config.max_bitrate_bps, suspended_ssrcs, suspended_payload_states,
      encoder_config.content_type)));

  // Wait for ConstructionTask to complete so that |send_stream_| can be used.
  // |module_process_thread| must be registered and deregistered on the thread
  // it was created on.
  thread_sync_event_.Wait(rtc::Event::kForever);
  send_stream_->RegisterProcessThread(module_process_thread);
  // TODO(sprang): Enable this also for regular video calls if it works well.
  if (encoder_config.content_type == VideoEncoderConfig::ContentType::kScreen) {
    // Only signal target bitrate for screenshare streams, for now.
    video_stream_encoder_->SetBitrateObserver(send_stream_.get());
  }

  ReconfigureVideoEncoder(std::move(encoder_config));
}

主要是創(chuàng)建了VideoStreamEncoder對(duì)象和VideoSendStreamImpl對(duì)象,然后調(diào)用ReconfigureVideoEncoder初始化編碼器流礁,VideoSendStreamImpl對(duì)象的創(chuàng)建是在ConstructionTask的Run函數(shù)中完成的神帅,如下所示:

  bool Run() override {
    send_stream_->reset(new VideoSendStreamImpl(
        stats_proxy_, rtc::TaskQueue::Current(), call_stats_, transport_,
        bitrate_allocator_, send_delay_stats_, video_stream_encoder_,
        event_log_, config_, initial_encoder_max_bitrate_,
        std::move(suspended_ssrcs_), std::move(suspended_payload_states_),
        content_type_));
    return true;
  }

VideoStreamEncoder的構(gòu)造函數(shù)如下所示:

VideoStreamEncoder::VideoStreamEncoder(
    uint32_t number_of_cores,
    SendStatisticsProxy* stats_proxy,
    const VideoSendStream::Config::EncoderSettings& settings,
    rtc::VideoSinkInterface<VideoFrame>* pre_encode_callback,
    std::unique_ptr<OveruseFrameDetector> overuse_detector)
    : shutdown_event_(true /* manual_reset */, false),
      number_of_cores_(number_of_cores),
      initial_rampup_(0),
      source_proxy_(new VideoSourceProxy(this)),
      sink_(nullptr),
      settings_(settings),
      codec_type_(PayloadStringToCodecType(settings.payload_name)),
      video_sender_(Clock::GetRealTimeClock(), this),
      overuse_detector_(
          overuse_detector.get()
              ? overuse_detector.release()
              : new OveruseFrameDetector(
                    GetCpuOveruseOptions(settings.full_overuse_time),
                    this,
                    stats_proxy)),
      stats_proxy_(stats_proxy),
      pre_encode_callback_(pre_encode_callback),
      max_framerate_(-1),
      pending_encoder_reconfiguration_(false),
      encoder_start_bitrate_bps_(0),
      max_data_payload_length_(0),
      nack_enabled_(false),
      last_observed_bitrate_bps_(0),
      encoder_paused_and_dropped_frame_(false),
      clock_(Clock::GetRealTimeClock()),
      degradation_preference_(
          VideoSendStream::DegradationPreference::kDegradationDisabled),
      posted_frames_waiting_for_encode_(0),
      last_captured_timestamp_(0),
      delta_ntp_internal_ms_(clock_->CurrentNtpInMilliseconds() -
                             clock_->TimeInMilliseconds()),
      last_frame_log_ms_(clock_->TimeInMilliseconds()),
      captured_frame_count_(0),
      dropped_frame_count_(0),
      bitrate_observer_(nullptr),
      encoder_queue_("EncoderQueue") {
  RTC_DCHECK(stats_proxy);
  encoder_queue_.PostTask([this] {
    RTC_DCHECK_RUN_ON(&encoder_queue_);
    overuse_detector_->StartCheckForOveruse();
    video_sender_.RegisterExternalEncoder(
        settings_.encoder, settings_.payload_type, settings_.internal_source);
  });
}

主要是初始化source_proxy_對(duì)象和video_sender_對(duì)象找御,VideoSender構(gòu)造函數(shù)如下所示:

VideoSender::VideoSender(Clock* clock,
                         EncodedImageCallback* post_encode_callback)
    : _encoder(nullptr),
      _mediaOpt(clock),
      _encodedFrameCallback(post_encode_callback, &_mediaOpt),
      post_encode_callback_(post_encode_callback),
      _codecDataBase(&_encodedFrameCallback),
      frame_dropper_enabled_(true),
      current_codec_(),
      encoder_params_({BitrateAllocation(), 0, 0, 0}),
      encoder_has_internal_source_(false),
      next_frame_types_(1, kVideoFrameDelta) {
  _mediaOpt.Reset();
  // Allow VideoSender to be created on one thread but used on another, post
  // construction. This is currently how this class is being used by at least
  // one external project (diffractor).
  sequenced_checker_.Detach();
}

主要是初始化_encodedFrameCallback和codecDataBase對(duì)象,VCMEncodedFrameCallback構(gòu)造函數(shù)如下所示栖疑,
post_encode_callback
是一個(gè)VideoStreamEncoder對(duì)象:

VCMEncodedFrameCallback::VCMEncodedFrameCallback(
    EncodedImageCallback* post_encode_callback,
    media_optimization::MediaOptimization* media_opt)
    : internal_source_(false),
      post_encode_callback_(post_encode_callback),
      media_opt_(media_opt),
      framerate_(1),
      last_timing_frame_time_ms_(-1),
      timing_frames_thresholds_({-1, 0}),
      incorrect_capture_time_logged_messages_(0),
      reordered_frames_logged_messages_(0),
      stalled_encoder_logged_messages_(0) {
}

VCMCodecDataBase構(gòu)造函數(shù)如下所示蔽挠,encoded_frame_callback_是一個(gè)VCMEncodedFrameCallback對(duì)象:

VCMCodecDataBase::VCMCodecDataBase(
    VCMEncodedFrameCallback* encoded_frame_callback)
    : number_of_cores_(0),
      max_payload_size_(kDefaultPayloadSize),
      periodic_key_frames_(false),
      pending_encoder_reset_(true),
      send_codec_(),
      receive_codec_(),
      encoder_payload_type_(0),
      external_encoder_(nullptr),
      internal_source_(false),
      encoded_frame_callback_(encoded_frame_callback),
      dec_map_(),
      dec_external_map_() {}

在VideoStreamEncoder的構(gòu)造函數(shù)中,調(diào)用VideoSender的RegisterExternalEncoder函數(shù)最終將編碼器對(duì)象保存在VCMCodecDataBase的external_encoder_成員比原,如下所示:

void VCMCodecDataBase::RegisterExternalEncoder(VideoEncoder* external_encoder,
                                               uint8_t payload_type,
                                               bool internal_source) {
  // Since only one encoder can be used at a given time, only one external
  // encoder can be registered/used.
  external_encoder_ = external_encoder;
  encoder_payload_type_ = payload_type;
  internal_source_ = internal_source;
  pending_encoder_reset_ = true;
}

編碼器對(duì)象是通過WebRtcVideoEncoderFactory創(chuàng)建的量窘,如下所示:

void WebRtcVideoChannel::WebRtcVideoSendStream::SetCodec(
    const VideoCodecSettings& codec_settings,
    bool force_encoder_allocation) {
  std::unique_ptr<webrtc::VideoEncoder> new_encoder;
  if (force_encoder_allocation || !allocated_encoder_ ||
      allocated_codec_ != codec_settings.codec) {
    const webrtc::SdpVideoFormat format(codec_settings.codec.name,
                                        codec_settings.codec.params);
    new_encoder = encoder_factory_->CreateVideoEncoder(format);

    parameters_.config.encoder_settings.encoder = new_encoder.get();

    const webrtc::VideoEncoderFactory::CodecInfo info =
        encoder_factory_->QueryVideoEncoder(format);
    parameters_.config.encoder_settings.full_overuse_time =
        info.is_hardware_accelerated;
    parameters_.config.encoder_settings.internal_source =
        info.has_internal_source;
  } else {
    new_encoder = std::move(allocated_encoder_);
  }
  parameters_.config.encoder_settings.payload_name = codec_settings.codec.name;
  parameters_.config.encoder_settings.payload_type = codec_settings.codec.id;
}

以Android MediaCodec編碼器為例氢拥,MediaCodecVideoEncoderFactory的CreateVideoEncoder定義如下:

VideoEncoder* MediaCodecVideoEncoderFactory::CreateVideoEncoder(
    const cricket::VideoCodec& codec) {
  if (supported_codecs().empty()) {
    ALOGW << "No HW video encoder for codec " << codec.name;
    return nullptr;
  }
  if (FindMatchingCodec(supported_codecs(), codec)) {
    ALOGD << "Create HW video encoder for " << codec.name;
    JNIEnv* jni = AttachCurrentThreadIfNeeded();
    ScopedLocalRefFrame local_ref_frame(jni);
    return new MediaCodecVideoEncoder(jni, codec, egl_context_);
  }
  ALOGW << "Can not find HW video encoder for type " << codec.name;
  return nullptr;
}

可見VCMCodecDataBase的external_encoder_是一個(gè)MediaCodecVideoEncoder對(duì)象嫩海。

回到VideoSendStream的構(gòu)造函數(shù)中,調(diào)用的ReconfigureVideoEncoder最后會(huì)調(diào)用VCMCodecDataBase的SetSendCodec函數(shù)审葬,如下所示奕谭,主要是創(chuàng)建并初始化VCMGenericEncoder對(duì)象,其中external_encoder_是一個(gè)MediaCodecVideoEncoder對(duì)象官册,encoded_frame_callback_是一個(gè)VCMEncodedFrameCallback對(duì)象难捌。

bool VCMCodecDataBase::SetSendCodec(const VideoCodec* send_codec,
                                    int number_of_cores,
                                    size_t max_payload_size) {
  ptr_encoder_.reset(new VCMGenericEncoder(
      external_encoder_, encoded_frame_callback_, internal_source_));
  encoded_frame_callback_->SetInternalSource(internal_source_);
  if (ptr_encoder_->InitEncode(&send_codec_, number_of_cores_,
                               max_payload_size_) < 0) {
    RTC_LOG(LS_ERROR) << "Failed to initialize video encoder.";
    DeleteEncoder();
    return false;
  }
}

VCMGenericEncoder的構(gòu)造函數(shù)如下所示根吁,encoder_和vcm_encoded_frame_callback_保存的分別是MediaCodecVideoEncoder對(duì)象和VCMEncodedFrameCallback對(duì)象。

VCMGenericEncoder::VCMGenericEncoder(
    VideoEncoder* encoder,
    VCMEncodedFrameCallback* encoded_frame_callback,
    bool internal_source)
    : encoder_(encoder),
      vcm_encoded_frame_callback_(encoded_frame_callback),
      internal_source_(internal_source),
      encoder_params_({BitrateAllocation(), 0, 0, 0}),
      streams_or_svc_num_(0) {}

VCMGenericEncoder的InitEncode函數(shù)定義如下所示:

int32_t VCMGenericEncoder::InitEncode(const VideoCodec* settings,
                                      int32_t number_of_cores,
                                      size_t max_payload_size) {
  RTC_DCHECK_RUNS_SERIALIZED(&race_checker_);
  TRACE_EVENT0("webrtc", "VCMGenericEncoder::InitEncode");
  streams_or_svc_num_ = settings->numberOfSimulcastStreams;
  codec_type_ = settings->codecType;
  if (settings->codecType == kVideoCodecVP9) {
    streams_or_svc_num_ = settings->VP9().numberOfSpatialLayers;
  }
  if (streams_or_svc_num_ == 0)
    streams_or_svc_num_ = 1;

  vcm_encoded_frame_callback_->SetTimingFramesThresholds(
      settings->timing_frame_thresholds);
  vcm_encoded_frame_callback_->OnFrameRateChanged(settings->maxFramerate);

  if (encoder_->InitEncode(settings, number_of_cores, max_payload_size) != 0) {
    RTC_LOG(LS_ERROR) << "Failed to initialize the encoder associated with "
                         "payload name: "
                      << settings->plName;
    return -1;
  }
  vcm_encoded_frame_callback_->Reset();
  encoder_->RegisterEncodeCompleteCallback(vcm_encoded_frame_callback_);
  return 0;
}

主要是調(diào)用MediaCodecVideoEncoder的InitEncode函數(shù)創(chuàng)建并初始化編碼器,并將VCMEncodedFrameCallback注冊(cè)到MediaCodecVideoEncoder中用來接收編碼后的數(shù)據(jù)愚争。

InitEncode函數(shù)最后會(huì)調(diào)用java層MediaCodecVideoEncoder的initEncode函數(shù),如下所示捅彻,在這里完成了創(chuàng)建并初始化Android MediaCodec編碼器的工作鞍陨。

  @CalledByNativeUnchecked
  boolean initEncode(VideoCodecType type, int profile, int width, int height, int kbps, int fps,
      EglBase14.Context sharedContext) {
    final boolean useSurface = sharedContext != null;
    Logging.d(TAG,
        "Java initEncode: " + type + ". Profile: " + profile + " : " + width + " x " + height
            + ". @ " + kbps + " kbps. Fps: " + fps + ". Encode from texture : " + useSurface);

    this.profile = profile;
    this.width = width;
    this.height = height;
    if (mediaCodecThread != null) {
      throw new RuntimeException("Forgot to release()?");
    }
    EncoderProperties properties = null;
    String mime = null;
    int keyFrameIntervalSec = 0;
    boolean configureH264HighProfile = false;
    if (type == VideoCodecType.VIDEO_CODEC_VP8) {
      mime = VP8_MIME_TYPE;
      properties = findHwEncoder(
          VP8_MIME_TYPE, vp8HwList(), useSurface ? supportedSurfaceColorList : supportedColorList);
      keyFrameIntervalSec = 100;
    } else if (type == VideoCodecType.VIDEO_CODEC_VP9) {
      mime = VP9_MIME_TYPE;
      properties = findHwEncoder(
          VP9_MIME_TYPE, vp9HwList, useSurface ? supportedSurfaceColorList : supportedColorList);
      keyFrameIntervalSec = 100;
    } else if (type == VideoCodecType.VIDEO_CODEC_H264) {
      mime = H264_MIME_TYPE;
      properties = findHwEncoder(
          H264_MIME_TYPE, h264HwList, useSurface ? supportedSurfaceColorList : supportedColorList);
      if (profile == H264Profile.CONSTRAINED_HIGH.getValue()) {
        EncoderProperties h264HighProfileProperties = findHwEncoder(H264_MIME_TYPE,
            h264HighProfileHwList, useSurface ? supportedSurfaceColorList : supportedColorList);
        if (h264HighProfileProperties != null) {
          Logging.d(TAG, "High profile H.264 encoder supported.");
          configureH264HighProfile = true;
        } else {
          Logging.d(TAG, "High profile H.264 encoder requested, but not supported. Use baseline.");
        }
      }
      keyFrameIntervalSec = 20;
    }
    if (properties == null) {
      throw new RuntimeException("Can not find HW encoder for " + type);
    }
    runningInstance = this; // Encoder is now running and can be queried for stack traces.
    colorFormat = properties.colorFormat;
    bitrateAdjustmentType = properties.bitrateAdjustmentType;
    if (bitrateAdjustmentType == BitrateAdjustmentType.FRAMERATE_ADJUSTMENT) {
      fps = BITRATE_ADJUSTMENT_FPS;
    } else {
      fps = Math.min(fps, MAXIMUM_INITIAL_FPS);
    }

    forcedKeyFrameMs = 0;
    lastKeyFrameMs = -1;
    if (type == VideoCodecType.VIDEO_CODEC_VP8
        && properties.codecName.startsWith(qcomVp8HwProperties.codecPrefix)) {
      if (Build.VERSION.SDK_INT == Build.VERSION_CODES.LOLLIPOP
          || Build.VERSION.SDK_INT == Build.VERSION_CODES.LOLLIPOP_MR1) {
        forcedKeyFrameMs = QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_L_MS;
      } else if (Build.VERSION.SDK_INT == Build.VERSION_CODES.M) {
        forcedKeyFrameMs = QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_M_MS;
      } else if (Build.VERSION.SDK_INT > Build.VERSION_CODES.M) {
        forcedKeyFrameMs = QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_N_MS;
      }
    }

    Logging.d(TAG, "Color format: " + colorFormat + ". Bitrate adjustment: " + bitrateAdjustmentType
            + ". Key frame interval: " + forcedKeyFrameMs + " . Initial fps: " + fps);
    targetBitrateBps = 1000 * kbps;
    targetFps = fps;
    bitrateAccumulatorMax = targetBitrateBps / 8.0;
    bitrateAccumulator = 0;
    bitrateObservationTimeMs = 0;
    bitrateAdjustmentScaleExp = 0;

    mediaCodecThread = Thread.currentThread();
    try {
      MediaFormat format = MediaFormat.createVideoFormat(mime, width, height);
      format.setInteger(MediaFormat.KEY_BIT_RATE, targetBitrateBps);
      format.setInteger("bitrate-mode", VIDEO_ControlRateConstant);
      format.setInteger(MediaFormat.KEY_COLOR_FORMAT, properties.colorFormat);
      format.setInteger(MediaFormat.KEY_FRAME_RATE, targetFps);
      format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, keyFrameIntervalSec);
      if (configureH264HighProfile) {
        format.setInteger("profile", VIDEO_AVCProfileHigh);
        format.setInteger("level", VIDEO_AVCLevel3);
      }
      Logging.d(TAG, "  Format: " + format);
      mediaCodec = createByCodecName(properties.codecName);
      this.type = type;
      if (mediaCodec == null) {
        Logging.e(TAG, "Can not create media encoder");
        release();
        return false;
      }
      mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

      if (useSurface) {
        eglBase = new EglBase14(sharedContext, EglBase.CONFIG_RECORDABLE);
        // Create an input surface and keep a reference since we must release the surface when done.
        inputSurface = mediaCodec.createInputSurface();
        eglBase.createSurface(inputSurface);
        drawer = new GlRectDrawer();
      }
      mediaCodec.start();
      outputBuffers = mediaCodec.getOutputBuffers();
      Logging.d(TAG, "Output buffers: " + outputBuffers.length);

    } catch (IllegalStateException e) {
      Logging.e(TAG, "initEncode failed", e);
      release();
      return false;
    }
    return true;
  }

MediaCodecVideoEncoder的RegisterEncodeCompleteCallback定義如下所示,可見callback_成員是一個(gè)VCMEncodedFrameCallback對(duì)象澈驼。

int32_t MediaCodecVideoEncoder::RegisterEncodeCompleteCallback(
    EncodedImageCallback* callback) {
  RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_checker_);
  JNIEnv* jni = AttachCurrentThreadIfNeeded();
  ScopedLocalRefFrame local_ref_frame(jni);
  callback_ = callback;
  return WEBRTC_VIDEO_CODEC_OK;
}

回到WebRtcVideoSendStream的RecreateWebRtcStream函數(shù)筛武,會(huì)調(diào)用VideoSendStream的SetSource設(shè)置Source,最后會(huì)在VideoSourceProxy的SetSource函數(shù)中調(diào)用WebRtcVideoSendStream的AddOrUpdateSink函數(shù)將VideoStreamEncoder這個(gè)sink對(duì)象注冊(cè)到Source上内边,這樣Source的圖像數(shù)據(jù)就可以分發(fā)到VideoStreamEncoder對(duì)象進(jìn)行編碼了待锈。

總結(jié)

本篇文章主要分析了webrtc視頻編碼模塊的初始化流程,這個(gè)流程就是創(chuàng)建一系列相關(guān)的對(duì)象辉懒,然后編碼器設(shè)置好輸入輸出谍失,VideoStreamEncoder對(duì)象負(fù)責(zé)輸入輸出的銜接,編碼器的輸入是通過將VideoStreamEncoder注冊(cè)到VideoTrack來完成圖像數(shù)據(jù)的接收颠印,此時(shí)VideoStreamEncoder是做為一個(gè)VideoSinkInterface對(duì)象抹竹,編碼器的輸出是通過將VCMEncodedFrameCallback注冊(cè)到MediaCodecVideoEncoder,再經(jīng)過VideoStreamEncoder來完成碼流數(shù)據(jù)的打包和傳輸窃判,這工作是交給VideoSendStreamImpl來完成的袄琳,此時(shí)VideoStreamEncoder是做為一個(gè)EncodedImageCallback對(duì)象燃乍。

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末宛琅,一起剝皮案震驚了整個(gè)濱河市,隨后出現(xiàn)的幾起案子舆瘪,更是在濱河造成了極大的恐慌红伦,老刑警劉巖,帶你破解...
    沈念sama閱讀 210,914評(píng)論 6 490
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件哺呜,死亡現(xiàn)場(chǎng)離奇詭異箕戳,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī)玻墅,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 89,935評(píng)論 2 383
  • 文/潘曉璐 我一進(jìn)店門壮虫,熙熙樓的掌柜王于貴愁眉苦臉地迎上來囚似,“玉大人,你說我怎么就攤上這事饶唤。” “怎么了办素?”我有些...
    開封第一講書人閱讀 156,531評(píng)論 0 345
  • 文/不壞的土叔 我叫張陵祸穷,是天一觀的道長雷滚。 經(jīng)常有香客問我,道長,這世上最難降的妖魔是什么刻蚯? 我笑而不...
    開封第一講書人閱讀 56,309評(píng)論 1 282
  • 正文 為了忘掉前任桑嘶,我火速辦了婚禮躬充,結(jié)果婚禮上充甚,老公的妹妹穿的比我還像新娘。我一直安慰自己伴找,他們只是感情好,可當(dāng)我...
    茶點(diǎn)故事閱讀 65,381評(píng)論 5 384
  • 文/花漫 我一把揭開白布抖誉。 她就那樣靜靜地躺著衰倦,像睡著了一般樊零。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上驻襟,一...
    開封第一講書人閱讀 49,730評(píng)論 1 289
  • 那天沉衣,我揣著相機(jī)與錄音,去河邊找鬼厢蒜。 笑死斑鸦,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的巷屿。 我是一名探鬼主播,決...
    沈念sama閱讀 38,882評(píng)論 3 404
  • 文/蒼蘭香墨 我猛地睜開眼憨琳,長吁一口氣:“原來是場(chǎng)噩夢(mèng)啊……” “哼!你這毒婦竟也來了菌湃?” 一聲冷哼從身側(cè)響起遍略,我...
    開封第一講書人閱讀 37,643評(píng)論 0 266
  • 序言:老撾萬榮一對(duì)情侶失蹤绪杏,失蹤者是張志新(化名)和其女友劉穎,沒想到半個(gè)月后蕾久,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 44,095評(píng)論 1 303
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡履因,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 36,448評(píng)論 2 325
  • 正文 我和宋清朗相戀三年搓逾,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了杯拐。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 38,566評(píng)論 1 339
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡朗兵,死狀恐怖顶滩,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情盐欺,我是刑警寧澤仅醇,帶...
    沈念sama閱讀 34,253評(píng)論 4 328
  • 正文 年R本政府宣布,位于F島的核電站粉洼,受9級(jí)特大地震影響,放射性物質(zhì)發(fā)生泄漏属韧。R本人自食惡果不足惜宵喂,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 39,829評(píng)論 3 312
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望愉棱。 院中可真熱鬧哲戚,春花似錦艾岂、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 30,715評(píng)論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至钞啸,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間梭稚,已是汗流浹背絮吵。 一陣腳步聲響...
    開封第一講書人閱讀 31,945評(píng)論 1 264
  • 我被黑心中介騙來泰國打工蹬敲, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人伴嗡。 一個(gè)月前我還...
    沈念sama閱讀 46,248評(píng)論 2 360
  • 正文 我出身青樓闹究,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對(duì)象是個(gè)殘疾皇子吉嫩,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 43,440評(píng)論 2 348

推薦閱讀更多精彩內(nèi)容