引言
MTK HAL算法集成系列文章共三篇:
- MTK HAL算法集成之單幀算法
- MTK HAL算法集成之多幀算法
- MTK HAL算法集成之雙攝算法
這個系列文章均基于Android 9.0,MT6763平臺,HAL版本是HAL3。
本文是其中的最后一篇读规,主要介紹雙攝算法集成。關(guān)于算法的分類卵佛、算法評估等等內(nèi)容本文就不重復(fù)介紹了学赛,有需要可以直接看:《MTK HAL算法集成之單幀算法》河哑。
1. 雙攝算法簡介
雙攝算法相比單幀算法和多幀算法要復(fù)雜的多财岔。無論是用于夜拍风皿,HDR,還是用于虛化(景深/人像/大光圈)的雙攝算法匠璧,一般都會需要主桐款、輔兩個攝像頭的圖像同步。并且患朱,由于每一組攝像頭模組都有一定的差異鲁僚,還會開發(fā)特定的標(biāo)定程序,在工廠的產(chǎn)線進行標(biāo)定裁厅。標(biāo)定程序?qū)?biāo)定參數(shù)(也就是標(biāo)定的結(jié)果)寫入到不易被擦除的分區(qū)(如NV分區(qū))中。拍照時侨艾,雙攝算法根據(jù)標(biāo)定參數(shù)修正模組差異执虹。并使用主、輔攝像頭的圖像進行計算唠梨,得出深度袋励、曝光之類的參數(shù)。使用得出的深度当叭、曝光之類的參數(shù)來調(diào)整主攝圖像茬故,達到夜拍增強、HDR蚁鳖、背景虛化(景深/人像/大光圈)等等效果磺芭。
對于算法集成來說,一般有兩點:
- 標(biāo)定程序的集成:包括標(biāo)定APP以及配置APP的SELinux權(quán)限等等醉箕。
- 雙攝算法的集成:與單幀算法钾腺、多幀算法類似,選擇對應(yīng)的feature讥裤,實現(xiàn)對應(yīng)的plugin放棒,掛載算法。
由于預(yù)置標(biāo)定APP比較簡單己英,本文就不介紹间螟,關(guān)于配置標(biāo)定APP的SELinux權(quán)限,可參考我的另外一篇文章:SELinux權(quán)限。
由于我無法提供一個真正的雙攝算法厢破,還是和介紹單幀算法集成時類似邮府,提供一個模擬算法庫,這個模擬算法庫拼接主溉奕、輔攝像頭的圖像褂傀,將輔攝圖像拼接到主攝圖像中間,最終呈現(xiàn)類似畫中畫的效果加勤。
2. 選擇feature和配置feature table
2.1 選擇feature
雙攝算法是很常見的算法仙辟,在MTK已預(yù)置一些雙攝的feature,總結(jié)下大概有以下feature是用于雙攝的:
vendor/mediatek/proprietary/hardware/mtkcam3/include/mtkcam3/3rdparty/mtk/mtk_feature_type.h:
MTK_FEATURE_DEPTH = 1ULL << 8,
MTK_FEATURE_BOKEH = 1ULL << 9,
MTK_FEATURE_VSDOF = (MTK_FEATURE_DEPTH|MTK_FEATURE_BOKEH),
MTK_FEATURE_DUAL_YUV = 1ULL << 14,
MTK_FEATURE_DUAL_HWDEPTH = 1ULL << 15,
其中鳄梅,MTK_FEATURE_DEPTH和MTK_FEATURE_BOKEH用于雙攝虛化叠国,并且計算深度和模糊處理是在兩個分開的掛載點進行。
vendor/mediatek/proprietary/hardware/mtkcam3/include/mtkcam3/3rdparty/customer/customer_feature_type.h:
TP_FEATURE_DEPTH = 1ULL << 37,
TP_FEATURE_BOKEH = 1ULL << 38,
TP_FEATURE_VSDOF = (TP_FEATURE_DEPTH|TP_FEATURE_BOKEH),
TP_FEATURE_FUSION = 1ULL << 39,
TP_FEATURE_HDR_DC = 1ULL << 40,
TP_FEATURE_DUAL_YUV = 1ULL << 41,
TP_FEATURE_DUAL_HWDEPTH = 1ULL << 42,
TP_FEATURE_PUREBOKEH = 1ULL << 43,
customer部分定義的feature戴尸,其中TP_FEATURE_DEPTH和TP_FEATURE_BOKEH也是用于雙攝虛化粟焊,并且計算深度和模糊處理也是在兩個分開的node進行。TP_FEATURE_FUSION和TP_FEATURE_PUREBOKEH是用于雙攝虛化的孙蒙,但是它們將計算深度和虛化處理放在同一個掛載點進行项棠。TP_FEATURE_HDR_DC是用于雙攝HDR算法的。
按MTK的設(shè)計意圖來看挎峦,MTK_FEATURE_DUAL_YUV和TP_FEATURE_DUAL_YUV兩個feature應(yīng)該也是可以用于雙攝算法的香追,但是我沒有試過,我一般用TP_FEATURE_FUSION或者TP_FEATURE_PUREBOKEH坦胶,有興趣的童鞋可以自己試一下透典。
既然MTK已經(jīng)預(yù)置好了,這一步我們就對號入座顿苇,不用再額外添加feature峭咒。由于是第三方算法,所以我們選擇TP_FEATURE_PUREBOKEH纪岁。
2.2 配置feature table
上一步凑队,我們選擇了TP_FEATURE_PUREBOKEH,MTK很貼心的在vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/mtk/mtk_scenario_mgr.cpp中已經(jīng)定義了一個MTK_FEATURE_COMBINATION_TP_PUREBOKEH蜂科。所以定義這一步我們也省了顽决,只需要將MTK_FEATURE_COMBINATION_TP_PUREBOKEH添加到對應(yīng)的MTK_CAMERA_SCENARIO_CAPTURE_DUALCAM。這里由于我們沒有其它雙攝算法导匣,將其它兩行注釋掉才菠。
diff --git a/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/mtk/mtk_scenario_mgr.cpp b/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/mtk/mtk_scenario_mgr.cpp
index 38365e0602..7adc2a76db 100755
--- a/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/mtk/mtk_scenario_mgr.cpp
+++ b/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/mtk/mtk_scenario_mgr.cpp
@@ -363,8 +363,9 @@ const std::vector<std::unordered_map<int32_t, ScenarioFeatures>> gMtkScenarioFe
CAMERA_SCENARIO_END
//
CAMERA_SCENARIO_START(MTK_CAMERA_SCENARIO_CAPTURE_DUALCAM)
- ADD_CAMERA_FEATURE_SET(MTK_FEATURE_MFNR, MTK_FEATURE_COMBINATION_TP_VSDOF_MFNR)
- ADD_CAMERA_FEATURE_SET(NO_FEATURE_NORMAL, MTK_FEATURE_COMBINATION_TP_VSDOF)
+ //ADD_CAMERA_FEATURE_SET(MTK_FEATURE_MFNR, MTK_FEATURE_COMBINATION_TP_VSDOF_MFNR)
+ //ADD_CAMERA_FEATURE_SET(NO_FEATURE_NORMAL, MTK_FEATURE_COMBINATION_TP_VSDOF)
+ ADD_CAMERA_FEATURE_SET(NO_FEATURE_NORMAL, MTK_FEATURE_COMBINATION_TP_PUREBOKEH)
CAMERA_SCENARIO_END
//
CAMERA_SCENARIO_START(MTK_CAMERA_SCENARIO_CAPTURE_CSHOT)
注意:
如果是9.0代碼,是區(qū)分camera id的贡定,feature table的配置要修改openId = 4中的MTK_CAMERA_SCENARIO_CAPTURE_DUALCAM赋访。
順帶提一下4個攝像頭的手機,一般情況下,邏輯camera id的劃分:
- 0:后置主攝
- 1:前置主攝
- 2:后置輔攝
- 3:后置廣角
- 4:雙攝0+2同時開蚓耽。
市面上的手機已經(jīng)有5渠牲、6個攝像頭的。也已經(jīng)有多組雙攝模式的步悠,例如签杈,主攝和輔攝虛化一組,廣角加長焦一組鼎兽,主攝和微距一組答姥。甚至有些手機前攝也有兩個攝像頭的。而我還沒有接觸過多個雙攝模式的項目谚咬,也沒有接觸過前置雙攝的項目鹦付,并且每個公司,甚至每個項目可能都會一些差異择卦,所以我這里列舉的不一定完整和準(zhǔn)確敲长,歡迎了解的童鞋交流補充。
3. 掛載算法
3.1 為算法選擇plugin
MTK HAL3在vendor/mediatek/proprietary/hardware/mtkcam3/include/mtkcam3/3rdparty/plugin/PipelinePluginType.h 中將三方算法的掛載點大致分為以下幾類:
- BokehPlugin: Bokeh算法掛載點秉继,雙攝景深算法的虛化部分祈噪。
- DepthPlugin: Depth算法掛載點,雙攝景深算法的計算深度部分秕噪。
- FusionPlugin: Depth和Bokeh放在1個算法中钳降,即合并的雙攝景深算法掛載點。
- JoinPlugin: Streaming相關(guān)算法掛載點腌巾,預(yù)覽算法都掛載在這里。
- MultiFramePlugin: 多幀算法掛載點铲觉,包括YUV與RAW澈蝙,例如MFNR/HDR
- RawPlugin: RAW算法掛載點,例如remosaic
- YuvPlugin: Yuv單幀算法掛載點撵幽,例如美顏灯荧、廣角鏡頭畸變校正等。
對號入座盐杂,為要集成的算法選擇相應(yīng)的plugin逗载。這里模擬算法庫是在同一個掛載點處理的雙攝算法,所以選擇FusionPlugin链烈。
3.2 添加全局宏控
為了能控制某個項目是否集成此算法厉斟,我們在device/mediateksample/[platform]/ProjectConfig.mk中添加一個宏,用于控制新接入算法的編譯:
QXT_DUALCAMERA_SUPPORT = yes
當(dāng)某個項目不需要這個算法時强衡,將device/mediateksample/[platform]/ProjectConfig.mk的QXT_DUALCAMERA_SUPPORT的值設(shè)為 no 就可以了擦秽。
3.3 編寫算法集成文件
vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/customer/cp_dualcamera/
├── Android.mk
├── DualCameraCapture.cpp
├── include
│ └── dual_camera.h
└── lib
├── arm64-v8a
│ └── libdualcamera.so
└── armeabi-v7a
└── libdualcamera.so
文件說明:
Android.mk中配置算法庫、頭文件、集成的源代碼DualCameraCapture.cpp文件感挥,將它們編譯成庫libmtkcam.plugin.tp_dc缩搅,供libmtkcam_3rdparty.customer依賴調(diào)用。
libdualcamera.so可以將主触幼、輔攝圖像拼接成一張畫中畫效果的圖硼瓣,libdualcamera.so用來模擬需要接入的第三方雙攝算法庫。dual_camera.h是頭文件置谦。
DualCameraCapture.cpp是集成的源代碼CPP文件堂鲤。
3.3.1 mtkcam3/3rdparty/customer/cp_dualcamera/Android.mk
ifeq ($(QXT_DUALCAMERA_SUPPORT),yes)
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := libdualcamera
LOCAL_SRC_FILES_32 := lib/armeabi-v7a/libdualcamera.so
LOCAL_SRC_FILES_64 := lib/arm64-v8a/libdualcamera.so
LOCAL_MODULE_TAGS := optional
LOCAL_MODULE_CLASS := SHARED_LIBRARIES
LOCAL_MODULE_SUFFIX := .so
LOCAL_PROPRIETARY_MODULE := true
LOCAL_MULTILIB := both
include $(BUILD_PREBUILT)
################################################################################
#
################################################################################
include $(CLEAR_VARS)
#-----------------------------------------------------------
-include $(TOP)/$(MTK_PATH_SOURCE)/hardware/mtkcam/mtkcam.mk
#-----------------------------------------------------------
LOCAL_SRC_FILES += DualCameraCapture.cpp
#-----------------------------------------------------------
LOCAL_C_INCLUDES += $(MTKCAM_C_INCLUDES)
LOCAL_C_INCLUDES += $(TOP)/$(MTK_PATH_SOURCE)/hardware/mtkcam3/include $(MTK_PATH_SOURCE)/hardware/mtkcam/include
LOCAL_C_INCLUDES += $(TOP)/$(MTK_PATH_COMMON)/hal/inc
LOCAL_C_INCLUDES += $(TOP)/$(MTK_PATH_CUSTOM_PLATFORM)/hal/inc
#
LOCAL_C_INCLUDES += system/media/camera/include
LOCAL_C_INCLUDES += $(TOP)/$(MTK_PATH_SOURCE)/hardware/mtkcam3/3rdparty/customer/cp_dualcamera/include
#-----------------------------------------------------------
LOCAL_CFLAGS += $(MTKCAM_CFLAGS)
#
#-----------------------------------------------------------
LOCAL_STATIC_LIBRARIES +=
#
LOCAL_WHOLE_STATIC_LIBRARIES +=
#-----------------------------------------------------------
LOCAL_SHARED_LIBRARIES += liblog
LOCAL_SHARED_LIBRARIES += libutils
LOCAL_SHARED_LIBRARIES += libcutils
LOCAL_SHARED_LIBRARIES += libmtkcam_metadata
LOCAL_SHARED_LIBRARIES += libmtkcam_imgbuf
#LOCAL_SHARED_LIBRARIES += libmtkcam_3rdparty
#-----------------------------------------------------------
LOCAL_HEADER_LIBRARIES := libutils_headers liblog_headers libhardware_headers
#-----------------------------------------------------------
LOCAL_MODULE := libmtkcam.plugin.tp_dc
LOCAL_PROPRIETARY_MODULE := true
LOCAL_MODULE_OWNER := mtk
LOCAL_MODULE_TAGS := optional
include $(MTK_STATIC_LIBRARY)
################################################################################
#
################################################################################
include $(call all-makefiles-under,$(LOCAL_PATH))
endif
3.3.2 mtkcam3/3rdparty/customer/cp_dualcamera/include/dual_camera.h
#ifndef QXT_DUAL_CAMERA_H
#define QXT_DUAL_CAMERA_H
typedef unsigned char uchar;
#define CENTER 0
#define LEFT_TOP 1
#define LEFT_BOTTOM 2
#define RIGHT_TOP 3
#define RIGHT_BOTTOM 4
class DualCamera {
public:
DualCamera();
~DualCamera();
void processI420(uchar *main, int mainWidth, int mainHeight,
uchar *sub, int subWidth, int subHeight);
void processI420(uchar *mainY, uchar *mainU, uchar *mainV, int mainWidth, int mainHeight,
uchar *subY, uchar *subU, uchar *subV, int subWidth, int subHeight);
void processNV21(uchar *main, int mainWidth, int mainHeight,
uchar *sub, int subWidth, int subHeight);
void processNV21(uchar *mainY, uchar *mainUV, int mainWidth, int mainHeight,
uchar *subY, uchar *subUV, int subWidth, int subHeight);
private:
int position;
};
#endif //QXT_DUAL_CAMERA_H
頭文件中的接口函數(shù)介紹:
- DualCamera: 構(gòu)造函數(shù),構(gòu)造函數(shù)中會模擬讀取標(biāo)定參數(shù)文件霉祸,這里模擬的標(biāo)定參數(shù)文件內(nèi)容只是一個數(shù)字筑累,用于指定副攝圖像的坐標(biāo)位置。
- processI420:用于將主副攝圖像拼接成畫中畫丝蹭,輸入和輸出圖像必須是I420格式慢宗。
- processNV21:用于將主副攝圖像拼接成畫中畫,輸入和輸出圖像必須是NV21格式奔穿。
- ~DualCamera(): 析構(gòu)函數(shù)镜沽,沒有實際作用。
為了方便有興趣的童鞋們贱田,實現(xiàn)代碼dual_camera.cpp也一并貼上:
#include <cstring>
#include <cstdio>
#include "dual_camera.h"
#include "logger.h"
using namespace std;
DualCamera::DualCamera() {
const char * path = "/vendor/persist/camera/calibration.cfg";
FILE *fp;
if ((fp = fopen(path, "r")) != nullptr) {
auto buffer = new int[1];
fread(buffer, 1, sizeof(int), fp);
position = buffer[0];
} else {
LOGE("Failed to open: %s", path);
position = CENTER;
}
}
DualCamera::~DualCamera() = default;
void DualCamera::processI420(uchar *main, int mainWidth, int mainHeight,
uchar *sub, int subWidth, int subHeight) {
uchar *mainY = main;
uchar *mainU = main + mainWidth * mainHeight;
uchar *mainV = main + mainWidth * mainHeight * 5 / 4;
uchar *subY = sub;
uchar *subU = sub + subWidth * subHeight;
uchar *subV = sub + subWidth * subHeight * 5 / 4;
processI420(mainY, mainU, mainV, mainWidth, mainHeight, subY, subU, subV, subWidth, subHeight);
}
void
DualCamera::processI420(uchar *mainY, uchar *mainU, uchar *mainV, int mainWidth, int mainHeight,
uchar *subY, uchar *subU, uchar *subV, int subWidth, int subHeight) {
int mainUVHeight = mainHeight / 2;
int mainUVWidth = mainWidth / 2;
int subUVHeight = subHeight / 2;
int subUVWidth = subWidth / 2;
//merge
unsigned char *pDstY;
unsigned char *pSrcY;
for (int i = 0; i < subHeight; i++) {
pSrcY = subY + i * subWidth;
if (position == LEFT_TOP) {
pDstY = mainY + i * mainWidth;
} else if (position == LEFT_BOTTOM) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) * mainWidth);
} else if (position == RIGHT_TOP) {
pDstY = mainY + i * mainWidth + (mainWidth - subWidth);
} else if (position == RIGHT_BOTTOM) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) * mainWidth) +
(mainWidth - subWidth);
} else if (position == CENTER) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) / 2 * mainWidth) +
(mainWidth - subWidth) / 2;
} else {
LOGE("Unsupported position: %d", position);
return;
}
memcpy(pDstY, pSrcY, subWidth);
}
unsigned char *pDstU;
unsigned char *pDstV;
unsigned char *pSrcU;
unsigned char *pSrcV;
for (int i = 0; i < subUVHeight; i++) {
pSrcU = subU + i * subUVWidth;
pSrcV = subV + i * subUVWidth;
if (position == LEFT_TOP) {
pDstU = mainU + i * mainUVWidth;
pDstV = mainV + i * mainUVWidth;
} else if (position == LEFT_BOTTOM) {
pDstU = mainU + ((mainUVHeight - subUVHeight) * mainUVWidth) + i * mainUVWidth;
pDstV = mainV + ((mainUVHeight - subUVHeight) * mainUVWidth) + i * mainUVWidth;
} else if (position == RIGHT_TOP) {
pDstU = mainU + i * mainUVWidth + mainUVWidth - subUVWidth;
pDstV = mainV + i * mainUVWidth + mainUVWidth - subUVWidth;
} else if (position == RIGHT_BOTTOM) {
pDstU = mainU + ((mainUVHeight - subUVHeight) * mainUVWidth) +
i * mainUVWidth + (mainUVWidth - subUVWidth);
pDstV = mainV + ((mainUVHeight - subUVHeight) * mainUVWidth) +
i * mainUVWidth + (mainUVWidth - subUVWidth);
} else if (position == CENTER) {
pDstU = mainU + ((mainUVHeight - subUVHeight) / 2 * mainUVWidth) +
i * mainUVWidth + (mainUVWidth - subUVWidth) / 2;
pDstV = mainV + ((mainUVHeight - subUVHeight) / 2 * mainUVWidth) +
i * mainUVWidth + (mainUVWidth - subUVWidth) / 2;
} else {
LOGE("Unsupported position: %d", position);
return;
}
memcpy(pDstU, pSrcU, subUVWidth);
memcpy(pDstV, pSrcV, subUVWidth);
}
}
void DualCamera::processNV21(uchar *main, int mainWidth, int mainHeight,
uchar *sub, int subWidth, int subHeight) {
uchar *mainY = main;
uchar *mainUV = main + mainWidth * mainHeight;
uchar *subY = sub;
uchar *subUV = sub + subWidth * subHeight;
processNV21(mainY, mainUV, mainWidth, mainHeight, subY, subUV, subWidth, subHeight);
}
void DualCamera::processNV21(uchar *mainY, uchar *mainUV, int mainWidth, int mainHeight,
uchar *subY, uchar *subUV, int subWidth, int subHeight) {
LOGD("[processNV21] mainY:%p, mainUV:%p, mainWidth:%d, mainHeight:%d, subY:%p, subUV:%p, subWidth:%d, subHeight:%d, position:%d",
mainY, mainUV, mainWidth, mainHeight, subY, subUV, subWidth, subHeight, position);
int mainUVHeight = mainHeight / 2;
int mainUVWidth = mainWidth / 2;
unsigned char *pDstY;
unsigned char *pSrcY;
for (int i = 0; i < subHeight; i++) {
pSrcY = subY + i * subWidth;
if (position == LEFT_TOP) {
pDstY = mainY + i * mainWidth;
} else if (position == LEFT_BOTTOM) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) * mainWidth);
} else if (position == RIGHT_TOP) {
pDstY = mainY + i * mainWidth + (mainWidth - subWidth);
} else if (position == RIGHT_BOTTOM) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) * mainWidth) +
(mainWidth - subWidth);
} else if (position == CENTER) {
pDstY = mainY + i * mainWidth + ((mainHeight - subHeight) / 2 * mainWidth) +
(mainWidth - subWidth) / 2;
} else {
LOGE("Unsupported position: %d", position);
return;
}
memcpy(pDstY, pSrcY, subWidth);
}
int subUVHeight = subHeight / 2;
int subUVWidth = subWidth / 2;
unsigned char *pDstUV;
unsigned char *pSrcUV;
for (int i = 0; i < subUVHeight; i++) {
pSrcUV = subUV + i * subUVWidth * 2;
if (position == LEFT_TOP) {
pDstUV = mainUV + i * mainUVWidth * 2;
} else if (position == LEFT_BOTTOM) {
pDstUV = mainUV + ((mainUVHeight - subUVHeight) * mainUVWidth + i * mainUVWidth) * 2;
} else if (position == RIGHT_TOP) {
pDstUV = mainUV + (i * mainUVWidth + mainUVWidth - subUVWidth) * 2;
} else if (position == RIGHT_BOTTOM) {
pDstUV = mainUV + ((mainUVHeight - subUVHeight) * mainUVWidth +
i * mainUVWidth + mainUVWidth - subUVWidth) * 2;
} else if (position == CENTER) {
pDstUV = mainUV + ((mainUVHeight - subUVHeight) / 2 * mainUVWidth +
i * mainUVWidth + (mainUVWidth - subUVWidth) / 2) * 2;
} else {
LOGE("Unsupported position: %d", position);
return;
}
memcpy(pDstUV, pSrcUV, subUVWidth * 2);
}
}
3.3.3 mtkcam3/3rdparty/customer/cp_dualcamera/DualCameraCapture.cpp
#define LOG_TAG "DualCamera"
// Standard C header file
#include <stdlib.h>
#include <chrono>
#include <random>
#include <thread>
// Android system/core header file
// mtkcam custom header file
// mtkcam global header file
#include <mtkcam/utils/std/Log.h>
// Module header file
#include <mtkcam/drv/iopipe/SImager/IImageTransform.h>
#include <mtkcam/utils/metastore/IMetadataProvider.h>
#include <mtkcam3/3rdparty/plugin/PipelinePlugin.h>
#include <mtkcam3/3rdparty/plugin/PipelinePluginType.h>
//
#include <mtkcam/utils/metadata/client/mtk_metadata_tag.h>
#include <mtkcam/utils/metadata/hal/mtk_platform_metadata_tag.h>
// Local header file
#include <dual_camera.h>
using namespace NSCam;
using namespace android;
using namespace std;
using namespace NSCam::NSPipelinePlugin;
/******************************************************************************
*
******************************************************************************/
#define MY_LOGV(fmt, arg...) CAM_LOGV("(%d)[%s] " fmt, ::gettid(), __FUNCTION__, ##arg)
#define MY_LOGD(fmt, arg...) CAM_LOGD("(%d)[%s] " fmt, ::gettid(), __FUNCTION__, ##arg)
#define MY_LOGI(fmt, arg...) CAM_LOGI("(%d)[%s] " fmt, ::gettid(), __FUNCTION__, ##arg)
#define MY_LOGW(fmt, arg...) CAM_LOGW("(%d)[%s] " fmt, ::gettid(), __FUNCTION__, ##arg)
#define MY_LOGE(fmt, arg...) CAM_LOGE("(%d)[%s] " fmt, ::gettid(), __FUNCTION__, ##arg)
//
#define MY_LOGV_IF(cond, ...) do { if( (cond) ) { MY_LOGV(__VA_ARGS__); } }while(0)
#define MY_LOGD_IF(cond, ...) do { if( (cond) ) { MY_LOGD(__VA_ARGS__); } }while(0)
#define MY_LOGI_IF(cond, ...) do { if( (cond) ) { MY_LOGI(__VA_ARGS__); } }while(0)
#define MY_LOGW_IF(cond, ...) do { if( (cond) ) { MY_LOGW(__VA_ARGS__); } }while(0)
#define MY_LOGE_IF(cond, ...) do { if( (cond) ) { MY_LOGE(__VA_ARGS__); } }while(0)
/*******************************************************************************
* MACRO Utilities Define.
********************************************************************************/
namespace { // anonymous namespace for debug MARCO function
using AutoObject = std::unique_ptr<const char, std::function<void(const char*)>>;
//
auto
createAutoScoper(const char* funcName) -> AutoObject
{
CAM_LOGD("[%s] +", funcName);
return AutoObject(funcName, [](const char* p)
{
CAM_LOGD("[%s] -", p);
});
}
#define SCOPED_TRACER() auto scoped_tracer = ::createAutoScoper(__FUNCTION__)
//
auto
createAutoTimer(const char* funcName, const char* text) -> AutoObject
{
using Timing = std::chrono::time_point<std::chrono::high_resolution_clock>;
using DuationTime = std::chrono::duration<float, std::milli>;
Timing startTime = std::chrono::high_resolution_clock::now();
return AutoObject(text, [funcName, startTime](const char* p)
{
Timing endTime = std::chrono::high_resolution_clock::now();
DuationTime duationTime = endTime - startTime;
CAM_LOGD("[%s] %s, elapsed(ms):%.4f",funcName, p, duationTime.count());
});
}
#define AUTO_TIMER(TEXT) auto auto_timer = ::createAutoTimer(__FUNCTION__, TEXT)
//
#define UNREFERENCED_PARAMETER(param) (param)
//
} // end anonymous namespace for debug MARCO function
/*******************************************************************************
* Alias.
********************************************************************************/
using namespace NSCam;
using namespace NSCam::NSPipelinePlugin;
using namespace NSCam::NSIoPipe::NSSImager;
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// Type Alias..
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
using Property = FusionPlugin::Property;
using Selection = FusionPlugin::Selection;
using RequestPtr = FusionPlugin::Request::Ptr;
using RequestCallbackPtr = FusionPlugin::RequestCallback::Ptr;
//
template<typename T>
using AutoPtr = std::unique_ptr<T, std::function<void(T*)>>;
//
using ImgPtr = AutoPtr<IImageBuffer>;
using MetaPtr = AutoPtr<IMetadata>;
using ImageTransformPtr = AutoPtr<IImageTransform>;
/*******************************************************************************
* Namespace Start.
********************************************************************************/
namespace { // anonymous namespace
/*******************************************************************************
* Class Definition
********************************************************************************/
/**
* @brief third party pure bokeh algo. provider
*/
class DualCameraCapture final: public FusionPlugin::IProvider
{
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// Instantiation.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
public:
DualCameraCapture();
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// FusionPlugin::IProvider Public Operations.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
public:
void set(MINT32 iOpenId, MINT32 iOpenId2) override;
const Property& property() override;
MERROR negotiate(Selection& sel) override;
void init() override;
MERROR process(RequestPtr requestPtr, RequestCallbackPtr callbackPtr) override;
void abort(vector<RequestPtr>& requestPtrs) override;
void uninit() override;
~DualCameraCapture();
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// DualCameraCapture Private Operator.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private:
MERROR processDone(const RequestPtr& requestPtr, const RequestCallbackPtr& callbackPtr, MERROR status);
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// DualCameraCapture Private Data Members.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private:
MINT32 mEnable;
//
MINT32 mOpenId;
MINT32 mOpenId2;
MINT32 mDump;
DualCamera* mDualCamera = NULL;
};
REGISTER_PLUGIN_PROVIDER(Fusion, DualCameraCapture);
/**
* @brief utility class
*/
class DualCameraUtility final
{
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// Instantiation.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
public:
DualCameraUtility() = delete;
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// DualCameraUtility Public Operations.
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
public:
static inline ImageTransformPtr createImageTransformPtr();
static inline ImgPtr createImgPtr(BufferHandle::Ptr& hangle);
static inline MetaPtr createMetaPtr(MetadataHandle::Ptr& hangle);
static inline MVOID dump(const IImageBuffer* pImgBuf, const std::string& dumpName);
static inline MVOID dump(IMetadata* pMetaData, const std::string& dumpName);
static inline const char * format2String(MINT format);
static inline MVOID saveImg(NSCam::IImageBuffer* pImgBuf, const std::string& fileName);
};
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// DualCameraUtility implementation.
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ImageTransformPtr
DualCameraUtility::
createImageTransformPtr()
{
return ImageTransformPtr(IImageTransform::createInstance(), [](IImageTransform *p)
{
p->destroyInstance();
});
}
ImgPtr
DualCameraUtility::
createImgPtr(BufferHandle::Ptr& hangle)
{
return ImgPtr(hangle->acquire(), [hangle](IImageBuffer* p)
{
UNREFERENCED_PARAMETER(p);
hangle->release();
});
};
MetaPtr
DualCameraUtility::
createMetaPtr(MetadataHandle::Ptr& hangle)
{
return MetaPtr(hangle->acquire(), [hangle](IMetadata* p)
{
UNREFERENCED_PARAMETER(p);
hangle->release();
});
};
MVOID
DualCameraUtility::
dump(const IImageBuffer* pImgBuf, const std::string& dumpName)
{
MY_LOGD("dump image info, dumpName:%s, info:[a:%p, si:%dx%d, st:%zu, f:0x%x, va:%p]",
dumpName.c_str(), pImgBuf,
pImgBuf->getImgSize().w, pImgBuf->getImgSize().h,
pImgBuf->getBufStridesInBytes(0),
pImgBuf->getImgFormat(),
reinterpret_cast<void*>(pImgBuf->getBufVA(0)));
}
MVOID
DualCameraUtility::
dump(IMetadata* pMetaData, const std::string& dumpName)
{
MY_LOGD("dump meta info, dumpName:%s, addr::%p, count:%u",
dumpName.c_str(), pMetaData, pMetaData->count());
}
MVOID
DualCameraUtility::
saveImg(NSCam::IImageBuffer* pImgBuf, const std::string& fileName)
{
char path[256];
snprintf(path, sizeof(path), "/data/vendor/camera_dump/%s_%zu_%dx%d.%s", fileName.c_str(), pImgBuf->getBufStridesInBytes(0),
pImgBuf->getImgSize().w, pImgBuf->getImgSize().h, format2String(pImgBuf->getImgFormat()));
pImgBuf->saveToFile(path);
}
const char*
DualCameraUtility::
format2String(MINT format) {
switch(format) {
case NSCam::eImgFmt_RGBA8888: return "rgba";
case NSCam::eImgFmt_RGB888: return "rgb";
case NSCam::eImgFmt_RGB565: return "rgb565";
case NSCam::eImgFmt_STA_BYTE: return "byte";
case NSCam::eImgFmt_YVYU: return "yvyu";
case NSCam::eImgFmt_UYVY: return "uyvy";
case NSCam::eImgFmt_VYUY: return "vyuy";
case NSCam::eImgFmt_YUY2: return "yuy2";
case NSCam::eImgFmt_YV12: return "yv12";
case NSCam::eImgFmt_YV16: return "yv16";
case NSCam::eImgFmt_NV16: return "nv16";
case NSCam::eImgFmt_NV61: return "nv61";
case NSCam::eImgFmt_NV12: return "nv12";
case NSCam::eImgFmt_NV21: return "nv21";
case NSCam::eImgFmt_I420: return "i420";
case NSCam::eImgFmt_I422: return "i422";
case NSCam::eImgFmt_Y800: return "y800";
case NSCam::eImgFmt_BAYER8: return "bayer8";
case NSCam::eImgFmt_BAYER10: return "bayer10";
case NSCam::eImgFmt_BAYER12: return "bayer12";
case NSCam::eImgFmt_BAYER14: return "bayer14";
case NSCam::eImgFmt_FG_BAYER8: return "fg_bayer8";
case NSCam::eImgFmt_FG_BAYER10: return "fg_bayer10";
case NSCam::eImgFmt_FG_BAYER12: return "fg_bayer12";
case NSCam::eImgFmt_FG_BAYER14: return "fg_bayer14";
default: return "unknown";
}
}
//++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// ThirdPartyFusionProvider implementation.
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
DualCameraCapture::
DualCameraCapture()
: mEnable(-1)
, mOpenId(-1)
, mOpenId2(-1)
, mDump(-1)
{
// on:1/off:0/auto:-1
mEnable = ::property_get_int32("vendor.debug.camera.dualcamera.enable", mEnable);
mDump = ::property_get_int32("vendor.debug.camera.dualcamera.dump", mDump);
mDualCamera = new DualCamera();
MY_LOGD("ctor:%p, mEnable:%d", this, mEnable);
}
void
DualCameraCapture::
set(MINT32 iOpenId, MINT32 iOpenId2)
{
mOpenId = iOpenId;
mOpenId2 = iOpenId2;
MY_LOGD("set openId:%d openId2:%d", mOpenId, mOpenId2);
}
const Property&
DualCameraCapture::
property()
{
static const Property prop = []() -> const Property
{
Property ret;
ret.mName = "DualCamera";
ret.mFeatures = TP_FEATURE_PUREBOKEH;
ret.mFaceData = eFD_Cache;
ret.mBoost = eBoost_CPU;
ret.mInitPhase = ePhase_OnPipeInit;
return ret;
}();
return prop;
}
MERROR
DualCameraCapture::
negotiate(Selection& sel)
{
SCOPED_TRACER();
if( mEnable == 0 )
{
MY_LOGD("force off tp dual camera");
return BAD_VALUE;
}
// INPUT
{
sel.mIBufferFull
.setRequired(MTRUE)
.addAcceptedFormat(eImgFmt_NV21)
.addAcceptedSize(eImgSize_Full);
sel.mIBufferFull2
.setRequired(MTRUE)
.addAcceptedFormat(eImgFmt_NV21)
.addAcceptedSize(eImgSize_Full);
sel.mIMetadataApp.setRequired(MTRUE);
sel.mIMetadataHal.setRequired(MTRUE);
sel.mIMetadataHal2.setRequired(MTRUE);
sel.mIMetadataDynamic.setRequired(MTRUE);
sel.mIMetadataDynamic2.setRequired(MTRUE);
}
// OUTPUT
{
sel.mOBufferFull
.setRequired(MTRUE)
.addAcceptedFormat(eImgFmt_NV21)
.addAcceptedSize(eImgSize_Full);
sel.mOMetadataApp.setRequired(MTRUE);
sel.mOMetadataHal.setRequired(MTRUE);
}
return OK;
}
void
DualCameraCapture::
init()
{
SCOPED_TRACER();
::srand(time(nullptr));
}
MERROR
DualCameraCapture::
process(RequestPtr requestPtr, RequestCallbackPtr callbackPtr)
{
SCOPED_TRACER();
auto isValidInput = [](const RequestPtr& requestPtr) -> MBOOL
{
const MBOOL ret = requestPtr->mIBufferFull != nullptr
&& requestPtr->mIBufferFull2 != nullptr
&& requestPtr->mIMetadataApp != nullptr
&& requestPtr->mIMetadataHal != nullptr
&& requestPtr->mIMetadataHal2 != nullptr;
if( !ret )
{
MY_LOGE("invalid request with input, req:%p, inFullImg:%p, inFullImg2:%p, inAppMeta:%p, inHalMeta:%p, inHalMeta2:%p",
requestPtr.get(),
requestPtr->mIBufferFull.get(),
requestPtr->mIBufferFull2.get(),
requestPtr->mIMetadataApp.get(),
requestPtr->mIMetadataHal.get(),
requestPtr->mIMetadataHal2.get());
}
return ret;
};
auto isValidOutput = [](const RequestPtr& requestPtr) -> MBOOL
{
const MBOOL ret = requestPtr->mOBufferFull != nullptr
&& requestPtr->mOMetadataApp != nullptr
&& requestPtr->mOMetadataHal != nullptr;
if( !ret )
{
MY_LOGE("invalid request with input, req:%p, outFullImg:%p, outAppMeta:%p, outHalMeta:%p",
requestPtr.get(),
requestPtr->mOBufferFull.get(),
requestPtr->mOMetadataApp.get(),
requestPtr->mOMetadataHal.get());
}
return ret;
};
MY_LOGD("process, reqAdrr:%p", requestPtr.get());
if( !isValidInput(requestPtr) )
{
return processDone(requestPtr, callbackPtr, BAD_VALUE);
}
if( !isValidOutput(requestPtr) )
{
return processDone(requestPtr, callbackPtr, BAD_VALUE);
}
//
//
{
// note: we can just call createXXXXPtr one time for a specified handle
ImgPtr inMainImgPtr = DualCameraUtility::createImgPtr(requestPtr->mIBufferFull);
ImgPtr inSubImgPtr = DualCameraUtility::createImgPtr(requestPtr->mIBufferFull2);
ImgPtr outFSImgPtr = DualCameraUtility::createImgPtr(requestPtr->mOBufferFull);
//
MetaPtr inAppMetaPtr = DualCameraUtility::createMetaPtr(requestPtr->mIMetadataApp);
MetaPtr inMainHalMetaPtr = DualCameraUtility::createMetaPtr(requestPtr->mIMetadataHal);
MetaPtr inSubHalMetaPtr = DualCameraUtility::createMetaPtr(requestPtr->mIMetadataHal2);
MetaPtr outAppMetaPtr = DualCameraUtility::createMetaPtr(requestPtr->mOMetadataApp);
MetaPtr outHalMetaPtr = DualCameraUtility::createMetaPtr(requestPtr->mOMetadataHal);
// dump info
{
DualCameraUtility::dump(inMainImgPtr.get(), "inputMainImg");
DualCameraUtility::dump(inSubImgPtr.get(), "inputSubImg");
DualCameraUtility::dump(outFSImgPtr.get(), "outFSImg");
//
DualCameraUtility::dump(inAppMetaPtr.get(), "inAppMeta");
DualCameraUtility::dump(inMainHalMetaPtr.get(), "inMainHalMeta");
DualCameraUtility::dump(inSubHalMetaPtr.get(), "inSubHalMeta");
DualCameraUtility::dump(outAppMetaPtr.get(), "outAppMeta");
DualCameraUtility::dump(outHalMetaPtr.get(), "outHalMeta");
}
//dual camera algo
{
AUTO_TIMER("proces dual camera algo.");
NSCam::IImageBuffer* inMainImgBuf = inMainImgPtr.get();
NSCam::IImageBuffer* inSubImgBuf = inSubImgPtr.get();
NSCam::IImageBuffer* outImgBuf = outFSImgPtr.get();
if (mDump) {
DualCameraUtility::saveImg(inMainImgBuf, "inputMainImg");
DualCameraUtility::saveImg(inSubImgBuf, "inputSubImg");
}
memcpy(reinterpret_cast<uchar*>(outImgBuf->getBufVA(0)), reinterpret_cast<uchar*>(inMainImgBuf->getBufVA(0)), inMainImgBuf->getBufSizeInBytes(0));
memcpy(reinterpret_cast<uchar*>(outImgBuf->getBufVA(1)), reinterpret_cast<uchar*>(inMainImgBuf->getBufVA(1)), inMainImgBuf->getBufSizeInBytes(1));
if (mDualCamera != NULL) {
mDualCamera->processNV21(reinterpret_cast<uchar*>(outImgBuf->getBufVA(0)), reinterpret_cast<uchar*>(outImgBuf->getBufVA(1)),
outImgBuf->getImgSize().w, outImgBuf->getImgSize().h,
reinterpret_cast<uchar*>(inSubImgBuf->getBufVA(0)), reinterpret_cast<uchar*>(inSubImgBuf->getBufVA(1)),
inSubImgBuf->getImgSize().w, inSubImgBuf->getImgSize().h);
}
}
}
return processDone(requestPtr, callbackPtr, OK);
}
MERROR
DualCameraCapture::
processDone(const RequestPtr& requestPtr, const RequestCallbackPtr& callbackPtr, MERROR status)
{
SCOPED_TRACER();
MY_LOGD("process done, call complete, reqAddr:%p, callbackPtr:%p, status:%d",
requestPtr.get(), callbackPtr.get(), status);
if( callbackPtr != nullptr )
{
callbackPtr->onCompleted(requestPtr, status);
}
return OK;
}
void
DualCameraCapture::
abort(vector<RequestPtr>& requestPtrs)
{
SCOPED_TRACER();
for(auto& item : requestPtrs)
{
MY_LOGD("abort request, reqAddr:%p", item.get());
}
}
void
DualCameraCapture::
uninit()
{
SCOPED_TRACER();
}
DualCameraCapture::
~DualCameraCapture()
{
MY_LOGD("dtor:%p", this);
if (mDualCamera != NULL) {
delete mDualCamera;
mDualCamera = NULL;
}
}
} // anonymous namespace
主要函數(shù)介紹:
在property函數(shù)中feature類型設(shè)置成TP_FEATURE_PUREBOKEH缅茉,并設(shè)置名稱等屬性。
在negotiate函數(shù)中配置算法需要的輸入男摧、輸出圖像的格式蔬墩、尺寸。注意耗拓,雙攝算法有2個輸入Buffer拇颅,但是只有1個輸出Buffer。
在process函數(shù)中接入算法乔询。調(diào)用算法接口函數(shù)processNV21進行處理樟插。
集成時,可以參照MTK提供的實例文件TPPureBokehImpl.cpp或者TPFusionImpl.cpp竿刁。
3.3.4 mtkcam3/3rdparty/customer/Android.mk
最終vendor.img需要的目標(biāo)共享庫是libmtkcam_3rdparty.customer.so黄锤。因此,我們還需要修改Android.mk食拜,使模塊libmtkcam_3rdparty.customer依賴libmtkcam.plugin.tp_dc鸵熟。
同時,為了避免沖突以及出圖更快监婶,我們還需要移除MTK示例的libmtkcam.plugin.tp_purebokeh旅赢。
diff --git a/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/customer/Android.mk b/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/customer/Android.mk
index 5e5dd6524f..bf2f6ffeae 100755
--- a/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/customer/Android.mk
+++ b/vendor/mediatek/proprietary/hardware/mtkcam3/3rdparty/customer/Android.mk
@@ -65,7 +65,7 @@ LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_bokeh
LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_depth
LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_fusion
LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_dc_hdr
-LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_purebokeh
+#LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_purebokeh
#
LOCAL_SHARED_LIBRARIES += libcam.iopipe
LOCAL_SHARED_LIBRARIES += libmtkcam_modulehelper
@@ -83,6 +83,11 @@ LOCAL_SHARED_LIBRARIES += libyuv.vendor
LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_mfnr
endif
+ifeq ($(QXT_DUALCAMERA_SUPPORT), yes)
+LOCAL_SHARED_LIBRARIES += libdualcamera
+LOCAL_WHOLE_STATIC_LIBRARIES += libmtkcam.plugin.tp_dc
+endif
+
由于MTK已經(jīng)定義了相關(guān)的metadata齿桃,因此,我們也無需再自定義metadata煮盼。
前面這些步驟完成之后短纵,集成工作就基本完成了。我們需要重新編譯一下系統(tǒng)源碼僵控,為節(jié)約時間香到,可以只編譯vendor.img。
4. APP調(diào)用算法
由于MTK原生的Camera APP本身就有雙攝stereo模式报破,我們也無需再寫APP來驗證算法悠就。為樣機刷入系統(tǒng)整包或者vendor.img,開機后充易,進入MTK 原生Camera APP的stereo模式梗脾。我們來拍一張看看效果:
輔攝的色彩效果似乎有些異常,但是不管怎樣盹靴,模擬算法庫是運行正常的炸茧,已經(jīng)將主、輔攝圖像拼成畫中畫效果了稿静。
5. 結(jié)語
雙攝算法是所有算法中最復(fù)雜的梭冠,涉及到標(biāo)定、主副攝同步改备、深度計算控漠、模糊調(diào)優(yōu)、邊緣處理等等悬钳。算法和集成兩部分只要出一點點小問題盐捷,雙攝的效果可能會天差地別。集成雙攝算法時默勾,請一定仔細(xì)毙驯,仔細(xì),再仔細(xì)灾测。
MTK HAL算法集成系列的三篇文章到這里就收官了。農(nóng)歷2020年馬上要結(jié)束了垦巴,這應(yīng)該也是我農(nóng)歷2020年最后一篇文章了媳搪。也臨近放假了,提前祝大家假期愉快骤宣!
6. 本文參考
本文主要參考MTK-Online的Camera quick start部分秦爆,MTK在MTK-Online上有詳細(xì)的文章及教程(為MTK點贊):
https://online.mediatek.com/QuickStart/2a17666a-9d46-4686-9222-610ec0f087cc
歡迎交流、點贊憔披、轉(zhuǎn)載等限,碼字不易爸吮,轉(zhuǎn)載請注明出處。