上一篇文章已經(jīng)說了怎么編譯出android下可用的ffmpeg so文件,并且通過傳遞一個字符串命令的方式實現(xiàn)需求速缆,真的非常方便规哪,這里我就用這個so來實現(xiàn)一個小視頻簡單制作功能烁落。
目標(biāo)功能:
1.分離出視頻的原聲。
2.得到無聲的視頻文件膜赃。
3.修改視頻原聲的聲音大小挺邀。
4.修改背景音樂的聲音大小。
5.視頻原聲和音樂合成。
6.視頻和音頻合成端铛。
1.打開activity_main.xml繪制一個布局
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<SurfaceView
android:id="@+id/video_surface_view"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<LinearLayout
android:id="@+id/rec_layout"
android:layout_gravity="bottom"
android:orientation="vertical"
android:padding="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<LinearLayout
android:layout_gravity="end"
android:gravity="center_vertical"
android:layout_marginBottom="10dp"
android:orientation="horizontal"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
<View
android:layout_width="10dp"
android:layout_height="10dp"
android:layout_marginRight="5dp"
android:background="@drawable/ripple_circle"/>
<TextView
android:id="@+id/time"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="16sp"
android:text="00:00"/>
</LinearLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ProgressBar
android:id="@+id/progress"
android:layout_width="match_parent"
android:layout_height="5dp"
android:layout_centerVertical="true"
android:alpha="0.8"
style="@style/progressBarHorizontal_color"
android:max="30" />
<View
android:id="@+id/marking"
android:layout_width="2dp"
android:layout_height="5dp"
android:alpha="0.7"
android:layout_marginLeft="45dp"
android:layout_centerVertical="true"
android:background="@android:color/holo_red_dark" />
</RelativeLayout>
<RelativeLayout
android:layout_marginTop="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/start_video"
android:layout_width="60dp"
android:layout_height="60dp"
android:layout_centerInParent="true"
android:src="@mipmap/bt_start"/>
<ImageView
android:id="@+id/start_video_ing"
android:visibility="gone"
android:layout_width="60dp"
android:layout_height="60dp"
android:layout_centerInParent="true"
android:src="@mipmap/icon_video_ing"/>
</RelativeLayout>
</LinearLayout>
<LinearLayout
android:id="@+id/top_layout"
android:layout_gravity="end"
android:layout_margin="10dip"
android:gravity="center"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/inversion"
android:layout_width="37dp"
android:layout_height="37dp"
android:padding="6dp"
android:src="@mipmap/icon_fanzhuan"/>
<ImageView
android:id="@+id/close"
android:layout_width="37dip"
android:layout_height="37dip"
android:padding="10dp"
android:layout_gravity="end"
android:src="@mipmap/live_close_icon" />
</LinearLayout>
</FrameLayout>
ripple_circle.xml
<?xml version="1.0" encoding="utf-8"?>
<shape xmlns:android="http://schemas.android.com/apk/res/android"
android:shape="oval">
<solid android:color="@android:color/holo_red_dark" />
<stroke
android:width="1dp"
android:color="@android:color/white" />
<size
android:height="14dp"
android:width="14dp" />
</shape>
@style/progressBarHorizontal_color
<style name="progressBarHorizontal_color" parent="android:Widget.ProgressBar.Horizontal">
<item name="android:indeterminateOnly">false</item>
<item name="android:progressDrawable">@drawable/progress_color_horizontal</item>
<item name="android:minHeight">5dip</item>
<item name="android:maxHeight">5dip</item>
</style>
progress_color_horizontal.xml
<?xml version="1.0" encoding="utf-8"?>
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
<item android:id="@android:id/background">
<shape>
<corners android:radius="2dip" />
<gradient
android:startColor="#555555"
android:centerColor="#555555"
android:centerY="0.75"
android:endColor="#555555"
android:angle="270" />
</shape>
</item>
<item android:id="@android:id/secondaryProgress">
<clip>
<shape>
<corners android:radius="5dip" />
<gradient
android:startColor="#80C07AB8"
android:centerColor="#80C07AB8"
android:centerY="0.75"
android:endColor="#a0C07AB8"
android:angle="270" />
</shape>
</clip>
</item>
<item android:id="@android:id/progress">
<clip>
<shape>
<corners android:radius="2dip" />
<gradient
android:startColor="@android:color/holo_red_dark"
android:centerColor="@android:color/holo_red_dark"
android:centerY="0.75"
android:endColor="@android:color/holo_red_dark"
android:angle="270" />
</shape>
</clip>
</item>
</layer-list>
圖片資源
2.實現(xiàn)一個調(diào)用相機的類泣矛,用來錄制小視頻
我這里簡單實現(xiàn)了一個工具類,主要使用Camera配合SurfaceView來錄制禾蚕,這些東西我就不多做解釋了乳蓄,直接看源碼。
package com.tangyx.video.ffmpeg;
import android.app.Activity;
import android.hardware.Camera;
import android.media.MediaRecorder;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import java.io.File;
import java.io.IOException;
import java.util.List;
/**
* Created by tangyx on 2017/8/2.
*
*/
public class MediaHelper implements SurfaceHolder.Callback {
private Activity activity;
private MediaRecorder mMediaRecorder;
private Camera mCamera;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
private File targetDir;
private String targetName;
private File targetFile;
private boolean isRecording;
private GestureDetector mDetector;
private boolean isZoomIn = false;
private int or = 90;
private int position = Camera.CameraInfo.CAMERA_FACING_BACK;
public MediaHelper(Activity activity) {
this.activity = activity;
}
public void setTargetDir(File file) {
this.targetDir = file;
}
public void setTargetName(String name) {
this.targetName = name;
}
public String getTargetFilePath() {
return targetFile.getPath();
}
public boolean deleteTargetFile() {
if (targetFile.exists()) {
return targetFile.delete();
} else {
return false;
}
}
public void setSurfaceView(SurfaceView view) {
this.mSurfaceView = view;
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mSurfaceHolder.addCallback(this);
mDetector = new GestureDetector(activity, new ZoomGestureListener());
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
mDetector.onTouchEvent(event);
return true;
}
});
}
public boolean isRecording() {
return isRecording;
}
public void record() {
if (isRecording) {
try {
mMediaRecorder.stop(); // stop the recording
} catch (RuntimeException e) {
e.printStackTrace();
targetFile.delete();
}
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
isRecording = false;
} else {
startRecordThread();
}
}
private boolean prepareRecord() {
try {
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// mMediaRecorder.setProfile(profile);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoSize(1280, 720);
// mMediaRecorder.setVideoSize(640, 480);
mMediaRecorder.setVideoEncodingBitRate(2 * 1024 * 1024);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
if (position == Camera.CameraInfo.CAMERA_FACING_BACK) {
mMediaRecorder.setOrientationHint(or);
} else {
mMediaRecorder.setOrientationHint(270);
}
targetFile = new File(targetDir, targetName);
mMediaRecorder.setOutputFile(targetFile.getPath());
} catch (Exception e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
}
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
} catch (IOException e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
}
return true;
}
public void stopRecordSave() {
if (isRecording) {
isRecording = false;
try {
mMediaRecorder.stop();
} catch (RuntimeException r) {
r.printStackTrace();
} finally {
releaseMediaRecorder();
}
}
}
public void stopRecordUnSave() {
if (isRecording) {
isRecording = false;
try {
mMediaRecorder.stop();
} catch (RuntimeException r) {
if (targetFile.exists()) {
//不保存直接刪掉
targetFile.delete();
}
} finally {
releaseMediaRecorder();
}
if (targetFile.exists()) {
//不保存直接刪掉
targetFile.delete();
}
}
}
private void startPreView(SurfaceHolder holder) {
if (mCamera == null) {
mCamera = Camera.open(position);
}
if (mCamera != null) {
mCamera.setDisplayOrientation(or);
try {
mCamera.setPreviewDisplay(holder);
Camera.Parameters parameters = mCamera.getParameters();
List<Camera.Size> mSupportedPreviewSizes = parameters.getSupportedPreviewSizes();
if (mSupportedPreviewSizes != null) {
int width = mSurfaceView.getWidth();
int height = mSurfaceView.getHeight();
Camera.Size mPreviewSize = getOptimalPreviewSize(mSupportedPreviewSizes,
Math.max(width, height), Math.min(width, height));
parameters.setPreviewSize(mPreviewSize.width, mPreviewSize.height);
}
List<String> focusModes = parameters.getSupportedFocusModes();
if (focusModes != null) {
for (String mode : focusModes) {
if(mode.contains(Camera.Parameters.FOCUS_MODE_AUTO)){
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
}
}
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
final double ASPECT_TOLERANCE = 0.1;
double targetRatio = (double) w / h;
if (sizes == null) {
return null;
}
Camera.Size optimalSize = null;
double minDiff = Double.MAX_VALUE;
int targetHeight = h;
for (Camera.Size size : sizes) {
double ratio = (double) size.width / size.height;
if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE)
continue;
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
if (optimalSize == null) {
minDiff = Double.MAX_VALUE;
for (Camera.Size size : sizes) {
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
}
return optimalSize;
}
private void releaseMediaRecorder() {
if (mMediaRecorder != null) {
mMediaRecorder.reset();
mMediaRecorder.release();
mMediaRecorder = null;
}
}
public void releaseCamera() {
if (mCamera != null) {
mCamera.release();
mCamera = null;
}
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
mSurfaceHolder = holder;
startPreView(holder);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mCamera != null) {
releaseCamera();
}
if (mMediaRecorder != null) {
releaseMediaRecorder();
}
}
private void startRecordThread() {
if (prepareRecord()) {
try {
mMediaRecorder.start();
isRecording = true;
} catch (RuntimeException r) {
r.printStackTrace();
releaseMediaRecorder();
}
}
}
private class ZoomGestureListener extends GestureDetector.SimpleOnGestureListener {
//雙擊手勢事件
@Override
public boolean onDoubleTap(MotionEvent e) {
super.onDoubleTap(e);
if (!isZoomIn) {
setZoom(20);
isZoomIn = true;
} else {
setZoom(0);
isZoomIn = false;
}
return true;
}
}
private void setZoom(int zoomValue) {
if (mCamera != null) {
Camera.Parameters parameters = mCamera.getParameters();
if (parameters.isZoomSupported()) {
int maxZoom = parameters.getMaxZoom();
if (maxZoom == 0) {
return;
}
if (zoomValue > maxZoom) {
zoomValue = maxZoom;
}
parameters.setZoom(zoomValue);
mCamera.setParameters(parameters);
}
}
}
public void autoChangeCamera() {
if (position == Camera.CameraInfo.CAMERA_FACING_BACK) {
position = Camera.CameraInfo.CAMERA_FACING_FRONT;
} else {
position = Camera.CameraInfo.CAMERA_FACING_BACK;
}
releaseCamera();
stopRecordUnSave();
startPreView(mSurfaceHolder);
}
}
3.布局和工具類都準(zhǔn)備好了夕膀,接下里就在我們首頁MainActivity.java調(diào)用工具類錄制一段最長30秒虚倒,最少8秒的視頻。
AndroidManifest.xml中加入相機權(quán)限以及讀寫文件的權(quán)限
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-feature android:name="android.hardware.camera"/>
在6.0+系統(tǒng)的手機上产舞,只是在注冊文件加入權(quán)限還不夠的魂奥,還需要提示用戶手動允許開啟權(quán)限(真是麻煩)所以還需要在寫一個權(quán)限管理的工具類,以及權(quán)限檢測提示的activity
創(chuàng)建PermissionHelper.java加入內(nèi)容
public class PermissionHelper {
private final Context mContext;
public PermissionHelper(Context context) {
mContext = context.getApplicationContext();
}
// 判斷權(quán)限集合
public boolean lacksPermissions(String... permissions) {
for (String permission : permissions) {
if (lacksPermission(permission)) {
return true;
}
}
return false;
}
// 判斷是否缺少權(quán)限
private boolean lacksPermission(String permission) {
return ContextCompat.checkSelfPermission(mContext, permission) ==
PackageManager.PERMISSION_DENIED;
}
}
創(chuàng)建PermissionsActivity.java加入以下內(nèi)容(記得不要忘記在AndroidManifest.xml中注冊)
/**
* 權(quán)限獲取頁面
* <p/>
*/
public class PermissionsActivity extends AppCompatActivity {
//基本權(quán)限必須有
public final static String[] PERMISSIONS = new String[]{
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.RECORD_AUDIO,
Manifest.permission.CAMERA
};
public static final int PERMISSIONS_GRANTED = 1010; // 權(quán)限授權(quán)
public static final int PERMISSIONS_DENIED = 1011; // 權(quán)限拒絕
public static final int REQUEST_CODE = 1012; // 請求碼
private static final int PERMISSION_REQUEST_CODE = 0; // 系統(tǒng)權(quán)限管理頁面的參數(shù)
private static final String EXTRA_PERMISSIONS =
"megawave.permission.extra_permission"; // 權(quán)限參數(shù)
private static final String PACKAGE_URL_SCHEME = "package:"; // 方案
private PermissionHelper mChecker; // 權(quán)限檢測器
private boolean isRequireCheck; // 是否需要系統(tǒng)權(quán)限檢測, 防止和系統(tǒng)提示框重疊
private static boolean isShowSetting=true;
// 啟動當(dāng)前權(quán)限頁面的公開接口
public static void startActivityForResult(Activity activity, int requestCode, String... permissions) {
startActivityForResult(activity,requestCode,true,permissions);
}
public static void startActivityForResult(Activity activity, int requestCode,boolean showSetting,String... permissions) {
Intent intent = new Intent(activity, PermissionsActivity.class);
intent.putExtra(EXTRA_PERMISSIONS, permissions);
ActivityCompat.startActivityForResult(activity, intent, requestCode, null);
isShowSetting = showSetting;
}
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (getIntent() == null || !getIntent().hasExtra(EXTRA_PERMISSIONS)) {
throw new RuntimeException("PermissionsActivity需要使用靜態(tài)startActivityForResult方法啟動!");
}
setContentView(R.layout.activity_permissions);
mChecker = new PermissionHelper(this);
isRequireCheck = true;
}
@Override
protected void onResume() {
super.onResume();
if (isRequireCheck) {
String[] permissions = getPermissions();
if (mChecker.lacksPermissions(permissions)) {
requestPermissions(permissions); // 請求權(quán)限
} else {
allPermissionsGranted(); // 全部權(quán)限都已獲取
}
} else {
isRequireCheck = true;
}
}
// 返回傳遞的權(quán)限參數(shù)
private String[] getPermissions() {
return getIntent().getStringArrayExtra(EXTRA_PERMISSIONS);
}
// 請求權(quán)限兼容低版本
private void requestPermissions(String... permissions) {
ActivityCompat.requestPermissions(this, permissions, PERMISSION_REQUEST_CODE);
}
// 全部權(quán)限均已獲取
private void allPermissionsGranted() {
setResult(PERMISSIONS_GRANTED);
finish();
}
/**
* 用戶權(quán)限處理,
* 如果全部獲取, 則直接過.
* 如果權(quán)限缺失, 則提示Dialog.
*
* @param requestCode 請求碼
* @param permissions 權(quán)限
* @param grantResults 結(jié)果
*/
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
if (requestCode == PERMISSION_REQUEST_CODE && hasAllPermissionsGranted(grantResults)) {
isRequireCheck = true;
allPermissionsGranted();
} else {
isRequireCheck = false;
if(isShowSetting){
showMissingPermissionDialog();
}
}
}
// 含有全部的權(quán)限
private boolean hasAllPermissionsGranted(@NonNull int[] grantResults) {
for (int grantResult : grantResults) {
if (grantResult == PackageManager.PERMISSION_DENIED) {
return false;
}
}
return true;
}
// 顯示缺失權(quán)限提示
public void showMissingPermissionDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(PermissionsActivity.this);
builder.setTitle(R.string.label_help);
builder.setMessage(R.string.tips_permissions);
// 拒絕, 退出應(yīng)用
builder.setNegativeButton(R.string.label_quit, new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
setResult(-100);
finish();
}
});
builder.setPositiveButton(R.string.label_setting, new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
startAppSettings();
}
});
builder.setCancelable(false);
builder.show();
}
// 啟動應(yīng)用的設(shè)置
private void startAppSettings() {
Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
intent.setData(Uri.parse(PACKAGE_URL_SCHEME + getPackageName()));
startActivity(intent);
}
}
回到MainActivity中易猫,獲取SurfaceView控件以及初始化MediaHelper工具類耻煤,啟動相機。
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
WindowManager.LayoutParams p = this.getWindow().getAttributes();
p.flags |= WindowManager.LayoutParams.FLAG_FULLSCREEN;//|=:或等于准颓,取其一
getWindow().setAttributes(p);
setContentView(R.layout.activity_main);
mSurfaceView = (SurfaceView) findViewById(R.id.video_surface_view);
mStartVideo = (ImageView) findViewById(R.id.start_video);
mStartVideoIng = (ImageView) findViewById(R.id.start_video_ing);
mProgress = (ProgressBar) findViewById(R.id.progress);
mTime = (TextView) findViewById(R.id.time);
findViewById(R.id.close).setOnClickListener(this);
findViewById(R.id.inversion).setOnClickListener(this);
mStartVideo.setOnClickListener(this);
mStartVideoIng.setOnClickListener(this);
//初始化工具類
mMediaHelper = new MediaHelper(this);
//設(shè)置視頻存放地址的主目錄
mMediaHelper.setTargetDir(new File(new FileUtils(this).getStorageDirectory()));
//設(shè)置錄制視頻的名字
mMediaHelper.setTargetName(UUID.randomUUID() + ".mp4");
mPermissionHelper = new PermissionHelper(this);
}
@Override
protected void onResume() {
super.onResume();
if(mPermissionHelper.lacksPermissions(PermissionsActivity.PERMISSIONS)){
PermissionsActivity.startActivityForResult(this,PermissionsActivity.REQUEST_CODE,PermissionsActivity.PERMISSIONS);
}else{
//啟動相機
mMediaHelper.setSurfaceView(mSurfaceView);
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(resultCode == PermissionsActivity.PERMISSIONS_GRANTED){
//啟動相機
mMediaHelper.setSurfaceView(mSurfaceView);
}else if(resultCode == -100){
finish();
}
}
FileUtils.java工具類
public class FileUtils {
/**
* sd卡的根目錄
*/
private static String mSdRootPath = Environment.getExternalStorageDirectory().getPath();
/**
* 手機的緩存根目錄
*/
private static String mDataRootPath = null;
/**
* 保存Image的目錄名
*/
private final static String FOLDER_NAME = "/ffmpeg";
public final static String IMAGE_NAME = "/cache";
public FileUtils(Context context){
mDataRootPath = context.getCacheDir().getPath();
makeAppDir();
}
public String makeAppDir(){
String path = getStorageDirectory();
File folderFile = new File(path);
if(!folderFile.exists()){
folderFile.mkdir();
}
path = path + IMAGE_NAME;
folderFile = new File(path);
if(!folderFile.exists()){
folderFile.mkdir();
}
return path;
}
/**
* 獲取儲存Image的目錄
* @return
*/
public String getStorageDirectory(){
String localPath = Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED) ?
mSdRootPath + FOLDER_NAME : mDataRootPath + FOLDER_NAME;
File folderFile = new File(localPath);
if(!folderFile.exists()){
folderFile.mkdir();
}
return localPath;
}
/**
* 刪除文件
*/
public void deleteFile(String deletePath,String videoPath) {
File file = new File(deletePath);
if (file.exists()) {
File[] files = file.listFiles();
for (File f : files) {
if(f.isDirectory()){
if(f.listFiles().length==0){
f.delete();
}else{
deleteFile(f.getAbsolutePath(),videoPath);
}
}else if(!f.getAbsolutePath().equals(videoPath)){
f.delete();
}
}
}
}
}
4.相機調(diào)用成功后哈蝇,點擊錄制按鈕就開始錄制視頻了。
@Override
public void onClick(View view) {
switch (view.getId()){
case R.id.close:
mMediaHelper.stopRecordUnSave();
finish();
break;
case R.id.start_video:
mProgressNumber = 0;
mProgress.setProgress(0);
mMediaHelper.record();
startView();
break;
case R.id.start_video_ing:
if(mProgressNumber == 0){
stopView(false);
break;
}
if (mProgressNumber < 8) {
//時間太短不保存
Toast.makeText(this,"請至少錄制到紅線位置",Toast.LENGTH_LONG).show();
mMediaHelper.stopRecordUnSave();
stopView(false);
break;
}
//停止錄制
mMediaHelper.stopRecordSave();
stopView(true);
break;
case R.id.inversion:
mMediaHelper.stopRecordUnSave();
stopView(false);
mMediaHelper.autoChangeCamera();
break;
}
}
private void startView(){
mStartVideo.setVisibility(View.GONE);
mStartVideoIng.setVisibility(View.VISIBLE);
mProgressNumber = 0;
mTime.setText("00:00");
handler.removeMessages(0);
handler.sendMessage(handler.obtainMessage(0));
}
private void stopView(boolean isSave){
int timer = mProgressNumber;
mProgressNumber = 0;
mProgress.setProgress(0);
handler.removeMessages(0);
mTime.setText("00:00");
if(isSave) {
String path = mMediaHelper.getTargetFilePath();
Intent intent = new Intent(this,MakeVideoActivity.class);
intent.putExtra("path",path);
intent.putExtra("time",timer);
startActivity(intent);
}
mStartVideoIng.setVisibility(View.GONE);
mStartVideo.setVisibility(View.VISIBLE);
}
Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case 0:
mProgress.setProgress(mProgressNumber);
mTime.setText("00:"+(mProgressNumber<10?"0"+mProgressNumber:mProgressNumber));
if(mProgress.getProgress() >= mProgress.getMax()){
mMediaHelper.stopRecordSave();
stopView(true);
}else if (mMediaHelper.isRecording()){
mProgressNumber = mProgressNumber + 1;
sendMessageDelayed(handler.obtainMessage(0), 1000);
}
break;
}
}
};
到這里基本上小視頻就錄制成功了攘已,點擊停止錄制炮赦,如果小視頻時間大于8秒就跳轉(zhuǎn)到下一個頁面進行視頻制作。(雖然只是個demo样勃,但是細(xì)節(jié)東西還是要處理的吠勘,不然最后demo各自問題,估計要被噴一臉口水)
5.上面stopView方法邏輯中當(dāng)視頻錄制結(jié)束后跳轉(zhuǎn)到一個MakeVideoActivity中峡眶,攜帶了視頻路徑地址以及視頻時長剧防,
新建MakeVideoActivity(記得在AndroidManifest.xml中注冊)在這里實現(xiàn)我們的主要目標(biāo)功能,視頻音視頻處理辫樱,也是本文中最重要的部分峭拘,全程使用ffmpeg來完成這些目標(biāo)。
打開FFmpegRun這個類新增以下代碼狮暑。
package com.tangyx.video.ffmpeg;
import android.os.AsyncTask;
/**
* Created by tangyx
* Date 2017/8/1
* email tangyx@live.com
*/
public class FFmpegRun {
static {
System.loadLibrary("ffmpeg");
System.loadLibrary("ffmpeginvoke");
}
public static void execute(String[] commands, final FFmpegRunListener fFmpegRunListener) {
new AsyncTask<String[], Integer, Integer>() {
@Override
protected void onPreExecute() {
if (fFmpegRunListener != null) {
fFmpegRunListener.onStart();
}
}
@Override
protected Integer doInBackground(String[]... params) {
return run(params[0]);
}
@Override
protected void onPostExecute(Integer integer) {
if (fFmpegRunListener != null) {
fFmpegRunListener.onEnd(integer);
}
}
}.execute(commands);
}
public native static int run(String[] commands);
public interface FFmpegRunListener{
void onStart();
void onEnd(int result);
}
}
內(nèi)容不多鸡挠,主要封裝給外部方便調(diào)用,加載相關(guān)的so文件心例。
因為我們是通過命令的方式調(diào)用ffmpeg相關(guān)的功能宵凌,執(zhí)行命令是有一定的過程鞋囊,所以需要在線程中來完成止后,然后通過一個回調(diào)的接口傳遞給外部,外部通過接口在回調(diào)的onEnd方法參數(shù)值result判斷命令是否執(zhí)行成功。
result值:0表示成功译株,其他失敗瓜喇。
6.搭建一個制作的布局提供給MakeVideoActivity
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:gravity="center"
android:background="@android:color/black"
android:layout_width="match_parent"
android:layout_height="match_parent">
<VideoView
android:id="@+id/video"
android:layout_gravity="center"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<RelativeLayout
android:id="@+id/title_layout"
android:background="#50000000"
android:paddingRight="15dp"
android:paddingLeft="5dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:padding="10dp"
android:layout_centerVertical="true"
android:src="@mipmap/icon_back_white" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:textColor="@android:color/white"
android:text="制作" />
<TextView
android:id="@+id/next"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:textColor="@android:color/white"
android:padding="10dp"
android:text="下一步" />
</RelativeLayout>
<LinearLayout
android:id="@+id/editor_layout"
android:orientation="vertical"
android:gravity="center"
android:paddingTop="10dp"
android:paddingBottom="10dp"
android:paddingLeft="30dp"
android:paddingRight="30dp"
android:background="#50000000"
android:layout_gravity="bottom"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<LinearLayout
android:id="@+id/video_layout"
android:gravity="center"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="13sp"
android:text="原音" />
<android.support.v7.widget.AppCompatSeekBar
android:id="@+id/video_seek_bar"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:maxHeight="1.5dp"
style="@style/video_seek_bar"
android:progress="50"
android:max="100" />
</LinearLayout>
<LinearLayout
android:gravity="center"
android:layout_marginTop="10dp"
android:orientation="horizontal"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="13sp"
android:text="伴唱" />
<android.support.v7.widget.AppCompatSeekBar
android:id="@+id/music_seek_bar"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:maxHeight="1.5dp"
style="@style/video_seek_bar"
android:progress="50"
android:max="100" />
</LinearLayout>
<RelativeLayout
android:padding="30dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:id="@+id/local_music"
android:layout_centerInParent="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:drawablePadding="10dp"
android:paddingLeft="10dp"
android:paddingRight="10dp"
android:paddingBottom="5dp"
android:paddingTop="5dp"
android:background="@android:color/white"
android:textColor="@android:color/black"
android:text="選擇本地音樂" />
</RelativeLayout>
</LinearLayout>
</FrameLayout>
<style name="video_seek_bar">
<item name="android:thumb">@mipmap/kaibo_icon_huakuai</item>
<item name="android:progressDrawable">@drawable/video_seekbar</item>
</style>
video_seekbar.xml
<?xml version="1.0" encoding="utf-8"?>
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
<item android:id="@android:id/background">
<shape>
<corners android:radius="5dp" />
<solid android:color="#f3f3f3" />
</shape>
</item>
<item android:id="@android:id/secondaryProgress">
<clip>
<shape>
<corners android:radius="5dp" />
<solid android:color="#f3f3f3" />
</shape>
</clip>
</item>
<item android:id="@android:id/progress">
<clip>
<shape>
<corners android:radius="5dp" />
<solid android:color="#f15a23" />
</shape>
</clip>
</item>
</layer-list>
資源文件
這里先說一下實現(xiàn)思路(差不多是這樣):
原唱:錄制視頻中的原聲,需要通過滑動原音的seekbar來實時變化聲音的大小歉糜。
伴唱:看見下面的本地音樂按鈕沒有乘寒?沒錯,就是白底黑字的那個按鈕匪补,點擊它可以跳轉(zhuǎn)到一個加載本地音樂的列表選擇一首歌曲在返回到當(dāng)前頁面并且播放伞辛,通過伴唱的seekbar實時控制聲音的大小。
兩個聲音是可以同時播放的夯缺,并且可以改變自己的音量大小不影響彼此蚤氏,我這里播放音頻(原聲和伴唱)都是通過MediaPlayer(如果你不了解它?點我?guī)泔w)來實現(xiàn)的踊兜。
所以我這里第一步就是需要把視頻中的音頻分割出來(采用FFmpeg分離)竿滨,視頻通過VideoView控件來播放,分離出來的音頻用MediaPlayer控制播放捏境。
7.因為調(diào)用ffmpeg的功能只需要我們傳遞一個命令等待返回就可以完成我們的需求于游,所以這里新建一個類(FFmpegCommands.java)來管理我們這次需要用到ffmpeg命令
打開FFmpegCommands.java加入以下兩個方法:
/**
* 提取單獨的音頻
*
* @param videoUrl
* @param outUrl
* @return
*/
public static String[] extractAudio(String videoUrl, String outUrl) {
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = videoUrl;
commands[3] = "-acodec";
commands[4] = "copy";
commands[5] = "-vn";
commands[6] = "-y";
commands[7] = outUrl;
return commands;
}
這個方法最后組成的就是一個ffmepg命令:ffmepg -i videoUrl(錄制視頻的文件路徑) -acodec copy -vn -y outUrl(提取的音頻存放路徑)
簡單解釋一下參數(shù)含義:
-i 輸入文件
-acodec 使用codec編解碼
copy 拷貝原始編解碼數(shù)據(jù)
-vn 不做視頻記錄(只提取音頻)
-y 直接覆蓋(如果目錄下相同文件名字)
當(dāng)然更多的參數(shù)學(xué)習(xí),可通過官網(wǎng)查看----->FFmpeg超級傳送門
我這里主要實現(xiàn)本文的目標(biāo)功能垫言,更多的語法等你入門后在自己去深入了解
/**
* 提取單獨的視頻贰剥,沒有聲音
*
* @param videoUrl
* @param outUrl
* @return
*/
public static String[] extractVideo(String videoUrl, String outUrl) {
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = videoUrl;
commands[3] = "-vcodec";
commands[4] = "copy";
commands[5] = "-an";
commands[6] = "-y";
commands[7] = outUrl;
return commands;
}
這兩個方法就是分別得到視頻文件(沒有聲音),音頻文件(視頻的原聲來這)
回到MakeVideoActivity中開始使調(diào)用ffmpeg
初始化布局:
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_make_video);
mVideoView = (VideoView) findViewById(R.id.video);
mAudioSeekBar = (AppCompatSeekBar) findViewById(R.id.video_seek_bar);
mMusicSeekBar = (AppCompatSeekBar) findViewById(R.id.music_seek_bar);
mAudioSeekBar.setOnSeekBarChangeListener(this);
mMusicSeekBar.setOnSeekBarChangeListener(this);
findViewById(R.id.next).setOnClickListener(this);
findViewById(R.id.back).setOnClickListener(this);
findViewById(R.id.local_music).setOnClickListener(this);
isPlayer = getIntent().getBooleanExtra(this.getClass().getSimpleName(), false);
if (isPlayer) {
findViewById(R.id.title_layout).setVisibility(View.GONE);
findViewById(R.id.editor_layout).setVisibility(View.GONE);
mVideoView.setVideoPath(getIntent().getStringExtra("path"));
mVideoView.start();
}else{
mFileUtils = new FileUtils(this);
mTargetPath = mFileUtils.getStorageDirectory();
extractVideo();
}
}
其他操作按鈕沒什么解釋筷频,其中有個參數(shù)isPlayer鸠澈,主要最后點擊下一步生成文件,我又會Intent到當(dāng)前MakeVideoActivity中截驮,這時候isPlayer=true笑陈,就是播放最后制作完成的成品視頻。(偷個懶葵袭,少創(chuàng)建一個文件)
新增以下方法
/**
* 提取視頻
*/
private void extractVideo() {
final String outVideo = mTargetPath + "/video.mp4";
String[] commands = FFmpegCommands.extractVideo(getIntent().getStringExtra("path"), outVideo);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
mMediaPath = new ArrayList<>();
Log.e(TAG,"extractVideo ffmpeg start...");
}
@Override
public void onEnd(int result) {
Log.e(TAG,"extractVideo ffmpeg end...");
mMediaPath.add(outVideo);
extractAudio();
}
});
}
/**
* 提取音頻
*/
private void extractAudio() {
final String outVideo = mTargetPath + "/audio.aac";
String[] commands = FFmpegCommands.extractAudio(getIntent().getStringExtra("path"), outVideo);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
mAudioPlayer = new MediaPlayer();
}
@Override
public void onEnd(int result) {
Log.e(TAG,"extractAudio ffmpeg end...");
mMediaPath.add(outVideo);
String path = mMediaPath.get(0);
mVideoView.setVideoPath(path);
try {
mAudioPlayer.setDataSource(mMediaPath.get(1));
mAudioPlayer.setLooping(true);
mAudioPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mediaPlayer) {
mAudioPlayer.setVolume(0.5f, 0.5f);
mAudioPlayer.start();
}
});
mAudioPlayer.prepare();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
在onCreate中調(diào)用extractVideo方法開始執(zhí)行FFmpeg命令涵妥,如果不出現(xiàn)意執(zhí)行完命令后,音頻文件通過MediaPlayer開始播放坡锡,視頻文件通過VideoView加載蓬网,音頻默認(rèn)50%音量。
實現(xiàn)seekbar進度條變化監(jiān)聽實現(xiàn):
@Override
public void onProgressChanged(SeekBar seekBar, int i, boolean b) {
float volume = i / 100f;
if (mAudioSeekBar == seekBar) {
mAudioPlayer.setVolume(volume, volume);
} else if(mMusicPlayer!=null){
mMusicPlayer.setVolume(volume, volume);
}
}
這里的視頻和音頻已經(jīng)被單獨分開鹉勒,大概是這樣(gif質(zhì)量較差帆锋,勉強看哈這個意思就行,聲音也無法聽見禽额,很尷尬锯厢。)
其實能夠?qū)崿F(xiàn)這一步說明我們編譯的ffmpeg so文件是沒有任何問題的皮官,可以傳遞更多的命令來完成更多的功能,所以繼續(xù)下一步实辑,選擇一首音樂合成到視頻中捺氢,最后視頻有原聲也有音樂,并且兩種聲音控制在最后分別選擇的音量進度上剪撬。
8.選擇本地音樂摄乒,并且剪切音頻文件和視頻時長一樣。
新建一個MusicActivity用來展示本地音樂列表
public class MusicActivity extends AppCompatActivity {
private ListView mListView;
private MusicAdapter mAdapter;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_music);
mListView = (ListView) findViewById(R.id.list);
findViewById(R.id.back).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
finish();
}
});
new SongTask().execute();
}
private class SongTask extends AsyncTask<Void, Void, List<Music>> implements AdapterView.OnItemClickListener{
@Override
protected void onPreExecute() {
super.onPreExecute();
}
@Override
protected List<Music> doInBackground(Void... voids) {
List<Music> musics = new ArrayList<>();
Cursor cursor = getApplicationContext().getContentResolver().query(
MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, null,
MediaStore.Audio.Media.DATA + " like ?",
new String[]{Environment.getExternalStorageDirectory() + File.separator + "%"},
MediaStore.Audio.Media.DEFAULT_SORT_ORDER);
if (cursor != null) {
for (cursor.moveToFirst(); !cursor.isAfterLast(); cursor.moveToNext()) {
Music music = new Music();
String isMusic = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.IS_MUSIC));
if (isMusic != null && isMusic.equals("")) continue;
// int duration = cursor.getInt(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.DURATION));
String path = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.DATA));
Log.e("SLog","music:"+path);
if (!path.endsWith(".mp3")) {
continue;
}
String title = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.TITLE));
String artist = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.ARTIST));
music.setId(cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media._ID)));
music.setName(title);
music.setSingerName(artist);
music.setSongUrl(path);
musics.add(music);
}
cursor.close();
}
return musics;
}
@Override
protected void onPostExecute(List<Music> musics) {
super.onPostExecute(musics);
mAdapter = new MusicAdapter(MusicActivity.this,musics);
mListView.setAdapter(mAdapter);
mListView.setOnItemClickListener(this);
}
@Override
public void onItemClick(AdapterView<?> adapterView, View view, int i, long l) {
Music music = mAdapter.getItem(i);
Intent intent = new Intent();
intent.putExtra("music",music.getSongUrl());
setResult(10000,intent);
finish();
}
}
}
這里面有個內(nèi)部類SongTask残黑,防止主UI被卡死馍佑,所以在線程中完成本地音樂的遍歷,可以合成的音樂格式有很多種梨水,但是可能需要做不同的格式處理挤茄,我這里暫時只獲取了MP3格式的音頻文件。
適配器比較簡單冰木,這里就不貼了穷劈,后面上傳的源碼會攜帶。
9.視頻不足30秒踊沸,一般的mp3音樂都在幾分鐘歇终,所以現(xiàn)在點擊一個音樂,返回到制作頁面MakeVideoActivity需要通過ffmpeg命令剪切音樂后再進行播放逼龟。
打開FFmpegCommands新增剪切的命令评凝。
/**
* 裁剪音頻
*/
public static String[] cutIntoMusic(String musicUrl, long second, String outUrl) {
String[] commands = new String[10];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = musicUrl;
commands[3] = "-ss";
commands[4] = "00:00:10";
commands[5] = "-t";
commands[6] = String.valueOf(second);
commands[7] = "-acodec";
commands[8] = "copy";
commands[9] = outUrl;
return commands;
}
參數(shù)解釋:
-ss 音頻開始位置,我這里是從低10秒開始
-t 音頻結(jié)束位置腺律,我這里傳遞了一個參數(shù)過來奕短,就是視頻的時長,
最后得到的音頻就是這個mp3文件第10秒開始到10+ second秒結(jié)束匀钧。
我這里處理完的音頻時長和視頻時長保持一樣翎碑。
新增一個方法
private void cutSelectMusic(String musicUrl) {
final String musicPath = mTargetPath + "/bgMusic.aac";
long time = getIntent().getIntExtra("time",0);
String[] commands = FFmpegCommands.cutIntoMusic(musicUrl, time, musicPath);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"cutSelectMusic ffmpeg start...");
}
@Override
public void onEnd(int result) {
Log.e(TAG,"cutSelectMusic ffmpeg end...");
if(mMusicPlayer!=null){//移除上一個選擇的音樂背景
mMediaPath.remove(mMediaPath.size()-1);
}
mMediaPath.add(musicPath);
stopMediaPlayer();
mMusicPlayer = new MediaPlayer();
try {
mMusicPlayer.setDataSource(musicPath);
mMusicPlayer.setLooping(true);
mMusicPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.setVolume(0.5f, 0.5f);
mediaPlayer.start();
mMusicSeekBar.setProgress(50);
}
});
mMusicPlayer.prepareAsync();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
調(diào)用
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == 10000) {
String music = data.getStringExtra("music");
cutSelectMusic(music);
}
}
現(xiàn)在制作頁面就使用了三個獨立文件,一個播放的視頻(沒有原聲)之斯,一個視頻的原聲日杈,一個剛才選擇的背景音樂。到這里基本上準(zhǔn)備的東西都已經(jīng)完成了佑刷,調(diào)節(jié)好需要的音量莉擒,點擊下一步就開始合成。
10.作為剛?cè)腴T的時候真的苦不堪言瘫絮,簡直有種從入門到放棄的感覺涨冀,視頻合成多音頻我到現(xiàn)在也沒弄明白,總之走了很多彎路麦萤,各自嘗試失敗鹿鳖,要么就是達(dá)不到要求扁眯,其中的彎路過程我就不在這里闡述了,我怕說多了剎不住車把你們帶溝里栓辜,直接給出最后音視頻+背景音樂合成成功的思路:
1.根據(jù)兩個seebar選擇的音量進度,通過ffmpeg命令修改視頻原聲和背景音樂的對應(yīng)音量垛孔。
2.通過ffmpeg將視頻原聲和背景音樂合成為一個音頻文件藕甩。
3.得到最新的音頻文件,通過ffmpeg命令合成到視頻中周荐。
打開FFmpegCommands新增三個方法狭莱。
/**
* 修改音頻文件的音量
* @param audioOrMusicUrl
* @param vol
* @param outUrl
* @return
*/
public static String[] changeAudioOrMusicVol(String audioOrMusicUrl, int vol, String outUrl) {
if (SLog.debug)
SLog.w("audioOrMusicUrl:" + audioOrMusicUrl + "\nvol:" + vol + "\noutUrl:" + outUrl);
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = audioOrMusicUrl;
commands[3] = "-vol";
commands[4] = String.valueOf(vol);
commands[5] = "-acodec";
commands[6] = "copy";
commands[7] = outUrl;
return commands;
}
-vol 就是主要改變音量的值。
/**
* @param audio1
* @param audio2
* @param outputUrl
* @return
*/
public static String[] composeAudio(String audio1, String audio2, String outputUrl) {
Log.w("SLog","audio1:" + audio1 + "\naudio2:" + audio2 + "\noutputUrl:" + outputUrl);
String[] commands = new String[10];
commands[0] = "ffmpeg";
//輸入
commands[1] = "-i";
commands[2] = audio1;
//音樂
commands[3] = "-i";
commands[4] = audio2;
//覆蓋輸出
commands[5] = "-filter_complex";
commands[6] = "amix=inputs=2:duration=first:dropout_transition=2";
commands[7] = "-strict";
commands[8] = "-2";
//輸出文件
commands[9] = outputUrl;
return commands;
}
混合合并兩個音頻文件
-filter_complex 很強大概作,很多濾鏡功能腋妙,更多的效果可通過官網(wǎng)學(xué)習(xí)。
我這里用它指定2個音頻文件合成讯榕,最后的合成的音頻時長以第一個音頻的時長為準(zhǔn)骤素。
/**
* 音頻,視頻合成
* @param videoUrl
* @param musicOrAudio
* @param outputUrl
* @param second
* @return
*/
public static String[] composeVideo(String videoUrl, String musicOrAudio, String outputUrl, long second) {
Log.w("SLog","videoUrl:" + videoUrl + "\nmusicOrAudio:" + musicOrAudio + "\noutputUrl:" + outputUrl + "\nsecond:" + second);
String[] commands = new String[14];
commands[0] = "ffmpeg";
//輸入
commands[1] = "-i";
commands[2] = videoUrl;
//音樂
commands[3] = "-i";
commands[4] = musicOrAudio;
commands[5] = "-ss";
commands[6] = "00:00:00";
commands[7] = "-t";
commands[8] = String.valueOf(second);
//覆蓋輸出
commands[9] = "-vcodec";
commands[10] = "copy";
commands[11] = "-acodec";
commands[12] = "copy";
//輸出文件
commands[13] = outputUrl;
return commands;
}
把音頻文件合成到視頻愚屁。
回到MakeVideoActivity點擊下一步開始制作視頻
修改onClick增加下一步的點擊事件
@Override
public void onClick(View view) {
switch (view.getId()){
case R.id.back:
finish();
mFileUtils.deleteFile(mTargetPath,null);
break;
case R.id.local_music:
Intent intent = new Intent(this,MusicActivity.class);
startActivityForResult(intent,0);
break;
case R.id.next:
composeVideoAudio();
mNext.setTextColor(Color.parseColor("#999999"));
mNext.setEnabled(false);
break;
}
}
/**
* 處理視頻原聲
*/
private void composeVideoAudio() {
int mAudioVol = mAudioSeekBar.getProgress();
String audioUrl = mMediaPath.get(1);
final String audioOutUrl = mTargetPath + "/tempAudio.aac";
String[] common = FFmpegCommands.changeAudioOrMusicVol(audioUrl, mAudioVol * 10, audioOutUrl);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"changeAudioVol ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"changeAudioVol ffmpeg end...");
if (mMediaPath.size() == 3) {
composeVideoMusic(audioOutUrl);
} else {
composeMusicAndAudio(audioOutUrl);
}
}
});
}
在onEnd方法有個判斷邏輯济竹,如果沒有選擇本地音樂,那不需要合成背景音樂霎槐,只需要處理視頻原聲后直接合成到視頻完成制作送浊,否則就繼續(xù)處理當(dāng)前選擇的背景音樂。
/**
* 處理背景音樂
*/
/**
* 處理背景音樂
*/
private void composeVideoMusic(final String audioUrl) {
final int mMusicVol = mMusicSeekBar.getProgress();
String musicUrl;
if (audioUrl == null) {
musicUrl = mMediaPath.get(1);
} else {
musicUrl = mMediaPath.get(2);
}
final String musicOutUrl = mTargetPath + "/tempMusic.aac";
final String[] common = FFmpegCommands.changeAudioOrMusicVol(musicUrl, mMusicVol * 10, musicOutUrl);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"changeMusicVol ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"changeMusicVol ffmpeg end...");
composeAudioAndMusic(audioUrl, musicOutUrl);
}
});
}
原聲和背景音樂都處理好了就把兩個音頻合成一個
/**
* 合成原聲和背景音樂
*/
public void composeAudioAndMusic(String audioUrl, String musicUrl) {
if (audioUrl == null) {
composeMusicAndAudio(musicUrl);
} else {
final String musicAudioPath = mTargetPath + "/audioMusic.aac";
String[] common = FFmpegCommands.composeAudio(audioUrl, musicUrl, musicAudioPath);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"composeAudioAndMusic ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"composeAudioAndMusic ffmpeg end...");
composeMusicAndAudio(musicAudioPath);
}
});
}
}
得到最后的音頻文件合成到無聲的視頻中
/**
* 視頻和背景音樂合成
*
* @param bgMusicAndAudio
*/
private void composeMusicAndAudio(String bgMusicAndAudio) {
final String videoAudioPath = mTargetPath + "/videoMusicAudio.mp4";
final String videoUrl = mMediaPath.get(0);
final int time = getIntent().getIntExtra("time",0) - 1;
String[] common = FFmpegCommands.composeVideo(videoUrl, bgMusicAndAudio, videoAudioPath, time);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"videoAndAudio ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"videoAndAudio ffmpeg end...");
handleVideoNext(videoAudioPath);
}
});
}
如果不出意外就會的到最后制作成功的視頻文件丘跌,進行其他邏輯處理袭景。
/**
* 適配處理完成,進入下一步
*/
private void handleVideoNext(String videoUrl) {
Message message = new Message();
message.what = 1;
message.obj = videoUrl;
handler.sendMessage(message);
}
Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
switch (msg.what) {
case 0:
showProgressLoading();
break;
case 1:
dismissProgress();
String videoPath = (String) msg.obj;
Intent intent = new Intent(MakeVideoActivity.this,MakeVideoActivity.class);
intent.putExtra("path",videoPath);
intent.putExtra(this.getClass().getSimpleName(),true);
startActivity(intent);
finish();
break;
case 2:
dismissProgress();
break;
}
}
};
private void showProgressLoading(){
}
private void dismissProgress(){
}
我這里就是跳轉(zhuǎn)到到當(dāng)前頁面直接播放闭树。
到這里我們就實現(xiàn)了所有的目標(biāo)功能耸棒,當(dāng)然這里我所用到的ffmpeg功能只是冰山一角,它的更多功能需要自己去挖掘报辱,希望我這里的兩篇文章能夠?qū)⒛阋隺ndroid ffmpeg的入門榆纽。
后面有時間我將會提供一個android opencv的demo的功能實現(xiàn)。
源碼地址
Android Studio下編譯FFmpeg so文件
Android通過FFmpeg實現(xiàn)多段小視頻合成
讀書不一定改變命運捏肢,學(xué)習(xí)一定讓你進步奈籽。