本人在写那么些demo时,即便不可能一向录不过我

作者:编程

互连网有成都百货上千自定义相机的事例,这里只是本人一时写的三个小demo,仅供参谋:用到了上面多少个库:#import <AVFoundation/AVFoundation.h>``#import <AssetsLibrary/AssetsLibrary.h>

修改了GPUImage导入的措施,也消除了摄像编辑后出现90转悠的难题。网络流行的美颜滤镜有非常多,笔者的demo里有三款美颜滤镜(GPUImageBeautifyFilterFSKGPUImageBeautyFilter)。笔者本意是希图写三个相册管理的,最终由于项目须求,写成了短录像管理,所以项目名字没改,demo上面会提交,只是给大家多少个有关思路,不惜勿喷。上边直接上本人demo里比较重要的多少个地点:短视频效率采取的是GPUImageVideoCamera,由于品种里要求运用1:1,所以选取了滤镜GPUImageCropFilter,大家借使有须求,可以越来越高那一个滤镜的装置,上边给出相机的设置代码:

到当下截至Android中还无法直接录制星型的摄像,纵然无法一直录然而大家也可能有局地办法来拍卖录像后的摄像,从前本人写过一篇小说 Android自定义Camera(一), 能够先通晓一下哪些做叁个大致的自定义相机demo, 那录像录制也要翻开相机预览, 有以下多少个步骤和供给小心的地点:

在应用的时候须要在Info.plist中把有关权限写进去:Privacy - Microphone Usage Description``Privacy - Photo Library Usage Description``Privacy - Camera Usage Description

//相机设置- customSystemSession { WS self.imgView = [UIImageView new]; self.imgView.clipsToBounds = YES; [self.view addSubview:self.imgView]; [self.imgView mas_makeConstraints:^(MASConstraintMaker *make) { make.top.equalTo(weakSelf.topView.mas_bottom).offset; make.left.equalTo(weakSelf.view.mas_left).offset; make.right.equalTo(weakSelf.view.mas_right).offset; make.height.mas_equalTo(SCREEN_WIDTH); }]; //美颜相机 self.kj_videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack]; self.kj_videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait; self.kj_videoCamera.horizontallyMirrorFrontFacingCamera = YES; self.kj_filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, SCREEN_WIDTH, SCREEN_WIDTH)]; self.kj_filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill; // self.filterView.center = self.view.center; [self.imgView addSubview:self.kj_filterView]; // //剪裁滤镜如果有需要全屏,这里需要做出更改或者直接去掉这个滤镜 self.kj_cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:CGRectMake(0, 44/SCREEN_HEIGHT, 1, SCREEN_WIDTH/SCREEN_HEIGHT)]; //美颜 self.kj_beautifyFilter = [[FSKGPUImageBeautyFilter alloc] init]; self.kj_beautifyFilter.beautyLevel = 0.9f;//美颜程度 self.kj_beautifyFilter.brightLevel = 0.7f;//美白程度 self.kj_beautifyFilter.toneLevel = 0.9f;//色调强度 // self.kj_filter = [[GPUImageSaturationFilter alloc] init];// //滤镜组// self.kj_filterGroup = [[GPUImageFilterGroup alloc] init];// [self.kj_filterGroup addFilter:self.kj_cropFilter];// [self.kj_filterGroup addFilter:self.kj_beautifyFilter]; // [self openBeautify]; [self.kj_videoCamera addAudioInputsAndOutputs]; [self.kj_videoCamera addTarget:self.kj_cropFilter]; [self.kj_cropFilter addTarget:self.kj_beautifyFilter]; [self.kj_beautifyFilter addTarget:self.kj_filterView]; [self.kj_videoCamera startCameraCapture];}

1.赢得相机实例

/**
     * 获取Camera实例
     *
     * @return
     */
private Camera getCamera(int id) {
    Camera camera = null;
    try {
        camera = Camera.open(id);
    } catch(Exception e) {

}
    return camera;
}```

#2.开启预览 
要注意,开启预览要在Activity的onResume方法里面开启,然后在onPause方法里面释放相机资源,举一个简单的例子,如果你在预览的时候按下了home键,此时再次打开程序,如果你是在Oncreate方法里面开启相机,那么再次打开预览界面应该会卡住。

/**
* 预览相机
*/
private void startPreview(Camera camera, SurfaceHolder holder) {
try {
setupCamera(camera);
camera.setPreviewDisplay(holder);
//获取相机预览角度, 后边录制录像必要用
recorderRotation = CameraUtil.getInstance().getRecorderRotation(mCameraId);
CameraUtil.getInstance().setCameraDisplayOrientation(this, mCameraId, camera);
camera.startPreview();
} catch(IOException e) {
e.printStackTrace();
}
}

/**
* 设置
*/
private void setupCamera(Camera camera) {
if (camera != null) {
Camera.Parameters parameters = camera.getParameters();

    List < String > focusModes = parameters.getSupportedFocusModes();
    if (focusModes != null && focusModes.size() > 0) {
        if (focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
            //设置自动对焦
            parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
        }
    }

    List < Camera.Size > videoSiezes = null;
    if (parameters != null) {
        //获取相机所有支持尺寸
        videoSiezes = parameters.getSupportedVideoSizes();
        for (Camera.Size size: videoSiezes) {}
    }

    if (videoSiezes != null && videoSiezes.size() > 0) {
        //拿到一个预览宽度最小为720像素的预览值 
        Camera.Size videoSize = CameraUtil.getInstance().getPropVideoSize(videoSiezes, 720);
        video_width = videoSize.width;
        video_height = videoSize.height;
        LogUtils.i("video_width===" + video_width);
        LogUtils.i("video_height===" + video_height);
    }

    //这里第三个参数为最小尺寸 getPropPreviewSize方法会对从最小尺寸开始升序排列 取出所有支持尺寸的最小尺寸
    Camera.Size previewSize = CameraUtil.getInstance().getPropPreviewSize(parameters.getSupportedPreviewSizes(), video_width);
    parameters.setPreviewSize(previewSize.width, previewSize.height);

    Camera.Size pictrueSize = CameraUtil.getInstance().getPropPictureSize(parameters.getSupportedPictureSizes(), video_width);
    parameters.setPictureSize(pictrueSize.width, pictrueSize.height);

    camera.setParameters(parameters);

    /**
         * 设置surfaceView的尺寸 因为camera默认是横屏,所以取得支持尺寸也都是横屏的尺寸
         * 我们在startPreview方法里面把它矫正了过来,但是这里我们设置设置surfaceView的尺寸的时候要注意 previewSize.height<previewSize.width
         * previewSize.width才是surfaceView的高度
         * 一般相机都是屏幕的宽度 这里设置为屏幕宽度 高度自适应 你也可以设置自己想要的大小
         */
    FrameLayout.LayoutParams params = new FrameLayout.LayoutParams(screenWidth, (screenWidth * video_width) / video_height);
    //这里当然可以设置拍照位置 比如居中 我这里就置顶了
    surfaceView.setLayoutParams(params);

    RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(screenWidth, screenheight - screenWidth);
    layoutParams.addRule(RelativeLayout.BELOW, surfaceView.getId());
    layoutParams.addRule(RelativeLayout.ALIGN_PARENT_BOTTOM, RelativeLayout.TRUE);
    bottomLayout.setLayoutParams(layoutParams);
}

}```

本人在写那一个demo时,是鲁人持竿微信的体裁写的,同样是点击拍照、长按录制摄像,录制摄像完直接开展播放,这里封装了二个总结的播放器:

对此相机的内外镜头切换、闪光灯等等这里就不做验证了,对于短视频作者的处理方案是多段录像合併,用GPUImageVideoCamera来摄像多段录像,带美颜滤镜,其实这里是能够直接行使实时滤镜的,不过出于集团只必要美颜,所以这里没做别的滤镜的拍卖(实际上和美颜滤镜的切换时一样的道理,所以实时滤镜实际上是在每段录像拍戏前选好滤镜后addTarget就好了,当然啦,倘诺是备位充数使用后,要先删除(removeTarget)后增添),可是在三回九转小编独立写了一个摄像的归纳编辑效率,这里有对于本地录制、图片扩张滤镜的管理,后续会讲到,接下去放上短录制中最关键的一对,录像合并:

3.开头摄像

此地要小心的是MediaRecorder的连带办法的调用顺序时候无法乱的 * *,详细能够看官方网址api表明

接下去开启摄像:

protected void start() {
    try {
        pathName = System.currentTimeMillis() + "";
        //视频存储路径
        file = new File(MyApplication.getInstance().getTempPath() + File.separator + pathName + AppConfig.MP4);

        //如果没有要创建
        BitmapUtils.makeDir(file);

        //初始化一个MediaRecorder
        if (mediaRecorder == null) {
            mediaRecorder = new MediaRecorder();
        } else {
            mediaRecorder.reset();
        }

        mCamera.unlock();
        mediaRecorder.setCamera(mCamera);
        //设置视频输出的方向 很多设备在播放的时候需要设个参数 这算是一个文件属性
        mediaRecorder.setOrientationHint(recorderRotation);

        //视频源类型
        mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
        mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
        mediaRecorder.setAudioChannels(2);
        // 设置视频图像的录入源
        // 设置录入媒体的输出格式
        //            mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
        // 设置音频的编码格式
        //            mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
        // 设置视频的编码格式
        //            mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
        if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
            profile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
        }
        /*else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
                profile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
            } */
        else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_1080P)) {
            profile = CamcorderProfile.get(CamcorderProfile.QUALITY_1080P);
        } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_HIGH)) {
            profile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
        } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_LOW)) {
            profile = CamcorderProfile.get(CamcorderProfile.QUALITY_LOW);
        }

        if (profile != null) {
            profile.audioCodec = MediaRecorder.AudioEncoder.AAC;
            profile.audioChannels = 1;
            profile.audioSampleRate = 16000;

            profile.videoCodec = MediaRecorder.VideoEncoder.H264;
            mediaRecorder.setProfile(profile);
        }

        //视频尺寸
        mediaRecorder.setVideoSize(video_width, video_height);

        //数值越大 视频质量越高
        mediaRecorder.setVideoEncodingBitRate(5 * 1024 * 1024);

        // 设置视频的采样率,每秒帧数
        //            mediaRecorder.setVideoFrameRate(5);
        // 设置录制视频文件的输出路径
        mediaRecorder.setOutputFile(file.getAbsolutePath());
        mediaRecorder.setMaxDuration(2000);

        // 设置捕获视频图像的预览界面
        mediaRecorder.setPreviewDisplay(surfaceView.getHolder().getSurface());

        mediaRecorder.setOnErrorListener(new MediaRecorder.OnErrorListener() {

            @Override public void onError(MediaRecorder mr, int what, int extra) {
                // 发生错误,停止录制
                if (mediaRecorder != null) {
                    mediaRecorder.stop();
                    mediaRecorder.release();
                    mediaRecorder = null;
                    LogUtils.i("Error");
                }
            }
        });

        mediaRecorder.setOnInfoListener(new MediaRecorder.OnInfoListener() {@Override public void onInfo(MediaRecorder mr, int what, int extra) {
                //录制完成
            }
        });

        // 准备、开始
        mediaRecorder.prepare();
        mediaRecorder.start();

        new Thread(new Runnable() {@Override public void run() {
                for (int i = 0; i < PROGRESS_MAX; i++) {
                    try {
                        Thread.currentThread().sleep(20);

                        Message message = new Message();
                        message.what = 1;
                        message.obj = i;
                        handler.sendMessage(message);
                    } catch(InterruptedException e) {
                        e.printStackTrace();
                    }
                }
            }
        }).start();

    } catch(Exception e) {
        e.printStackTrace();
    }
}

m文件

#import "HAVPlayer.h"#import <AVFoundation/AVFoundation.h>@interface HAVPlayer ()@property (nonatomic,strong) AVPlayer *player;//播放器对象@end@implementation HAVPlayer/*// Only override drawRect: if you perform custom drawing.// An empty implementation adversely affects performance during animation.- drawRect:rect { // Drawing code}*/- (instancetype)initWithFrame:frame withShowInView:bgView url:url { if (self = [self initWithFrame:frame]) { //创建播放器层 AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player]; playerLayer.frame = self.bounds; [self.layer addSublayer:playerLayer]; if  { self.videoUrl = url; } [bgView addSubview:self]; } return self;}- dealloc { [self removeAvPlayerNtf]; [self stopPlayer]; self.player = nil;}- (AVPlayer *)player { if  { _player = [AVPlayer playerWithPlayerItem:[self getAVPlayerItem]]; [self addAVPlayerNtf:_player.currentItem]; } return _player;}- (AVPlayerItem *)getAVPlayerItem { AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:self.videoUrl]; return playerItem;}- setVideoUrl:videoUrl { _videoUrl = videoUrl; [self removeAvPlayerNtf]; [self nextPlayer];}- nextPlayer { [self.player seekToTime:CMTimeMakeWithSeconds(0, _player.currentItem.duration.timescale)]; [self.player replaceCurrentItemWithPlayerItem:[self getAVPlayerItem]]; [self addAVPlayerNtf:self.player.currentItem]; if (self.player.rate == 0) { [self.player play]; }}-  addAVPlayerNtf:(AVPlayerItem *)playerItem { //监控状态属性 [playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil]; //监控网络加载情况属性 [playerItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playbackFinished:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.player.currentItem];}- removeAvPlayerNtf { AVPlayerItem *playerItem = self.player.currentItem; [playerItem removeObserver:self forKeyPath:@"status"]; [playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"]; [[NSNotificationCenter defaultCenter] removeObserver:self];}- stopPlayer { if (self.player.rate == 1) { [self.player pause];//如果在播放状态就停止 }}/** * 通过KVO监控播放器状态 * * @param keyPath 监控属性 * @param object 监视器 * @param change 状态改变 * @param context 上下文 */-observeValueForKeyPath:(NSString *)keyPath ofObject:object change:(NSDictionary *)change context:context{ AVPlayerItem *playerItem = object; if ([keyPath isEqualToString:@"status"]) { AVPlayerStatus status= [[change objectForKey:@"new"] intValue]; if(status==AVPlayerStatusReadyToPlay){ NSLog(@"正在播放...,视频总长度:%.2f",CMTimeGetSeconds(playerItem.duration)); } }else if([keyPath isEqualToString:@"loadedTimeRanges"]){ NSArray *array=playerItem.loadedTimeRanges; CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次缓冲时间范围 float startSeconds = CMTimeGetSeconds(timeRange.start); float durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval totalBuffer = startSeconds + durationSeconds;//缓冲总长度 NSLog(@"共缓冲:%.2f",totalBuffer); }}- playbackFinished:(NSNotification *)ntf { Plog(@"视频播放完成"); [self.player seekToTime:CMTimeMake]; [self.player play];}@end

别的微信上边包车型客车按键长按会冒出圆弧时间条:

//确定- onCompleteAction:(UIButton *)sender { if (self.kj_videoArray.count > 0) { if (self.kj_videoArray.count > 1) { //需要合并多段视频 if (!self.kj_outPath) { self.kj_outPath = [self getVideoOutPath];//合成后的输出路径 } //判断本地是否已有合成后的视频文件 if ([[NSFileManager defaultManager] fileExistsAtPath:self.kj_outPath]) { //如果存在就删除,重新合成 [[NSFileManager defaultManager] removeItemAtPath:self.kj_outPath error:nil]; } //音视频合成工具 AVMutableComposition *kj_composition = [AVMutableComposition composition]; //音频 AVMutableCompositionTrack *kj_audioTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //视频 AVMutableCompositionTrack *kj_videoTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; //开始合成 [KJUtility showProgressDialogText:@"视频处理中..."]; CMTime kj_totalDuration = kCMTimeZero; for (int i = 0; i < self.kj_videoArray.count; i ++) { NSDictionary *localDict = self.kj_videoArray[i]; NSDictionary* options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES}; AVAsset *kj_asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:localDict[@"path"]] options:options]; //获取kj_asset中的音频 NSArray *audioArray = [kj_asset tracksWithMediaType:AVMediaTypeAudio]; AVAssetTrack *kj_assetAudio = audioArray.firstObject; //向kj_audioTrack中加入音频 NSError *kj_audioError = nil; BOOL isComplete_audio = [kj_audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, kj_asset.duration) ofTrack:kj_assetAudio atTime:kj_totalDuration error:&kj_audioError]; NSLog(@"加入音频%d isComplete_audio:%d error:%@", i, isComplete_audio, kj_audioError); //获取kj_asset中的视频 NSArray *videoArray = [kj_asset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack *kj_assetVideo = videoArray.firstObject; //向kj_videoTrack中加入视频 NSError *kj_videoError = nil; BOOL isComplete_video = [kj_videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, kj_asset.duration) ofTrack:kj_assetVideo atTime:kj_totalDuration error:&kj_videoError]; NSLog(@"加入视频%d isComplete_video:%d error:%@", i, isComplete_video, kj_videoError); kj_totalDuration = CMTimeAdd(kj_totalDuration, kj_asset.duration); } //这里可以加水印的,但在这里不做水印处理 //视频导出处理 AVAssetExportSession *kj_export = [AVAssetExportSession exportSessionWithAsset:kj_composition presetName:AVAssetExportPreset1280x720]; kj_export.outputURL = [NSURL fileURLWithPath:self.kj_outPath]; kj_export.outputFileType = AVFileTypeMPEG4; kj_export.shouldOptimizeForNetworkUse = YES; WS [kj_export exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ [KJUtility hideProgressDialog]; if (weakSelf.kjFileDelegate && [weakSelf.kjFileDelegate respondsToSelector:@selector(kj_videoFileCompleteLocalPath:)]) { //合成视频成功后,删除小段视频 [weakSelf clearAllVideo]; NSLog(@"%@",weakSelf.kj_outPath); [weakSelf.kjFileDelegate kj_videoFileCompleteLocalPath:weakSelf.kj_outPath]; } else { [weakSelf saveVideoToLibrary]; } }); }]; } else { //只有一段视频 [KJUtility hideProgressDialog]; NSDictionary *dict = self.kj_videoArray.firstObject; self.kj_outPath = dict[@"path"]; if (self.kjFileDelegate && [self.kjFileDelegate respondsToSelector:@selector(kj_videoFileCompleteLocalPath:)]) { [self.kjFileDelegate kj_videoFileCompleteLocalPath:self.kj_outPath]; } else { [self saveVideoToLibrary]; } } }}

4.录像作而成功后接下去的显要来了

使用ffmpeg对录像张开裁剪长方形,ffmpeg使用Shell命令的法子展开录制操作,实施功效也是不行好,那首先你要集成FFmpeg到Android项目里面,这里本人下载好了三个library, 直接引进到花色就可以,感兴趣的同伴能够友善编写翻译三个库,这里自个儿把命令贴出来解释一下,

ffmpeg - threads 4 - y - i / storage / emulated / 0 / CustomCamera / temp / 1476598263062.mp4 - metadata: s: v rotate = "0" - vf transpose = 1, crop = width: height: x: y - preset ultrafast - tune zerolatency - r 25 - vcodec libx264 - acodec copy / storage / emulated / 0 / CustomCamera / VIDEO / 1476598263062.mp4

-y: 假使文件存在那么覆盖掉

-i:输入

metadata: s: v rotate = "0": 重新编码除去rotate, 因为暗许录像出来的是横屏录制,这里再次编码

transpose = 1:0 = 90CounterCLockwise and Vertical Flip(
default)逆时针旋转90度并且垂直翻转,上边类推1 = 90Clockwise 2 = 90CounterClockwise 3 = 90Clockwise and Vertical Flip

crop = width: height: x: y,当中width和height表示裁剪后的尺码,x: y代表裁剪区域的左上角坐标

-preset ultrafast - tune ze饭丰万理江tency: 加快作用

r 25:帧率

-vcodec libx264 - acodec:编码格局libx264一旦想驾驭更加多关于ffmpeg的东西得以采访官方网站https: //ffmpeg.org/
再贴一下本身的代码调用:

 try {
     fc.compress_clipVideo(file.getAbsolutePath(), file2.getAbsolutePath(), mCameraId, video_width, video_height, 0, 0, new ShellUtils.ShellCallback() {

         @Override public void shellOut(String shellLine) {}

         @Override public void processComplete(int exitValue) {
             dialog.dismiss();
             if (exitValue != 0) {
                 ToastFactory.showLongToast(context, getResources().getString(R.string.state_compress_error));
                 mHandler.sendEmptyMessage(R.string.state_compress_error);
             } else {
                 mHandler.sendEmptyMessage(R.string.state_compress_end);
             }
         }
     });

 } catch(Exception e) {
     e.printStackTrace();
 }

FFmpeg编写翻译出来的Android库依然一点都不小的,差不离15M左右,无疑扩张了apk的分寸,固然你的产品趋势是摄像图片gif等等的格式转变类似的效果能够虚拟动用 * *

尽管您对这么的意义感兴趣能够详细查看源码:

github源码地址

m文件

#import "HProgressView.h"@interface HProgressView ()/** * 进度值0-1.0之间 */@property (nonatomic,assign)CGFloat progressValue;@property (nonatomic, assign) CGFloat currentTime;@end@implementation HProgressView// Only override drawRect: if you perform custom drawing.// An empty implementation adversely affects performance during animation.- drawRect:rect { // Drawing code CGContextRef ctx = UIGraphicsGetCurrentContext();//获取上下文 Plog(@"width = %f",self.frame.size.width); CGPoint center = CGPointMake(self.frame.size.width/2.0, self.frame.size.width/2.0); //设置圆心位置 CGFloat radius = self.frame.size.width/2.0-5; //设置半径 CGFloat startA = - M_PI_2; //圆起点位置 CGFloat endA = -M_PI_2 + M_PI * 2 * _progressValue; //圆终点位置 UIBezierPath *path = [UIBezierPath bezierPathWithArcCenter:center radius:radius startAngle:startA endAngle:endA clockwise:YES]; CGContextSetLineWidth; //设置线条宽度 [[UIColor whiteColor] setStroke]; //设置描边颜色 CGContextAddPath(ctx, path.CGPath); //把路径添加到上下文 CGContextStrokePath; //渲染}- setTimeMax:(NSInteger)timeMax { _timeMax = timeMax; self.currentTime = 0; self.progressValue = 0; [self setNeedsDisplay]; self.hidden = NO; [self performSelector:@selector(startProgress) withObject:nil afterDelay:0.1];}- clearProgress { _currentTime = _timeMax; self.hidden = YES;}- startProgress { _currentTime += 0.1; if (_timeMax > _currentTime) { _progressValue = _currentTime/_timeMax; Plog(@"progress = %f",_progressValue); [self setNeedsDisplay]; [self performSelector:@selector(startProgress) withObject:nil afterDelay:0.1]; } if (_timeMax <= _currentTime) { [self clearProgress]; }}@end

接下去正是照相机的调控器了,由于是近年来写的,所以用的xib,大家不用间接接纳,直接上m文件代码吧:

#import "HVideoViewController.h"#import <AVFoundation/AVFoundation.h>#import "HAVPlayer.h"#import "HProgressView.h"#import <Foundation/Foundation.h>#import <AssetsLibrary/AssetsLibrary.h>typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice);@interface HVideoViewController ()<AVCaptureFileOutputRecordingDelegate>//轻触拍照,按住摄像@property (strong, nonatomic) IBOutlet UILabel *labelTipTitle;//视频输出流@property (strong,nonatomic) AVCaptureMovieFileOutput *captureMovieFileOutput;//图片输出流//@property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流//负责从AVCaptureDevice获得输入数据@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//后台任务标识@property (assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier;@property (assign,nonatomic) UIBackgroundTaskIdentifier lastBackgroundTaskIdentifier;@property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标//负责输入和输出设备之间的数据传递@property(nonatomic)AVCaptureSession *session;//图像预览层,实时显示捕获的图像@property(nonatomic)AVCaptureVideoPreviewLayer *previewLayer;@property (strong, nonatomic) IBOutlet UIButton *btnBack;//重新录制@property (strong, nonatomic) IBOutlet UIButton *btnAfresh;//确定@property (strong, nonatomic) IBOutlet UIButton *btnEnsure;//摄像头切换@property (strong, nonatomic) IBOutlet UIButton *btnCamera;@property (strong, nonatomic) IBOutlet UIImageView *bgView;//记录录制的时间 默认最大60秒@property (assign, nonatomic) NSInteger seconds;//记录需要保存视频的路径@property (strong, nonatomic) NSURL *saveVideoUrl;//是否在对焦@property (assign, nonatomic) BOOL isFocus;@property (strong, nonatomic) IBOutlet NSLayoutConstraint *afreshCenterX;@property (strong, nonatomic) IBOutlet NSLayoutConstraint *ensureCenterX;@property (strong, nonatomic) IBOutlet NSLayoutConstraint *backCenterX;//视频播放@property (strong, nonatomic) HAVPlayer *player;@property (strong, nonatomic) IBOutlet HProgressView *progressView;//是否是摄像 YES 代表是录制 NO 表示拍照@property (assign, nonatomic) BOOL isVideo;@property (strong, nonatomic) UIImage *takeImage;@property (strong, nonatomic) UIImageView *takeImageView;@property (strong, nonatomic) IBOutlet UIImageView *imgRecord;@end//时间大于这个就是视频,否则为拍照#define TimeMax 1@implementation HVideoViewController-dealloc{ [self removeNotification]; }- viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view. UIImage *image = [UIImage imageNamed:@"sc_btn_take.png"]; self.backCenterX.constant = -(SCREEN_WIDTH/2/2)-image.size.width/2/2; self.progressView.layer.cornerRadius = self.progressView.frame.size.width/2; if (self.HSeconds == 0) { self.HSeconds = 60; } [self performSelector:@selector(hiddenTipsLabel) withObject:nil afterDelay:4];}- hiddenTipsLabel { self.labelTipTitle.hidden = YES;}- didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated.}- viewWillAppear:animated { [super viewWillAppear:animated]; [[UIApplication sharedApplication] setStatusBarHidden:YES]; [self customCamera]; [self.session startRunning];}-viewDidAppear:animated{ [super viewDidAppear:animated];}-viewDidDisappear:animated{ [super viewDidDisappear:animated]; [self.session stopRunning];}- viewWillDisappear:animated { [super viewWillDisappear:animated]; [[UIApplication sharedApplication] setStatusBarHidden:NO];}- customCamera { //初始化会话,用来结合输入输出 self.session = [[AVCaptureSession alloc] init]; //设置分辨率 (设备支持的最高分辨率) if ([self.session canSetSessionPreset:AVCaptureSessionPresetHigh]) { self.session.sessionPreset = AVCaptureSessionPresetHigh; } //取得后置摄像头 AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack]; //添加一个音频输入设备 AVCaptureDevice *audioCaptureDevice=[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; //初始化输入设备 NSError *error = nil; self.captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error]; if  { Plog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription); return; } //添加音频 error = nil; AVCaptureDeviceInput *audioCaptureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:audioCaptureDevice error:&error]; if  { NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription); return; } //输出对象 self.captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];//视频输出 //将输入设备添加到会话 if ([self.session canAddInput:self.captureDeviceInput]) { [self.session addInput:self.captureDeviceInput]; [self.session addInput:audioCaptureDeviceInput]; //设置视频防抖 AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo]; if ([connection isVideoStabilizationSupported]) { connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic; } } //将输出设备添加到会话 (刚开始 是照片为输出对象) if ([self.session canAddOutput:self.captureMovieFileOutput]) { [self.session addOutput:self.captureMovieFileOutput]; } //创建视频预览层,用于实时展示摄像头状态 self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session]; self.previewLayer.frame = self.view.bounds;//CGRectMake(0, 0, self.view.width, self.view.height); self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//填充模式 [self.bgView.layer addSublayer:self.previewLayer]; [self addNotificationToCaptureDevice:captureDevice]; [self addGenstureRecognizer];}- onCancelAction:(UIButton *)sender { [self dismissViewControllerAnimated:YES completion:^{ [Utility hideProgressDialog]; }];}- touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { if ([[touches anyObject] view] == self.imgRecord) { Plog; //根据设备输出获得连接 AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeAudio]; //根据连接取得设备输出的数据 if (![self.captureMovieFileOutput isRecording]) { //如果支持多任务则开始多任务 if ([[UIDevice currentDevice] isMultitaskingSupported]) { self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]; } if (self.saveVideoUrl) { [[NSFileManager defaultManager] removeItemAtURL:self.saveVideoUrl error:nil]; } //预览图层和视频方向保持一致 connection.videoOrientation = [self.previewLayer connection].videoOrientation; NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"]; NSLog(@"save path is :%@",outputFielPath); NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath]; NSLog(@"fileUrl:%@",fileUrl); [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self]; } else { [self.captureMovieFileOutput stopRecording]; } }}- touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { if ([[touches anyObject] view] == self.imgRecord) { Plog; if (!self.isVideo) { [self performSelector:@selector(endRecord) withObject:nil afterDelay:0.3]; } else { [self endRecord]; } }}- endRecord { [self.captureMovieFileOutput stopRecording];//停止录制}- onAfreshAction:(UIButton *)sender { Plog; [self recoverLayout];}- onEnsureAction:(UIButton *)sender { Plog(@"确定 这里进行保存或者发送出去"); if (self.saveVideoUrl) { WS [Utility showProgressDialogText:@"视频处理中..."]; ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init]; [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:self.saveVideoUrl completionBlock:^(NSURL *assetURL, NSError *error) { Plog(@"outputUrl:%@",weakSelf.saveVideoUrl); [[NSFileManager defaultManager] removeItemAtURL:weakSelf.saveVideoUrl error:nil]; if (weakSelf.lastBackgroundTaskIdentifier!= UIBackgroundTaskInvalid) { [[UIApplication sharedApplication] endBackgroundTask:weakSelf.lastBackgroundTaskIdentifier]; } if  { Plog(@"保存视频到相簿过程中发生错误,错误信息:%@",error.localizedDescription); [Utility showAllTextDialog:KAppDelegate.window Text:@"保存视频到相册发生错误"]; } else { if (weakSelf.takeBlock) { weakSelf.takeBlock; } Plog(@"成功保存视频到相簿."); [weakSelf onCancelAction:nil]; } }]; } else { //照片 UIImageWriteToSavedPhotosAlbum(self.takeImage, self, nil, nil); if (self.takeBlock) { self.takeBlock(self.takeImage); } [self onCancelAction:nil]; }}//前后摄像头的切换- onCameraAction:(UIButton *)sender { Plog; AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCaptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionFront;//前 if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) { toChangePosition = AVCaptureDevicePositionBack;//后 } toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition]; [self addNotificationToCaptureDevice:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.session beginConfiguration]; //移除原有输入对象 [self.session removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.session canAddInput:toChangeDeviceInput]) { [self.session addInput:toChangeDeviceInput]; self.captureDeviceInput = toChangeDeviceInput; } //提交会话配置 [self.session commitConfiguration];}- onStartTranscribe:fileURL { if ([self.captureMovieFileOutput isRecording]) { -- self.seconds; if (self.seconds > 0) { if (self.HSeconds - self.seconds >= TimeMax && !self.isVideo) { self.isVideo = YES;//长按时间超过TimeMax 表示是视频录制 self.progressView.timeMax = self.seconds; } [self performSelector:@selector(onStartTranscribe:) withObject:fileURL afterDelay:1.0]; } else { if ([self.captureMovieFileOutput isRecording]) { [self.captureMovieFileOutput stopRecording]; } } }}#pragma mark - 视频输出代理-captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:fileURL fromConnections:(NSArray *)connections{ Plog(@"开始录制..."); self.seconds = self.HSeconds; [self performSelector:@selector(onStartTranscribe:) withObject:fileURL afterDelay:1.0];}-captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{ Plog(@"视频录制完成."); [self changeLayout]; if (self.isVideo) { self.saveVideoUrl = outputFileURL; if (!self.player) { self.player = [[HAVPlayer alloc] initWithFrame:self.bgView.bounds withShowInView:self.bgView url:outputFileURL]; } else { if (outputFileURL) { self.player.videoUrl = outputFileURL; self.player.hidden = NO; } } } else { //照片 self.saveVideoUrl = nil; [self videoHandlePhoto:outputFileURL]; } }- videoHandlePhoto:url { AVURLAsset *urlSet = [AVURLAsset assetWithURL:url]; AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:urlSet]; imageGenerator.appliesPreferredTrackTransform = YES; // 截图的时候调整到正确的方向 NSError *error = nil; CMTime time = CMTimeMake;//缩略图创建时间 CMTime是表示电影时间信息的结构体,第一个参数表示是视频第几秒,第二个参数表示每秒帧数.(如果要获取某一秒的第几帧可以使用CMTimeMake方法) CMTime actucalTime; //缩略图实际生成的时间 CGImageRef cgImage = [imageGenerator copyCGImageAtTime:time actualTime:&actucalTime error:&error]; if  { Plog(@"截取视频图片失败:%@",error.localizedDescription); } CMTimeShow(actucalTime); UIImage *image = [UIImage imageWithCGImage:cgImage]; CGImageRelease; if  { Plog(@"视频截取成功"); } else { Plog(@"视频截取失败"); } self.takeImage = image;//[UIImage imageWithCGImage:cgImage]; [[NSFileManager defaultManager] removeItemAtURL:url error:nil]; if (!self.takeImageView) { self.takeImageView = [[UIImageView alloc] initWithFrame:self.view.frame]; [self.bgView addSubview:self.takeImageView]; } self.takeImageView.hidden = NO; self.takeImageView.image = self.takeImage;}#pragma mark - 通知//注册通知- setupObservers{ NSNotificationCenter *notification = [NSNotificationCenter defaultCenter]; [notification addObserver:self selector:@selector(applicationDidEnterBackground:) name:UIApplicationWillResignActiveNotification object:[UIApplication sharedApplication]];}//进入后台就退出视频录制- applicationDidEnterBackground:(NSNotification *)notification { [self onCancelAction:nil];}/** * 给输入设备添加通知 */-addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{ //注意添加区域改变捕获通知必须首先设置设备允许捕获 [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { captureDevice.subjectAreaChangeMonitoringEnabled=YES; }]; NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //捕获区域发生改变 [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}-removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}/** * 移除所有通知 */-removeNotification{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self];}-addNotificationToCaptureSession:(AVCaptureSession *)captureSession{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //会话出错 [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];}/** * 设备连接成功 * * @param notification 通知对象 */-deviceConnected:(NSNotification *)notification{ NSLog(@"设备已连接...");}/** * 设备连接断开 * * @param notification 通知对象 */-deviceDisconnected:(NSNotification *)notification{ NSLog(@"设备已断开.");}/** * 捕获区域改变 * * @param notification 通知对象 */-areaChange:(NSNotification *)notification{ NSLog(@"捕获区域改变...");}/** * 会话出错 * * @param notification 通知对象 */-sessionRuntimeError:(NSNotification *)notification{ NSLog(@"会话发生错误.");}/** * 取得指定位置的摄像头 * * @param position 摄像头位置 * * @return 摄像头设备 */-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{ NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *camera in cameras) { if ([camera position] == position) { return camera; } } return nil;}/** * 改变设备属性的统一操作方法 * * @param propertyChange 属性改变操作 */-changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { //自动白平衡 if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) { [captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]; } //自动根据环境条件开启闪光灯 if ([captureDevice isFlashModeSupported:AVCaptureFlashModeAuto]) { [captureDevice setFlashMode:AVCaptureFlashModeAuto]; } propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription); }}/** * 设置闪光灯模式 * * @param flashMode 闪光灯模式 */-setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }];}/** * 设置聚焦模式 * * @param focusMode 聚焦模式 */-setFocusMode:(AVCaptureFocusMode )focusMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:focusMode]; } }];}/** * 设置曝光模式 * * @param exposureMode 曝光模式 */-setExposureMode:(AVCaptureExposureMode)exposureMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:exposureMode]; } }];}/** * 设置聚焦点 * * @param point 聚焦点 */-focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {// if ([captureDevice isFocusPointOfInterestSupported]) {// [captureDevice setFocusPointOfInterest:point];// }// if ([captureDevice isExposurePointOfInterestSupported]) {// [captureDevice setExposurePointOfInterest:point];// } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:exposureMode]; } if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:focusMode]; } }];}/** * 添加点按手势,点按时聚焦 */-addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.bgView addGestureRecognizer:tapGesture];}-tapScreen:(UITapGestureRecognizer *)tapGesture{ if ([self.session isRunning]) { CGPoint point= [tapGesture locationInView:self.bgView]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.previewLayer captureDevicePointOfInterestForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposureMode:AVCaptureExposureModeContinuousAutoExposure atPoint:cameraPoint]; }}/** * 设置聚焦光标位置 * * @param point 光标位置 */-setFocusCursorWithPoint:point{ if (!self.isFocus) { self.isFocus = YES; self.focusCursor.center=point; self.focusCursor.transform = CGAffineTransformMakeScale(1.25, 1.25); self.focusCursor.alpha = 1.0; [UIView animateWithDuration:0.5 animations:^{ self.focusCursor.transform = CGAffineTransformIdentity; } completion:^(BOOL finished) { [self performSelector:@selector(onHiddenFocusCurSorAction) withObject:nil afterDelay:0.5]; }]; }}- onHiddenFocusCurSorAction { self.focusCursor.alpha=0; self.isFocus = NO;}//拍摄完成时调用- changeLayout { self.imgRecord.hidden = YES; self.btnCamera.hidden = YES; self.btnAfresh.hidden = NO; self.btnEnsure.hidden = NO; self.btnBack.hidden = YES; if (self.isVideo) { [self.progressView clearProgress]; } self.afreshCenterX.constant = -(SCREEN_WIDTH/2/2); self.ensureCenterX.constant = SCREEN_WIDTH/2/2; [UIView animateWithDuration:0.25 animations:^{ [self.view layoutIfNeeded]; }]; self.lastBackgroundTaskIdentifier = self.backgroundTaskIdentifier; self.backgroundTaskIdentifier = UIBackgroundTaskInvalid; [self.session stopRunning];}//重新拍摄时调用- recoverLayout { if (self.isVideo) { self.isVideo = NO; [self.player stopPlayer]; self.player.hidden = YES; } [self.session startRunning]; if (!self.takeImageView.hidden) { self.takeImageView.hidden = YES; }// self.saveVideoUrl = nil; self.afreshCenterX.constant = 0; self.ensureCenterX.constant = 0; self.imgRecord.hidden = NO; self.btnCamera.hidden = NO; self.btnAfresh.hidden = YES; self.btnEnsure.hidden = YES; self.btnBack.hidden = NO; [UIView animateWithDuration:0.25 animations:^{ [self.view layoutIfNeeded]; }];}/*#pragma mark - Navigation// In a storyboard-based application, you will often want to do a little preparation before navigation- prepareForSegue:(UIStoryboardSegue *)segue sender:sender { // Get the new view controller using [segue destinationViewController]. // Pass the selected object to the new view controller.}*/@end

采纳也挺轻易:

- onCameraAction:(UIButton *)sender { //额 。。由于是demo,所以用的xib,大家根据需求自己更改,该demo只是提供一个思路,使用时不要直接拖入项目 HVideoViewController *ctrl = [[NSBundle mainBundle] loadNibNamed:@"HVideoViewController" owner:nil options:nil].lastObject; ctrl.HSeconds = 30;//设置可录制最长时间 ctrl.takeBlock = ^ { if ([item isKindOfClass:[NSURL class]]) { NSURL *videoURL = item; //视频url } else { //图片 } }; [self presentViewController:ctrl animated:YES completion:nil];}

demo地址也给出去吧:不喜勿碰-_-

之后就甘休啦,写的相比轻易,希望能辅助到我们,感激!

批注都说的很理解了,就窘迫代码做连锁解释了,最要害的依旧AVMutableCompositionAVMutableCompositionTrackAVMutableCompositionTrack的大概实用,百度时而就会看出有关属性的证实,另外的就和日常相机的拍卖没什么差异的。另贰个方面,就是录像的剪辑了,大家对此上传服务器的视频,由于岁月太长也许是录制太大,那就需求对录制展开削减和剪裁管理,这里对于时间的选拔UI层上不做验证了,能够看的demo里的UI管理,下边给出录制剪裁的代码:

x效果图也帖出来啊:

图片 1KJCamera.gif

- onCompleteButtonAction:(UIButton *)sender { //开始剪裁 [self.kj_player pause]; //开始时间 CMTime startTime = CMTimeMakeWithSeconds((self.collectionView.contentOffset.x+self.btnStart.frame.origin.x)*self.pixel_time, self.kj_player.currentItem.duration.timescale); //长度 CGFloat length = (self.btnEnd.frame.origin.x+self.btnEnd.frame.size.width-self.btnStart.frame.origin.x)*self.pixel_time; CMTime time_total = [self.kj_player.currentItem duration]; if (length == 1.0*time_total.value/time_total.timescale) { if (self.kj_videoCapturedelegate && [self.kj_videoCapturedelegate respondsToSelector:@selector(kj_didCaptureCompleteForPath:)]) { AVURLAsset *urlAsset = (AVURLAsset *)self.kj_player.currentItem.asset; [self.kj_videoCapturedelegate kj_didCaptureCompleteForPath:urlAsset.URL.path]; } else { [self onCancelButtonAction:nil]; } return; } [KJUtility showProgressDialogText:@"开始处理"]; if (length > self.kj_maxTime) { length = self.kj_maxTime; } CMTime videoLenth = CMTimeMakeWithSeconds(length, self.kj_player.currentItem.duration.timescale); CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoLenth); AVAssetExportSession * exportSession = [[AVAssetExportSession alloc] initWithAsset:self.kj_player.currentItem.asset presetName:AVAssetExportPresetMediumQuality]; exportSession.timeRange = videoTimeRange; NSString *path = [KJUtility kj_getKJAlbumFilePath]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init]; [formatter setDateFormat:@"yyyyMMddHHmmss"]; NSString *fileName = [NSString stringWithFormat:@"%@-%@",[formatter stringFromDate:[NSDate date]], @"kj_video.mp4"]; path = [path stringByAppendingPathComponent:fileName]; exportSession.outputURL = [NSURL fileURLWithPath:path]; exportSession.outputFileType = AVFileTypeMPEG4; __block BOOL completeOK = NO; WS [exportSession exportAsynchronouslyWithCompletionHandler:^{ switch (exportSession.status) { case AVAssetExportSessionStatusUnknown: break; case AVAssetExportSessionStatusWaiting: break; case AVAssetExportSessionStatusExporting: break; case AVAssetExportSessionStatusCompleted: completeOK = YES; break; case AVAssetExportSessionStatusFailed: break; case AVAssetExportSessionStatusCancelled: break; }; if (completeOK) { dispatch_async(dispatch_get_main_queue(), ^{ [KJUtility showAllTextDialog:weakSelf.view Text:@"视频截取成功"]; if (weakSelf.kj_videoCapturedelegate && [weakSelf.kj_videoCapturedelegate respondsToSelector:@selector(kj_didCaptureCompleteForPath:)]) { [KJUtility hideProgressDialog]; [weakSelf.kj_videoCapturedelegate kj_didCaptureCompleteForPath:path]; } else { //保存到相册 [KJUtility kj_saveVideoToLibraryForPath:path completeHandler:^(NSString *localIdentifier, BOOL isSuccess) { if (isSuccess) { NSFileManager *fileManger = [[NSFileManager alloc] init]; [fileManger removeItemAtPath:path error:nil]; } dispatch_async(dispatch_get_main_queue(), ^{ [KJUtility hideProgressDialog]; [weakSelf onCancelButtonAction:nil]; }); }]; } }); } else { [KJUtility showAllTextDialog:weakSelf.view Text:@"视频截取失败"]; } dispatch_async(dispatch_get_main_queue(), ^{ [KJUtility hideProgressDialog]; }); }];}

接过来讲说利用GPUImage对本土录制加滤镜,若是急需采取滤镜后,能实时见到滤镜效果,要求选用GPUImageMovie来预览,滤镜合成还亟需采纳到GPUImageMovieWriter,由于GPUImageMovie预览录制时是一直不声息的,所以还供给一个播放器来播放摄像,那些可以看本身的demo管理,上面贴出对地点录像合成滤镜的处理:

//合成滤镜- filterCompositionForFilter:(GPUImageOutput<GPUImageInput> *)filter withVideoUrl:videoUrl { if  { WS GPUImageOutput<GPUImageInput> *tmpFilter = filter; kj_movieComposition = [[GPUImageMovie alloc] initWithURL:videoUrl]; kj_movieComposition.runBenchmark = YES; kj_movieComposition.playAtActualSpeed = NO; [kj_movieComposition addTarget:tmpFilter]; //合成后的视频路径 NSString *newPath = [KJUtility kj_getKJAlbumFilePath]; newPath = [newPath stringByAppendingPathComponent:[KJUtility kj_getNewFileName]]; unlink([newPath UTF8String]); NSLog(@"%f,%f",self.kj_player.currentItem.presentationSize.height,self.kj_player.currentItem.presentationSize.width); CGSize videoSize = self.kj_player.currentItem.presentationSize; NSURL *tmpUrl = [NSURL fileURLWithPath:newPath]; [self.kj_newVideoPathArray addObject:tmpUrl]; kj_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:tmpUrl size:videoSize]; kj_movieWriter.shouldPassthroughAudio = YES; //这里原GPUImage始没有这个属性的,修改后才有 kj_movieWriter.allowWriteAudio = YES; kj_movieComposition.audioEncodingTarget = kj_movieWriter; [tmpFilter addTarget:kj_movieWriter]; [kj_movieComposition enableSynchronizedEncodingUsingMovieWriter:kj_movieWriter]; [kj_movieWriter startRecording]; [kj_movieComposition startProcessing]; __weak GPUImageMovieWriter *weakmovieWriter = kj_movieWriter; [kj_movieWriter setCompletionBlock:^{ NSLog(@"滤镜添加成功"); [tmpFilter removeTarget:weakmovieWriter]; [weakmovieWriter finishRecording]; dispatch_async(dispatch_get_main_queue(), ^{ if (weakSelf.kj_selectedMusic) { //合成音乐 [weakSelf musicCompositionForMusicInfo:weakSelf.kj_selectedMusic withVideoPath:weakSelf.kj_newVideoPathArray.lastObject]; } else { [weakSelf saveVideoToLib]; } }); }]; [kj_movieWriter setFailureBlock:^(NSError *error) { dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"滤镜添加失败:%@", error); if ([[NSFileManager defaultManager] fileExistsAtPath:newPath]) { NSError *delError = nil; [[NSFileManager defaultManager] removeItemAtPath:newPath error:&delError]; if  { NSLog(@"删除沙盒路径失败:%@", delError); } } [weakSelf.kj_newVideoPathArray removeLastObject]; [KJUtility hideProgressDialog]; }); }]; }}

上面包车型地铁话说对于本地录像增多音乐的拍卖,增加音乐这里作者的笺注写的很明白,间接上代码(照旧使用前面提到的是哪位类来管理的,原理正是把录制和旋律单独提出来,最终放在工具里联合):

//合成音乐- musicCompositionForMusicInfo:(NSDictionary *)musicInfo withVideoPath:videoUrl { if (musicInfo && videoUrl) { //音乐 NSString *audioPath = [[NSBundle mainBundle] pathForResource:musicInfo[@"music"] ofType:@"mp3"]; NSURL *audioUrl = [NSURL fileURLWithPath:audioPath]; //合成后的视频输出路径 NSString *newPath = [KJUtility kj_getKJAlbumFilePath]; newPath = [newPath stringByAppendingPathComponent:[KJUtility kj_getNewFileName]]; unlink([newPath UTF8String]); NSURL *newVideoPath = [NSURL fileURLWithPath:newPath]; //合成工具 AVMutableComposition *kj_composition = [AVMutableComposition composition]; //音频 AVMutableCompositionTrack *kj_audioTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //视频 AVMutableCompositionTrack *kj_videoTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; NSDictionary* kj_options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES}; //视频AVAsset AVURLAsset *kj_videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:kj_options]; //视频时间范围(合成的音乐不能超过这个时间范围) CMTimeRange kj_videoTimeRange = CMTimeRangeMake(kCMTimeZero, kj_videoAsset.duration); //采集kj_videoAsset中的视频 NSArray *videoArray = [kj_videoAsset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack *kj_assetVideo = videoArray.firstObject; //采集的视频加入到视频通道kj_videoTrack NSError *kj_videoError = nil; BOOL isComplete_video = [kj_videoTrack insertTimeRange:kj_videoTimeRange ofTrack:kj_assetVideo atTime:kCMTimeZero error:&kj_videoError]; NSLog(@"加入视频isComplete_video:%d error:%@",isComplete_video, kj_videoError); //音频AVAsset AVURLAsset *kj_audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:kj_options]; //采集kj_audioAsset中的音频 NSArray *audioArray = [kj_audioAsset tracksWithMediaType:AVMediaTypeAudio]; AVAssetTrack *kj_assetAudio = audioArray.firstObject; //音频的范围 CMTimeRange kj_audioTimeRange = CMTimeRangeMake(kCMTimeZero, kj_audioAsset.duration); if (CMTimeCompare(kj_audioAsset.duration, kj_videoAsset.duration)) {//当视频时间小于音频时间 kj_audioTimeRange = CMTimeRangeMake(kCMTimeZero, kj_videoAsset.duration); } //采集的音频加入到音频通道kj_audioTrack NSError *kj_audioError = nil; BOOL isComplete_audio = [kj_audioTrack insertTimeRange:kj_audioTimeRange ofTrack:kj_assetAudio atTime:kCMTimeZero error:&kj_audioError]; NSLog(@"加入音频isComplete_audio:%d error:%@",isComplete_audio, kj_audioError); //因为要保存相册,所以设置高质量, 这里可以根据实际情况进行更改 WS [KJUtility kj_compressedVideoAsset:kj_composition withPresetName:AVAssetExportPresetHighestQuality withNewSavePath:newVideoPath withCompleteBlock:^(NSError *error) { dispatch_async(dispatch_get_main_queue(), ^{ if  { NSLog(@"转码失败:%@", error); [KJUtility hideProgressDialog]; } else { [weakSelf.kj_newVideoPathArray addObject:newVideoPath]; [weakSelf saveVideoToLib]; } }); }]; }}

录像压缩这些毫无多说了,网络太多了,基本上是iOS自带的就有那几个方法,非常粗大略的几行代码:

/** 视频转码/压缩 @param asset AVAsset @param presetName 视频质量(建议压缩使用AVAssetExportPresetMediumQuality,存相册AVAssetExportPreset1920x1080,根据需求设置) @param savePath 保存的路径 @param completeBlock 返回状态 */+ kj_compressedVideoAsset:(AVAsset *)asset withPresetName:(NSString *)presetName withNewSavePath:savePath withCompleteBlock:(NSError *error))completeBlock { AVAssetExportSession *kj_export = [AVAssetExportSession exportSessionWithAsset:asset presetName:presetName]; kj_export.outputURL = savePath; kj_export.outputFileType = AVFileTypeMPEG4; kj_export.shouldOptimizeForNetworkUse = YES; [kj_export exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ if (kj_export.status == AVAssetExportSessionStatusCompleted) { if (completeBlock) { completeBlock; } } else if (kj_export.status == AVAssetExportSessionStatusFailed) { if (completeBlock) { completeBlock(kj_export.error); } } else { NSLog(@"当前压缩进度:%f",kj_export.progress); } }); }];}

此处说的都以短录像的管理方案,事实上demo里有相册的管理,不过没做实时更新的拍卖,所以有实时更新的内需您自身加上代码监听并更新呈现,当然啦,小编的UI料定不适合大家,所以这边只是提供一种方法,不是写的工具能一直拿来就能够采取,demo帮忙iOS8*。上面给出整个demo的github:Kegendemo比非常的粗糙,然则注释都有写,勿直接利用、、、GPUImage很强劲,还也许有众多都没利用到,有必要可以斟酌切磋,很实用。demo中央电台频会在相册里存两份,一份是减掉后的,三个是没压缩的,那几个是本人做测量检验的时候存的,假设大家在测量检验的时候无需,那么些能够在调用的时候能够去掉保存,小编的调用代码:

- onVideoButtonAction:(UIButton *)sender { KJVideoAlbumController *ctrl = [[KJVideoAlbumController alloc] init]; ctrl.kj_minTime = 2.0; ctrl.kj_maxTime = 15.0f; WS ctrl.kj_complete = ^(NSURL *outPath) { dispatch_async(dispatch_get_main_queue(), ^{ [weakSelf editViewPath:outPath]; }); }; UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl]; navc.modalTransitionStyle = UIModalTransitionStyleCrossDissolve; [self presentViewController:navc animated:YES completion:nil];}- editVideo:(NSString *)localIdentifier { WS [KJUtility kj_getAssetForLocalIdentifier:localIdentifier completionHandler:^(PHAsset *kj_object) { dispatch_async(dispatch_get_main_queue(), ^{ [KJUtility kj_requestVideoForAsset:kj_object completion:^(AVURLAsset *asset) { dispatch_async(dispatch_get_main_queue(), ^{ KJEditVideoViewController *ctrl = [[KJEditVideoViewController alloc] init]; ctrl.kj_localVideo = asset; ctrl.modalTransitionStyle = UIModalTransitionStyleCrossDissolve; UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl]; [weakSelf presentViewController:navc animated:YES completion:nil]; }); }]; }); }];}- editViewPath:path { KJEditVideoViewController *ctrl = [[KJEditVideoViewController alloc] init]; ctrl.kj_localVideo = path; ctrl.kj_isSelectCover = YES; WS/** * videoPath压缩后的视频 * localIdentifier保存到相册的高质量视频(如果没有添加滤镜或音乐,返回nil) * kj_cover 封面图 */ ctrl.editCompleteBlock = ^(NSURL *videoPath, NSString *localidentifier, UIImage *kj_cover) { [weakSelf saveVideoToLibVideoUrl:videoPath]; }; ctrl.modalTransitionStyle = UIModalTransitionStyleCrossDissolve; UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl]; [self presentViewController:navc animated:YES completion:nil];}//保存到相册- saveVideoToLibVideoUrl:url { [KJUtility kj_saveVideoToLibraryForPath:url.path completeHandler:^(NSString *localIdentifier, BOOL isSuccess) { if (isSuccess) { NSLog(@"保存到相册成功"); } else { NSLog(@"保存到相册失败"); } }];}

另外只怕大家在看那代码里面有相当多都以调用KJUtility的代码,把h文件放出去就通晓了:

//// KJUtility.h// KJAlbumDemo//// Created by JOIN iOS on 2017/9/5.// Copyright © 2017年 Kegem. All rights reserved.//#import <Foundation/Foundation.h>#import <Photos/Photos.h>#import "UIKit+BaseExtension.h"#import <YYWebImage.h>#import <Masonry.h>#import <YYAnimatedImageView.h>#import <GPUImage/GPUImage.h>@protocol KJCustomCameraDelegate <NSObject>- kj_didStartTakeAction;- kj_didResetTakeAction;- kj_didCompleteAction;- kj_didCancelAction;@end@protocol KJVideoFileDelegate <NSObject>//这里是不保存到相册,返回录制完后的地址- kj_videoFileCompleteLocalPath:(NSString *)kj_outPath;@end//状态条的高#define StatusBarHeight [[UIApplication sharedApplication] statusBarFrame].size.height//得到屏幕bounds#define SCREEN_SIZE [UIScreen mainScreen].bounds//得到屏幕height#define SCREEN_HEIGHT [UIScreen mainScreen].bounds.size.height//得到屏幕width#define SCREEN_WIDTH [UIScreen mainScreen].bounds.size.width//self#define WS __weak typeof  weakSelf = self;//颜色#define sYellowColor 0xffd700@interface KJUtility : NSObject+ showAllTextDialog:view Text:(NSString *)text;+ showProgressDialogText:(NSString *)text;+ hideProgressDialog;//传入 秒 得到 xx分钟xx秒+ (NSString *)getMMSSFromSS:(NSInteger)seconds;/** 拍摄的图片/视频等存放的路径 @return 文件路径 */+ (NSString *)kj_getKJAlbumFilePath;/** 视频名 @return 返回视频名 */+ (NSString *)kj_getNewFileName;/** 根据PHAsset获取图片 @param asset PHAsset @param isSynchronous 同步-YES 异步-NO @param completion 返回图片 */+ kj_requestImageForAsset:(PHAsset *)asset withSynchronous:isSynchronous completion:(UIImage *image))completion;/** 根据PHAsset获取视频 @param kj_asset PHAsset @param completion AVURLAsset */+ kj_requestVideoForAsset:(PHAsset *)kj_asset completion:(AVURLAsset *asset))completion;/** 获取视频的缩略图方法 @param urlAsset 视频的本地路径 @param start 开始时间 @param timescale scale @return 视频截图 */+ (UIImage *)kj_getScreenShotImageFromVideoPath:(AVURLAsset *)urlAsset withStart:start withTimescale:timescale ;/** 图片保存到系统相册 @param image 图片 @param completionHandler 返回结果 */+ kj_savePhotoToLibraryForImage:(UIImage *)image completeHandler:(NSString *localIdentifier, BOOL isSuccess))completionHandler;/** 视频保存到系统相册 @param path 视频路径 @param completionHandler 返回结果 */+ kj_saveVideoToLibraryForPath:(NSString *)path completeHandler:(NSString *localIdentifier, BOOL isSuccess))completionHandler;/** 根据相册localid获取PHAsset @param localIdentifier 相册id @param completionHandler 返回PHAsset对象 */+ kj_getAssetForLocalIdentifier:(NSString *)localIdentifier completionHandler:(PHAsset *kj_object))completionHandler;/** 视频转码/压缩 @param asset AVAsset @param presetName 视频质量(建议压缩上传使用AVAssetExportPresetMediumQuality根据需求设置) @param savePath 保存的路径 @param completeBlock 返回状态 */+ kj_compressedVideoAsset:(AVAsset *)asset withPresetName:(NSString *)presetName withNewSavePath:savePath withCompleteBlock:(NSError *error))completeBlock;/** 相册授权 @param ctrl 当前控制器 @param completeBlock 返回是否允许访问 */+ kj_photoLibraryAuthorizationStatus:(UIViewController *)ctrl completeBlock:(BOOL allowAccess))completeBlock;/** 相机授权 @param ctrl 当前控制器 @param completeBlock 返回是否允许访问 */+ kj_cameraAuthorizationStatus:(UIViewController *)ctrl completeBlock:(BOOL allowAccess))completeBlock;/** 麦克风授权 @param ctrl 当前控制器 @param completeBlock 返回是否允许访问 */+ kj_requestRecordPermission:(UIViewController *)ctrl completeBlock:(BOOL allowAccess))completeBlock;/** 授权提示弹出框(跳转到手机设置-应用权限) @param ctrl 当前控制器 @param title 提示语 */+ kj_authorizationAlert:(UIViewController *)ctrl tipMessage:(NSString *)title;/** 图片加滤镜 @param image 源图 @param filterName 滤镜名字 @return 合成滤镜后的图片 */+ (UIImage *)kj_imageProcessedUsingGPUImage:(UIImage *)image withFilterName:(NSString *)filterName;@end

此处本意是在应用完短摄像的效果后,删除沙盒中的kjalbum的文本夹。好了,不在过多的废话了,大家一贯下载demo看看就知道了,上面给出效果展示gif,由于太大,压缩处理了,或然有一些模糊:

图片 2KJlbum.gif

本文由必赢娱乐_官网(welcome!)发布,转载请注明来源

关键词: