麻豆小视频在线观看_中文黄色一级片_久久久成人精品_成片免费观看视频大全_午夜精品久久久久久久99热浪潮_成人一区二区三区四区

首頁(yè) > 系統(tǒng) > iOS > 正文

iOS實(shí)現(xiàn)微信朋友圈視頻截取功能

2019-10-21 18:40:18
字體:
來(lái)源:轉(zhuǎn)載
供稿:網(wǎng)友

序言

微信現(xiàn)在這么普及,功能也做的越來(lái)越強(qiáng)大,不知大家對(duì)于微信朋友圈發(fā)視頻截取的功能或者蘋(píng)果拍視頻對(duì)視頻編輯的功能有沒(méi)有了解(作者這里也猜測(cè),微信的這個(gè)功能也是仿蘋(píng)果的)。感覺(jué)這個(gè)功能確實(shí)很方便實(shí)用,近來(lái)作者也在研究音視頻功能,所以就實(shí)現(xiàn)了一下這個(gè)功能。

功能其實(shí)看著挺簡(jiǎn)單,實(shí)現(xiàn)過(guò)程也踩了不少坑。一方面記錄一下;另一方面也算是對(duì)實(shí)現(xiàn)過(guò)程的再一次梳理,這樣大家看代碼也會(huì)比較明白。

效果

我們先看看我實(shí)現(xiàn)的效果

iOS,微信,朋友圈,視頻,截取

實(shí)現(xiàn)

實(shí)現(xiàn)過(guò)程分析

整個(gè)功能可以分為三部分:

  • 視頻播放

這部分我們單獨(dú)封裝一個(gè)視頻播放器即可

  • 下邊的滑動(dòng)視圖

這部分實(shí)現(xiàn)過(guò)程比較復(fù)雜,一共分成了4部分。灰色遮蓋、左右把手滑塊、滑塊中間上下兩條線、圖片管理視圖

控制器視圖邏輯組裝和功能實(shí)現(xiàn)

  • 視頻播放器的封裝

這里使用AVPlayer、playerLayer、AVPlayerItem這三個(gè)類實(shí)現(xiàn)了視頻播放功能;由于整個(gè)事件都是基于KVO監(jiān)聽(tīng)的,所以增加了Block代碼提供了對(duì)外監(jiān)聽(tīng)使用。

#import "FOFMoviePlayer.h"@interface FOFMoviePlayer(){  AVPlayerLooper *_playerLooper;  AVPlayerItem *_playItem;  BOOL _loop;}@property(nonatomic,strong)NSURL *url;@property(nonatomic,strong)AVPlayer *player;@property(nonatomic,strong)AVPlayerLayer *playerLayer;@property(nonatomic,strong)AVPlayerItem *playItem;@property (nonatomic,assign) CMTime duration;@end@implementation FOFMoviePlayer-(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer{  self = [super init];  if (self) {    [self initplayers:superLayer];    _playerLayer.frame = frame;    self.url = url;  }  return self;}-(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer loop:(BOOL)loop{  self = [self initWithFrame:frame url:url superLayer:superLayer];  if (self) {    _loop = loop;  }  return self;}- (void)initplayers:(CALayer *)superLayer{  self.player = [[AVPlayer alloc] init];  self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];  self.playerLayer.videoGravity = AVLayerVideoGravityResize;  [superLayer addSublayer:self.playerLayer];}- (void)initLoopPlayers:(CALayer *)superLayer{  self.player = [[AVQueuePlayer alloc] init];  self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];  self.playerLayer.videoGravity = AVLayerVideoGravityResize;  [superLayer addSublayer:self.playerLayer];}-(void)fof_play{  [self.player play];}-(void)fof_pause{  [self.player pause];}#pragma mark - Observe-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void *)context{  if ([keyPath isEqualToString:@"status"]) {    AVPlayerItem *item = (AVPlayerItem *)object;    AVPlayerItemStatus status = [[change objectForKey:@"new"] intValue]; // 獲取更改后的狀態(tài)    if (status == AVPlayerItemStatusReadyToPlay) {      _duration = item.duration;//只有在此狀態(tài)下才能獲取,不能在AVPlayerItem初始化后馬上獲取      NSLog(@"準(zhǔn)備播放");      if (self.blockStatusReadyPlay) {        self.blockStatusReadyPlay(item);      }    } else if (status == AVPlayerItemStatusFailed) {      if (self.blockStatusFailed) {        self.blockStatusFailed();      }      AVPlayerItem *item = (AVPlayerItem *)object;      NSLog(@"%@",item.error);      NSLog(@"AVPlayerStatusFailed");    } else {      self.blockStatusUnknown();      NSLog(@"%@",item.error);      NSLog(@"AVPlayerStatusUnknown");    }  }else if ([keyPath isEqualToString:@"tracking"]){    NSInteger status = [change[@"new"] integerValue];    if (self.blockTracking) {      self.blockTracking(status);    }    if (status) {//正在拖動(dòng)      [self.player pause];    }else{//停止拖動(dòng)    }  }else if ([keyPath isEqualToString:@"loadedTimeRanges"]){    NSArray *array = _playItem.loadedTimeRanges;    CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次緩沖時(shí)間范圍    CGFloat startSeconds = CMTimeGetSeconds(timeRange.start);    CGFloat durationSeconds = CMTimeGetSeconds(timeRange.duration);    NSTimeInterval totalBuffer = startSeconds + durationSeconds;//緩沖總長(zhǎng)度    double progress = totalBuffer/CMTimeGetSeconds(_duration);    if (self.blockLoadedTimeRanges) {      self.blockLoadedTimeRanges(progress);    }    NSLog(@"當(dāng)前緩沖時(shí)間:%f",totalBuffer);  }else if ([keyPath isEqualToString:@"playbackBufferEmpty"]){    NSLog(@"緩存不夠,不能播放!");  }else if ([keyPath isEqualToString:@"playbackLikelyToKeepUp"]){    if (self.blockPlaybackLikelyToKeepUp) {      self.blockPlaybackLikelyToKeepUp([change[@"new"] boolValue]);    }  }}-(void)setUrl:(NSURL *)url{  _url = url;  [self.player replaceCurrentItemWithPlayerItem:self.playItem];}-(AVPlayerItem *)playItem{  _playItem = [[AVPlayerItem alloc] initWithURL:_url];  //監(jiān)聽(tīng)播放器的狀態(tài),準(zhǔn)備好播放、失敗、未知錯(cuò)誤  [_playItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];  //  監(jiān)聽(tīng)緩存的時(shí)間  [_playItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];  //  監(jiān)聽(tīng)獲取當(dāng)緩存不夠,視頻加載不出來(lái)的情況:  [_playItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];  //  用于監(jiān)聽(tīng)緩存足夠播放的狀態(tài)  [_playItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:nil];  [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(private_playerMovieFinish) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];  return _playItem;}- (void)private_playerMovieFinish{  NSLog(@"播放結(jié)束");  if (self.blockPlayToEndTime) {    self.blockPlayToEndTime();  }  if (_loop) {//默認(rèn)提供一個(gè)循環(huán)播放的功能    [self.player pause];    CMTime time = CMTimeMake(1, 1);    __weak typeof(self)this = self;    [self.player seekToTime:time completionHandler:^(BOOL finished) {      [this.player play];    }];  }}-(void)dealloc{  NSLog(@"-----銷(xiāo)毀-----");}@end

視頻播放器就不重點(diǎn)講了,作者計(jì)劃單獨(dú)寫(xiě)一篇有關(guān)視頻播放器的。

下邊的滑動(dòng)視圖

灰色遮蓋

灰色遮蓋比較簡(jiǎn)單這里作者只是用了UIView

self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];self.leftMaskView.backgroundColor = [UIColor grayColor];self.leftMaskView.alpha = 0.8;[self addSubview:self.leftMaskView];self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];self.rightMaskView.backgroundColor = [UIColor grayColor];self.rightMaskView.alpha = 0.8;

滑塊中間上下兩條線

這兩根線單獨(dú)封裝了一個(gè)視圖Line,一開(kāi)始也想到用一個(gè)UIView就好了,但是發(fā)現(xiàn)一個(gè)問(wèn)題,就是把手的滑動(dòng)與線的滑動(dòng)速度不匹配,線比較慢。

@implementation Line-(void)setBeginPoint:(CGPoint)beginPoint{  _beginPoint = beginPoint;  [self setNeedsDisplay];}-(void)setEndPoint:(CGPoint)endPoint{  _endPoint = endPoint;  [self setNeedsDisplay];}- (void)drawRect:(CGRect)rect {  CGContextRef context = UIGraphicsGetCurrentContext();  CGContextSetLineWidth(context, 3);  CGContextSetStrokeColorWithColor(context, [UIColor colorWithWhite:0.9 alpha:1].CGColor);  CGContextMoveToPoint(context, self.beginPoint.x, self.beginPoint.y);  CGContextAddLineToPoint(context, self.endPoint.x, self.endPoint.y);  CGContextStrokePath(context);}

圖片管理視圖

這里封裝了一個(gè)VideoPieces,用來(lái)組裝把手、線、遮蓋的邏輯,并且用來(lái)顯示圖片。由于圖片只有10張,所以這里緊緊是一個(gè)for循環(huán),增加了10個(gè)UIImageView

@interface VideoPieces(){  CGPoint _beginPoint;}@property(nonatomic,strong) Haft *leftHaft;@property(nonatomic,strong) Haft *rightHaft;@property(nonatomic,strong) Line *topLine;@property(nonatomic,strong) Line *bottomLine;@property(nonatomic,strong) UIView *leftMaskView;@property(nonatomic,strong) UIView *rightMaskView;@end@implementation VideoPieces-(instancetype)initWithFrame:(CGRect)frame{  self = [super initWithFrame:frame];  if (self) {    [self initSubViews:frame];  }  return self;}- (void)initSubViews:(CGRect)frame{  CGFloat height = CGRectGetHeight(frame);  CGFloat width = CGRectGetWidth(frame);  CGFloat minGap = 30;  CGFloat widthHaft = 10;  CGFloat heightLine = 3;  _leftHaft = [[Haft alloc] initWithFrame:CGRectMake(0, 0, widthHaft, height)];  _leftHaft.alpha = 0.8;  _leftHaft.backgroundColor = [UIColor colorWithWhite:0.9 alpha:1];  _leftHaft.rightEdgeInset = 20;  _leftHaft.lefEdgeInset = 5;  __weak typeof(self) this = self;  [_leftHaft setBlockMove:^(CGPoint point) {    CGFloat maxX = this.rightHaft.frame.origin.x-minGap;    if (point.x=minX) {      this.topLine.endPoint = CGPointMake(point.x-widthHaft, heightLine/2.0);      this.bottomLine.endPoint = CGPointMake(point.x-widthHaft, heightLine/2.0);      this.rightHaft.frame = CGRectMake(point.x, 0, widthHaft, height);      this.rightMaskView.frame = CGRectMake(point.x+widthHaft, 0, width-point.x-widthHaft, height);      if (this.blockSeekOffRight) {        this.blockSeekOffRight(point.x);      }    }  }];  [_rightHaft setBlockMoveEnd:^{    if (this.blockMoveEnd) {      this.blockMoveEnd();    }  }];  _topLine = [[Line alloc] init];  _topLine.alpha = 0.8;  _topLine.frame = CGRectMake(widthHaft, 0, width-2*widthHaft, heightLine);  _topLine.beginPoint = CGPointMake(0, heightLine/2.0);  _topLine.endPoint = CGPointMake(CGRectGetWidth(_topLine.bounds), heightLine/2.0);  _topLine.backgroundColor = [UIColor clearColor];  [self addSubview:_topLine];  _bottomLine = [[Line alloc] init];  _bottomLine.alpha = 0.8;  _bottomLine.frame = CGRectMake(widthHaft, height-heightLine, width-2*widthHaft, heightLine);  _bottomLine.beginPoint = CGPointMake(0, heightLine/2.0);  _bottomLine.endPoint = CGPointMake(CGRectGetWidth(_bottomLine.bounds), heightLine/2.0);  _bottomLine.backgroundColor = [UIColor clearColor];  [self addSubview:_bottomLine];  [self addSubview:_leftHaft];  [self addSubview:_rightHaft];  self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];  self.leftMaskView.backgroundColor = [UIColor grayColor];  self.leftMaskView.alpha = 0.8;  [self addSubview:self.leftMaskView];  self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];  self.rightMaskView.backgroundColor = [UIColor grayColor];  self.rightMaskView.alpha = 0.8;  [self addSubview:self.rightMaskView];}-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event{  UITouch *touch = touches.anyObject;  _beginPoint = [touch locationInView:self];}

把手的實(shí)現(xiàn)

把手的實(shí)現(xiàn)這里優(yōu)化了一點(diǎn),就是滑動(dòng)的時(shí)候比較靈敏,一開(kāi)始用手指滑動(dòng)的時(shí)候不是非常靈敏,經(jīng)常手指滑動(dòng)了,但是把手沒(méi)有動(dòng)。

增加了靈敏度的方法其實(shí)就是增加了接收事件區(qū)域的大小,重寫(xiě)了-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event這個(gè)方法

@implementation Haft-(instancetype)initWithFrame:(CGRect)frame{  self = [super initWithFrame:frame];  if (self) {    self.userInteractionEnabled = true;  }  return self;}-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{  CGRect rect = CGRectMake(self.bounds.origin.x-self.lefEdgeInset, self.bounds.origin.y-self.topEdgeInset, CGRectGetWidth(self.bounds)+self.lefEdgeInset+self.rightEdgeInset, CGRectGetHeight(self.bounds)+self.bottomEdgeInset+self.topEdgeInset);  if (CGRectContainsPoint(rect, point)) {    return YES;  }  return NO;}-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event{  NSLog(@"開(kāi)始");}-(void)touchesMoved:(NSSet*)touches withEvent:(UIEvent *)event{  NSLog(@"Move");  UITouch *touch = touches.anyObject;  CGPoint point = [touch locationInView:self.superview];  CGFloat maxX = CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds);  if (point.x>maxX) {    point.x = maxX;  }  if (point.x>=0&&point.x<=(CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds))&&self.blockMove) {    self.blockMove(point);  }}-(void)touchesEnded:(NSSet*)touches withEvent:(UIEvent *)event{  if (self.blockMoveEnd) {    self.blockMoveEnd();  }}- (void)drawRect:(CGRect)rect {  CGFloat width = CGRectGetWidth(self.bounds);  CGFloat height = CGRectGetHeight(self.bounds);  CGFloat lineWidth = 1.5;  CGFloat lineHeight = 12;  CGFloat gap = (width-lineWidth*2)/3.0;  CGFloat lineY = (height-lineHeight)/2.0;  CGContextRef context = UIGraphicsGetCurrentContext();  CGContextSetLineWidth(context, lineWidth);  CGContextSetStrokeColorWithColor(context, [[UIColor grayColor] colorWithAlphaComponent:0.8].CGColor);  CGContextMoveToPoint(context, gap+lineWidth/2, lineY);  CGContextAddLineToPoint(context, gap+lineWidth/2, lineY+lineHeight);  CGContextStrokePath(context);  CGContextSetLineWidth(context, lineWidth);  CGContextSetStrokeColorWithColor(context, [[UIColor grayColor] colorWithAlphaComponent:0.8].CGColor);  CGContextMoveToPoint(context, gap*2+lineWidth+lineWidth/2, lineY);  CGContextAddLineToPoint(context, gap*2+lineWidth+lineWidth/2, lineY+lineHeight);  CGContextStrokePath(context);}

控制器視圖邏輯組裝和功能實(shí)現(xiàn)

這部分邏輯是最重要也是最復(fù)雜的。

獲取10張縮略圖

- (NSArray *)getVideoThumbnail:(NSString *)path count:(NSInteger)count splitCompleteBlock:(void(^)(BOOL success, NSMutableArray *splitimgs))splitCompleteBlock {  AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]];  NSMutableArray *arrayImages = [NSMutableArray array];  [asset loadValuesAsynchronouslyForKeys:@[@"duration"] completionHandler:^{    AVAssetImageGenerator *generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];//    generator.maximumSize = CGSizeMake(480,136);//如果是CGSizeMake(480,136),則獲取到的圖片是{240, 136}。與實(shí)際大小成比例    generator.appliesPreferredTrackTransform = YES;//這個(gè)屬性保證我們獲取的圖片的方向是正確的。比如有的視頻需要旋轉(zhuǎn)手機(jī)方向才是視頻的正確方向。    /**因?yàn)橛姓`差,所以需要設(shè)置以下兩個(gè)屬性。如果不設(shè)置誤差有點(diǎn)大,設(shè)置了之后相差非常非常的小**/    generator.requestedTimeToleranceAfter = kCMTimeZero;    generator.requestedTimeToleranceBefore = kCMTimeZero;    Float64 seconds = CMTimeGetSeconds(asset.duration);    NSMutableArray *array = [NSMutableArray array];    for (int i = 0; i      CMTime time = CMTimeMakeWithSeconds(i*(seconds/10.0),1);//想要獲取圖片的時(shí)間位置      [array addObject:[NSValue valueWithCMTime:time]];    }    __block int i = 0;    [generator generateCGImagesAsynchronouslyForTimes:array completionHandler:^(CMTime requestedTime, CGImageRef _Nullable imageRef, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {      i++;      if (result==AVAssetImageGeneratorSucceeded) {        UIImage *image = [UIImage imageWithCGImage:imageRef];        [arrayImages addObject:image];      }else{        NSLog(@"獲取圖片失敗!!!");      }      if (i==count) {        dispatch_async(dispatch_get_main_queue(), ^{          splitCompleteBlock(YES,arrayImages);        });      }    }];  }];  return arrayImages;}

10張圖片很容易獲取到,不過(guò)這里要注意一點(diǎn):回調(diào)的時(shí)候要放到異步主隊(duì)列回調(diào)!要不會(huì)出現(xiàn)圖片顯示延遲比較嚴(yán)重的問(wèn)題。

監(jiān)聽(tīng)左右滑塊事件

[_videoPieces setBlockSeekOffLeft:^(CGFloat offX) {  this.seeking = true;  [this.moviePlayer fof_pause];  this.lastStartSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);  [this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastStartSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];}];[_videoPieces setBlockSeekOffRight:^(CGFloat offX) {  this.seeking = true;  [this.moviePlayer fof_pause];  this.lastEndSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);  [this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastEndSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];}];

這里通過(guò)監(jiān)聽(tīng)左右滑塊的事件,將偏移距離轉(zhuǎn)換成時(shí)間,從而設(shè)置播放器的開(kāi)始時(shí)間和結(jié)束時(shí)間。

循環(huán)播放

self.timeObserverToken = [self.moviePlayer.player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(0.5, NSEC_PER_SEC) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {  if (!this.seeking) {    if (fabs(CMTimeGetSeconds(time)-this.lastEndSeconds)<=0.02) {        [this.moviePlayer fof_pause];        [this private_replayAtBeginTime:this.lastStartSeconds];      }  }}];

這里有兩個(gè)注意點(diǎn):

1. addPeriodicTimeObserverForInterval要進(jìn)行釋放,否則會(huì)有內(nèi)存泄漏。

-(void)dealloc{  [self.moviePlayer.player removeTimeObserver:self.timeObserverToken];}

2.這里監(jiān)聽(tīng)了播放時(shí)間,進(jìn)而計(jì)算是否達(dá)到了我們右邊把手拖動(dòng)的時(shí)間,如果達(dá)到了則重新播放。 這個(gè)問(wèn)題作者思考了很久,怎么實(shí)現(xiàn)邊播放邊截取?差點(diǎn)進(jìn)入了一個(gè)誤區(qū),真去截取視頻。其實(shí)這里不用截取視頻,只是控制播放時(shí)間和結(jié)束時(shí)間就可以了,最后只截取一次就行了。

總結(jié)

這次微信小視頻編輯實(shí)現(xiàn)過(guò)程中,確實(shí)遇到了挺多的小問(wèn)題。不過(guò)通過(guò)仔細(xì)的研究,最終完美實(shí)現(xiàn)了,有種如釋重負(fù)的感覺(jué)。哈哈。

源碼

GitHub源碼

總結(jié)

以上所述是小編給大家介紹的iOS實(shí)現(xiàn)微信朋友圈視頻截取功能,希望對(duì)大家有所幫助,如果大家有任何疑問(wèn)請(qǐng)給我留言,小編會(huì)及時(shí)回復(fù)大家的。在此也非常感謝大家對(duì)VEVB武林網(wǎng)網(wǎng)站的支持!


注:相關(guān)教程知識(shí)閱讀請(qǐng)移步到IOS開(kāi)發(fā)頻道。
發(fā)表評(píng)論 共有條評(píng)論
用戶名: 密碼:
驗(yàn)證碼: 匿名發(fā)表
主站蜘蛛池模板: 日韩毛片免费观看 | 万圣街在线观看免费完整版 | 久久精品国产精品亚洲 | 欧美日韩高清一区二区三区 | 一级片a | 午夜精品毛片 | 欧美性生活久久久 | 最近日本电影hd免费观看 | 99视频有精品 | 特级西西444www大精品视频免费看 | 亚洲精品a级| 欧美黑大粗硬毛片视频 | 天天躁狠狠躁夜躁2020挡不住 | 午夜爽爽爽男女免费观看hd | 日韩毛片毛片久久精品 | 中文字幕11 | 小情侣嗯啊哦视频www | 日本在线视频免费观看 | 色污视频 | 国产一级免费在线视频 | 亚洲综合一区在线观看 | 日本在线观看高清完整版 | 特级黄aaaaaaaaa毛片 | 香蕉国产9| 日本在线观看中文字幕 | 免费男女视频 | 国产99一区二区 | 中文字幕在线观看1 | 看片一区| 国产人成免费爽爽爽视频 | 激情宗合| 草久影视| 黄污网站在线 | 亚洲尻逼视频 | 精品亚洲va在线va天堂资源站 | 久久千人斩 | 日韩视频一区二区 | 久久久一区二区三区精品 | 成人福利视频导航 | 久久老司机| 色播视频网站 |