iOS 在線視頻生成GIF圖功能的方法
在一些視頻APP中,都可以看到一個(gè)將在線視頻轉(zhuǎn)成GIF圖的功能。下面就來(lái)說(shuō)說(shuō)思路以及實(shí)現(xiàn)。我們知道本地視頻可以生成GIF,那么將在線視頻截取成本地視頻不就可以了嗎?經(jīng)過(guò)比較,騰訊視頻App也是這么做的。話不多說(shuō),下面開(kāi)始上代碼:
第一步:截取視頻
#pragma mark -截取視頻
- (void)interceptVideoAndVideoUrl:(NSURL *)videoUrl withOutPath:(NSString *)outPath outputFileType:(NSString *)outputFileType range:(NSRange)videoRange intercept:(InterceptBlock)interceptBlock {
_interceptBlock =interceptBlock;
//不添加背景音樂(lè)
NSURL *audioUrl =nil;
//AVURLAsset此類主要用于獲取媒體信息,包括視頻、聲音等
AVURLAsset* audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
//創(chuàng)建AVMutableComposition對(duì)象來(lái)添加視頻音頻資源的AVMutableCompositionTrack
AVMutableComposition* mixComposition = [AVMutableComposition composition];
//CMTimeRangeMake(start, duration),start起始時(shí)間,duration時(shí)長(zhǎng),都是CMTime類型
//CMTimeMake(int64_t value, int32_t timescale),返回CMTime,value視頻的一個(gè)總幀數(shù),timescale是指每秒視頻播放的幀數(shù),視頻播放速率,(value / timescale)才是視頻實(shí)際的秒數(shù)時(shí)長(zhǎng),timescale一般情況下不改變,截取視頻長(zhǎng)度通過(guò)改變value的值
//CMTimeMakeWithSeconds(Float64 seconds, int32_t preferredTimeScale),返回CMTime,seconds截取時(shí)長(zhǎng)(單位秒),preferredTimeScale每秒幀數(shù)
//開(kāi)始位置startTime
CMTime startTime = CMTimeMakeWithSeconds(videoRange.location, videoAsset.duration.timescale);
//截取長(zhǎng)度videoDuration
CMTime videoDuration = CMTimeMakeWithSeconds(videoRange.length, videoAsset.duration.timescale);
CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoDuration);
//視頻采集compositionVideoTrack
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// 避免數(shù)組越界 tracksWithMediaType 找不到對(duì)應(yīng)的文件時(shí)候返回空數(shù)組
//TimeRange截取的范圍長(zhǎng)度
//ofTrack來(lái)源
//atTime插放在視頻的時(shí)間位置
[compositionVideoTrack insertTimeRange:videoTimeRange ofTrack:([videoAsset tracksWithMediaType:AVMediaTypeVideo].count>0) ? [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject : nil atTime:kCMTimeZero error:nil];
//視頻聲音采集(也可不執(zhí)行這段代碼不采集視頻音軌,合并后的視頻文件將沒(méi)有視頻原來(lái)的聲音)
AVMutableCompositionTrack *compositionVoiceTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVoiceTrack insertTimeRange:videoTimeRange ofTrack:([videoAsset tracksWithMediaType:AVMediaTypeAudio].count>0)?[videoAsset tracksWithMediaType:AVMediaTypeAudio].firstObject:nil atTime:kCMTimeZero error:nil];
//聲音長(zhǎng)度截取范圍==視頻長(zhǎng)度
CMTimeRange audioTimeRange = CMTimeRangeMake(kCMTimeZero, videoDuration);
//音頻采集compositionCommentaryTrack
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:audioTimeRange ofTrack:([audioAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) ? [audioAsset tracksWithMediaType:AVMediaTypeAudio].firstObject : nil atTime:kCMTimeZero error:nil];
//AVAssetExportSession用于合并文件,導(dǎo)出合并后文件,presetName文件的輸出類型
AVAssetExportSession *assetExportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
//混合后的視頻輸出路徑
NSURL *outPutURL = [NSURL fileURLWithPath:outPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outPath])
{
[[NSFileManager defaultManager] removeItemAtPath:outPath error:nil];
}
//輸出視頻格式
assetExportSession.outputFileType = outputFileType;
assetExportSession.outputURL = outPutURL;
//輸出文件是否網(wǎng)絡(luò)優(yōu)化
assetExportSession.shouldOptimizeForNetworkUse = YES;
[assetExportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
switch (assetExportSession.status) {
case AVAssetExportSessionStatusFailed:
if (_interceptBlock) {
_interceptBlock(assetExportSession.error,outPutURL);
}
break;
case AVAssetExportSessionStatusCancelled:{
logdebug(@"Export Status: Cancell");
break;
}
case AVAssetExportSessionStatusCompleted: {
if (_interceptBlock) {
_interceptBlock(nil,outPutURL);
}
break;
}
case AVAssetExportSessionStatusUnknown: {
logdebug(@"Export Status: Unknown");
}
case AVAssetExportSessionStatusExporting : {
logdebug(@"Export Status: Exporting");
}
case AVAssetExportSessionStatusWaiting: {
logdebug(@"Export Status: Wating");
}
}
});
}];
}
第二步:本地視頻生成GIF圖
/**
生成GIF圖片
@param videoURL 視頻的路徑URL
@param loopCount 播放次數(shù)
@param time 每幀的時(shí)間間隔 默認(rèn)0.25s
@param imagePath 存放GIF圖片的文件路徑
@param completeBlock 完成的回調(diào)
*/
#pragma mark--制作GIF
- (void)createGIFfromURL:(NSURL*)videoURL loopCount:(int)loopCount delayTime:(CGFloat )time gifImagePath:(NSString *)imagePath complete:(CompleteBlock)completeBlock {
_completeBlock =completeBlock;
float delayTime = time?:0.25;
// Create properties dictionaries
NSDictionary *fileProperties = [self filePropertiesWithLoopCount:loopCount];
NSDictionary *frameProperties = [self framePropertiesWithDelayTime:delayTime];
AVURLAsset *asset = [AVURLAsset assetWithURL:videoURL];
float videoWidth = [[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width;
float videoHeight = [[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height;
GIFSize optimalSize = GIFSizeMedium;
if (videoWidth >= 1200 || videoHeight >= 1200)
optimalSize = GIFSizeVeryLow;
else if (videoWidth >= 800 || videoHeight >= 800)
optimalSize = GIFSizeLow;
else if (videoWidth >= 400 || videoHeight >= 400)
optimalSize = GIFSizeMedium;
else if (videoWidth < 400|| videoHeight < 400)
optimalSize = GIFSizeHigh;
// Get the length of the video in seconds
float videoLength = (float)asset.duration.value/asset.duration.timescale;
int framesPerSecond = 4;
int frameCount = videoLength*framesPerSecond;
// How far along the video track we want to move, in seconds.
float increment = (float)videoLength/frameCount;
// Add frames to the buffer
NSMutableArray *timePoints = [NSMutableArray array];
for (int currentFrame = 0; currentFrame<frameCount; ++currentFrame) {
float seconds = (float)increment * currentFrame;
CMTime time = CMTimeMakeWithSeconds(seconds, [timeInterval intValue]);
[timePoints addObject:[NSValue valueWithCMTime:time]];
}
//completion block
NSURL *gifURL = [self createGIFforTimePoints:timePoints fromURL:videoURL fileProperties:fileProperties frameProperties:frameProperties gifImagePath:imagePath frameCount:frameCount gifSize:_gifSize?:GIFSizeMedium];
if (_completeBlock) {
// Return GIF URL
_completeBlock(_error,gifURL);
}
}
經(jīng)過(guò)上面兩步,就可以生成本地的視頻和GIF圖了,存儲(chǔ)在沙盒即可。貼上兩步所用到的方法:
#pragma mark - Base methods
- (NSURL *)createGIFforTimePoints:(NSArray *)timePoints fromURL:(NSURL *)url fileProperties:(NSDictionary *)fileProperties frameProperties:(NSDictionary *)frameProperties gifImagePath:(NSString *)imagePath frameCount:(int)frameCount gifSize:(GIFSize)gifSize{
NSURL *fileURL = [NSURL fileURLWithPath:imagePath];
if (fileURL == nil)
return nil;
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)fileURL, kUTTypeGIF , frameCount, NULL);
CGImageDestinationSetProperties(destination, (CFDictionaryRef)fileProperties);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetImageGenerator *generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
CMTime tol = CMTimeMakeWithSeconds([tolerance floatValue], [timeInterval intValue]);
generator.requestedTimeToleranceBefore = tol;
generator.requestedTimeToleranceAfter = tol;
NSError *error = nil;
CGImageRef previousImageRefCopy = nil;
for (NSValue *time in timePoints) {
CGImageRef imageRef;
#if TARGET_OS_IPHONE || TARGET_IPHONE_SIMULATOR
imageRef = (float)gifSize/10 != 1 ? createImageWithScale([generator copyCGImageAtTime:[time CMTimeValue] actualTime:nil error:&error], (float)gifSize/10) : [generator copyCGImageAtTime:[time CMTimeValue] actualTime:nil error:&error];
#elif TARGET_OS_MAC
imageRef = [generator copyCGImageAtTime:[time CMTimeValue] actualTime:nil error:&error];
#endif
if (error) {
_error =error;
logdebug(@"Error copying image: %@", error);
return nil;
}
if (imageRef) {
CGImageRelease(previousImageRefCopy);
previousImageRefCopy = CGImageCreateCopy(imageRef);
} else if (previousImageRefCopy) {
imageRef = CGImageCreateCopy(previousImageRefCopy);
} else {
_error =[NSError errorWithDomain:NSStringFromClass([self class]) code:0 userInfo:@{NSLocalizedDescriptionKey:@"Error copying image and no previous frames to duplicate"}];
logdebug(@"Error copying image and no previous frames to duplicate");
return nil;
}
CGImageDestinationAddImage(destination, imageRef, (CFDictionaryRef)frameProperties);
CGImageRelease(imageRef);
}
CGImageRelease(previousImageRefCopy);
// Finalize the GIF
if (!CGImageDestinationFinalize(destination)) {
_error =error;
logdebug(@"Failed to finalize GIF destination: %@", error);
if (destination != nil) {
CFRelease(destination);
}
return nil;
}
CFRelease(destination);
return fileURL;
}
#pragma mark - Helpers
CGImageRef createImageWithScale(CGImageRef imageRef, float scale) {
#if TARGET_OS_IPHONE || TARGET_IPHONE_SIMULATOR
CGSize newSize = CGSizeMake(CGImageGetWidth(imageRef)*scale, CGImageGetHeight(imageRef)*scale);
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
if (!context) {
return nil;
}
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
//Release old image
CFRelease(imageRef);
// Get the resized image from the context and a UIImage
imageRef = CGBitmapContextCreateImage(context);
UIGraphicsEndImageContext();
#endif
return imageRef;
}
#pragma mark - Properties
- (NSDictionary *)filePropertiesWithLoopCount:(int)loopCount {
return @{(NSString *)kCGImagePropertyGIFDictionary:
@{(NSString *)kCGImagePropertyGIFLoopCount: @(loopCount)}
};
}
- (NSDictionary *)framePropertiesWithDelayTime:(float)delayTime {
return @{(NSString *)kCGImagePropertyGIFDictionary:
@{(NSString *)kCGImagePropertyGIFDelayTime: @(delayTime)},
(NSString *)kCGImagePropertyColorModel:(NSString *)kCGImagePropertyColorModelRGB
};
}
最后,截取的本地視頻可用AVPlayer播放,生成的GIF圖則用UIWebView或者WKWebView又或者 YYImage 加載即可。
以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持腳本之家。
相關(guān)文章
iOS優(yōu)化UITableViewCell高度計(jì)算的一些事兒
這iOS開(kāi)發(fā)中對(duì)于UITableViewCell高度自適應(yīng)的文章已經(jīng)很多很多,但都不是自己所需要的,下面篇文章主要給大家介紹了關(guān)于iOS優(yōu)化UITableViewCell高度計(jì)算的相關(guān)資料,文中通過(guò)示例代碼介紹的非常詳細(xì),需要的朋友可以參考下2018-11-11
iOS 中根據(jù)屏幕寬度自適應(yīng)分布按鈕的實(shí)例代碼
這篇文章主要介紹了iOS 中根據(jù)屏幕寬度自適應(yīng)分布按鈕的實(shí)例代碼,本文給大家分享兩種方式,代碼簡(jiǎn)單易懂,需要的朋友可以參考下2016-11-11
iOS 項(xiàng)目中的version和build 詳解
這篇文章主要介紹了iOS 項(xiàng)目中的version和build 詳解的相關(guān)資料,需要的朋友可以參考下2016-11-11
iOS 控制器自定義動(dòng)畫跳轉(zhuǎn)方法(模態(tài)跳轉(zhuǎn))
下面小編就為大家分享一篇iOS 控制器自定義動(dòng)畫跳轉(zhuǎn)方法(模態(tài)跳轉(zhuǎn)),具有很好的參考價(jià)值,希望對(duì)大家有所幫助。一起跟隨小編過(guò)來(lái)看看吧2018-01-01
iOS視頻壓縮存儲(chǔ)至本地并上傳至服務(wù)器實(shí)例代碼
本篇文章主要介紹了iOS視頻壓縮存儲(chǔ)至本地并上傳至服務(wù)器實(shí)例代碼,具有一定的參考價(jià)值,感興趣的小伙伴們可以參考一下2017-04-04
iOS 標(biāo)簽Tag列表的實(shí)現(xiàn)代碼
這篇文章主要介紹了本篇文章主要介紹了iOS 標(biāo)簽Tag列表的實(shí)現(xiàn)代碼,非常具有實(shí)用價(jià)值,需要的朋友可以參考下2017-04-04
iOS應(yīng)用中使用AsyncSocket庫(kù)處理Socket通信的用法講解
這篇文章主要介紹了iOS應(yīng)用中使用AsyncSocket庫(kù)處理Socket通信的用法講解,AsyncSocket同時(shí)支持TCP和UDP,文中展示了其建立斷開(kāi)連接及發(fā)送接收消息的操作,very好用,需要的朋友可以參考下2016-05-05

