iOS GPUImage研究七:動態相簿初探(水印)

前言:

其實,不僅僅是水印,包含一些3D立體相簿的靈感。可以通過GPUImage來實現
試想一下,我們可以通過手機錄製視訊,然後能夠實現自己新增特效,製作成動感影集,是不是很激動。

實現效果:

這裡寫圖片描述

說明:

實現了一個簡單地動畫,邏輯是,A檢視跟隨B檢視轉動,但是A檢視的尺寸僅僅被B檢視包裹在內,隨之變動。
其實很簡單:
imageView1.frame = CGRectMake(imageView2.frame.origin.x 1, imageView2.frame.origin.y 1, imageView2.frame.size.width 1, imageView2.frame.size.height 1);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/100, 0, 0 , 1);
一個簡單地CATransform3D動畫。

關於視訊水印,請看上一篇部落格
http://blog.csdn.net/xoxo_x/article/details/71055867

介紹一下:GPUImageUIElement

GPUImageUIElement 是一個輸入類,可以將UIView屬性的檢視轉為紋理,輸入到輸出源,例如:GPUImageView、GPUImageMovieWriter以及filter

其建立方式和物件方法:

// Initialization and teardown
- (id)initWithView:(UIView *)inputView;
- (id)initWithLayer:(CALayer *)inputLayer;
// Layer management
- (CGSize)layerSizeInPixels;
- (void)update;
- (void)updateUsingCurrentTime;
- (void)updateWithTimestamp:(CMTime)frameTime;

注意:

我們需要獲取水印的動態,所以需要得到視訊的時間戳,因此需要用到updateWithTimestamp這個函式。
如果我們不用的話,我們可以通過一個定時器來更新,但是獲取當前時間的時間戳是麻煩的

監控視訊時間:

這也是一個處理視訊進度的一個回撥。
GPUImageFilter* progressFilter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:progressFilter];
[progressFilter addTarget:filter];
//達到獲取當前處理時間的時間戳的目的
[progressFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
imageView1.frame = CGRectMake(imageView2.frame.origin.x 1, imageView2.frame.origin.y 1, imageView2.frame.size.width 1, imageView2.frame.size.height 1);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/100, 0, 0 , 1);
[strongSelf->pictureView updateWithTimestamp:time];
}];

全部程式碼:

//
//  ViewController.m
//  WatermarkDemo
//
//  Created by 馮士魁 on 2017/5/1.
//  Copyright © 2017年 xoxo_x. All rights reserved.
//
/**
在這裡你會發現更多,好玩的事情
*  http://blog.csdn.net/xoxo_x/article
*
*
*/
#import "ViewController.h"
#import "GPUImage.h"
@interface ViewController (){
//    GPUImagePicture *pictureFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageVideoCamera *videoCamera;
GPUImageView *filterView;
GPUImageUIElement * pictureView;
GPUImageMovie * movieFile;
}
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
[self initGPUImageView];
[self initFilter];
[self initCamera];
UIView *view = [[UIView alloc]initWithFrame:self.view.bounds];
UIImageView *imageView1 = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width/2, self.view.bounds.size.height/2)];
imageView1.image = [UIImage imageNamed:@"美女1.jpg"];
[view addSubview:imageView1];
UIImageView *imageView2 = [[UIImageView alloc]initWithFrame:CGRectMake(self.view.bounds.size.width/4, self.view.bounds.size.height/4, self.view.bounds.size.width/2, self.view.bounds.size.height/2)];
imageView2.image = [UIImage imageNamed:@"美女2.jpg"];
[view addSubview:imageView2];
pictureView = [[GPUImageUIElement alloc]initWithView:view];
GPUImageFilter* progressFilter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:progressFilter];
[progressFilter addTarget:filter];
[pictureView addTarget:filter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
__strong typeof(self) strongSelf = self;
[progressFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
imageView1.frame = CGRectMake(imageView2.frame.origin.x 1, imageView2.frame.origin.y 1, imageView2.frame.size.width 1, imageView2.frame.size.height 1);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/100, 0, 0 , 1);
[strongSelf->pictureView updateWithTimestamp:time];
}];
}
-(void)initGPUImageView{
filterView = [[GPUImageView alloc] initWithFrame:self.view.frame];
[self.view addSubview:filterView];
}
-(void)initFilter{
filter = [[GPUImageAlphaBlendFilter alloc] init];
}
-(void)initCamera{
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = YES;
}
@end

歡迎打賞 – 打賞後、加好友哦 O(∩_∩)O哈哈~

這裡寫圖片描述