Code Monkey home page Code Monkey logo

ircamerasticker's Introduction

Build Status Platform

IRCameraSticker

  • IRCameraSticker is a powerful face sticker and camera sticker for iOS.

Screenshots

Features

  • Support face sticker.
  • Support GPUImage.

Future

  • Support more customized settings.

Install

Git

  • Git clone this project.
  • Copy this project into your own project.
  • Add the .xcodeproj into you project and link it as embed framework.

Options

  • You can remove the demo and ScreenShots folder.

Cocoapods

  • Add pod 'IRCameraSticker' in the Podfile
  • pod install

How does it work?

  • Using VNDetectFaceLandmarksRequestRevision3
Starting with iOS 13, you will get a different set of points (VNDetectFaceLandmarksRequestRevision3) 

VNDetectFaceLandmarksRequestRevision3

  • Get size of camera layer.
AVCaptureVideoDataOutput *output = [[[self.videoCamera captureSession] outputs] lastObject];
NSDictionary* outputSettings = [output videoSettings];

long width  = [[outputSettings objectForKey:@"Width"]  longValue];
long height = [[outputSettings objectForKey:@"Height"] longValue];
  • Conver points

    • There are two methods to convert.
    1. Method one
size_t width = CVPixelBufferGetWidth(CVPixelBufferRef);
size_t height = CVPixelBufferGetHeight(CVPixelBufferRef);

CGSize size = CGSizeMake(width, height);

CGFloat scaleX = self.filterView.layer.frame.size.width / size.width;
CGFloat scaleY = self.filterView.layer.frame.size.height / size.height;

CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -1);

CGRect faceBoundingBoxOnScreen = VNImageRectForNormalizedRect(CGRectApplyAffineTransform(observedFace.boundingBox, transform), size.width, size.height);
    1. Method two
CGRect faceBoundingBoxOnScreen = CGRectZero;
faceBoundingBoxOnScreen.size.height = self.filterView.layer.frame.size.height * observedFace.boundingBox.size.height;
faceBoundingBoxOnScreen.size.width = self.filterView.layer.frame.size.width * observedFace.boundingBox.size.width;
faceBoundingBoxOnScreen.origin.x = observedFace.boundingBox.origin.x * self.filterView.layer.frame.size.width;
faceBoundingBoxOnScreen.origin.y = observedFace.boundingBox.origin.y * self.filterView.layer.frame.size.height;
  • Eyes

    • There are two methods to draw.
    1. Method one
for (int i = 0; i < eye.pointCount; i++) {
    CGPoint eyePoint = eye.normalizedPoints[i];
    CGRect faceBounds = VNImageRectForNormalizedRect(screenBoundingBox, size.width, size.height);

    CGAffineTransform transform = CGAffineTransformScale(CGAffineTransformMakeTranslation(faceBounds.origin.x, faceBounds.origin.y), faceBounds.size.width, faceBounds.size.height);

    eyePoint = CGPointApplyAffineTransform(eyePoint, transform);

    CGFloat scaleX = self.filterView.layer.frame.size.width / size.width;
    CGFloat scaleY = self.filterView.layer.frame.size.height / size.height;

    transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -size.height);

    eyePoint = CGPointApplyAffineTransform(eyePoint, transform);
}
    1. Method two
const CGPoint *pointsInImage = [eye pointsInImageOfSize:CGSizeMake(size.width, size.height)];
for (int i = 0; i < eye.pointCount; i++) {
    CGPoint eyePoint = pointsInImage[i];

    CGFloat scaleX = self.filterView.layer.frame.size.width / size.width;
    CGFloat scaleY = self.filterView.layer.frame.size.height / size.height;
    
    CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -size.height);

    eyePoint = CGPointApplyAffineTransform(eyePoint, transform);

    newEyePoints[i] = eyePoint;
    [newEyePointsArray addObject:[NSValue valueWithCGPoint:eyePoint]];
}
  • All points
NSMutableArray *newAllPointsArray = [NSMutableArray array];
const CGPoint *pointsInImage = [landmarks.allPoints pointsInImageOfSize:CGSizeMake(size.width, size.height)];
for (int i = 0; i < landmarks.allPoints.pointCount; i++) {
    CGPoint eyePoint = pointsInImage[i];

    CGFloat scaleX = (self.filterView.layer.frame.size.width / size.width) * (size.height / self.filterView.layer.frame.size.width);
    CGFloat scaleY = (self.filterView.layer.frame.size.height / size.height) * (size.width / self.filterView.layer.frame.size.height);
    
    CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -size.height);
    
    eyePoint = CGPointApplyAffineTransform(eyePoint, transform);
    
    [newAllPointsArray addObject:[NSValue valueWithCGPoint:eyePoint]];
}

Usage

Basic

  • Setup Camera
#import <GPUImage/GPUImage.h>

#import "IRCameraSticker.h"
#import "IRCameraStickerFilter.h"
#import "IRCameraStickersManager.h"

self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh cameraPosition:AVCaptureDevicePositionFront];
self.videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;
self.videoCamera.delegate = self;
  • Setup IRCameraStickerFilter and GPUImageView
self.stickerFilter = [IRCameraStickerFilter new];
[self.videoCamera addTarget:self.stickerFilter];

self.filterView = [[GPUImageView alloc] initWithFrame:self.view.bounds];
self.filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
self.filterView.center = self.view.center;

[self.view addSubview:self.filterView];

self.filterView.layer.frame = self.view.frame;

[self.stickerFilter addTarget:self.filterView];
  • Start Camera
[self.videoCamera startCameraCapture];
  • LoadSticker
[IRCameraStickersManager loadStickersWithCompletion:^(NSArray<IRCameraSticker *> *stickers) {
    self.stickers = stickers;
    self.stickerFilter.sticker = [stickers firstObject];
}];
  • Set callbacks and detect face then detail with it.
#pragma mark - GPUImageVideoCameraDelegate
- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    CVImageBufferRef frame = CMSampleBufferGetImageBuffer(sampleBuffer);
    [self detectFace:frame];
}

- (void)detectFace:(CVPixelBufferRef)image {
    size_t width = CVPixelBufferGetWidth(image);
    size_t height = CVPixelBufferGetHeight(image);
    
    VNDetectFaceLandmarksRequest *faceDetectionRequest = [[VNDetectFaceLandmarksRequest alloc] initWithCompletionHandler:^(VNRequest * _Nonnull request, NSError * _Nullable error) {
        dispatch_async(dispatch_get_main_queue(), ^{
            if ([request.results isKindOfClass:[NSArray<VNFaceObservation *> class]]) {
                [self handleFaceDetectionResults:request.results size:CGSizeMake(width, height)];
            } else {
                [self clearDrawings];
            }
        });
    }];
    
    VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:image orientation:kCGImagePropertyOrientationLeftMirrored options:0];
    [imageRequestHandler performRequests:@[faceDetectionRequest] error:nil];
}

- (void)handleFaceDetectionResults:(NSArray<VNFaceObservation *> *)observedFaces size:(CGSize)size {
    [self clearDrawings];
    
    NSMutableArray<CAShapeLayer *> *facesBoundingBoxes = [NSMutableArray array];
    
    for (VNFaceObservation *observedFace in observedFaces) {
        if (observedFace.landmarks) {
            [newDrawings addObjectsFromArray:[self drawFaceFeatures:observedFace.landmarks WithBoundingBox:observedFace.boundingBox size:size]];
        }
        [facesBoundingBoxes addObjectsFromArray:newDrawings];
        
        for (CAShapeLayer *faceBoundingBox in facesBoundingBoxes) {
            [self.filterView.layer addSublayer:faceBoundingBox];
        }
        
        self.drawings = facesBoundingBoxes;
    }
}
    
- (void)clearDrawings {
    for (CAShapeLayer *drawing in self.drawings) {
        [drawing removeFromSuperlayer];
    }
}

- (NSMutableArray<CAShapeLayer *> *)drawFaceFeatures:(VNFaceLandmarks2D *)landmarks WithBoundingBox:(CGRect)screenBoundingBox size:(CGSize)size {
    NSMutableArray<CAShapeLayer *> *faceFeaturesDrawings = [NSMutableArray array];
    if (landmarks.allPoints) {
        NSMutableArray *newAllPointsArray = [NSMutableArray array];
        const CGPoint *pointsInImage = [landmarks.allPoints pointsInImageOfSize:CGSizeMake(size.width, size.height)];
        for (int i = 0; i < landmarks.allPoints.pointCount; i++) {
            CGPoint eyePoint = pointsInImage[i];

            CGFloat scaleX = (self.filterView.layer.frame.size.width / size.width) * (size.height / self.filterView.layer.frame.size.width);
            CGFloat scaleY = (self.filterView.layer.frame.size.height / size.height) * (size.width / self.filterView.layer.frame.size.height);
            
            CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -size.height);
            
            eyePoint = CGPointApplyAffineTransform(eyePoint, transform);
            
            [newAllPointsArray addObject:[NSValue valueWithCGPoint:eyePoint]];
        }
        
        self.stickerFilter.faces = @[newAllPointsArray];
    }
    
    return faceFeaturesDrawings;
}

Advanced settings

  • Draw Face's bound.
CGFloat scaleX = self.filterView.layer.frame.size.width / size.width;
CGFloat scaleY = self.filterView.layer.frame.size.height / size.height;

CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -1);
CGRect faceBoundingBoxOnScreen = VNImageRectForNormalizedRect(CGRectApplyAffineTransform(observedFace.boundingBox, transform), size.width, size.height);

CGPathRef faceBoundingBoxPath = CGPathCreateWithRect(faceBoundingBoxOnScreen, nil);
CAShapeLayer *faceBoundingBoxShape = [CAShapeLayer layer];
faceBoundingBoxShape.path = faceBoundingBoxPath;
faceBoundingBoxShape.fillColor = [UIColor clearColor].CGColor;
faceBoundingBoxShape.strokeColor = [UIColor greenColor].CGColor;
NSMutableArray<CAShapeLayer *> *newDrawings = [NSMutableArray array];
[newDrawings addObject:faceBoundingBoxShape];
  • Draw Eyes' bounds.
...
if (landmarks.leftEye) {
    CAShapeLayer *eyeDrawing = [self drawEye:landmarks.leftEye WithBoundingBox:screenBoundingBox size:size];
    [faceFeaturesDrawings addObject:eyeDrawing];
}
if (landmarks.rightEye) {
    CAShapeLayer *eyeDrawing = [self drawEye:landmarks.rightEye WithBoundingBox:screenBoundingBox size:size];
    [faceFeaturesDrawings addObject:eyeDrawing];
}
...

- (CAShapeLayer *)drawEye:(VNFaceLandmarkRegion2D *)eye WithBoundingBox:(CGRect)screenBoundingBox size:(CGSize)size {
    CGMutablePathRef eyePath = CGPathCreateMutable();
    
    CGPoint *newEyePoints = malloc(sizeof(CGPoint) * eye.pointCount);
    NSMutableArray *newEyePointsArray = [NSMutableArray array];

    const CGPoint *pointsInImage = [eye pointsInImageOfSize:CGSizeMake(size.width, size.height)];
    for (int i = 0; i < eye.pointCount; i++) {
        CGPoint eyePoint = pointsInImage[i];

        CGFloat scaleX = self.filterView.layer.frame.size.width / size.width;
        CGFloat scaleY = self.filterView.layer.frame.size.height / size.height;
        
        CGAffineTransform transform = CGAffineTransformTranslate(CGAffineTransformMakeScale(scaleX, -scaleY), 0, -size.height);

        eyePoint = CGPointApplyAffineTransform(eyePoint, transform);

        newEyePoints[i] = eyePoint;
        [newEyePointsArray addObject:[NSValue valueWithCGPoint:eyePoint]];
    }
    
    CGPathAddLines(eyePath, nil, newEyePoints, eye.pointCount);
    CGPathCloseSubpath(eyePath);
    CAShapeLayer *eyeDrawing = [CAShapeLayer layer];
    eyeDrawing.anchorPoint = CGPointMake(0.5, 0.5);
    eyeDrawing.position = CGPointMake(size.width / 2, size.height / 2);
    eyeDrawing.bounds = CGRectMake(0, 0, size.width, size.height);
    eyeDrawing.path = eyePath;
    eyeDrawing.fillColor = [UIColor clearColor].CGColor;
    eyeDrawing.strokeColor = [UIColor greenColor].CGColor;
    
    return eyeDrawing;
}

Thanks list

This project is inspired from FaceTracker.
This project is inspired from CameraStickerDemo.

ircamerasticker's People

Contributors

irons163 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

rbqren000

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.