Code Monkey home page Code Monkey logo

lflivekit's Introduction

LFLiveKit

icon~

Build Status  License MIT  CocoaPods  Support  platform 

LFLiveKit is a opensource RTMP streaming SDK for iOS.

Features

  • Background recording
  • Support horizontal vertical recording
  • Support Beauty Face With GPUImage
  • Support H264+AAC Hardware Encoding
  • Drop frames on bad network
  • Dynamic switching rate
  • Audio configuration
  • Video configuration
  • RTMP Transport
  • Switch camera position
  • Audio Mute
  • Support Send Buffer
  • Support WaterMark
  • Swift Support
  • Support Single Video or Audio
  • Support External input video or audio(Screen recording or Peripheral)
  • FLV package and send

Requirements

- iOS 7.0+
- Xcode 7.3

Installation

CocoaPods

# To integrate LFLiveKit into your Xcode project using CocoaPods, specify it in your Podfile:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '7.0'
pod 'LFLiveKit'

# Then, run the following command:
$ pod install

Carthage

1. Add `github "LaiFengiOS/LFLiveKit"` to your Cartfile.
2. Run `carthage update --platform ios` and add the framework to your project.
3. Import \<LFLiveKit/LFLiveKit.h\>.

Manually

1. Download all the files in the `LFLiveKit` subdirectory.
2. Add the source files to your Xcode project.
3. Link with required frameworks:
    * UIKit
    * Foundation
    * AVFoundation
    * VideoToolbox
    * AudioToolbox
    * libz
    * libstdc++

Usage example

Objective-C

- (LFLiveSession*)session {
	if (!_session) {
	    _session = [[LFLiveSession alloc] initWithAudioConfiguration:[LFLiveAudioConfiguration defaultConfiguration] videoConfiguration:[LFLiveVideoConfiguration defaultConfiguration]];
	    _session.preView = self;
	    _session.delegate = self;
	}
	return _session;
}

- (void)startLive {	
	LFLiveStreamInfo *streamInfo = [LFLiveStreamInfo new];
	streamInfo.url = @"your server rtmp url";
	[self.session startLive:streamInfo];
}

- (void)stopLive {
	[self.session stopLive];
}

//MARK: - CallBack:
- (void)liveSession:(nullable LFLiveSession *)session liveStateDidChange: (LFLiveState)state;
- (void)liveSession:(nullable LFLiveSession *)session debugInfo:(nullable LFLiveDebug*)debugInfo;
- (void)liveSession:(nullable LFLiveSession*)session errorCode:(LFLiveSocketErrorCode)errorCode;

Swift

// import LFLiveKit in [ProjectName]-Bridging-Header.h
#import <LFLiveKit.h> 

//MARK: - Getters and Setters
lazy var session: LFLiveSession = {
	let audioConfiguration = LFLiveAudioConfiguration.defaultConfiguration()
	let videoConfiguration = LFLiveVideoConfiguration.defaultConfigurationForQuality(LFLiveVideoQuality.Low3, landscape: false)
	let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfiguration)
	    
	session?.delegate = self
	session?.preView = self.view
	return session!
}()

//MARK: - Event
func startLive() -> Void { 
	let stream = LFLiveStreamInfo()
	stream.url = "your server rtmp url";
	session.startLive(stream)
}

func stopLive() -> Void {
	session.stopLive()
}

//MARK: - Callback
func liveSession(session: LFLiveSession?, debugInfo: LFLiveDebug?) 
func liveSession(session: LFLiveSession?, errorCode: LFLiveSocketErrorCode)
func liveSession(session: LFLiveSession?, liveStateDidChange state: LFLiveState)

Release History

* 2.0.0
    * CHANGE: modify bugs,support ios7 live.
* 2.2.4.3
    * CHANGE: modify bugs,support swift import.
* 2.5 
    * CHANGE: modify bugs,support bitcode.

License

LFLiveKit is released under the MIT license. See LICENSE for details.

lflivekit's People

Contributors

bunnyirsa avatar hannseman avatar kciter avatar linyehui avatar michael-ioser avatar toss156 avatar zenonhuang avatar zhangyu528 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lflivekit's Issues

pod搜不到啊

arondeMacBook-Pro:MiaowShow aron$ pod search LFLive
[!] Unable to find a pod with name matching `LFLiveKit'

Resolving dependencies of Podfile
[!] Unable to find a specification for LFLiveKit

通过pod搜不到这个库

RTMP Auth problem

How do I get rtmp authentication working?

LFLiveStreamInfo *stream = [LFLiveStreamInfo new];
stream.url = @"rtmp://test:test@myhost/test";

Does not work.

It prints
2016-08-04 16:37:58.737 LFLiveKitDemo[1447:493017] MicrophoneSource: startRunning 2016-08-04 16:37:58.936 LFLiveKitDemo[1447:493066] handleRouteChange reason is The category of the session object changed. 2016-08-04 16:38:22.112 LFLiveKitDemo[1447:492940] liveStateDidChange: 1 ERROR: Problem accessing the DNS. (addr: test) 2016-08-04 16:38:22.130 LFLiveKitDemo[1447:492940] errorCode: 203 2016-08-04 16:38:22.130 LFLiveKitDemo[1447:492940] liveStateDidChange: 4

Thanks

Broken pipe解决思路

我参照你们的库,想学习一下RTMP推流相关的东西,推流的时候遇到了下面的错误:
2016-07-18 09:32:47.802 LiveStudy[923:159136] 数据包
2016-07-18 09:32:47.802 LiveStudy[923:159136] 时间戳52582565
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (144 bytes)
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (39 bytes)
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (42 bytes)

我学习的代码仓库:https://github.com/mengxiangyue/LiveStudy,
如果有时间帮忙看一下,没时间的话能给一下思路也十分感谢。

我的逻辑是使用摄像头去捕获数据,目前只是处理视频。然后使用硬件编码最后使用RTMP发送,RTMP发送的代码是基本跟你们库相同,另外我所有的操作都是在主线程的。

十分感谢。

自己建立的项目就没错,导入到公司的项目就出错,不知什么原因

] Failed to compile fragment shader
2016-08-20 14:57:37.193 SystemTeq[589:132757] Program link log: (null)
2016-08-20 14:57:37.193 SystemTeq[589:132757] Fragment shader compile log: ERROR: 0:1: 'mainScreen' : syntax error: syntax error
2016-08-20 14:57:37.194 SystemTeq[589:132757] Vertex shader compile log: (null)
2016-08-20 14:57:37.194 SystemTeq[589:132757] *** Assertion failure in -[LFGPUImageBeautyFilter initWithVertexShaderFromString:fragmentShaderFromString:], /Users/macbookpro/Downloads/work/XimalayaSDK_iOS_2.12/SystemTeq/SystemTeq/VideoLibrary/LFLiveKit/Vendor/GPUImage/GPUImageFilter.m:94

直播停止不了!!!

停止直播调用[_self.session stopLive]后, 直播状态由已连接变为连接中, 又变为已连接, 停止不掉!
2016-07-14 17 33 23

多次尝试都没有吧项目运行起来。。。

真的太失败了我。。。
ld: library not found for -lPods
clang: error: linker command failed with exit code 1 (use -v to see invocation)

有又一样错误的人嘛,谢谢告知。
我是pod 1.0.1

下载源码后,我就只在demo目录下面运行了pod install ,编译项目就报错了。

弱网内存爆炸

你好,我们在测试的过程发现,网络不好的时候内存会一直往上长,多到100多MB。网络好的时候没有这个问题,不知道这是什么 BUG?!
谢谢解答

Please advice RTMP Playback

Hello
Thank you for your awesome library, it's so much better than VideoCore!

But it's hard to find a reliable library to actually view the stream. Could you please advice me the best one for iOS in your opinion?

请问如何添加水印

`- (void)setWaterMark
{
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
NSDate *startTime = [NSDate date];

UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
timeLabel.font = [UIFont systemFontOfSize:17.0f];
timeLabel.text = @"Time: 0.0 s";
timeLabel.textAlignment = NSTextAlignmentCenter;
timeLabel.backgroundColor = [UIColor clearColor];
timeLabel.textColor = [UIColor whiteColor];

GPUImageUIElement *uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];

[_filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];

[blendFilter addTarget:_gpuImageView];

__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
__weak typeof(self) _self = self;
[_filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
    timeLabel.text = [NSString stringWithFormat:@"Time: %f s", -[startTime timeIntervalSinceNow]];
    [weakUIElementInput update];
    [_self processVideo:filter];
}];
[_videoCamera addTarget:_gpuImageView];

}`
这是我添加水印的代码 可是在[weakUIElementInput update] 这里崩溃

EXC_BAD_ACCESS (code=EXC_ARM_DA_ALIGN, address=0x103b)

在iPhone5连接Xcode中真机测试中出现内存错误.
断开数据线运行并没有影响.
我将异常信息提供在下面.非常感谢你们的开源分享,希望做的更好.

xcode Version 7.3.1 (7D1014)
iPhone5 iOS 8.3
pod 'LFLiveKit', '~> 1.6'

代码位置:
GPUImageView.m Line 168
[[[GPUImageContext sharedImageProcessingContext] context] renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)self.layer];

日志信息:
device iPhone5,1
model iPhone5
num 5

线程信息:

  • thread #1: tid = 0x55eaf, 0x0145a3bc libglInterpose.dylibEAGLContext_renderbufferStorageFromDrawable(EAGLContext*, objc_selector*, unsigned int, id<EAGLDrawable>) + 204, queue = 'com.sunsetlakesoftware.GPUImage.openGLESContextQueue', stop reason = EXC_BAD_ACCESS (code=EXC_ARM_DA_ALIGN, address=0x103b) frame #0: 0x0145a3bc libglInterpose.dylibEAGLContext_renderbufferStorageFromDrawable(EAGLContext_, objc_selector_, unsigned int, id) + 204
    • frame #1: 0x00364402 -[GPUImageView createDisplayFramebuffer](self=0x073eed60, _cmd="createDisplayFramebuffer") + 342 at GPUImageView.m:168

      frame #2: 0x00363efc __26-[GPUImageView commonInit]_block_invoke(.block_descriptor=0x0128a4f8) + 1608 at GPUImageView.m:129

      frame #3: 0x014fdea4 libdispatch.dylib`_dispatch_barrier_sync_f_invoke + 96

      frame #4: 0x0033f70c runSynchronouslyOnVideoProcessingQueue(block=0x0128a4f8) + 102 at GPUImageOutput.m:44

      frame #5: 0x003638a0 -[GPUImageView commonInit](self=0x073eed60, _cmd="commonInit") + 810 at GPUImageView.m:97

      frame #6: 0x00363492 -[GPUImageView initWithFrame:](self=0x073eed60, _cmd="initWithFrame:", frame=%28origin = %28x = 0, y = 0%29, size = %28width = 320, height = 568%29%29) + 168 at GPUImageView.m:63

      frame #7: 0x00306ce6 -[LFVideoCapture initWithVideoConfiguration:](self=0x073d39c0, _cmd="initWithVideoConfiguration:", configuration=0x073cf280) + 874 at LFVideoCapture.m:37

      frame #8: 0x002fa960 -[LFLiveSession videoCaptureSource](self=0x073cf6b0, _cmd="videoCaptureSource") + 132 at LFLiveSession.m:219

      frame #9: 0x002fa40a -[LFLiveSession setRunning:](self=0x073cf6b0, _cmd="setRunning:", running=YES) + 204 at LFLiveSession.m:173

Manual Installation

CocoaPods and Carthage are awesome tools and make our life really easier, but there are some devs who still don't know how to use them.

It would be cool to add the Manual installation guide in your README.md. You can take a look at my iOS Readme Template to see how you can do it.

The best framework!

Just wanted to say, keep up the good work, the best live stream framework I have found so far, makes thing easier than ever... Thank you 👍

水印特别模糊

将一个label设置为waterMarkView,能明显的感觉到字体是模糊的

在4G环境下推流异常

在4G环境下采集推流控制台提示:
handleRouteChange reason is The category of the session object changed.
handleRouteChange reason is The output route was overridden by the app.
数据应该没有推出去!

I can't pods installed

pod install
Unable to find a specification for `LFLiveKit

pod search LFLiveKit
[!] Unable to find a pod with name, author, summary, or descriptionmatching LFLiveKit

I use xcode 7.2

关于LFLiveVideoConfiguration类中supportSessionPreset的方法使用疑问

2016-08-10 10 59 31

请问楼主以上的代码逻辑是不是有点问题哦,第一次 if (![session canSetSessionPreset:avSessionPreset]) 即不支持当前设置的分辨率之后,从上到下依次下降分辨率,但是当赋值sessionPreset = LFCaptureSessionPreset540x960之后,又再一次判断 if (![session canSetSessionPreset:avSessionPreset]) ,之后设置sessionPreset = LFCaptureSessionPreset360x640;这样做判断是不是有问题哦?

Accept other video input

Great library! I wonder can you make it more flexible by allowing users to specify the video input, for example, making LFVideoCapture an protocol and then let LFLiveSession accept an LFVideoCapture implementation. The implementation can call - (void)captureOutput:(nullable LFVideoCapture*)capture pixelBuffer:(nullable CVImageBufferRef)pixelBuffer; to specify the video input.

Thanks!

观看端看到的画面模糊

推上去的流是不是有问题?在观看端看到的画面时常很模糊,不管怎么设置,明显不如直播端预览的画面清晰,尤其直播端在动的时候,观看端看到的画面都是胡的。不知道是什么原因呢?
另外滤镜感觉反光好严重啊,感觉还没有最早的版本好呢。

1.8.0 demo 关闭直播,还是会自动链接服务器,状态为已连接

同时,如果服务器未打开,点击开始直播后,再次点击播报会崩溃到LFStreamRtmpSockert.m

- (void)_stop{
    if(self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]){
        [self.delegate socketStatus:self status:LFLiveStop];
    }
    if(_rtmp != NULL){
        PILI_RTMP_Close(_rtmp, &_error);
        PILI_RTMP_Free(_rtmp);
        _rtmp = NULL;
    }
} 

Failed to start Video Camera on iOS7

Hi,
I use iOS7.1 on iPad mini2.

My camera video failed:

Assert - (kernResult == kIOReturnSuccess) - f: /SourceCache/AppleVXE380/AppleVXE380-403/Library/AppleVXE380UserLandLibrary.cpp l: 494
AppleVXE380VA ERROR: IOServiceOpen failed
Assert - (false) - f: /SourceCache/AppleVXE380/AppleVXE380-403/Library/AppleVXE380FIGwrapper.cpp l: 3106

VXE FIG: H264VideoEncoderVA_DriverCreate failed.

Any suggestion?
Thanks;

希望把人脸检测加上

滤镜可以根据不同的需求自己写,还是希望把人脸检测加上,这样使用更方便些

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.