Code Monkey home page Code Monkey logo

opentok-ios-sdk-samples's Introduction

Build Status

OpenTok iOS SDK Samples

This repository is meant to provide some examples for you to better understand the features of the OpenTok iOS SDK. The sample applications are meant to be used with the latest version of the OpenTok iOS SDK. Feel free to copy and modify the source code herein for your own projects. Please consider sharing your modifications with us, especially if they might benefit other developers using the OpenTok iOS SDK. See the License for more information.

Quick Start

  1. Get values for your OpenTok API key, session ID, and token. See Obtaining OpenTok Credentials for important information.

  2. Install CocoaPods as described in CocoaPods Getting Started.

  3. In Terminal, cd to your project directory and run pod install.

  4. Open your project in Xcode using the new .xcworkspace file in the project directory.

  5. Set up some config settings for the app. This varies, depending on the project.

  • For the Archiving, Basic-Video-Chat, and Signaling projects, in the Config.h file, replace the following empty strings with the base URL of the server that implements the learning-opentok-php or learning-opentok-node projects:

    #define SAMPLE_SERVER_BASE_URL @"https://YOUR-SERVER-URL"

    For more information, see the instructions on setting up these servers in the OpenTok tutorials at the OpenTok developer center

* For all other projects, in the ViewController.m file, replace the following empty strings
  with the corresponding API key, session ID, and token values:

  ```objc
  // *** Fill the following variables using your own Project info  ***
  // ***          https://dashboard.tokbox.com/projects            ***
  // Replace with your OpenTok API key
  static NSString* const kApiKey = @"";
  // Replace with your generated session ID
  static NSString* const kSessionId = @"";
  // Replace with your generated token
  static NSString* const kToken = @"";
  ```
  1. Use Xcode to build and run the app on an iOS simulator or device.

What's Inside

Archiving - This application shows you how to record an OpenTok session.

Basic Video Chat - This basic application demonstrates a short path to getting started with the OpenTok iOS SDK.

Custom Video Driver - This project provides classes that implement the OTVideoCapture and OTVideoRender interfaces of the core Publisher and Subscriber classes. Using these modules, we can see the basic workflow of sourcing video frames from the device camera in and out of OpenTok, via the OTPublisherKit and OTSubscriberKit interfaces.

Custom Audio Driver - This project demonstrate how to use an external audio source with the OpenTok SDK. This project utilizes CoreAudio and the AUGraph API to create an audio session suitable for voice and video communications.

Screen Sharing - This project demonstrates how to use a custom video capturer to publish a stream that uses a UI view (instead of a camera) as the video source.

Live Photo Capture - This project extends the video capture module implemented in project 2, and demonstrates how the AVFoundation media capture APIs can be used to simultaneously stream video and capture high-resolution photos from the same camera.

Simple Multiparty - This project demonstrates how to use the OpenTok iOS SDK for a multi-party call. The application publishes audio/video from an iOS device and can connect to multiple subscribers. However it shows only one subscriber video at a time due to CPU limitations on iOS devices.

Signaling - This project shows you how to implement text chat using the OpenTok signaling API.

Overlay Graphics - This project shows how to overlay graphics for the following:

  • A button for muting the publisher microphone

  • A button for muting the subscriber audio

  • Stream quality notification icons for the subscriber video

  • Archive recording icons

This project barrows publisher and subscribers modules implemented in project 2.

Audio Levels - This project demonstrates how to use the OpenTok iOS SDK for audio-only multi party calls. Both publisher and subscribers are audio-based only. This application also shows how to use the audio level API along with an audio meter UI for visualization of publisher and subscriber audio levels.

Ringtones - This project extends on the work done in Project 3 (Custom Audio Driver) by extending the sample audio driver with an AVAudioPlayer controller, which will play a short ringtone while waiting for the subscriber to connect to the client device.

FrameMetadata -- This project shows how to set metadata (limited to 32 bytes) to a video frame, as well as how to read metadata from a video frame.

Obtaining OpenTok Credentials

To use the OpenTok platform you need a session ID, token, and API Key. You can get these values by creating a project on your OpenTok Account Page and scrolling down to the Project Tools section of your Project page. For production deployment, you must generate the session ID and token values using one of the OpenTok Server SDKs.

Development and Contributing

Interested in contributing? We ❤️ pull requests! See the Contribution guidelines.

Getting Help

We love to hear from you so if you have questions, comments or find a bug in the project, let us know! You can either:

Further Reading

opentok-ios-sdk-samples's People

Contributors

abdulajet avatar ceaglest avatar chetanvangaditokbox avatar deniskras avatar goncalocostamendes avatar igitgotit avatar jeffswartz avatar jvalli avatar kmoulder avatar lucashuang0802 avatar michaeljolley avatar oludemilade avatar pvenum avatar robjperez avatar snobear avatar stevemcfarlin avatar tnas32 avatar v-kpheng avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

opentok-ios-sdk-samples's Issues

otk_console_set_logger

How do you fight the following warning displayed in console? Using Swift here.

*******************************************************
NOTICE: OPENTOK CONSOLE LOGGER HAS NOT BEEN SET.
PLEASE USE otk_console_set_logger(otk_console_logger)
TO SET YOUR LOGGER.
*******************************************************

Extension Crashes on iPad Mini with ios 14.x

Hi, the extension keeps crashing and closing on iPad Mini.

Same code works on iPhone x and 8+, so i think that maybe the system closes it because it is over 50Mb.

How can i shrink resolution ? Can you please point me in the right direction so i can try if the extension si being killed ?

Thanks.

How can we detect the real reason behind the subscriber's video disabled due to quality change?

We can detect the subscriber video disabled from a protocol with a reason why it got disabled.
One of the reason for disabling is Quality Changed

Screenshot 2021-10-25 at 10 19 22 AM

As per the documentation this quality changed would be trigger whenever quality of network/cpu degraded on either publisher/subscriber device. How do we know the exact reason here, like which device has the problem so that we can inform user that either your's or your partner's device has lower bandwidth. Which help users to have better knowledge why the video feed got suspended.

Screenshot 2021-10-25 at 10 19 05 AM

Is there any api which can tell more detail about this quality change trigger?

BroadcastUpload SampleHandler.m swift conversion

I am trying to convert the SampleHandler to Swift 3 and I'm running into the following challenges:

How do you define the videoCaptureConsumer in SampleHandler.h?

@property(atomic, weak) id<OTVideoCaptureConsumer> _Nullable videoCaptureConsumer;

[self.videoCaptureConsumer consumeImageBuffer:_pixelBuffer
orientation:OTVideoOrientationUp
timestamp:ts
metadata:nil];

Swift suggests the following template, but what should be returned?

	var videoCaptureConsumer: OTVideoCaptureConsumer? {
		get {
			<#code#>
		}
		set(videoCaptureConsumer) {
			<#code#>
		}
	}

merge `(CMSampleBufferRef)sampleBuffer` and `UIImage`, and send merged `CVPixelBufferRef` to CaptureConsumer, the final publisher stream seems to be added a blue filter

In our project, we need to display an image in the publisher stream, so I take the TBExampleVideoCapture in this repo to custom the video capture.

The trick is to convert video and image Buffer to CIImages and render them on a CVPixelBufferRef using CIContext, finally, send the CVPixelBufferRef to CaptureConsumer.

Steps

The following is my main changes to add an image in the publisher stream:

  1. changed the video format to ARGB:
-(id)init {
    self = [super init];
    if (self) {
       // ....
        _videoFrame = [[OTVideoFrame alloc] initWithFormat:
                      [OTVideoFormat videoFormatARGBWithWidth:_captureWidth
                                                       height:_captureHeight]];
       // ....
    }
    return self;
}


- (int32_t)captureSettings:(OTVideoFormat*)videoFormat {
    videoFormat.pixelFormat = OTPixelFormatARGB;
    videoFormat.imageWidth = _captureWidth;
    videoFormat.imageHeight = _captureHeight;
    return 0;
}
  1. added the method to convert an image to CVPixelBufferRef
- (void)fillImagePixelBufferFromCGImage:(CGImageRef)image videoWidth:(CGFloat)videoWidth videoHeight:(CGFloat)videoHeight
{
    if (imagePixelBuffer == nil) {
        NSDictionary *options = @{
                                  (NSString *)kCVPixelBufferCGImageCompatibilityKey: @NO,
                                  (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @NO
                                  };
        
        CVPixelBufferCreate(kCFAllocatorDefault,
                            videoWidth,
                            videoHeight,
                            kCVPixelFormatType_32ARGB,
                            (__bridge CFDictionaryRef)(options),
                            &(imagePixelBuffer));
    }

    CVPixelBufferLockBaseAddress(imagePixelBuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(imagePixelBuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context =
    CGBitmapContextCreate(pxdata,
                          videoWidth,
                          videoHeight,
                          8,
                          CVPixelBufferGetBytesPerRow(imagePixelBuffer),
                          rgbColorSpace,
                          kCGImageAlphaPremultipliedFirst |
                          kCGBitmapByteOrder32Big);
    
    if ([self.delegate respondsToSelector:@selector(frameForImageInVideo)]) {
        CGRect frame = [self.delegate frameForImageInVideo];
        CGContextDrawImage(context, CGRectMake(frame.origin.y, frame.origin.x, frame.size.width, frame.size.height), image);
    }
    
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    
    CVPixelBufferUnlockBaseAddress(imagePixelBuffer, 0);
}
  1. merge video sampleBuffer and imagePixelBuffer:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {
           // ....   
    CVImageBufferRef videoBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t videoWidth = CVPixelBufferGetWidth(videoBuffer);
    size_t videoHeight = CVPixelBufferGetHeight(videoBuffer);

    // create the finalPixelBuffer which will be consumed by CaptureConsumer finally
    CVPixelBufferRef finalPixelBuffer;
    NSDictionary *options = @{
                              (NSString *)kCVPixelBufferCGImageCompatibilityKey: @NO,
                              (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @NO
                              };
    CVPixelBufferCreate(kCFAllocatorDefault,
                        videoWidth,
                        videoHeight,
                        kCVPixelFormatType_32ARGB,
                        (__bridge CFDictionaryRef)(options),
                        &(finalPixelBuffer));
    
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    
   // render video to finalPixelBuffer
    CIImage *videoImage = [CIImage imageWithCVImageBuffer:videoBuffer];
    [context
     render:videoImage
     toCVPixelBuffer:finalPixelBuffer
     bounds:CGRectMake(0, 0, videoWidth, videoHeight)
     colorSpace:rgbColorSpace];

   
     // render image to finalPixelBuffer
    if ([self.delegate respondsToSelector:@selector(imageToMergeIntoVideo)] &&
        [self.delegate respondsToSelector:@selector(frameForImageInVideo)] &&
        [self.delegate imageToMergeIntoVideo]) {
        
        UIImage *image = [self.delegate imageToMergeIntoVideo];
        CGRect frame = [self.delegate frameForImageInVideo];
        
        [self fillImagePixelBufferFromCGImage:[image CGImage] videoWidth:videoWidth videoHeight:videoHeight];
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imagePixelBuffer];
        [context
         render:ciImage
         toCVPixelBuffer:finalPixelBuffer
         bounds:CGRectMake(frame.origin.y, frame.origin.x, frame.size.width, frame.size.height)
         colorSpace:rgbColorSpace];
    }

  // send finalPixelBuffer to CaptureConsumer

    CVPixelBufferLockBaseAddress(finalPixelBuffer, 0);
    uint8_t *planes[1];
    planes[0] = CVPixelBufferGetBaseAddress(finalPixelBuffer);
    
    CMTime time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    _videoFrame.timestamp = time;
    _videoFrame.orientation = [self currentDeviceOrientation];
    [_videoFrame setPlanesWithPointers:planes numPlanes:1];

    [_videoCaptureConsumer consumeFrame:_videoFrame];
    
    CVPixelBufferUnlockBaseAddress(finalPixelBuffer, 0);
    CVPixelBufferRelease(finalPixelBuffer);
    CGColorSpaceRelease(rgbColorSpace);
}

Result

When I test the changes, The publisher steam looks like this:

stream

It looks like the steam is added a blue filter.

I thought there is something wrong during the merge process at first. But when I convert the finalPixelBuffer to UIImage using the following code:

CGImageRef testImageRef;
VTCreateCGImageFromCVPixelBuffer(finalPixelBuffer, nil, &testImageRef);
UIImage *testImage = [UIImage imageWithCGImage:testImageRef];

the result told me the merge process works well. This is the resulting image:

uiimage

Question

Do you have any ideas to remove the blue filter? Thanks!

Broadcast extension invalid session

Today, i try to stream screen sharing. but had problem, stream always is broken, and audio is distorted. Anyone, have any solution to scale image?, Thank you so much

Screen share is not working

We have implemented screen share in iOS and Android.

  1. In iOS its not working, as only time is displaying. Can you provide more detail or document that how it will be implemented and how it will work if we want to share my screen.

  2. In android if we want to share mobile screen than how to achieve it, currently its sharing views only.

Important changes to iOS 14 networking affecting relayed sessions

With iOS 14, Apple introduces local network privacy (see this video).

Beginning in iOS 14, the operating system will prompt the user for permission when an application attempts to subscribe to clients on the same local network in a relayed session.

If your application uses a relayed session, it is encouraged to add a descriptive custom usage string to inform the user why the application needs access to their local area network. The Vonage Video API uses the local network to discover and connect to video participants on your same network where possible. The Apple video (above) shows how you can edit the string displayed in this message.

If the user does not accept the permission, the attempt to subscribe will fail. After the permission is rejected, any future attempts to subscribe to clients on the same network will also fail unless the user changes the permission in Settings. Unfortunately, iOS does not provide an API for an application to determine if the user has accepted or rejected this permission.

It is important to note that this does not apply to video sessions that use the OpenTok Media Router, as media is sent over the internet rather than the local network.

For applications which cannot use routed sessions and do not wish the user to ever be prompted for local network access, it is recommended to use TURN by using the following API:

   OTSessionICEConfig *myICEServerConfiguration = [[OTSessionICEConfig alloc] init];
 
    OTSessionSettings *settings = [[OTSessionSettings alloc] init];
    myICEServerConfiguration.transportPolicy = OTSessionICETransportRelay;
    settings.iceConfig = myICEServerConfiguration;
    _session = [[OTSession alloc] initWithApiKey:kApiKey
                                       sessionId:kSessionId
                                        delegate:self settings:settings];

The provided API key does not match this token

I am trying Signaling app provided in this repo. I have added valid token, session id and api keys. But I am getting this error

(Error Domain=OTSessionErrorDomain Code=1004 "The provided API key does not match this token" UserInfo={NSLocalizedDescription=The provided API key does not match this token})

Swift Compiler errors

I am using OpenTok version 2.16
And frequently while compiling i face the issue

Screenshot 2019-11-19 at 12 22 08 PM

Though sometime it works if i clean the project 5 6 times but this issue is very frequent.

OpenTok LTE/3G Issues...

Hello.

I have a problem when i use opentok ios(android) sdk.
On the LTE/3G, I can't see the subscribers(other client video).
But on the WIFI, it works fine.

What is the cause? I think because the LTE/3G bandwidth is small more than wifi.
Let me know the solving ways ASAP.

Regards.

ERROR[OpenTok]:Audio device error: startCapture.AudioOutputUnitStart returned error: -66637

I have implemented OpenTok with CallKit. Whenever I start outgoing call, my stream is never published. At the inspector dashboard it says,
Attempting to publish with stream 54B02465-0E26-4AF0-8715-9D333BF6E9FC (has not started publishing to session yet)"

it never succeed in publishing.

Moreover, I get the error in the log:

ERROR[OpenTok]:Audio device error: startCapture.AudioOutputUnitStart returned error: -66637

Is the error in publishing because of this error?

I have added OTDefaultAudioDevice file from the demo provided on GitHub.

Below is my code:

`func provider(_ provider: CXProvider, perform action: CXStartCallAction) {
// Create & configure an instance of SpeakerboxCall, the app's model class representing the new outgoing call.
let call = SpeakerboxCall(uuid: action.callUUID, isOutgoing: true)
call.handle = action.handle.value

        /*
            Configure the audio session, but do not start call audio here, since it must be done once
            the audio session has been activated by the system after having its priority elevated.
         */
        // https://forums.developer.apple.com/thread/64544
        // we can't configure the audio session here for the case of launching it from locked screen
        // instead, we have to pre-heat the AVAudioSession by configuring as early as possible, didActivate do not get called otherwise
        // please look for  * pre-heat the AVAudioSession *
        configureAudioSession()
        
        /*
            Set callback blocks for significant events in the call's lifecycle, so that the CXProvider may be updated
            to reflect the updated state.
         */
        call.hasStartedConnectingDidChange = {
            provider.reportOutgoingCall(with: call.uuid, startedConnectingAt: call.connectingDate)
        }
        call.hasConnectedDidChange = {
            provider.reportOutgoingCall(with: call.uuid, connectedAt: call.connectDate)
        }

        self.outgoingCall = call
        
        // Signal to the system that the action has been successfully performed.
        action.fulfill()
    }

func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
print("Received (#function)")

        // If we are returning from a hold state
        if answerCall?.hasConnected ?? false {
            //configureAudioSession()
            // See more details on how this works in the OTDefaultAudioDevice.m method handleInterruptionEvent
            sendFakeAudioInterruptionNotificationToStartAudioResources();
            return
        }
        
        if outgoingCall?.hasConnected ?? false {
            //configureAudioSession()
            // See more details on how this works in the OTDefaultAudioDevice.m method handleInterruptionEvent
            sendFakeAudioInterruptionNotificationToStartAudioResources()
            return
        }
        
        if outgoingCall != nil{
            startCall(withAudioSession: audioSession) { success in
                if success {
                    self.outgoingCall?.hasConnected = true
                    self.addCall(self.outgoingCall!)
                    self.startAudio()
                }
            }
        }
        
        if answerCall != nil{
            answerCall(withAudioSession: audioSession) { success in
                if success {
                    self.answerCall?.hasConnected = true
                    self.startAudio()
                }
            }
        }
        
    }

func sendFakeAudioInterruptionNotificationToStartAudioResources() {
var userInfo = Dictionary<AnyHashable, Any>()
let interrupttioEndedRaw = AVAudioSession.InterruptionType.ended.rawValue
userInfo[AVAudioSessionInterruptionTypeKey] = interrupttioEndedRaw
NotificationCenter.default.post(name: AVAudioSession.interruptionNotification, object: self, userInfo: userInfo)
}

func configureAudioSession() {
    // See https://forums.developer.apple.com/thread/64544
    let session = AVAudioSession.sharedInstance()
    do {
        try session.setCategory(AVAudioSession.Category.playAndRecord, mode: .default)
        try session.setActive(true)
        try session.setMode(AVAudioSession.Mode.voiceChat)
        try session.setPreferredSampleRate(44100.0)
        try session.setPreferredIOBufferDuration(0.005)
    } catch {
        print(#file)
        print(#function)
print(error)
    }
}
`

Any help is urgently needed.

TIA.

iOS publisher camera issue

Hi,
I am facing some random issue here. If I locked my iPhone or put application background during video call then sometimes publisher camera freezes or black out. I go through documentation but I could not find out the solution.

Can you please help me ?

OTSubscriberKitNetworkStatsDelegate and OTSubscriberKitRtcStatsReportDelegate Methods are not called

Issue

OTSubscriberKitNetworkStatsDelegate and OTSubscriberKitRtcStatsReportDelegate Methods are not called unless the subscriber's delegate (OTSubscriberKitDelegate) conforms to the protocol and implements the method. This is fine if the fine if the subscriber's networkStatsDelegate and rtcStatsReportDelegate are the same object as the delegate, but can fail if they are different objects.

A crash can also occur if the delegate does conform to the stats delegate protocols and implements the methods, but the actual networkStatsDelegate and rtcStatsReportDelegate do not implement the methods.

Steps to Reproduce

  • Make sure the OTSubscriberKitDelegate doens't implement the OTSubscriberKitNetworkStatsDelegate and OTSubscriberKitRtcStatsReportDelegate methods.
  • Create a separate delegate object that implements the OTSubscriberKitNetworkStatsDelegate and OTSubscriberKitRtcStatsReportDelegate methods.
  • Observer the stats delegate methods not getting executed.
  • Notice that implementing the methods in the OTSubscriberKitDelegate will cause the methods in the stats delegate to get executed.
  • Notice that implementing the methods in the OTSubscriberKitDelegate, but not in the stats delegate will cause the app to crash.
class SubscriberDelegate: NSObject,
    OTSubscriberKitDelegate,
    OTSubscriberKitNetworkStatsDelegate,
    OTSubscriberKitRtcStatsReportDelegate
{
    ...
    
    // Uncomment theses lines to get the StatsDelegate to work.
    // func subscriber(_ subscriber: OTSubscriberKit, videoNetworkStatsUpdated stats: OTSubscriberKitVideoNetworkStats) {}
    // func subscriber(_ subscriber: OTSubscriberKit, audioNetworkStatsUpdated stats: OTSubscriberKitAudioNetworkStats) {}
    // func subscriber(_ subscriber: OTSubscriberKit, rtcStatsReport jsonArrayOfReports: String) {}
    
    ...
    
}
class StatsDelegate: NSObject,
    OTSubscriberKitNetworkStatsDelegate,
    OTSubscriberKitRtcStatsReportDelegate
{
    // Comment out these methods and uncomment the methods above to observe a crash.
    func subscriber(_ subscriber: OTSubscriberKit, videoNetworkStatsUpdated stats: OTSubscriberKitVideoNetworkStats) {
        print("subscriber video stats")
    }
    
    func subscriber(_ subscriber: OTSubscriberKit, audioNetworkStatsUpdated stats: OTSubscriberKitAudioNetworkStats) {
        print("subscriber audio stats")
    }
    
    func subscriber(_ subscriber: OTSubscriberKit, rtcStatsReport jsonArrayOfReports: String) {
        print("subscriber rtc stats")
    }
}
class SessionDelegate: NSObject,
    OTSessionDelegate
{
    
    ...
    
    let subscriberDelegate = SubscriberDelegate()
    let statsDelegate = StatsDelegate()

    func session(_ session: OTSession, streamCreated stream: OTStream) {
        guard let subscriber = OTSubscriber(stream: stream) else { return }
        subscriber.delegate = self.subscriberDelegate
        subscriber.networkStatsDelegate = self.statsDelegate
        subscriber.rtcStatsReportDelegate = self.statsDelegate
        DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(1), execute: {
            subscriber.getRtcStatsReport()
        })
    }
    
    ...
    
}

Solution

I don't have access to the source code, but I am guessing that the reason this happens is because when deciding whether or not to call the stats delegate methods, there is a guard incorrectly checking to see if the subscriber's delegate conforms to the protocol and implements the method, this should be changed to check the networkStatsDelegate/rtcStatsReportDelegate as appropriate.

Build problems with Opentok.framework file.

Hi,
I recently had to add a Cocoa Touch Framework in my iOS project (Swift 3.0) which uses OpenTok as an internal dependency. Everything works great, but the only problem i'm having is my build on CircleCI. We use Fastlane and CircleCI to distribute our app to various environments. And since the Opentok.framework(static library) is 130MB in size and GitHub has a limit of 100MB, i'm having trouble to make my builds work on CircleCI. Would you please suggest a work around maybe. Appreciate any sort of help.

With thanks in advance,
Tejas

setOutputMode

In Android one can switch between handset and speakerphone by calling AudioDeviceManager.getAudioDevice().setOutputMode. However, in iOS this requires using a custom audio driver. Why isn't this implemented in the iOS SDK similar to the Android SDK?

Crash iOS 14 [VGRTCDefaultVideoEncoderFactory supportedCodecs]

Describe the bug
Hello, we're getting quite a few crash reports on iOS 14, using the OTXCFramework (2.24.1)

Stracktrace:

Fatal Exception: NSInvalidArgumentException
0  CoreFoundation                 0x129dc0 __exceptionPreprocess
1  libobjc.A.dylib                0x287a8 objc_exception_throw
2  CoreFoundation                 0x19c5a0 -[__NSCFString characterAtIndex:].cold.1
3  CoreFoundation                 0x1a85f8 -[__NSPlaceholderDictionary initWithCapacity:].cold.1
4  CoreFoundation                 0x16b90 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]
5  CoreFoundation                 0x94b0 +[NSDictionary dictionaryWithObjects:forKeys:count:]
6  MeetingRoom                    0x251e6c +[VGRTCDefaultVideoEncoderFactory supportedCodecs]
7  MeetingRoom                    0x25224c -[VGRTCDefaultVideoEncoderFactory supportedCodecs]
8  MeetingRoom                    0x739680 webrtc::ObjCVideoEncoderFactory::GetSupportedFormats() const
9  MeetingRoom                    0x3959f4 cricket::WebRtcVideoEngine::send_codecs() const
10 MeetingRoom                    0x1a920c webrtc::PeerConnectionFactory::GetVideoEncoderSupportedCodecs()
11 MeetingRoom                    0x1a9170 webrtc::PeerConnectionFactory::PeerConnectionFactory(rtc::scoped_refptr<webrtc::ConnectionContext>, webrtc::PeerConnectionFactoryDependencies*)
12 MeetingRoom                    0x1ad2f4 rtc::RefCountedObject<webrtc::PeerConnectionFactory>::RefCountedObject<rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*>(rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*&&)
13 MeetingRoom                    0x1a9078 rtc::scoped_refptr<webrtc::PeerConnectionFactory> rtc::make_ref_counted<webrtc::PeerConnectionFactory, rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*, (webrtc::PeerConnectionFactory*)0>(rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*&&)
14 MeetingRoom                    0x1a9000 webrtc::PeerConnectionFactory::Create(webrtc::PeerConnectionFactoryDependencies)
15 MeetingRoom                    0x1a8ee8 webrtc::CreateModularPeerConnectionFactory(webrtc::PeerConnectionFactoryDependencies)
16 MeetingRoom                    0x1aa530 rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>::CallVoidPtr<webrtc::CreateModularPeerConnectionFactory(webrtc::PeerConnectionFactoryDependencies)::$_2>(rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>::VoidUnion)
17 MeetingRoom                    0x1aa4bc rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> rtc::Thread::Invoke<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface>, void>(rtc::Location const&, rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>)::'lambda'()::operator()() const
18 MeetingRoom                    0x2880fc webrtc::webrtc_new_closure_impl::ClosureTask<rtc::Thread::Send(rtc::Location const&, rtc::MessageHandler*, unsigned int, rtc::MessageData*)::$_2>::Run()
19 MeetingRoom                    0x2871d0 rtc::Thread::QueuedTaskHandler::OnMessage(rtc::Message*)
20 MeetingRoom                    0x28693c rtc::Thread::Dispatch(rtc::Message*)
21 MeetingRoom                    0x2859c8 rtc::Thread::ProcessMessages(int)
22 MeetingRoom                    0x286dcc rtc::Thread::PreRun(void*)
23 libsystem_pthread.dylib        0x1bfc (Missing UUID 496dc4232dd43031bccf93e889035a34)
24 libsystem_pthread.dylib        0xa758 (Missing UUID 496dc4232dd43031bccf93e889035a34)

Expected behavior
Shouldn't crash

Device (please compete the following information):

  • sessionId, if applicable: 1_MX40NDc3MzQxMn5-MTY3NzY5MjI1MjM0NH5MdVBKNUUyTVFMaVA0YVdoNkcxWENhSnl-UH5-
  • OS and version: iOS 14, OTXCFramework (2.24.1)

[AVCaptureSession startRunning] startRunning crash

Sometimes when publishing a stream, I am getting the following error:

[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration

I have not found specific steps to reproduce it. It is not often, but it is happening on my production app as well. In the live app I am using Opentok 2.24.1, but I upgraded to 2.24.2 locally and the issue remains.

  • iOS SDK version: Opentok 2.24.1, Opentok 2.24.2
  • OS and version: iOS 14, iOS 15, iOS 16

Logs from Firebase:
Screenshot 2023-03-24 at 10 14 38

FrameMetadata doesn't actually work?

Obj-C FrameMetadata doesn't actually send metadata to remote subscribers. It all appears to be a local loop-back in the example project. When I connect the Windows SDK to the same session, the frame.metadata is null. Or maybe this is just the same problem as below:

Swift FrameMetadata doesn't work either.. but sometimes when I set a breakpoint in the Windows client and inspect the frame.metadata property (it's null) and then step over suddenly there's metadata. But the timestamps start repeating, the same two timestamps over and over. Then, with the FrameMetadata app still running on the phone, I disconnect the Windows client from the OT Session and reconnect, no metadata again.

I create a new Session and Token, video is working between Windows and iOS clients but no metadata. Suddenly the metadata shows up but it's more than 30 seconds old and it keeps repeating. Everything has been running now for 5 mins and I keep getting this in the Windows client over and over: 2020-05-11T03:26:28-04:00

Now I'm getting this over and over:
Subscriber video frame metadata: 2020-05-11T03:30:13-04:00
Subscriber video frame metadata: 2020-05-11T03:26:28-04:00
Subscriber video frame metadata: 2020-05-11T03:28:58-04:00
Subscriber video frame metadata: 2020-05-11T03:30:13-04:00

Is this a server issue? The only changes I've made to all the example code is the Key, SessionId, and Token.

Disabling all voice processing and filtering in custom audio driver

Hello,

We are having some issues with poor audio quality when playing instruments over Opentok. The suggestion in the React Native issue I opened was to implement the custom audio driver and use kAudioUnitSubType_RemoteIO only.

We tested the Custom Audio Driver in this repo with the following results. Attached are Opentok Playground archive samples to illustrate the issues: archive-tests 2.zip

  1. with-voiceprocessing.webm - uses the default kAudioUnitSubType_VoiceProcessingIO. Result: Bad overall quality
  2. remoteio.webm - uses kAudioUnitSubType_RemoteIO Result: Better quality, but being filtered (more on this below)
  3. remoteio-AVAudioSessionModeMeasurement.webm - uses kAudioUnitSubType_RemoteIO and sets AVAudioSession to setMode:AVAudioSessionModeMeasurement in order to bypass the high-pass filter that is apparently still applied with RemoteIO (per https://stackoverflow.com/q/32227585/193210). (no difference in quality that I can tell from #2).

If you noticed with the kAudioUnitSubType_RemoteIO recordings, there is some sort of distortion and/or filter happening that makes higher ranges, e.g. the higher notes on the guitar sound poor. I'm wondering if someone could at least point me in the right direction on what to try next.

I've tried messing with some of the stream_format settings, but things just get crazy sounding :). Thanks for any insight.

Support for iOS SPM?

Hello, I would like to know when you will be giving support for iOS Swift Package Manager. It would make it much simpler to update than just throwing the updated version of "OpenTok.framework" into the project.

Getting error in publisher(_:didFailWithError:) -> 1541 - Timed out while attempting to publish

My App requirement is to test Audio and Video quality first, then start video call. Network test is done properly.
And when I start actual call I'm getting above error. And unable to publish my feed and audio to subscriber.

publisher(_:didFailWithError:)
1541 - Timed out while attempting to publish.

https://github.com/opentok/opentok-network-test/tree/master/iOS-Sample

I've tried in opentok version 2.19.1, 2.20.0

Thread warning when initialising OTPublisher

Initialising an OTPublisher freezes the main thread for a short time and prints out the fallowing warning:

Thread Performance Checker: -[AVCaptureSession startRunning] should be called from background thread. Calling it on the main thread can lead to UI unresponsiveness.

Initialising the OTPublisher creates and starts AVCaptureSession.
[AVCaptureSession startRunning] method is synchronous and blocks until the session starts running. It should be called from background thread.

Crash on OTAudioDevice protocol with the func estimatedRenderDelay()

Hi, we are experiencing some crashes with OTAudioDevice.

The call stack is the following:

Crashed: WebRTCWorkerThread

  • 0 Riverside 0xe4ff4 @objc RVSAudioDeviceManager.estimatedRenderDelay() + 4377530356
  • 1 Riverside 0x38b20c otc_audio_proxy_playout_delay + 4380307980
  • 2 Riverside 0x3f98ec callback_playout_delay(otk_audio_device*) + 4380760300
  • 3 Riverside 0x3f92ac webrtc::OTAudioDevice::PlayoutDelay(unsigned short&) const + 4380758700
  • 4 Riverside 0x39637c webrtc::OTAudioDeviceModule::PlayoutDelay(unsigned short*) const + 4380353404
  • 5 Riverside 0x420ea4 webrtc::voe::(anonymous namespace)::ChannelReceive::UpdatePlayoutTimestamp(bool, long long) + 4380921508
  • 6 Riverside 0x41fd08 webrtc::voe::(anonymous namespace)::ChannelReceive::OnRtpPacket(webrtc::RtpPacketReceived const&) + 4380917000
  • 7 Riverside 0x697ab4 webrtc::RtpDemuxer::OnRtpPacket(webrtc::RtpPacketReceived const&) + 4383505076
  • 8 Riverside 0x699488 webrtc::RtpStreamReceiverController::OnRtpPacket(webrtc::RtpPacketReceived const&) + 4383511688
  • 9 Riverside 0x4279dc webrtc::internal::Call::DeliverRtp(webrtc::MediaType, rtc::CopyOnWriteBuffer, long long) + 4380948956
  • 10 Riverside 0x427d9c webrtc::internal::Call::DeliverPacket(webrtc::MediaType, rtc::CopyOnWriteBuffer, long long) + 4380949916
  • 11 Riverside 0x65cd34 cricket::WebRtcVoiceMediaChannel::OnPacketReceived(rtc::CopyOnWriteBuffer, long long) + 4383264052
  • 12 Riverside 0x4dacc4 rtc::FireAndForgetAsyncClosure<cricket::BaseChannel::OnRtpPacket(webrtc::RtpPacketReceived const&)::$_6>::Execute() + 4381682884
  • 13 Riverside 0x4faeec rtc::AsyncInvoker::OnMessage(rtc::Message*) + 4381814508
  • 14 Riverside 0x512fa0 rtc::Thread::Dispatch(rtc::Message*) + 4381912992
  • 15 Riverside 0x512014 rtc::Thread::ProcessMessages(int) + 4381909012
  • 16 Riverside 0x5133a4 rtc::Thread::PreRun(void*) + 4381914020
  • 17 libsystem_pthread.dylib 0x19ac _pthread_start + 148
  • 18 libsystem_pthread.dylib 0xe68 thread_start + 8

We are using pod 'OpenTok', '~> 2.22.3'

Want Start Call by Clicking Bluetooth Button, But unable to get sound on web end

Describe the bug
We are working on a lone-worker safety app in which we want to start a call by clicking Bluetooth button, its working fine when app is open but when app is in background call is started but no sound listen on web end.

To Reproduce
We are working on a lone-worker safety app in which we want to start a call by clicking Bluetooth button, its working fine when app is open but when app is in background call is started but no sound listen on web end.

Expected behavior
no sound hear on web end even i enable background services for audio

Screenshots
Screenshot 2022-10-13 at 11 01 27 AM

Device (please compete the following information):

  • sessionId, if applicable:
  • iOS SDK version: 2.11.5
  • OS and version: 15.5

Additional context
Add any other context about the problem here.

OTSubscriberKitDelegate methods not being called

In particular, I am unable to get "subscriberDidDisconnectFromStream" and "subscriberDidReconnectToStream" to run when I have a device that loses network stability. Those specific methods only trigger on the device that is experiencing the network problem. Suggesting every other subscriber is having trouble connecting to the session. This is not what is implied in the documentation.

Hi, all friend

i download example run work ok

but use voip didReceiveIncomingPushWithPayload received and open view video doSubscribe(OTSubscriber delegate callbacks) not asynchronously

Thx. all

Custom-Video-Capture example memory leak/crash when used as Xamarin Native Binding

I'm trying to use the TBExampleVideoCapture in a native Xamarin.iOS project via Native Bindings (https://docs.microsoft.com/en-us/xamarin/ios/platform/binding-objective-c/walkthrough?tabs=macos) and it results in a run-away memory leak and app crash.

I've tried both the following scenarios and they both crash:

  • Binding to the TBExampleVideoCapture.m directly and doing all other OpenTok work in Xamarin.iOS including setting the Publisher.VideoCapture property to the bound TBExampleVideoCapture object.

  • Changing the ViewController.m class from a UIViewController to UIView, then binding to that. I moved all OpenTok logic from Xamarin.iOS into native Objective-C UIView.

I should note that the TBExampleVideoCapture works as a Native iOS application. But there is something wrong when it's exposed as a native binding to Xamarin that causes the memory to run away. I thought perhaps there was something wrong with binding directly to the TBExampleVideoCapture via the Publisher.VideoCapture property, so I tried the UIView route thinking that all the rendering would occur on the native side and that would fix things, but it doesn't.

I'm really stuck and this is a pivotal thing for our application. If there's no way to resolve this we'll have to investigate other platforms unfortunately. We need the ability to draw on the Publisher's video before it's sent across the network. Thank you!

This SessionId can not be used with OpenTok 2.0 clients

got this issue in ios ----- didFailWithError: (Error Domain=OTSessionErrorDomain Code=1004 "This SessionId can not be used with OpenTok 2.0 clients" UserInfo=0x17531c60 {NSLocalizedDescription=This SessionId can not be used with OpenTok 2.0 clients})

Screen freeze when creating a publisher

In the latest version of the iOS SDK (v2.24.0) when initializing the publisher, the screen freezes and becomes unresponsive for a few seconds.

In the previous versions, there was a thread warning when initializing the publisher that has been fixed with the latest one.
Related issue: #267

Since the fix, the thread warning no longer appears, however the screen freeze still remains and now it is more noticeable than before. Now it takes about 10s for the publisher stream be initialized, and in that time the screen is not responsive. In the previous versions, despite the thread warning the freeze and init time was about 2-3s.

Multiple sample projects in this repo are failing with `EXC_BAD_ACCESS`

Hi, myself and at least one other person are unable to run the sample projects in this repository.

To reproduce

  • Run either Basic-VideoChat or Custom-Audio-Driver after entering Opentok apikey, sessionid, and token in ViewController.m.
  • The projects will build successfully, then they request Video and Mic permissions on the iOS device (iOS 13.3, iPhone 7+), but then crash with EXC_BAD_ACCESS.

Xcode logs up till crash:

2020-01-16 09:01:19.487426-0500 Basic-Video-Chat[21204:6399032] 
2020-01-16 09:01:19.487494-0500 Basic-Video-Chat[21204:6399032] ------------------------------------------------
2020-01-16 09:01:19.487521-0500 Basic-Video-Chat[21204:6399032] OpenTok iOS Library, Rev.2
2020-01-16 09:01:19.487546-0500 Basic-Video-Chat[21204:6399032] This build was born on May  8 2019 at 16:57:23
2020-01-16 09:01:19.612931-0500 Basic-Video-Chat[21204:6399032] Version: 2.16.1.7383-ios
2020-01-16 09:01:19.612988-0500 Basic-Video-Chat[21204:6399032] libOpenTokObjC:b4f4e66c2a07ac12693b38cc2722a3061a9165ea
2020-01-16 09:01:19.613016-0500 Basic-Video-Chat[21204:6399032] Copyright 2019 TokBox, Inc.
2020-01-16 09:01:19.613039-0500 Basic-Video-Chat[21204:6399032] Licensed under the Apache License, Version 2.0
2020-01-16 09:01:19.613061-0500 Basic-Video-Chat[21204:6399032] ------------------------------------------------
2020-01-16 09:01:21.490235-0500 Basic-Video-Chat[21204:6399032] sessionDidConnect (sessionid-redacted)
2020-01-16 09:01:21.533336-0500 Basic-Video-Chat[21204:6399032] Metal GPU Frame Capture Enabled
2020-01-16 09:01:21.533878-0500 Basic-Video-Chat[21204:6399032] Metal API Validation Enabled
2020-01-16 09:01:23.643116-0500 Basic-Video-Chat[21204:6399032] Publishing

This is happening with the two sample projects mentioned above, but likely is an issue for more or all of them in this repo. Can someone please take a look? Thanks.

Image stretched in latest iOS SDK version

In latest version(2.16) of OpenTok getting Image stretch problem randomly for publisher and subscriber, but OpenTok version(2.15.3) works fine, No image stretch.Any suggestions for this.

Using the OTDefaultAudioDevice breaks AirPlay support

In the latest OpenTok version (2.12.0) from September 2017, fixes to AirPlay support were introduced. However, if using the OTDefaultAudioDevice.h and OTDefaultAudioDevice.m files and using them as an audio driver, the AirPlay support is broken.

I am unable to choose any AirPlay devices for mirroring if using this driver and streaming from TokBox. However, if I do make some changes (mainly change the session category to AVAudioSessionCategoryPlayback - as we do not use microphone input from our end users), the mirroring does work. But only for about 20-30 seconds, and then the connection is lost.

Would it be possible to update the audio driver examples to match the 2.12.0 OpenTok version - meaning having an example where the AirPlay streaming does work.

Subscriber Video Enabling Problem

When we activate the video after starting the conversation by enabling only the audio stream without enabling the video, the subscriberView doesn't appear on the iOS app. But when we go background and foreground just once, the video has appeared on the view.

If subscriber video opens, we want to see it in view.

  • iOS SDK version: 2.24.0
  • OS and version: 16.0

How to set the "return to call" status bar title?

Hi guys,

Is it possible to set the title of the double height status bar that appears when backgrounding your app while in a call? I'm having trouble finding any mention of it in the Apple documentation.

If this is possible, I'd appreciate some guidance on how to do it.

Thanks!

Open Tok ReplayKit Integration

Hi @robjperez ,

The screen-sharing module contains an old code base also there is no swift support for it. Can you please provide a swift example of the module with latest updates of OpenTok.

Best,

Add possibility to use PIP, Screenshare and Camera with iOS 16

Hello together,

as far as we can see with iOS 16 it is possible to have camera access also the App is not in foreground.

https://developer.apple.com/documentation/bundleresources/entitlements/com_apple_developer_avfoundation_multitasking-camera-access

I Player a bit with iOS SDK and PIP mode and could add it.
Only if i put the App in Background the Camera Stream will drop and I not can see my UIViews anymore.

That is normal behavior and I could challange it on Android where it is working quite fine.

Will it be possible to have this extension for AVFoundation also for the iOS SDK.

Is it somehow possible to receive the Stream Raw Data without the Subscriber View so that I can add it to the PIP mode. And the User can see the other user when the app is in Background to Share the Screen?

Kind regards

Set Subscriber Audio Volume

Hi,

I have been trying to set OTSubscriber or OTSession Volume Level like you have in js
subscriber.setAudioVolume

Issue: As Soon as Subscriber is connected. Volume keys control the volume of Session but I need to control volume using a UISlider in iOS Natively.

Can you please tell me how I can control session Audio volume using a UISlider or MPVolumeView of iOS ?
I need this urgently

Thanks

Crash: OTSession initWithToken:error:

Describe the bug
When I'm start working with connect session via token, after init flow of OTSession instance catch the crash.
It start raise error when update OpenTok version from 2.21.3 to 2.22.0 or 2.23.0 or 2.24.0

To Reproduce
Related to sdk version 2.22.0

OTError *result = nil;

if (session.sessionConnectionStatus == OTSessionConnectionStatusConnected) { return; }

[session connectWithToken:token error:&error];

Application(11135,0x170373000) malloc: *** error for object 0x2822a1b48: pointer being freed was not allocated
Application(11135,0x170373000) malloc: *** error for object 0x2822a1b48: pointer being freed was not allocated
Application(11135,0x170373000) malloc: *** set a breakpoint in malloc_error_break to debug

Screenshots

Device (please compete the following information):

  • OpenTok: 2.22.0
  • OS and version: 16.1

On version 2.21.3 all works fine.

Session stream destroyed when reconnecting after no network

Version:
Opentok iOS verison 2.23.1

Steps to reproduce

  1. Connect to session and start publishing
  2. Turn of the internet connection
  3. Wait for the session to disconnect
  4. Turn on the internet connection
  5. Connect to session again as soon as you turn on the internet
  6. The app successfully connects to session and starts publishing
  7. After a few seconds a stream destroyed event is received for the publisher stream without any error or action by the user

What is the current bug behaviour?
Even though the session is connected successfully and the publisher stream is created and publishing, a stream destroyed event is received and it terminates the video call.

The bug was recreated on the Bacic-Video-Chat sample.

What is the expected correct behaviour?
If the session is connected and already publishing, a stream destroyed event should not be received for no reason.

Possible related issue: opentok/opentok-react-native#201

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.