Code Monkey home page Code Monkey logo

audiograph's Introduction

audioGraph:
README.txt
5/14/2014
version 1.2 - tested on iOS 7 - compiles and runs with several warnings about deprecated features
This is a temporary fix - as I noticed there were issues with the current version in App store. More work will be done on this App over the coming months to update core audio features.
----

iOS audio processing graph demonstration

"An audio processing graph is a Core Foundation opaque type, AUGraph, that you use to construct and manage an audio unit processing chain. A graph can leverage the capabilities of multiple audio units and multiple render callback functions, allowing you to create nearly any audio processing solution you can imagine." - From Apple's "Audio Unit Hosting Guide For iOS" 

AudioGraph is a superset of Apple's MixerHost application. 

Features include:

* Mono and stereo mic/line input
* Audio effects including:
	o Ring modulator
	o FFT passthrough using the Accelerate vDSP framework.
	o Real-time variable pitch shifting and detection using STFT
	o Simple variable speed delay using a ring buffer 
	o Recursive moving average filter with variable number of points
	o Convolution example with variable filter cutoff frequency
* Stereo level meter
* Synthesizer example - sine wave with an envelope generator
* iOS 5 features including:
	o MIDI sampler audio unit
	o file player audio unit
	o audio unit effects
* Runs on iPad, iPhone, and iPod-Touch
* Open source
* Available as free download from iTunes App Store
* Music by Van Lawton
* Plus everything from MixerHost 

Requirements:

* A device that runs iOS 5.x 
* Headphones.

Instructions:

Launch the app and press Play. 

Source code, documentation, support, downloads:

http://zerokidz.com/audiograph (website)
https://github.com/tkzic/audiograph (github)
[email protected] (support/questions)
http://itunes.apple.com/app/audiograph/id486193487 (app)

Credits:

This project is derived from work by:

Chris Adamson
Stephan M. Bernsee
Michael Tyson
Stephen Smith
Apple Core-Audio mailing list
stackoverflow.com
Apple's IOS developer program

Thank you.

- Tom Zicarelli 12/12/2011

Here is the original README file for MixerHost
----------------------------------------------

===========================================================================
DESCRIPTION:

MixerHost demonstrates how to use the Multichannel Mixer audio unit in an iOS application. It also demonstrates how to use a render callback function to provide audio to an audio unit input bus. In this sample, the audio delivered by the callback comes from two short loops read from disk. You could use a similar callback, however, to synthesize sounds to feed into a mixer unit. 

This sample is described in Audio Unit Hosting Guide for iOS.

The code in MixerHost instantiates two system-supplied audio units--the Multichannel Mixer unit (of subtype kAudioUnitSubType_MultichannelMixer) and the Remote I/O unit (of subtype kAudioUnitSubType_RemoteIO)--and connects them together using an audio processing graph (an AUGraph opaque type). The app functions as an audio mixer, letting a user control the playback levels of two sound loops.

The sample provides a user interface for controlling the following Multichannel Mixer unit parameters:

    * input bus enable
    * input bus gain
    * output bus gain

This sample shows how to:

    * Write an input render callback function
    * Locate system audio units at runtime and then load, instantiate, configure, 
        and connect them
    * Correctly use audio stream formats in the context of an audio processing
       graph
    * Instantiate, open, initialize, and start an audio processing graph
    * Control a Multichannel Mixer unit through a user interface

This sample also shows how to:

    * Configure an audio application for playing in the background by adding the 
        "app plays audio" key to the info.plist file
    * Use the AVAudioSession class to configure audio behavior, set hardware
        sample rate, and handle interruptions
    * Make the app eligible for its audio session to be reactivated while in the 
        background
    * Respond to remote-control events as described in Event Handling Guide for 
        iOS
    * Allocate memory for an AudioBufferList struct so that it can handle more
        than one channel of audio
   * Use the C interface from Audio Session Services to handle audio hardware 
        route changes
    * Use Cocoa notifications to communicate state changes from the audio object 
        back to the controller object

To test how this app can reactivate its audio session while in the background in iOS 4.0 or later:

    1. Launch the app and start playback.
    2. Press the Home button. MixerHost continues to play in the background.
    3. Launch the Clock app and set a one-minute countdown timer. Leave the 
        Clock app running in the foreground.
    4. When the timer expires, an alarm sounds, which interrupts MixerHost and 
        stops its audio.
    5. Tap OK to dismiss the Timer Done alert. The MixerHost audio resumes 
        playback while the app remains in the background.

To test how this app responds to remote-control events:

    1. Launch the app and start playback.
    2. Press the Home button. MixerHost continues to play in the background.
    3. Double-press the Home button to display the running apps.
    4. Swipe right to expose the audio transport controls.
    5. Notice the MixerHost icon at the bottom-right of the screen. This 
        indicates that MixerHost is the current target of remote-control
        events.
    6. Tap the play/pause toggle button; MixerHost stops. Tap it again;
        MixerHost resumes playback. Tap the MixerHost icon; MixerHost
        comes to the foreground.


===========================================================================
RELATED INFORMATION:

Audio Unit Hosting Guide for iOS, May 2010
Audio Session Programming Guide, April 2010


===========================================================================
BUILD REQUIREMENTS:

Mac OS X v10.6.4, Xcode 3.2, iOS 4.0


===========================================================================
RUNTIME REQUIREMENTS:

Simulator: Mac OS X v10.6.4
Devices:   iOS 4.0


===========================================================================
PACKAGING LIST:

MixerHostAppDelegate.h
MixerHostAppDelegate.m

The MixerHostAppDelegate class defines the application delegate object, responsible for instantiating the controller object (defined in the MixerHostViewController class) and adding the application's view to the application window.

MixerHostViewController.h
MixerHostViewController.m

The MixerHostViewController class defines the controller object for the application. The object helps set up the user interface, responds to and manages user interaction, responds to notifications from the MixerHostAudio object to handle audio interruptions and audio route changes, and handles various housekeeping duties.

MixerHostAudio.h
MixerHostAudio.m

The MixerHostAudio class encapsulates all of the audio capabilities for the application. It handles audio session configuration, use of the ExtAudioFileRef opaque type for reading audio files from disk into memory, and construction and management of the audio processing graph. It detects interruptions and audio route changes and uses notifications to communicate audio state changes back to the MixerHostViewController object.


===========================================================================
CHANGES FROM PREVIOUS VERSIONS:

Version 1.0. New sample application that demonstrates how to host a Multichannel Mixer unit.
 
================================================================================
Copyright (C) 2010 Apple Inc. All rights reserved.

audiograph's People

Contributors

tkzic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

audiograph's Issues

UIViewControllerHierarchyInconsistency

I just cloned your project last night and I hit a UIViewControllerHierarchyInconsistency error when I ran. I can't figure out where it's coming from. I see this in the console:

reason: 'A view can only be associated with at most one view controller at a time! View <UIView: 0x1f87ff00; frame = (0 0; 320 480); autoresize = W+H; layer = <CALayer: 0x1f872310>> is associated with <UIViewController: 0x1eda4c40>. Clear this association before associating this view with <MixerHostViewController: 0x1f870710>.'

I made the following change in application:didFinishLaunching:

// [window addSubview: viewController.view];
window.rootViewController = self.viewController;

But I still get the error. I looked all through the object graph in the debugger and I can't find out where the view is associated with another UIViewController. I'm running XCode 4.5.1 on OS X 10.7.5 and deploying on an iPhone 4S.

Performance Improvements

Hi,

Tanks for creating this, it hast been a great help understanding the PhaseVocoder and Apples APIs.

I used "Intruments" to see what makes smbPitchShift slow. Your change to the Apple Accelerate framework was a very big step.
More improvements are: replace double with float.
The audio is 16 bit precise float has 24 bit precision, and is sufficient.
This goes for the C math functions as well. atan -> atanf; cos -> cosf etc..

After moving to float, the biggest thing will be cosf. This is caused by recomputing the window function twice each call. And can be eliminated to a vector multiplication.

Apple supplies 3 window functions.
vDSP_blkman_window
vDSP_hamm_window
vDSP_hann_window
The last one is equivalent to the current Implementation. The window function should only be computed in the init, or passed like the FFTSetup.
The first for loop would be reduced to:
vDSP_vmul(gInFIFO, 1, window, 1, gFFTworksp, 1, fftFrameSize);

After this the Phase Vocoder should be much faster. ( 4x or even more )

Clarification on comment for soundStruct

Hi,
Thanks so much for this project - it's a HUGE help in learning Core Audio.

In the MixerAudioHost.h file, just before the declaration of the soundStruct, there's a comment:

// Note: this is used by the callbacks for playing looped files (old way)

Can you please clarify what's meant by 'old way'? I'm working on an app that plays files and want to ensure I'm doing it the 'new way'. :-)

Thanks in advance!

How do I access bottom and rear mic on this simultaneously?

Hello, I came across this project and I want to try some modifications. How can I modify the input to access both bottom and rear mics on iphone6 simultaneously? Does this do record and playback, if not can I add record and playback function to this?I am new to iphone app development.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.