Code Monkey home page Code Monkey logo

audio_session's Issues

Crash when setting AVAudioSessionRouteSharingPolicy for iOS

On an iOS 14.5 simulator setting the AVAudioSessionRouteSharingPolicy will cause an exception or crash.

  • AVAudioSessionRouteSharingPolicy.defaultPolicy works fine
  • AVAudioSessionRouteSharingPolicy.longFormAudio and AVAudioSessionRouteSharingPolicy.longFormVideo cause an OSStatus error -50 platform exception*
  • AVAudioSessionRouteSharingPolicy.independent causes a Lost connection to device. from the IDE without any Flutter or system logs.

*more info: https://stackoverflow.com/questions/26718103/what-does-osstatus-error-50-mean

Android OS 5 Crash

The audio_session library version 0.1.3 is causing a crash whenever its being defined in android os 5, downgrading fixed the issue

TODO: Investigate AVAudioSession.setCategory defaults

In an effort to allow the Dart API to pass in any combination of parameters with default values, the plugin chooses AVAudioSessionCategoryOptions.none as the default for the options parameter. However, the documentation says that allowBluetoothA2db will effectively be switched on by default "if you configure an app’s audio session to use the AVAudioSessionCategoryAmbient, AVAudioSessionCategorySoloAmbient, or AVAudioSessionCategoryPlayback categories". I may need to either emulate iOS's default values or try to avoid setting the default value myself and let iOS choose the default.

How to use this without Cocoapods?

How can I used this plugin in a project without CocoaPods?

Steps to reproduce:

  1. flutter create . --platforms ios
  2. Add the plugin
  3. cd ios
  4. pod deintegrate
  5. Open XCode and run the app

Bildschirmfoto 2022-01-11 um 10 05 55

NullPointerException in unregisterNoisyReceiver

Fatal Exception: java.lang.NullPointerException: Attempt to invoke virtual method 'void android.content.Context.unregisterReceiver(android.content.BroadcastReceiver)' on a null object reference
       at com.ryanheise.audio_session.AndroidAudioManager.unregisterNoisyReceiver(AndroidAudioManager.java:110)
       at com.ryanheise.audio_session.AndroidAudioManager.abandonAudioFocus(AndroidAudioManager.java:85)
       at com.ryanheise.audio_session.AndroidAudioManager.lambda$requestAudioFocus$0(AndroidAudioManager.java:66)
       at com.ryanheise.audio_session.-$$Lambda$AndroidAudioManager$vTsUKVRsAvXdPLcXpQfkDXFTmrU.onAudioFocusChange(-.java:2)
       at android.media.AudioManager$ServiceEventHandlerDelegate$1.handleMessage(AudioManager.java:2886)
       at android.os.Handler.dispatchMessage(Handler.java:106)
       at android.os.Looper.loop(Looper.java:223)
       at android.app.ActivityThread.main(ActivityThread.java:7660)
       at java.lang.reflect.Method.invoke(Method.java)
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:592)
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:947)

Happens when applicationContext became null.

This is in the latest version 0.0.10.

improve OS classes docs

a lot of classes that represent the OS native classes are almost not documented

for example an AVAudioSessionCategory

we have two options how we can handle it:

  1. copy all the documenation from the official docs
  2. copy the most relevant information (because there's sometimes too much of it) and link as "See also" to the original doc

i personally would prefer the latter

(and of course we can always edit something and add our comments as well)

module 'audio_session' not found

Hi,
I encounter this issue on the new Mac with the M1 chip.

Launching lib/main.dart on iPhone 12 in debug mode...
Running Xcode build...                                                  
 └─Compiling, linking and signing...                        340ms
Xcode build done.                                            4.7s
Failed to build iOS app
Error output from Xcode build:
↳
    objc[7636]: Class AMSupportURLConnectionDelegate is implemented in both ?? (0x1f2fe0188) and ?? (0x117e8c2b8). One of the two will be used. Which one is undefined.
    objc[7636]: Class AMSupportURLSession is implemented in both ?? (0x1f2fe01d8) and ?? (0x117e8c308). One of the two will be used. Which one is undefined.
    ** BUILD FAILED **


Xcode's output:
↳
    /Users/user/Projects/myproject/ios/Runner/GeneratedPluginRegistrant.m:10:9: fatal error: module 'audio_session' not found
    @import audio_session;
     ~~~~~~~^~~~~~~~~~~~~
    1 error generated.
    note: Using new build system
    note: Building targets in parallel
    note: Planning build
    note: Constructing build description

Could not build the application for the simulator.
Error launching application on iPhone 12.

[Question] Audio player becomes unresponsive when paused for too long

I’m using just_audio and audio_service for radio streaming app I’m building.

It uses an HLS stream and when I pause the audio with the phone locked, if I wait a couple minutes to hit play again the audio player (?) stops working.

  • The lockscreen controls go grey after Play is tapped.
  • The player controls in the app don’t play audio. The UI is still good, can use print statements, etc. The audio handler Play method is also being called, but the player doesn't respond.

I’m wondering if this has to do with using an HLS stream. It still seems less common and I can’t find others with this issue online anywhere. The amount of time it takes between pausing and things breaking seems to relate somewhat to the amount of buffer that exists. For example, if only 45 seconds of audio has been buffered and we are paused for 1 minute, things will break. Paused less than a minute and everything works fine.

I’ve tried running in Xcode to see if anything shows up in the console. There are some logs, but nothing seems directly related. I’m happy to provide what’s there if it could help.

Is it necessary to call AudioSession.setActive(true) ?

In my app, I only want to detect whether another app is playing audio , so that I can toast something .

But if I only use AudioSession.interruptionEventStream.listen, if another app start to play music , I can't receive any event. However, if I call AudioSession.setActive(true), I will receive the interruption event. But that means it will stop the another app's playing audio .

It seems that I need a API like AudioSession.isOtherAudioPlaying()

[android] delayed focus request

android audio focus is missing delayed request
https://developer.android.com/reference/android/media/AudioManager#AUDIOFOCUS_REQUEST_DELAYED

this will probably require introducing new enum which contains will contains these values

  • AUDIOFOCUS_REQUEST_DELAYED
  • AUDIOFOCUS_REQUEST_FAILED
  • AUDIOFOCUS_REQUEST_GRANTED

now the latter two are handled as boolean value and the delayed is simply not supported (it requires registering with https://developer.android.com/reference/android/media/AudioFocusRequest.Builder#setAcceptsDelayedFocusGain(boolean))

Crash on iOS 12

Hi,

When trying to run my app on iOS 12 it crashes with the following message:
dyld: Library not loaded: /System/Library/Frameworks/AVFAudio.framework/AVFAudio
Referenced from: /Users/hennie/Library/Developer/CoreSimulator/Devices/E29CF1A8-DF5D-4202-8758-649CEB6077CB/data/Containers/Bundle/Application/BA9BDD13-3C82-4C97-BF4A-E9A4AD758415/Runner.app/Frameworks/audio_session.framework/audio_session
Reason: image not found

Use --dart-define to optionally disable the microphone code

Theoretically, it seems like this idea might work.

As it stands, when apps get submitted to the app store, Apple's automated analyser will look at which APIs are statically compiled into your code and if you're symbolically referencing an API such as the microphone, it will ask you to include a user string explaining your use of the microphone. This analyser is not smart enough (and I guess cannot be smart enough) to know that audio_service is like a 3rd party SDK and an app isn't necessarily going to use all features of the SDK, so it will ask you to explain your use of the microphone even if you don't use it.

So with --dart-define we may be able to pass in a compile-time option and then use the preprocessor in the iOS code to selectively disable parts of the code.

This isn't strictly necessary, an app can always just include the user string for the microphone that says "This app does not use the microphone" or something similar, but maybe it'd still be a good idea to implement the dart-define option sometime down the road.

NSMicrophoneUsageDescription when compile out the option with the pod file ios

Hi,

I am having some issue regarding the iOS NSMicrophoneUsageDescription.

I am using the flutter TTS plugin with audio session to duck the background sounds when the TTS plugin is talking.

initAudioSession() async{ session = await AudioSession.instance; await session.configure(AudioSessionConfiguration( avAudioSessionCategory: AVAudioSessionCategory.playback, avAudioSessionCategoryOptions: AVAudioSessionCategoryOptions.duckOthers, avAudioSessionMode: AVAudioSessionMode.defaultMode, avAudioSessionRouteSharingPolicy: AVAudioSessionRouteSharingPolicy.defaultPolicy, avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.none, androidAudioAttributes: const AndroidAudioAttributes( contentType: AndroidAudioContentType.speech, flags: AndroidAudioFlags.none, usage: AndroidAudioUsage.media, ), androidAudioFocusGainType: AndroidAudioFocusGainType.gainTransientMayDuck, androidWillPauseWhenDucked: true, ));

I do not use the microphone at all (Unless the duck attribute uses the microphone).
But I have no microphone permission whatsoever enable or iOS or Android.

I have edited my pod file to be as suggested (to compile out the mic option):

post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
target.build_configurations.each do |config|
config.build_settings.delete 'IPHONEOS_DEPLOYMENT_TARGET'
config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [
'$(inherited)',
'AUDIO_SESSION_MICROPHONE=0'
]
end
end
end

But I am still getting ITMS-90683: Missing Purpose String in Info.plist - The app's Info.plist file should contain a NSMicrophoneUsageDescription from Apple.

The version of audio_session is the latest: 0.1.6+1

My flutter doctor -v output:
[✓] Flutter (Channel stable, 2.5.3, on macOS 12.0.1 21A559 darwin-arm, locale en-SG)
• Flutter version 2.5.3 at /Users/remy/Desktop/flutter
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision 18116933e7 (4 weeks ago), 2021-10-15 10:46:35 -0700
• Engine revision d3ea636dc5
• Dart version 2.14.4

[✓] Android toolchain - develop for Android devices (Android SDK version 31.0.0)
• Android SDK at /Users/remy/Library/Android/sdk
• Platform android-31, build-tools 31.0.0
• Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
• All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 13.1, Build version 13A1030d
• CocoaPods version 1.10.1

[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome

[✓] Android Studio (version 2020.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)

[✓] Connected device (3 available)
• iPad (9th generation) (mobile) • 7D9BF1E7-CF23-4185-A364-031987D00AD3 • ios • com.apple.CoreSimulator.SimRuntime.iOS-15-0 (simulator)
• macOS (desktop) • macos • darwin-arm64 • macOS 12.0.1 21A559 darwin-arm
• Chrome (web) • chrome • web-javascript • Google Chrome 95.0.4638.69

• No issues found!

Any advice?

Thanks a lot
Remy

Please disable logs by default

We often see these messages, originating from audio_session, a transitive dependency:

2021-09-15 15:58:12.148450+0100 Runner[10580:833980] routeChange detected
2021-09-15 15:58:13.457703+0100 Runner[10580:833980] routeChange detected
2021-09-15 15:58:14.760813+0100 Runner[10580:833980] routeChange detected
2021-09-15 15:58:21.482768+0100 Runner[10580:833980] routeChange detected

Play loudly and Use Microphone at the same time.

How to play audio and record at the same time. I am using this configuration but the audio play sound is low. @ryanheise


  await session.configure(const AudioSessionConfiguration(
    avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
    avAudioSessionCategoryOptions: AVAudioSessionCategoryOptions.defaultToSpeaker,
    avAudioSessionMode: AVAudioSessionMode.videoChat,
    avAudioSessionRouteSharingPolicy: AVAudioSessionRouteSharingPolicy.defaultPolicy,
    avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.none,
    androidAudioAttributes: AndroidAudioAttributes(
      contentType: AndroidAudioContentType.speech,
      flags: AndroidAudioFlags.none,
      usage: AndroidAudioUsage.voiceCommunication,
    ),
    androidAudioFocusGainType: AndroidAudioFocusGainType.gain,
    androidWillPauseWhenDucked: true,
  ));

How to change output between speaker and headphones in iOS

I am having trouble figuring out how to change output in iOS

I thought following would change it . But it is giving me an error.
I am also seeing only the speaker option so I assume there must be a different way to do this.

await AVAudioSession().overrideOutputAudioPort(AVAudioSessionPortOverride.speaker);

Can anyone provide some assistance for this?

module 'audio_session' not found

We are getting this error once we install audio_session plugin -

Xcode's output:

/Users/birendrayadav/Desktop/wemeetnew/wemeet/ios/Runner/GeneratedPluginRegistrant.m:10:9:
fatal error: module 'audio_session' not found
@import audio_session;
~~~~~~~^~~~~~~~~~~~~
1 error generated.
note: Using new build system
note: Building targets in parallel
note: Planning build
note: Analyzing workspace
note: Constructing build description
note: Build preparation complete

Could not build the application for the simulator.
Error launching application on iphone.

Exoplayer init multiple times ?

I/ExoPlayerImpl(11716): Init bff6ae6 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 236c827 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 7794ad4 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 769a47d [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init dc74f72 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init c3dbfc3 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init a8a4040 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init df10b79 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 9b6f0be [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 132951f [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init f3206c [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init aede635 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 56d5aca [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init a3b0b1 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init f6e5996 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 6ef1104 [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]
I/ExoPlayerImpl(11716): Init 811a6ed [ExoPlayerLib/2.11.7] [coral, Pixel 4 XL, Google, 29]

setup code.

just wanted to know is the logs coming are normal or i have done some mistake while implementing.

bgTaskEntryPoint() {
  print('coming inside backgroundTaskEntrypoint');
  AudioServiceBackground.run(() => PlayerBackgroundTask());
}

class Home extends StatefulWidget {
  @override
  _HomeState createState() => _HomeState();
}

class _HomeState extends State<Home> {
  @override
  void initState() {
    super.initState();
    AudioSession.instance.then((audioSession) async {
      await audioSession.configure(AudioSessionConfiguration.music());
    });
  }
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      backgroundColor: Colors.white,
      body: BuildAppBody(
        context: context,
      ),
    );
  }
}

Crash on iOS

Hi,

My app occasionally crashes on iOS and I am not entirely sure how to go about resolving it. I've narrowed it down to this library causing the crash but I am not entirely sure of why that would happen. Here is that trace from XCode.

image

Compatibility with τ-Sound

Hi @ryanheise ,

This is not an issue, but just a congratulation for this plugin idea.

When I began to work on flutter_sound, I had a terrific idea. Something really great : implement the concept of audio-session. Each recorder or player will have its own session. It was a brilliant idea ... until I realized that iOS can manage only one session per App. But I realized that too late, and the flutter_sound API implements the audio-session concept, but works very surprisingly when the App opens several players or recorders.

Then I am actually rewriting the τ-Sound API, and try to handle correctly the session concept.
But I am realizing that it could be a problem if the App uses also other plugins like just_audio.

And now I am discovering this plugin. Federating all audio-plugins sessions is a bright idea.
I will certainly do a dependency to audio-session from τ-Sound.

Thanks

interruptionEventStream is not working on iOS

Hi. For me, interruptionEventStream is not working anymore on iOS and I do not know why (I use audioplayers package). With another words, if my audio is interrupted nothing happens. :(
On my initState() after super.initState() I have this function Provider.of(context, listen: false).audioInterruptedListener();

And this is the function:

  void audioInterruptedListener() {
    AudioSession.instance.then(
      (audioSession) async {
        bool playInterrupted = false;
        bool narratorInterrupted = false;
        await audioSession.configure(AudioSessionConfiguration.speech());
        audioSession.interruptionEventStream.listen(
          (event) {
            if (event.begin) {
              switch (event.type) {
                case AudioInterruptionType.duck:
                case AudioInterruptionType.pause:
                case AudioInterruptionType.unknown:
                  if (playInterrupted) {
                    playerBackground.resume();
                    if (playerState == AudioPlayerState.PAUSED &&
                        narratorInterrupted) {
                      playerNarrator.resume();
                      narratorInterrupted = false;
                    }
                    playInterrupted = false;
                    break;
                  }
                  if (!playInterrupted) {
                    playInterrupted = true;
                    playerBackground.pause();
                    if (playerState == AudioPlayerState.PLAYING) {
                      playerNarrator.pause();
                      narratorInterrupted = true;
                    }
                  }
                  break;
              }
            } else {
              switch (event.type) {
                case AudioInterruptionType.duck:
                case AudioInterruptionType.pause:
                  if (playInterrupted) {
                    playerBackground.resume();
                    if (playerState == AudioPlayerState.PAUSED &&
                        narratorInterrupted) {
                      playerNarrator.resume();
                      narratorInterrupted = false;
                    }
                    playInterrupted = false;
                  }
                  break;
                case AudioInterruptionType.unknown:
                // The interruption ended but we should not resume.
              }
            }
          },
        );
      },
    );
  }

[Question] How to set output audio device?

Thanks to this package, we can get audio devices as a Future or Stream. I'm trying to cast audio to one of those devices. I couldn't figure out how I can do it. Is it possible with this package?

interruptionEventStream doesn't emit a new item when TalkBack is on (Android)

It seems that interruptionEventStream doesn't emit a new interruption event when certain conditions are met:

  • App is running on Android (described behavior happens at least on Android 11 and 12)
  • TalkBack is on (Android 12: Settings > Accessibility > TalkBack > Use TalkBack)
  • TalkBack's audio ducking is off (Android 12: Settings > Accessibility > TalkBack > Settings > Sound and Verbosity > Audio ducking). Interruption stream works as expected when ducking is on.

Reproduction path:

  • Enable TalkBack and disable audio ducking (see above)
  • Open app
  • Request audio focus
  • Change TalkBack focus to a different widget (e.g. Text) so TalkBack can start speaking and current playback should be interrupted

Observed behavior:

When changing focus the app doesn't receive a new AudioInterruptionEvent. This makes it impossible to do necessary changes in the audio playback (pause playback etc). The app is unable to update playback state (I'm using this plugin along with audio_service). There's playback state mismatch: app shows that playback is ongoing although it's actually not.

Expected behavior:

App gets notified about new audio interruption via AudioSession.interruptionEventStream.

My audio session configuration:

final config = const AudioSessionConfiguration.speech()
    .copyWith(androidWillPauseWhenDucked: true);

add some APIs from audio_service

i think you have considered this, but didn't have much time for this yet

i suggest that this plugin would provide a comprehensive API for any media session interactions, and thus some implementations from audio_service should be moved into here

  • notification handling
  • media button click handling
  • (probably something else i missed)

with this we could make APIs more broad than we can do this in the audio_service, and at the same time it wouldn't lose anything and would still provide the same interface for such interactions, but by using the API from the audio_session

Upgrade Android dependency to media2-session 1.2.0

The new version of this dependency requires upgrading the compile SDK version to 31, so unfortunately this would be a breaking change. I'll hold off until just_audio and audio_service are prepared for a major version bump.

Toogling audio output between earpiece and speaker

Context

I am developing a mobile application that plays audios with the packages just_audio and audio_service. I am trying to add a feature using the proximity sensors of the device so that every time the user brings the mobile closer to the ear, the audio is played from the earpiece

Dependencies

  • audio_service: 0.15.2
  • just_audio: 0.5.5
  • audio_session: 0.0.9

My current state

When the service starts I configure the AudioSession as following

    final session = await AudioSession.instance;
    await session.configure(AudioSessionConfiguration.music());

And I have a custom action when the proximity sensor is actived

  final session = await AudioSession.instance;
  final AudioSessionConfiguration currentConfiguration = session.configuration;
  await session.configure(
    session.configuration.copyWith(
      androidAudioAttributes: AndroidAudioAttributes(
        usage: AndroidAudioUsage.voiceCommunication,
        flags: currentConfiguration.androidAudioAttributes.flags,
        contentType: currentConfiguration.androidAudioAttributes.contentType,
      ),
    ),
  );

It is working properly on Android but I am not sure if it is the recommended way and I have no idea how to do it in ios with the available options avAudioSessionCategory, avAudioSessionCategoryOptions or avAudioSessionMode

Thank you in advance.

Currently playing info?

Hi

I'm looking for a Flutter plugin that allows my app to display information about what's currently playing via the speaker or connected bluetooth devices - artist, track, cover art, that sort of thing. (Potentially some limited control too - next track, pause, resume; but not necessarily.)

audio_session looks like it gets close to what I'm looking for, but (if I've read it correctly) focuses on the app that's running rather than system-wide audio that might come from players currently backgrounded.

I'm pretty new to this low-level side of audio, so I wondered if you'd be able to tell me if you thought that kind of information was potentially available, and if so whether audio_session was an appropriate package in which to surface them for Flutter consumption? If you think it is, I'll happily fork and try to get it to work.

Thanks for any insights - and apologies if this is exactly what audio_session already provides, and I'm too ignorant about the iOS/Android audio internals to tell.

Cheers

Nic Ford

Memory Leak on Android

Used version audio_session 0.0.9.

┬───
│ GC Root: Local variable in native code
│
├─ android.net.ConnectivityThread instance
│    Leaking: NO (PathClassLoader↓ is not leaking)
│    Thread name: 'ConnectivityThread'
│    ↓ ConnectivityThread.contextClassLoader
├─ dalvik.system.PathClassLoader instance
│    Leaking: NO (AndroidAudioManager↓ is not leaking and A ClassLoader is
│    never leaking)
│    ↓ PathClassLoader.runtimeInternalObjects
├─ java.lang.Object[] array
│    Leaking: NO (AndroidAudioManager↓ is not leaking)
│    ↓ Object[].[2988]
├─ com.ryanheise.audio_session.AndroidAudioManager class
│    Leaking: NO (a class is never leaking)
│    ↓ static AndroidAudioManager.noisyReceiver
│                                 ~~~~~~~~~~~~~
├─ com.ryanheise.audio_session.AndroidAudioManager$1 instance
│    Leaking: UNKNOWN
│    Retaining 49041 bytes in 865 objects
│    Anonymous subclass of android.content.BroadcastReceiver
│    ↓ AndroidAudioManager$1.this$0
│                            ~~~~~~
├─ com.ryanheise.audio_session.AndroidAudioManager instance
│    Leaking: UNKNOWN
│    Retaining 49021 bytes in 864 objects
│    applicationContext instance of com.company.app.MainApplication
│    ↓ AndroidAudioManager.messenger
│                          ~~~~~~~~~
├─ io.flutter.embedding.engine.dart.DartExecutor instance
│    Leaking: UNKNOWN
│    Retaining 48977 bytes in 862 objects
│    ↓ DartExecutor.flutterJNI
│                   ~~~~~~~~~~
├─ io.flutter.embedding.engine.FlutterJNI instance
│    Leaking: UNKNOWN
│    Retaining 128 bytes in 10 objects
│    ↓ FlutterJNI.localizationPlugin
│                 ~~~~~~~~~~~~~~~~~~
├─ io.flutter.plugin.localization.LocalizationPlugin instance
│    Leaking: UNKNOWN
│    Retaining 16 bytes in 1 objects
│    context instance of com.company.app.MainActivity with mDestroyed =
│    true
│    ↓ LocalizationPlugin.context
│                         ~~~~~~~
╰→ com.company.app.MainActivity instance
​     Leaking: YES (ObjectWatcher was watching this because com.mynextbase.
​     connect.MainActivity received Activity#onDestroy() callback and
​     Activity#mDestroyed is true)
​     Retaining 41732 bytes in 612 objects
​     key = c9ca838e-bc2f-4f78-a78e-a77abef2e000
​     watchDurationMillis = 20643
​     retainedDurationMillis = 15642
​     mApplication instance of com.company.app.MainApplication
​     mBase instance of android.app.ContextImpl, not wrapping known Android
​     context

METADATA

Build.VERSION.SDK_INT: 29
Build.MANUFACTURER: samsung
LeakCanary version: 2.5
App process name: com.company.app
Stats: LruCache[maxSize=3000,hits=2458,misses=79998,hitRate=2%]
RandomAccess[bytes=3743193,reads=79998,travel=30190866019,range=22454797,size=28
937691]
Analysis duration: 4149 ms

Instead of registering the receiver any time a plugin is instantiated, the native side of the plugin should expose a single event channel per use case and the dart side exposes the broadcast stream to the application.

The native side should only register the receive when it itself has stream subscriptions and cancel them when they cancel.

@ryanheise do you have time to work on this any time soon?

Feature Request: iOS Bluetooth Latency

Hey there!

I LOVE audio_session and just_audio - I use them both for our breathing app.

In our app, we have several breathing animations that are synced to audio.

This works great with wired speakers/headphones/phone speakers, but when the device is connected via Bluetooth/Airplay, the latency is large enough where we've had users start complaining.

While all Bluetooth devices have some latency, it varies relatively widely.

For this reason, I was wondering if it would be possible to implement a method to determine output device or output latency?

I already implemented in a little sample project just by creating a new method in DarwinAudiuoSession.m...

- (void)getOutputLatency:(NSArray *)args result:(FlutterResult)result { result(@([[AVAudioSession sharedInstance] outputLatency])); }

It returns a double when on iOS devices that seem to be fairly accurate.

improve OS classes docs

a lot of classes that represent the OS native classes are almost not documented

for example an AVAudioSessionCategory

we have two options how we can handle it:

  1. copy all the documenation from the official docs
  2. copy the most relevant information (because there's sometimes too much of it) and link as "See also" to the original doc

i personally would prefer the latter

(and of course we can always edit something and add our comments as well)

AudioSession.instance creates new listeners or instance every time called on iOS

Which API doesn't behave as documented, and how does it misbehave?
Each time that we call await AudioSession.instance in the code or at startup initWithRegistrar is called.

This causes the following code being run every time and we end up with a lot of listeners of the same type running at the same time

    [AVAudioSession sharedInstance];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioInterrupt:) name:AVAudioSessionInterruptionNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(routeChange:) name:AVAudioSessionRouteChangeNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(silenceSecondaryAudio:) name:AVAudioSessionSilenceSecondaryAudioHintNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(mediaServicesLost:) name:AVAudioSessionMediaServicesWereLostNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(mediaServicesReset:) name:AVAudioSessionMediaServicesWereResetNotification object:nil];

Minimal reproduction project
Audio_service example project

To Reproduce (i.e. user steps, not code)

  1. Press AudioPlayer
  2. Press Stop
    ... repeate a few times

Error messages

We can see how the listeners stack up an we get a new listener for every start/stop

When a route change happens after 3 start/stops we get the following:

routeChange detected
routeChange detected
routeChange detected
routeChange detected

We can also put a NSLog print statement in initWithRegistrar() in DarwinAudioSession.m

Expected behavior
Each listener should only be called once

Screenshots
If applicable, add screenshots to help explain your problem.

Runtime Environment (please complete the following information if relevant):

  • Device: Iphone Se (second generation)
  • OS: iOS 14.0.1

Flutter SDK version

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 1.22.0, on Mac OS X 10.15.7 19H2, locale en-GB)

[✓] Android toolchain - develop for Android devices (Android SDK version 29.0.2)
[✓] Xcode - develop for iOS and macOS (Xcode 12.0.1)
[✓] Android Studio (version 4.0)
[✓] IntelliJ IDEA Community Edition (version 2018.2.4)
[✓] VS Code (version 1.49.3)
[✓] Connected device (1 available)
    ! Error: Auglysingadeild4thgen is not connected. Xcode will continue when Auglysingadeild4thgen is
      connected. (code -13)
    ! Error: Living Room is not connected. Xcode will continue when Living Room is connected. (code
      -13)
    ! Error: Living Room is not connected. Xcode will continue when Living Room is connected. (code
      -13)

• No issues found!

turn off music when in background

I am using just_audio to play music in flutter app on android but its playing even when app is in background like I want to pause the music whenever I m outside of the app and continue when I m into the app again . I m unable to find any option. Any help ?

Warnings from Xcode

The plugin seems to work as expected, but Xcode shows a couple of warnings during the build. I post here in case you were not aware of them.

Warning 1

/users/andreivolgin/flutter/.pub-cache/hosted/pub.dartlang.org/audio_session-0.0.9/ios/Classes/AudioSessionPlugin.m:16:5: Expression result unused

@implementation AudioSessionPlugin {
    DarwinAudioSession *_darwinAudioSession;
    FlutterMethodChannel *_channel;
}

+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {
    if (!plugins) {
        plugins = [NSHashTable weakObjectsHashTable];
    }
    [[AudioSessionPlugin alloc] initWithRegistrar:registrar]; // This line is marked as "Expression result unused"
}

Warning 2

/users/andreivolgin/flutter/.pub-cache/hosted/pub.dartlang.org/audio_session-0.0.9/ios/Classes/DarwinAudioSession.m:402:22: Incompatible pointer types assigning to 'NSNumber *' from 'NSNull * _Nonnull'

This is the relevant code:

    if (wasSuspended == nil) {
        wasSuspended = [NSNull null];
    }

AVAudioSessionInterruptionWasSuspendedKey not handled on iOS

Which API doesn't behave as documented, and how does it misbehave?
When pausing from lock screen, locking phone and then pressing play from lock screen sends an interrupt
AVAudioSessionInterruptionWasSuspendedKey

To Reproduce (i.e. user steps, not code)

  1. Run audio_service example on iOS with audio_session 0.8
  2. Start playing audio
  3. Go to lock screen
  4. Pause on lock screen
  5. Lock screen / Turn off screen
  6. Wait about 30 sec
  7. Press play from lock screen
  8. Interrupt is received right after playing is started, music is paused and play button is disabled

Error messages

AudioInterrupt(begin:true)

Expected behavior
Start playing

Runtime Environment (please complete the following information if relevant):

  • Device: Iphone SE (Second generation)
  • OS: iOS 14.0.1

Flutter SDK version

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 1.22.0, on Mac OS X 10.15.7 19H2, locale en-GB)

[✓] Android toolchain - develop for Android devices (Android SDK version 29.0.2)
[✓] Xcode - develop for iOS and macOS (Xcode 12.0.1)
[✓] Android Studio (version 4.0)
[✓] IntelliJ IDEA Community Edition (version 2018.2.4)
[✓] VS Code (version 1.49.3)
[!] Connected device
    ! Error: Auglysingadeild4thgen is not connected. Xcode will continue when
      Auglysingadeild4thgen is connected. (code -13)
    ! Error: Living Room is not connected. Xcode will continue when Living Room
      is connected. (code -13)
    ! Error: Living Room is not connected. Xcode will continue when Living Room
      is connected. (code -13)

! Doctor found issues in 1 category.

Additional context
My workaround is using this code and working as intended.

- (void) audioInterrupt:(NSNotification*)notification {
    NSNumber *interruptionType = (NSNumber*)[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey];
    NSLog(@"audioInterrupt");
    switch ([interruptionType integerValue]) {
          case AVAudioSessionInterruptionTypeBegan:
        {
            if (@available(iOS 10.3, *)) {
                if ([notification.userInfo valueForKey:AVAudioSessionInterruptionWasSuspendedKey]) {
                    NSLog(@"audioInterrupt - DISCARDED BECAUSE OF SUSPENDED KEY");
                    break;
                }
            } else {
                // Fallback on earlier versions
            }
            [self invokeMethod:@"onInterruptionEvent" arguments:@[@(0), @(0)]];
            break;
        }
        case AVAudioSessionInterruptionTypeEnded:
        {
            if ([(NSNumber*)[notification.userInfo valueForKey:AVAudioSessionInterruptionOptionKey] intValue] == AVAudioSessionInterruptionOptionShouldResume) {
                [self invokeMethod:@"onInterruptionEvent" arguments:@[@(1), @(1)]];
            } else {
                [self invokeMethod:@"onInterruptionEvent" arguments:@[@(1), @(0)]];
            }
            break;
        }
        default:
            break;
    }
}

Android 5 ExoPlayer crash

Hi, the recent update 0.1.4 broke my app on old Android devices, but was later fixed by 0.1.5 so now it is all good.
To investigate the issue I set up an Android 5 emulator (didn't have it before so would not know wether it used to work on versions lower than 0.1.4 or not), and it turns out that (after the initial issues were fixed, upgrading to 0.1.5), the plugin crashes upon trying to play some audio.

...
I/OMXClient(20748): Using client-side OMX mux.
E/AudioTrack(20748): AudioFlinger could not create track, status: -12
E/AudioTrack-JNI(20748): Error -12 initializing AudioTrack
E/android.media.AudioTrack(20748): Error code -20 when initializing AudioTrack.
E/ExoPlayerImplInternal(20748): Playback error
E/ExoPlayerImplInternal(20748):   com.google.android.exoplayer2.ExoPlaybackException: MediaCodecAudioRenderer error, index=1, format=Format(null, null, null, audio/raw, null, -1, null, [-1, -1, -1.0], [2, 44100]), format_supported=YES
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.ExoPlayerImplInternal.handleMessage(ExoPlayerImplInternal.java:554)
E/ExoPlayerImplInternal(20748):       at android.os.Handler.dispatchMessage(Handler.java:98)
E/ExoPlayerImplInternal(20748):       at android.os.Looper.loop(Looper.java:135)
E/ExoPlayerImplInternal(20748):       at android.os.HandlerThread.run(HandlerThread.java:61)
E/ExoPlayerImplInternal(20748):   Caused by: com.google.android.exoplayer2.audio.AudioSink$InitializationException: AudioTrack init failed 0 Config(44100, 12, 44100)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.audio.DefaultAudioSink$Configuration.buildAudioTrack(DefaultAudioSink.java:1999)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.audio.DefaultAudioSink.buildAudioTrack(DefaultAudioSink.java:821)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.audio.DefaultAudioSink.initializeAudioTrack(DefaultAudioSink.java:626)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.audio.DefaultAudioSink.handleBuffer(DefaultAudioSink.java:699)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.audio.MediaCodecAudioRenderer.processOutputBuffer(MediaCodecAudioRenderer.java:630)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.drainOutputBuffer(MediaCodecRenderer.java:1854)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.render(MediaCodecRenderer.java:824)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.ExoPlayerImplInternal.doSomeWork(ExoPlayerImplInternal.java:947)
E/ExoPlayerImplInternal(20748):       at com.google.android.exoplayer2.ExoPlayerImplInternal.handleMessage(ExoPlayerImplInternal.java:477)
E/ExoPlayerImplInternal(20748):       ... 3 more
E/AudioPlayer(20748): TYPE_RENDERER: AudioTrack init failed 0 Config(44100, 12, 44100)
...

I see this message repeated multiple times (3?) every time my app tries to play some audio (simple .mp3 files).

Is there anything that can be done about this?

duck iOS external music when my flutter app start playing audio.

I was able to perform the above scenario but not able to get it on iOS.
Here is my config code-------->>>>>

await session.configure(AudioSessionConfiguration(
avAudioSessionCategory: AVAudioSessionCategory.playback,
avAudioSessionCategoryOptions: AVAudioSessionCategoryOptions.duckOthers,
avAudioSessionMode: AVAudioSessionMode.spokenAudio,
avAudioSessionRouteSharingPolicy: AVAudioSessionRouteSharingPolicy.defaultPolicy,
avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.notifyOthersOnDeactivation,
androidAudioAttributes: const AndroidAudioAttributes(
contentType: AndroidAudioContentType.music,
flags: AndroidAudioFlags.none,
usage: AndroidAudioUsage.notification,
),
androidAudioFocusGainType: AndroidAudioFocusGainType.gainTransientMayDuck,
androidWillPauseWhenDucked: true,
));

Ducking Spotify/music playesr

I'm trying to make an app similar to Google Maps where some audio should duck Spotify/any music player's output temporarily on Android.

androidAudioFocusGainType: AndroidAudioFocusGainType.gainTransientMayDuck,

but this doesn't work for Spotify, while it does work with my local music player (Poweramp). However, once the audio is ducked, it does not go back to its full volume even after the audio my app plays is complete.

How do I achieve a Google Maps like effect with my audio? I wonder if androidWillPauseWhenDucked has anything to do with it?

I also tried

androidAudioAttributes: const AndroidAudioAttributes(
	usage: AndroidAudioUsage.assistanceNavigationGuidance,
),

but it had the same issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.