Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions Docs/audio.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,23 @@ By default, the SDK automatically configures the `AVAudioSession`. However, this
AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false
```

## Audio engine observers

The SDK uses an `AudioEngineObserver` chain to configure and wire up the audio engine.
On iOS/visionOS/tvOS, the default is:

```swift
AudioManager.shared.set(engineObservers: [AudioManager.shared.audioSession, AudioManager.shared.mixer])
```

If you only want to manage `AVAudioSession` yourself, prefer setting
`AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false` and keep the
default observers.

Setting `AudioManager.shared.set(engineObservers: [])` disables all observers (no session
handling and no mixer setup). Do this only once early in app startup; changing the chain
while the engine is in use can cause rebuilds or unexpected behavior.

## Disabling Voice Processing

Apple's voice processing is enabled by default, such as echo cancellation and auto-gain control.
Expand Down Expand Up @@ -71,6 +88,14 @@ try AudioManager.shared.set(microphoneMuteMode: .voiceProcessing)
| `.restart` | No | Turns off | Slow |
| `.inputMixer` | No | Remains on | Fast |

Notes:

- `.voiceProcessing` uses `AVAudioInputNode.isVoiceProcessingInputMuted`. In our testing,
this can behave like an app-wide mute for voice-processing input, affecting other
`AVAudioEngine` instances that use the mic. If you have another engine that uses mic
input, consider `.inputMixer` or manage muting yourself.
- If your other engines are playback-only, there is typically no impact.

If you disable automatic audio session configuration (`AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false`), the SDK will not touch the session category. Make sure your app sets `.playAndRecord` before unmuting or publishing the mic.
## Capturing Audio Buffers

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false

- `AVAudioSession` must be configured and activated with category `.playAndRecord` and mode `.voiceChat` or `.videoChat` before enabling/publishing the microphone (so the audio engine can start).

To get specific timings of the audio engine lifecycle, you can provide your own `AudioEngineObserver` chain with `AudioManager.shared.set(engineObservers:)`.
To get specific timings of the audio engine lifecycle, you can provide your own `AudioEngineObserver` chain with `AudioManager.shared.set(engineObservers:)`. The default on iOS is `AudioManager.shared.set(engineObservers: [AudioManager.shared.audioSession, AudioManager.shared.mixer])`; configure this once early in app startup and avoid changing it while the engine is in use.

See the default `AudioSessionEngineObserver` for an example of how an `AudioEngineObserver` can configure the audio session.

Expand Down
Loading