Skip to content

Commit a114ac2

Browse files
[camera_avfoundation] Implementation swift migration - part 11 (#9690)
Migrates camera implementation as part of flutter/flutter#119109 Resolves flutter/flutter#170439. This PR migrates the last one of the problematic methods (`startImageStream`) to Switft, which resolves the issue. This PR migrates the 8th chunk of `FLTCam` class to Swift: * `startImageStream` * `setUpCaptureSessionForAudioIfNeeded` * `reportErrorMessage` (ObjC implementation removal) Some properties of the FLTCam have to be temporarily made public so that they are accessible in DefaultCamera. ## Pre-Review Checklist **Note**: The Flutter team is currently trialing the use of [Gemini Code Assist for GitHub](https://developers.google.com/gemini-code-assist/docs/review-github-code). Comments from the `gemini-code-assist` bot should not be taken as authoritative feedback from the Flutter team. If you find its comments useful you can update your code accordingly, but if you are unsure or disagree with the feedback, please feel free to wait for a Flutter team member's review for guidance on which automated comments should be addressed. [^1]: Regular contributors who have demonstrated familiarity with the repository guidelines only need to comment if the PR is not auto-exempted by repo tooling.
1 parent bc04e36 commit a114ac2

File tree

6 files changed

+146
-154
lines changed

6 files changed

+146
-154
lines changed

packages/camera/camera_avfoundation/CHANGELOG.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
## 0.9.21+1
2+
3+
* Migrates `startImageStream` and `setUpCaptureSessionForAudioIfNeeded` methods to Swift.
4+
* Removes Objective-C implementation of `reportErrorMessage` method.
5+
16
## 0.9.21
27

38
* Fixes crash when streaming is enabled during recording.

packages/camera/camera_avfoundation/ios/camera_avfoundation/Sources/camera_avfoundation/DefaultCamera.swift

Lines changed: 136 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,92 @@ final class DefaultCamera: FLTCam, Camera {
9292
return (captureVideoInput, captureVideoOutput, connection)
9393
}
9494

95+
func setUpCaptureSessionForAudioIfNeeded() {
96+
// Don't setup audio twice or we will lose the audio.
97+
guard !mediaSettings.enableAudio || !isAudioSetup else { return }
98+
99+
let audioDevice = audioCaptureDeviceFactory()
100+
do {
101+
// Create a device input with the device and add it to the session.
102+
// Setup the audio input.
103+
let audioInput = try captureDeviceInputFactory.deviceInput(with: audioDevice)
104+
105+
// Setup the audio output.
106+
let audioOutput = AVCaptureAudioDataOutput()
107+
108+
let block = {
109+
// Set up options implicit to AVAudioSessionCategoryPlayback to avoid conflicts with other
110+
// plugins like video_player.
111+
DefaultCamera.upgradeAudioSessionCategory(
112+
requestedCategory: .playAndRecord,
113+
options: [.defaultToSpeaker, .allowBluetoothA2DP, .allowAirPlay]
114+
)
115+
}
116+
117+
if !Thread.isMainThread {
118+
DispatchQueue.main.sync(execute: block)
119+
} else {
120+
block()
121+
}
122+
123+
if audioCaptureSession.canAddInput(audioInput) {
124+
audioCaptureSession.addInput(audioInput)
125+
126+
if audioCaptureSession.canAddOutput(audioOutput) {
127+
audioCaptureSession.addOutput(audioOutput)
128+
audioOutput.setSampleBufferDelegate(self, queue: captureSessionQueue)
129+
isAudioSetup = true
130+
} else {
131+
reportErrorMessage("Unable to add Audio input/output to session capture")
132+
isAudioSetup = false
133+
}
134+
}
135+
} catch let error as NSError {
136+
reportErrorMessage(error.description)
137+
}
138+
}
139+
140+
// This function, although slightly modified, is also in video_player_avfoundation (in ObjC).
141+
// Both need to do the same thing and run on the same thread (for example main thread).
142+
// Configure application wide audio session manually to prevent overwriting flag
143+
// MixWithOthers by capture session.
144+
// Only change category if it is considered an upgrade which means it can only enable
145+
// ability to play in silent mode or ability to record audio but never disables it,
146+
// that could affect other plugins which depend on this global state. Only change
147+
// category or options if there is change to prevent unnecessary lags and silence.
148+
private static func upgradeAudioSessionCategory(
149+
requestedCategory: AVAudioSession.Category,
150+
options: AVAudioSession.CategoryOptions
151+
) {
152+
let playCategories: Set<AVAudioSession.Category> = [.playback, .playAndRecord]
153+
let recordCategories: Set<AVAudioSession.Category> = [.record, .playAndRecord]
154+
let requiredCategories: Set<AVAudioSession.Category> = [
155+
requestedCategory, AVAudioSession.sharedInstance().category,
156+
]
157+
158+
let requiresPlay = !requiredCategories.isDisjoint(with: playCategories)
159+
let requiresRecord = !requiredCategories.isDisjoint(with: recordCategories)
160+
161+
var finalCategory = requestedCategory
162+
if requiresPlay && requiresRecord {
163+
finalCategory = .playAndRecord
164+
} else if requiresPlay {
165+
finalCategory = .playback
166+
} else if requiresRecord {
167+
finalCategory = .record
168+
}
169+
170+
let finalOptions = AVAudioSession.sharedInstance().categoryOptions.union(options)
171+
172+
if finalCategory == AVAudioSession.sharedInstance().category
173+
&& finalOptions == AVAudioSession.sharedInstance().categoryOptions
174+
{
175+
return
176+
}
177+
178+
try? AVAudioSession.sharedInstance().setCategory(finalCategory, options: finalOptions)
179+
}
180+
95181
func reportInitializationState() {
96182
// Get all the state on the current thread, not the main thread.
97183
let state = FCPPlatformCameraState.make(
@@ -257,7 +343,6 @@ final class DefaultCamera: FLTCam, Camera {
257343
newAudioWriterInput.expectsMediaDataInRealTime = true
258344
mediaSettingsAVWrapper.addInput(newAudioWriterInput, to: videoWriter)
259345
self.audioWriterInput = newAudioWriterInput
260-
audioOutput.setSampleBufferDelegate(self, queue: captureSessionQueue)
261346
}
262347

263348
if flashMode == .torch {
@@ -728,6 +813,53 @@ final class DefaultCamera: FLTCam, Camera {
728813
completion(nil)
729814
}
730815

816+
func startImageStream(
817+
with messenger: any FlutterBinaryMessenger, completion: @escaping (FlutterError?) -> Void
818+
) {
819+
startImageStream(
820+
with: messenger,
821+
imageStreamHandler: FLTImageStreamHandler(captureSessionQueue: captureSessionQueue),
822+
completion: completion
823+
)
824+
}
825+
826+
func startImageStream(
827+
with messenger: FlutterBinaryMessenger,
828+
imageStreamHandler: FLTImageStreamHandler,
829+
completion: @escaping (FlutterError?) -> Void
830+
) {
831+
if isStreamingImages {
832+
reportErrorMessage("Images from camera are already streaming!")
833+
completion(nil)
834+
return
835+
}
836+
837+
let eventChannel = FlutterEventChannel(
838+
name: "plugins.flutter.io/camera_avfoundation/imageStream",
839+
binaryMessenger: messenger
840+
)
841+
let threadSafeEventChannel = FLTThreadSafeEventChannel(eventChannel: eventChannel)
842+
843+
self.imageStreamHandler = imageStreamHandler
844+
threadSafeEventChannel.setStreamHandler(imageStreamHandler) { [weak self] in
845+
guard let strongSelf = self else {
846+
completion(nil)
847+
return
848+
}
849+
850+
strongSelf.captureSessionQueue.async { [weak self] in
851+
guard let strongSelf = self else {
852+
completion(nil)
853+
return
854+
}
855+
856+
strongSelf.isStreamingImages = true
857+
strongSelf.streamingPendingFramesCount = 0
858+
completion(nil)
859+
}
860+
}
861+
}
862+
731863
func stopImageStream() {
732864
if isStreamingImages {
733865
isStreamingImages = false
@@ -989,6 +1121,9 @@ final class DefaultCamera: FLTCam, Camera {
9891121
}
9901122
}
9911123

1124+
/// Reports the given error message to the Dart side of the plugin.
1125+
///
1126+
/// Can be called from any thread.
9921127
private func reportErrorMessage(_ errorMessage: String) {
9931128
FLTEnsureToRunOnMainQueue { [weak self] in
9941129
self?.dartAPI?.reportError(errorMessage) { _ in

packages/camera/camera_avfoundation/ios/camera_avfoundation/Sources/camera_avfoundation_objc/FLTCam.m

Lines changed: 1 addition & 142 deletions
Original file line numberDiff line numberDiff line change
@@ -28,17 +28,11 @@ @interface FLTCam () <AVCaptureVideoDataOutputSampleBufferDelegate,
2828
@property(strong, nonatomic)
2929
NSObject<FLTAssetWriterInputPixelBufferAdaptor> *assetWriterPixelBufferAdaptor;
3030
@property(strong, nonatomic) AVCaptureVideoDataOutput *videoOutput;
31-
@property(assign, nonatomic) BOOL isAudioSetup;
3231

3332
/// A wrapper for CMVideoFormatDescriptionGetDimensions.
3433
/// Allows for alternate implementations in tests.
3534
@property(nonatomic, copy) VideoDimensionsForFormat videoDimensionsForFormat;
36-
/// A wrapper for AVCaptureDevice creation to allow for dependency injection in tests.
37-
@property(nonatomic, copy) AudioCaptureDeviceFactory audioCaptureDeviceFactory;
38-
/// Reports the given error message to the Dart side of the plugin.
39-
///
40-
/// Can be called from any thread.
41-
- (void)reportErrorMessage:(NSString *)errorMessage;
35+
4236
@end
4337

4438
@implementation FLTCam
@@ -308,139 +302,4 @@ - (BOOL)setCaptureSessionPreset:(FCPPlatformResolutionPreset)resolutionPreset
308302
return bestFormat;
309303
}
310304

311-
- (void)startImageStreamWithMessenger:(NSObject<FlutterBinaryMessenger> *)messenger
312-
completion:(void (^)(FlutterError *))completion {
313-
[self startImageStreamWithMessenger:messenger
314-
imageStreamHandler:[[FLTImageStreamHandler alloc]
315-
initWithCaptureSessionQueue:_captureSessionQueue]
316-
completion:completion];
317-
}
318-
319-
- (void)startImageStreamWithMessenger:(NSObject<FlutterBinaryMessenger> *)messenger
320-
imageStreamHandler:(FLTImageStreamHandler *)imageStreamHandler
321-
completion:(void (^)(FlutterError *))completion {
322-
if (!_isStreamingImages) {
323-
id<FLTEventChannel> eventChannel = [FlutterEventChannel
324-
eventChannelWithName:@"plugins.flutter.io/camera_avfoundation/imageStream"
325-
binaryMessenger:messenger];
326-
FLTThreadSafeEventChannel *threadSafeEventChannel =
327-
[[FLTThreadSafeEventChannel alloc] initWithEventChannel:eventChannel];
328-
329-
_imageStreamHandler = imageStreamHandler;
330-
__weak typeof(self) weakSelf = self;
331-
[threadSafeEventChannel setStreamHandler:_imageStreamHandler
332-
completion:^{
333-
typeof(self) strongSelf = weakSelf;
334-
if (!strongSelf) {
335-
completion(nil);
336-
return;
337-
}
338-
339-
dispatch_async(strongSelf.captureSessionQueue, ^{
340-
// cannot use the outter strongSelf
341-
typeof(self) strongSelf = weakSelf;
342-
if (!strongSelf) {
343-
completion(nil);
344-
return;
345-
}
346-
347-
strongSelf.isStreamingImages = YES;
348-
strongSelf.streamingPendingFramesCount = 0;
349-
completion(nil);
350-
});
351-
}];
352-
} else {
353-
[self reportErrorMessage:@"Images from camera are already streaming!"];
354-
completion(nil);
355-
}
356-
}
357-
358-
// This function, although slightly modified, is also in video_player_avfoundation.
359-
// Both need to do the same thing and run on the same thread (for example main thread).
360-
// Configure application wide audio session manually to prevent overwriting flag
361-
// MixWithOthers by capture session.
362-
// Only change category if it is considered an upgrade which means it can only enable
363-
// ability to play in silent mode or ability to record audio but never disables it,
364-
// that could affect other plugins which depend on this global state. Only change
365-
// category or options if there is change to prevent unnecessary lags and silence.
366-
static void upgradeAudioSessionCategory(AVAudioSessionCategory requestedCategory,
367-
AVAudioSessionCategoryOptions options) {
368-
NSSet *playCategories = [NSSet
369-
setWithObjects:AVAudioSessionCategoryPlayback, AVAudioSessionCategoryPlayAndRecord, nil];
370-
NSSet *recordCategories =
371-
[NSSet setWithObjects:AVAudioSessionCategoryRecord, AVAudioSessionCategoryPlayAndRecord, nil];
372-
NSSet *requiredCategories =
373-
[NSSet setWithObjects:requestedCategory, AVAudioSession.sharedInstance.category, nil];
374-
BOOL requiresPlay = [requiredCategories intersectsSet:playCategories];
375-
BOOL requiresRecord = [requiredCategories intersectsSet:recordCategories];
376-
if (requiresPlay && requiresRecord) {
377-
requestedCategory = AVAudioSessionCategoryPlayAndRecord;
378-
} else if (requiresPlay) {
379-
requestedCategory = AVAudioSessionCategoryPlayback;
380-
} else if (requiresRecord) {
381-
requestedCategory = AVAudioSessionCategoryRecord;
382-
}
383-
options = AVAudioSession.sharedInstance.categoryOptions | options;
384-
if ([requestedCategory isEqualToString:AVAudioSession.sharedInstance.category] &&
385-
options == AVAudioSession.sharedInstance.categoryOptions) {
386-
return;
387-
}
388-
[AVAudioSession.sharedInstance setCategory:requestedCategory withOptions:options error:nil];
389-
}
390-
391-
- (void)setUpCaptureSessionForAudioIfNeeded {
392-
// Don't setup audio twice or we will lose the audio.
393-
if (!_mediaSettings.enableAudio || _isAudioSetup) {
394-
return;
395-
}
396-
397-
NSError *error = nil;
398-
// Create a device input with the device and add it to the session.
399-
// Setup the audio input.
400-
NSObject<FLTCaptureDevice> *audioDevice = self.audioCaptureDeviceFactory();
401-
NSObject<FLTCaptureInput> *audioInput =
402-
[_captureDeviceInputFactory deviceInputWithDevice:audioDevice error:&error];
403-
if (error) {
404-
[self reportErrorMessage:error.description];
405-
}
406-
// Setup the audio output.
407-
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
408-
409-
dispatch_block_t block = ^{
410-
// Set up options implicit to AVAudioSessionCategoryPlayback to avoid conflicts with other
411-
// plugins like video_player.
412-
upgradeAudioSessionCategory(AVAudioSessionCategoryPlayAndRecord,
413-
AVAudioSessionCategoryOptionDefaultToSpeaker |
414-
AVAudioSessionCategoryOptionAllowBluetoothA2DP |
415-
AVAudioSessionCategoryOptionAllowAirPlay);
416-
};
417-
if (!NSThread.isMainThread) {
418-
dispatch_sync(dispatch_get_main_queue(), block);
419-
} else {
420-
block();
421-
}
422-
423-
if ([_audioCaptureSession canAddInput:audioInput]) {
424-
[_audioCaptureSession addInput:audioInput];
425-
426-
if ([_audioCaptureSession canAddOutput:_audioOutput]) {
427-
[_audioCaptureSession addOutput:_audioOutput];
428-
_isAudioSetup = YES;
429-
} else {
430-
[self reportErrorMessage:@"Unable to add Audio input/output to session capture"];
431-
_isAudioSetup = NO;
432-
}
433-
}
434-
}
435-
436-
- (void)reportErrorMessage:(NSString *)errorMessage {
437-
__weak typeof(self) weakSelf = self;
438-
FLTEnsureToRunOnMainQueue(^{
439-
[weakSelf.dartAPI reportError:errorMessage
440-
completion:^(FlutterError *error){
441-
// Ignore any errors, as this is just an event broadcast.
442-
}];
443-
});
444-
}
445-
446305
@end

packages/camera/camera_avfoundation/ios/camera_avfoundation/Sources/camera_avfoundation_objc/include/camera_avfoundation/FLTCam.h

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -64,16 +64,14 @@ NS_ASSUME_NONNULL_BEGIN
6464
@property(readonly, nonatomic) FLTCamMediaSettingsAVWrapper *mediaSettingsAVWrapper;
6565
@property(readonly, nonatomic) FCPPlatformMediaSettings *mediaSettings;
6666
@property(nonatomic, copy) InputPixelBufferAdaptorFactory inputPixelBufferAdaptorFactory;
67-
@property(strong, nonatomic) AVCaptureAudioDataOutput *audioOutput;
67+
@property(assign, nonatomic) BOOL isAudioSetup;
68+
/// A wrapper for AVCaptureDevice creation to allow for dependency injection in tests.
69+
@property(nonatomic, copy) AudioCaptureDeviceFactory audioCaptureDeviceFactory;
6870

6971
/// Initializes an `FLTCam` instance with the given configuration.
7072
/// @param error report to the caller if any error happened creating the camera.
7173
- (instancetype)initWithConfiguration:(FLTCamConfiguration *)configuration error:(NSError **)error;
7274

73-
- (void)startImageStreamWithMessenger:(NSObject<FlutterBinaryMessenger> *)messenger
74-
completion:(nonnull void (^)(FlutterError *_Nullable))completion;
75-
- (void)setUpCaptureSessionForAudioIfNeeded;
76-
7775
// Methods exposed for the Swift DefaultCamera subclass
7876
- (void)updateOrientation;
7977

packages/camera/camera_avfoundation/ios/camera_avfoundation/Sources/camera_avfoundation_objc/include/camera_avfoundation/FLTCam_Test.h

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,4 @@
3131
@property(readonly, nonatomic)
3232
NSMutableDictionary<NSNumber *, FLTSavePhotoDelegate *> *inProgressSavePhotoDelegates;
3333

34-
/// Start streaming images.
35-
- (void)startImageStreamWithMessenger:(NSObject<FlutterBinaryMessenger> *)messenger
36-
imageStreamHandler:(FLTImageStreamHandler *)imageStreamHandler
37-
completion:(void (^)(FlutterError *))completion;
38-
3934
@end

packages/camera/camera_avfoundation/pubspec.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ name: camera_avfoundation
22
description: iOS implementation of the camera plugin.
33
repository: https://github.com/flutter/packages/tree/main/packages/camera/camera_avfoundation
44
issue_tracker: https://github.com/flutter/flutter/issues?q=is%3Aissue+is%3Aopen+label%3A%22p%3A+camera%22
5-
version: 0.9.21
5+
version: 0.9.21+1
66

77
environment:
88
sdk: ^3.6.0

0 commit comments

Comments
 (0)