如何解决如何在 iOS (Flutter) 上调用 Agora RTC SDK 时获取麦克风输入流?
我想做什么
嗨。我正在使用 Google Speech-to-text 制作语音通话应用程序。为了实现这一点,我需要不断地将麦克风输入流传送到谷歌语音到文本。
问题
不幸的是,Agora RTC SDK 和获取麦克风输入流在 iOS 上似乎存在冲突。他们在 Android 设备上工作。
环境
iPhone X (14.4.2)
❯ flutter doctor
Doctor summary (to see all details,run flutter doctor -v):
[✓] Flutter (Channel stable,2.2.0,on macOS 11.2.3 20D91 darwin-x64,locale en-JP)
[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.3)
[✓] Xcode - develop for iOS and macOS
[✓] Chrome - develop for the web
[✓] Android Studio (version 4.1)
[✓] VS Code (version 1.56.2)
[✓] Connected device (3 available)
• No issues found!
我关注了这两个项目。要获得麦克风输入流, https://github.com/CasperPas/flutter-sound-stream/tree/master/example 并初始化 Agora RTC, https://github.com/Meherdeep/Agora-Voice-Bot
代码
main.dart(放置 Agora APP ID 和 Token)。此示例未使用 RtcEngineEventHandler
,但我只想在 RtcEngineEventHandler
的 joinChannelSuccess
回调中启动语音转文本。
import 'dart:async';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:sound_stream/sound_stream.dart';
import 'package:agora_rtc_engine/agora_rtc_engine.dart';
import 'package:samplestream/utils/appID.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
RecorderStream _recorder = RecorderStream();
PlayerStream _player = PlayerStream();
List<Uint8List> _micChunks = [];
bool _isRecording = false;
bool _isPlaying = false;
StreamSubscription _recorderStatus;
StreamSubscription _playerStatus;
StreamSubscription _audioStream;
@override
void initState() {
super.initState();
initPlugin();
initialize();
}
@override
void dispose() {
_recorderStatus?.cancel();
_playerStatus?.cancel();
_audioStream?.cancel();
super.dispose();
// destroy sdk
AgoraRtcEngine.leaveChannel();
AgoraRtcEngine.destroy();
super.dispose();
}
// Platform messages are asynchronous,so we initialize in an async method.
Future<void> initPlugin() async {
_recorderStatus = _recorder.status.listen((status) {
if (mounted)
setState(() {
_isRecording = status == SoundStreamStatus.Playing;
});
});
_audioStream = _recorder.audioStream.listen((data) {
print(data);
if (_isPlaying) {
_player.writeChunk(data);
} else {
_micChunks.add(data);
}
});
_playerStatus = _player.status.listen((status) {
if (mounted)
setState(() {
_isPlaying = status == SoundStreamStatus.Playing;
});
});
await Future.wait([
_recorder.initialize(),_player.initialize(),]);
}
void _play() async {
await _player.start();
if (_micChunks.isNotEmpty) {
for (var chunk in _micChunks) {
await _player.writeChunk(chunk);
}
_micChunks.clear();
}
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Plugin example app'),),body: Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,children: [
IconButton(
iconSize: 96.0,icon: Icon(_isRecording ? Icons.mic_off : Icons.mic),onPressed: _isRecording ? _recorder.stop : _recorder.start,IconButton(
iconSize: 96.0,icon: Icon(_isPlaying ? Icons.pause : Icons.play_arrow),onPressed: _isPlaying ? _player.stop : _play,],);
}
Future<void> initialize() async {
const String appID = 'putYourAgoraRtcAppID';
await AgoraRtcEngine.setChannelProfile(ChannelProfile.LiveBroadcasting);
await AgoraRtcEngine.create(appID);
// await AgoraRtcEngine.enableWebSdkInteroperability(true);
await AgoraRtcEngine.joinChannel('putYourAgoraRtcToken','test',null,0);
}
}
pubspec.yaml
name: samplestream
description: Demonstrates how to use the sound_stream plugin.
publish_to: 'none'
environment:
sdk: ">=2.7.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
sound_stream: ^0.2.0
cupertino_icons: ^0.1.3
web_socket_channel: ^1.1.0
agora_rtc_engine: ^1.0.12
dev_dependencies:
flutter_test:
sdk: flutter
flutter:
uses-material-design: true
日志
Performing hot restart...
Restarted application in 610ms.
plugin handleMethodCall: setChannelProfile,argus: {
profile = 1;
}
plugin handleMethodCall: create,argus: {
appId = b7...c;
}
plugin handleMethodCall: joinChannel,argus: {
channelId = test;
info = "<null>";
token = "006b7...hg";
uid = 0;
}
[avae] AVAEInternal.h:76 required condition is false: [AVAudioIONodeImpl.mm:1158:SetOutputFormat: (format.sampleRate == hwFormat.sampleRate)]
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio',reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate'
*** First throw call stack:
(0x194d319d8 0x1a90b4b54 0x194c4050c 0x1a4c97984 0x1a4d41998 0x1a4cf5d24 0x1a4cddb3c 0x1a4d5ade8 0x1a4d3c1e4 0x1015d3d20 0x1015d4644 0x1015d11d0 0x1015d1920 0x101d2ad60 0x101a3c07c 0x101d45370 0x101cdf4d4 0x101ce1cfc 0x194cb23e0 0x194cb1fe4 0x194cb14c4 0x194cab850 0x194caaba0 0x1aba10598 0x19759c2f4 0x1975a1874 0x1005de114 0x194989568)
libc++abi.dylib: terminating with uncaught exception of type NSException
* thread #1,queue = 'com.apple.main-thread',stop reason = signal SIGABRT
frame #0: 0x00000001c0bf284c libsystem_kernel.dylib`__pthread_kill + 8
libsystem_kernel.dylib`__pthread_kill:
-> 0x1c0bf284c <+8>: b.lo 0x1c0bf2868 ; <+36>
0x1c0bf2850 <+12>: stp x29,x30,[sp,#-0x10]!
0x1c0bf2854 <+16>: mov x29,sp
0x1c0bf2858 <+20>: bl 0x1c0bcff5c ; cerror_nocancel
Target 0: (Runner) stopped.
Lost connection to device.
感谢阅读!
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。