微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

AVAudioEngine:播放 Int16 PCM 样本流

如何解决AVAudioEngine:播放 Int16 PCM 样本流

我正在接收 16 位/48 kHz 立体声 PCM 样本流作为 Int16s,我正在尝试使用 AVAudioEngine 播放它们,但是我根本听不到任何声音。我认为这要么与我设置播放器的方式有关,要么与我将数据推送到缓冲区的方式有关。

我已经阅读了很多关于使用音频队列服务的替代解决方案,但是我能找到的所有示例代码都是 Objective-C 或 iOS-only。

如果我有任何 frameSize 问题或其他什么问题,我难道不应该仍然能够至少听到扬声器发出的垃圾声音吗?

这是我的代码


import Foundation
import AVFoundation

class VoicePlayer {
    
    var engine: AVAudioEngine
    
    let format = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatInt16,sampleRate: 48000.0,channels: 2,interleaved: true)!
    let playerNode: AVAudioPlayerNode!
    var audioSession: AVCaptureSession = AVCaptureSession()
    
    init() {
        
        self.audioSession = AVCaptureSession()
        
        self.engine = AVAudioEngine()
        self.playerNode = AVAudioPlayerNode()
        
        self.engine.attach(self.playerNode)
        //engine.connect(self.playerNode,to: engine.mainmixerNode,format:AVAudioFormat.init(standardFormatWithSampleRate: 48000,channels: 2))
        /* If I set my custom format here,AVFoundation complains about the format not being available */
        engine.connect(self.playerNode,to: engine.outputNode,channels: 2))
        engine.prepare()
        try! engine.start()
        self.playerNode.play()
        
    }
    
    
    
    
    func play(buffer: [Int16]) {
        let interleavedChannelCount = 2
        let frameLength = buffer.count / interleavedChannelCount
        let audioBuffer = AVAudioPCMBuffer(pcmFormat: format,frameCapacity: AVAudioFrameCount(frameLength))!
        print("audio buffer size in frames is \(AVAudioFrameCount(frameLength))")
        // buffer contains 2 channel interleaved data
        // audioBuffer contains 2 channel interleaved data
        var buf = buffer
        let size = MemoryLayout<Int16>.stride * interleavedChannelCount * frameLength
        
        
        memcpy(audioBuffer.mutableaudiobufferlist.pointee.mBuffers.mData,&buf,size)
        audioBuffer.frameLength = AVAudioFrameCount(frameLength)
        
        /* Implemented an AVAudioConverter for testing
         Input: 16 bit PCM 48kHz stereo interleaved
         Output: whatever the standard format for the system is
         
         Maybe this is somehow needed as my audio interface doesn't directly support 16 bit audio and can only run at 24 bit?
         */
         let normalBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat.init(standardFormatWithSampleRate: 48000,channels: 2)!,frameCapacity: AVAudioFrameCount(frameLength))
         normalBuffer?.frameLength = AVAudioFrameCount(frameLength)
         let converter = AVAudioConverter(from: format,to: AVAudioFormat.init(standardFormatWithSampleRate: 48000,channels: 2)!)
         var gotData = false
         
         let inputBlock: AVAudioConverterInputBlock = { inNumPackets,outStatus in
         
         if gotData {
         outStatus.pointee = .noDatanow
         return nil
         }
         gotData = true
         outStatus.pointee = .haveData
         return audioBuffer
         }
         
         var error: NSError? = nil
         let status: AVAudioConverterOutputStatus = converter!.convert(to: normalBuffer!,error: &error,withInputFrom: inputBlock);
         
        // Play the output buffer,in this case the audioBuffer,otherwise the normalBuffer
        // Playing the raw audio buffer causes an EXEC_BAD_ACCESS on playback,playing back the buffer from the converter doesn't,but it still doesn't sound anything like a human voice
        self.playerNode.scheduleBuffer(audioBuffer) {
        print("Played")
        }
        
        
    }
    
    
}

任何帮助将不胜感激。

解决方法

将数据复制到 AVAudioPCMBuffer 后,您需要设置其 frameLength 属性以指示它包含多少有效音频。

func play(buffer: [Int16]) {
    let interleavedChannelCount = 2
    let frameLength = buffer.count / interleavedChannelCount
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: format,frameCapacity: AVAudioFrameCount(frameLength))!

    // buffer contains 2 channel interleaved data
    // audioBuffer contains 2 channel interleaved data

    var buf = buffer
    memcpy(audioBuffer.mutableAudioBufferList.pointee.mBuffers.mData,&buf,MemoryLayout<Int16>.stride * interleavedChannelCount * frameLength)

    audioBuffer.frameLength = AVAudioFrameCount(frameLength)

    self.playerNode.scheduleBuffer(audioBuffer) {
        print("Played")
    }
}

编辑:更新了对问题的更改。旧的(现在)不相关的部分:

部分问题在于您的格式不一致。 format 被声明为非交错,但 buffer 是单个 Int16 数组,因此大概代表交错数据。将一个直接复制到另一个可能是不正确的。

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。