Callkit 和 Webrtc 从锁定屏幕接听电话时没有音频
Posted
技术标签:
【中文标题】Callkit 和 Webrtc 从锁定屏幕接听电话时没有音频【英文标题】:Callkit and Webrtc no audio when accept call from locked screen 【发布时间】:2019-07-19 13:14:57 【问题描述】:我正在尝试让 callkit 在来电时与 webrtc 一起工作,但是当我接到电话并从锁定屏幕接受它时,在我以前台模式运行应用程序之前没有声音。我已经配置了音频会话向 RTCAudiosession 发送通知,但它不起作用。你有一些解决方法吗?
func configureAudioSession()
let sharedSession = AVAudioSession.sharedInstance()
do
try sharedSession.setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeVideoChat, options: .mixWithOthers)
try sharedSession.setMode(AVAudioSessionModeVideoChat)
// try sharedSession.setAggregatedIOPreference(AVAudioSessionIOType.aggregated)
catch
debugPrint("Failed to configure `AVAudioSession`")
func handleIncomingCall(spaceName:String)
if callUUID != nil
oldCallUUID = callUUID
callUUID = UUID()
print("CallManager handle uuid = \(callUUID?.description)")
let update = CXCallUpdate()
update.hasVideo = true
update.remoteHandle = CXHandle(type: .generic, value: spaceName)
self.configureAudioSession()
provider?.reportNewIncomingCall(with: callUUID!, update: update, completion: error in
print("CallManager report new incoming call completion")
)
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession)
print("CallManager didActivate")
RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = true
self.callDelegate?.callIsAnswered()
func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession)
print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = false
【问题讨论】:
【参考方案1】:好的,我找到了问题的原因。在 IOS 12 中,webrtc 存在问题,当您从锁定屏幕启动 webrtc 并尝试访问相机时 - 输出音量中断,因此解决方案是检查屏幕是否处于活动状态,如果不是 - 不要请求并将本地 RTCVideoTrack 添加到您的 RTCStream 中。
【讨论】:
我们如何检查屏幕是否处于活动状态? 你好,我的英雄,我有一个包含 rtc 功能的 CallScreen(仅从这里开始),当手机锁定时,我使用 callkit 显示 CallScreen,当用户接受呼叫时,我从 windowRootView 呈现 CallScreen,所以我想问:屏幕锁定时,用户滑动接听电话但没有点击按钮打开CallScreen,那么如何启动rtc?谢谢 您只需将音轨添加到您的 videoview 存根中,并且当屏幕应用程序启动时 - 将视频轨添加到【参考方案2】:您的测试 iPhone 的 iOS 版本是什么?
【讨论】:
这是 12.3.1 iphone XS Max,带 2 张 sim 卡【参考方案3】:请注意,我分享了我的代码及其即将满足我的需求,我分享以供参考。您需要根据需要进行更改。
当您收到 voip 通知时,创建您的 webrtc 处理类的新事件,并且 将这两行添加到代码块中,因为从 voip 通知启用音频会话失败
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
didReceive 方法;
func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType, completion: @escaping () -> Void)
let state = UIApplication.shared.applicationState
if(payload.dictionaryPayload["hangup"] == nil && state != .active
)
Globals.voipPayload = payload.dictionaryPayload as! [String:Any] // I pass parameters to Webrtc handler via Global singleton to create answer according to sdp sent by payload.
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
Globals.sipGateway = SipGateway() // my Webrtc and Janus gateway handler class
Globals.sipGateway?.configureCredentials(true) // I check janus gateway credentials stored in Shared preferences and initiate websocket connection and create peerconnection
to my janus gateway which is signaling server for my environment
initProvider() //Crating callkit provider
self.update.remoteHandle = CXHandle(type: .generic, value:String(describing: payload.dictionaryPayload["caller_id"]!))
Globals.callId = UUID()
let state = UIApplication.shared.applicationState
Globals.provider.reportNewIncomingCall(with:Globals.callId , update: self.update, completion: error in
)
func initProvider()
let config = CXProviderConfiguration(localizedName: "ulakBEL")
config.iconTemplateImageData = UIImage(named: "ulakbel")!.pngData()
config.ringtoneSound = "ringtone.caf"
// config.includesCallsInRecents = false;
config.supportsVideo = false
Globals.provider = CXProvider(configuration:config )
Globals.provider.setDelegate(self, queue: nil)
update = CXCallUpdate()
update.hasVideo = false
update.supportsDTMF = true
修改您的 didActivate 和 didDeActive 委托函数,如下所示,
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession)
print("CallManager didActivate")
RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = true
// self.callDelegate?.callIsAnswered()
func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession)
print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = false
在 Webrtc 处理程序类中配置媒体发送者和音频会话
private func createPeerConnection(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate)
let rtcConfig = RTCConfiguration.init()
rtcConfig.iceServers = server.iceServers
rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
rtcConfig.continualGatheringPolicy = .gatherContinually
rtcConfig.sdpSemantics = .planB
let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
pc = sessionFactory.peerConnection(with: rtcConfig, constraints: constraints, delegate: nil)
self.createMediaSenders()
self.configureAudioSession()
if webRTCCallbacks.getJsep() != nil
handleRemoteJsep(webrtcCallbacks: webRTCCallbacks)
媒体发件人;
private func createMediaSenders()
let streamId = "stream"
// Audio
let audioTrack = self.createAudioTrack()
self.pc.add(audioTrack, streamIds: [streamId])
// Video
/* let videoTrack = self.createVideoTrack()
self.localVideoTrack = videoTrack
self.peerConnection.add(videoTrack, streamIds: [streamId])
self.remoteVideoTrack = self.peerConnection.transceivers.first $0.mediaType == .video ?.receiver.track as? RTCVideoTrack
// Data
if let dataChannel = createDataChannel()
dataChannel.delegate = self
self.localDataChannel = dataChannel
*/
private func createAudioTrack() -> RTCAudioTrack
let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
let audioSource = sessionFactory.audioSource(with: audioConstrains)
let audioTrack = sessionFactory.audioTrack(with: audioSource, trackId: "audio0")
return audioTrack
音频会话;
private func configureAudioSession()
self.rtcAudioSession.lockForConfiguration()
do
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
catch let error
debugPrint("Error changeing AVAudioSession category: \(error)")
self.rtcAudioSession.unlockForConfiguration()
请考虑一下,因为我使用回调和委托代码包括委托和回调块。你可以相应地忽略它们!
供参考您也可以在link查看示例
【讨论】:
以上是关于Callkit 和 Webrtc 从锁定屏幕接听电话时没有音频的主要内容,如果未能解决你的问题,请参考以下文章
CallKit + WebRTC:在 iOS 中按下锁定/电源按钮时 CallKit 通话断开
使用 CallKit 接听电话时如何显示 ViewController