微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

【转载】live555学习 --H264数据处理1

        在live555学习(七) --DESCRIBE命令处理中,对如何打开文件并获得其SDP信息已做了描述,这里针对H264数据的处理再做进一步分析。

        当RTSPServer收到对某个媒体的DESCRIBE请求时,它会找到对应的ServerMediaSession,调用ServerMediaSession::generateSDPDescription()。generateSDPDescription()中会遍历调用ServerMediaSession中所有的调用ServerMediaSubsession,通过subsession->sdplines()取得每个Subsession的sdp,合并成一个完整的SDP返回之。我们几乎可以断定,文件的打开和分析应该是在每个Subsession的sdplines()函数中完成的,看看这个函数

  1: char const* ondemandServerMediaSubsession::sdplines()    
  2: {    
  3:     if (fSdplines == NULL) {    
  4:         // We need to construct a set of SDP lines that describe this     
  5:         // subsession (as a unicast stream).  To do so,we first create     
  6:         // dummy (unused) source and "RTPSink" objects, 
  7:         // whose parameters we use for the SDP lines:     
  8:         unsigned estBitrate;    
  9:         FramedSource* inputSource = createNewStreamSource(0,estBitrate);    
 10:         if (inputSource == NULL)    
 11:             return NULL; // file not found     
 12:     
 13:         struct in_addr dummyAddr;    
 14:         dummyAddr.s_addr = 0;    
 15:         Groupsock dummyGroupsock(envir(),dummyAddr,0);    
 16:         unsigned char rtpPayloadType = 96 + trackNumber() - 1; // if dynamic     
 17:         RTPSink* dummyRTPSink = createNewRTPSink(&dummyGroupsock,
 18:                 rtpPayloadType,inputSource);    
 19:     
 20:         setSdplinesFromrTPSink(dummyRTPSink,inputSource,monospace; font-size:12px"> 21:         Medium::close(dummyRTPSink);    
 22:         closeStreamSource(inputSource);    
 23:     }    
 24:     
 25:     return fSdplines;    
 26: }   
 27: 

        Subsession中直接保存了对应媒体文件的SDP,但是在第一次获取时fSdplines为NULL,所以需先获取fSdplines。其做法比较费事,是通过建临时的Source和RTPSink,把它们连接成一个StreamToken,Playing一段时间之后才取得了fSdplines。createNewStreamSource()和createNewRTPSink()都是虚函数,所以此处创建的source和sink都是继承类指定的,我们分析的是H264,也就是H264VideoFileServerMediaSubsession所指定的,来看一下这两个函数

  1: FramedSource* H264VideoFileServerMediaSubsession::createNewStreamSource(    
  2:         /*clientSessionId*/,monospace; font-size:12px">  3:         unsigned& estBitrate)    
  4: {    
  5:     estBitrate = 500; // kbps,estimate     
  6:     
  7:     // Create the video source:     
  8:     ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(),fFileName);    
  9:     if (fileSource == NULL)    
return NULL;    
 11:     fFileSize = fileSource->fileSize();    
 13:     // Create a framer for the Video Elementary Stream:     
 14:     return H264VideoStreamFramer::createNew(envir(),fileSource);    
 15: }    
 16:     
 17: RTPSink* H264VideoFileServerMediaSubsession::createNewRTPSink(    
 18:         Groupsock* rtpGroupsock,monospace; font-size:12px"> 19:         char rtpPayloadTypeIfDynamic,monospace; font-size:12px"> 20:         FramedSource* /*inputSource*/)    
 21: {    
 22:     return H264VideoRTPSink::createNew(envir(),rtpGroupsock,rtpPayloadTypeIfDynamic);    
 23: }   
 24: 

        可以看到,分别创建了H264VideoStreamFramer和H264VideoRTPSink。可以肯定H264VideoStreamFramer也是一个Source,但它内部又利用了另一个source--ByteStreamFileSource。后面会分析为什么要这样做,这里先不要管它。还没有看到真正打开文件代码,继续探索:

void ondemandServerMediaSubsession::setSdplinesFromrTPSink(    
  2:         RTPSink* rtpSink,monospace; font-size:12px">  3:         FramedSource* inputSource,255)">unsigned estBitrate)    
  5: {    
  6:     if (rtpSink == NULL)    
return;    
  8:     
const* mediaType = rtpSink->sdpMediaType();    
 10:     char rtpPayloadType = rtpSink->rtpPayloadType();    
 11:     struct in_addr serverAddrForSDP;    
 12:     serverAddrForSDP.s_addr = fServerAddressForSDP;    
char* const ipAddressstr = strDup(our_inet_ntoa(serverAddrForSDP));    
char* rtpmapLine = rtpSink->rtpmapLine();    
 15:     const* rangeLine = rangeSdpline();    
 16:     const* auxSdpline = getAuxSdpline(rtpSink,monospace; font-size:12px"> 17:     if (auxSdpline == NULL)    
 18:         auxSdpline = "";    
 20:     const* const sdpFmt = "m=%s %u RTP/AVP %d\r\n"    
 21:             "c=IN IP4 %s\r\n"    
 22:             "b=AS:%u\r\n"    
 23:             "%s"    
 24:             " 25:             " 26:             "a=control:%s\r\n";    
 27:     unsigned sdpFmtSize = strlen(sdpFmt) + strlen(mediaType) + 5 /* max short len */    
 28:     + 3 /* max char len */    
 29:     + strlen(ipAddressstr) + 20 /* max int len */    
 30:     + strlen(rtpmapLine) + strlen(rangeLine) + strlen(auxSdpline)    
 31:             + strlen(trackId());    
 32:     char* sdplines = new char[sdpFmtSize];    
 33:     sprintf(sdplines,sdpFmt,mediaType,// m= <media>     
 34:             fPortNumForSDP,0)">// m= <port>     
 35:             rtpPayloadType,0)">// m= <fmt list>     
 36:             ipAddressstr,0)">// c= address     
 37:             estBitrate,0)">// b=AS:<bandwidth>     
 38:             rtpmapLine,0)">// a=rtpmap:... (if present)     
 39:             rangeLine,0)">// a=range:... (if present)     
 40:             auxSdpline,0)">// optional extra SDP line     
 41:             trackId()); // a=control:<track-id>     
 42:     delete[] (char*) rangeLine;    
 43:     delete[] rtpmapLine;    
 44:     delete[] ipAddressstr;    
 45:     
 46:     fSdplines = strDup(sdplines);    
 47:     delete[] sdplines;    
 48: }   
 49: 

        此函数中取得Subsession的sdp并保存到fSdplines。打开文件应在rtpSink->rtpmapLine()甚至是Source创建时已经做了。我们不防先把它放一放,而是先把SDP的获取过程搞个通透。所以把焦点集中到getAuxSdpline()上。

const* ondemandServerMediaSubsession::getAuxSdpline(    
  3:         FramedSource*   5:     // Default implementation:     
return rtpSink == NULL ? NULL : rtpSink->auxSdpline();    
  7: }   
  8: 

        很简单,调用了rtpSink->auxSdpline()那么我们要看H264VideoRTPSink::auxSdpline():不用看了,很简单,取得source 中保存的PPS,SPS等形成a=fmpt行。但事实上并没有这么简单,H264VideoFileServerMediaSubsession重写了getAuxSdpline()!如果不重写,则说明auxSdpline已经在前面分析文件时获得了,那么既然重写,就说明前面没有获取到,只能在这函数中重写。H264VideoFileServerMediaSubsession中这个函数:

const* H264VideoFileServerMediaSubsession::getAuxSdpline(    
  3:         FramedSource* inputSource)    
if (fAuxSdpline != NULL)    
return fAuxSdpline; // it's already been set up (for a prevIoUs client)     
  7:     
  8:     if (fDummyRTPSink == NULL) { // we're not already setting it up for another,concurrent stream     
  9:         // Note: For H264 video files,the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't kNown     
// until we start reading the file.  This means that "rtpSink"s "auxSdpline()" will be NULL initially,monospace; font-size:12px"> 11:         // and we need to start reading data from our file until this changes.     
 12:         fDummyRTPSink = rtpSink;    
 13:     
 14:         // Start reading the file:     
 15:         fDummyRTPSink->startPlaying(*inputSource,afterPlayingDummy,255)">this);    
 17:         // Check whether the sink's 'auxSdpline()' is ready:     
 18:         checkForAuxSdpline( 19:     }    
 20:     
 21:     envir().taskScheduler().doEventLoop(&fDoneFlag);    
 22:     
 23:     return fAuxSdpline;    
 24: }  
 25: 
        注释里面解释得很清楚,H264不能在文件头中取得PPS/SPS,必须在播放一下后(当然,它是一个原始流文件,没有文件头)才行。也就是说不能从rtpSink中取得了。为了保证在函数退出前能取得AuxSDP,把大循环搬到这里来了。afterPlayingDummy()是在播放结束也就是取得aux sdp之后执行。在大循环之前的checkForAuxSdpline()做了什么呢? 
void H264VideoFileServerMediaSubsession::checkForAuxSdpline1()    
const* dasl;    
  4:     
if (fAuxSdpline != NULL) {    
// Signal the event loop that we're done:     
  7:         setDoneFlag();    
  8:     } else if (fDummyRTPSink != NULL    
  9:             && (dasl = fDummyRTPSink->auxSdpline()) != NULL) {    
 10:         fAuxSdpline = strDup(dasl);    
 11:         fDummyRTPSink = NULL;    
 14:         setDoneFlag();    
 15:     } else {    
// try again after a brief delay:     
int uSecsToDelay = 100000; // 100 ms     
 18:         nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,monospace; font-size:12px"> 19:                 (TaskFunc*) checkForAuxSdpline,monospace; font-size:12px"> 20:     }    
 21: }   
 22: 

        它检查是否已取得Aux sdp,如果取得了,设置结束标志,直接返回。如果没有,就检查是否sink中已取得了aux sdp,如果是,也设置结束标志,返回。如果还没有取得,则把这个检查函数做为delay task加入计划任务中。每100毫秒检查一次,每检查一次主要就是调用一次fDummyRTPSink->auxSdpline()。大循环在检测到fDoneFlag改变时停止,此时已取得了aux sdp。但是如果直到文件结束也没有得到aux sdp,则afterPlayingDummy( )被执行,在其中停止掉这个大循环。然后在父类Subsession中关掉这些临时的source和sink。在真正播放时重新创建。

http://blog.csdn.net/nkmnkm/article/details/6931400

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐