上文我们已经讲过了,客户端的DESCRIBE命令,接下来,我们继续讲一下剩下的RTSP命令。
发送SETUP命令,就是告诉服务器我们已经准备好了,可以建立连接进行数据传输了。之后,客户端通过之前设置好的RTP/RTCP端口去取流,进行数据的保存或播放,专业点叫做FileSink。
void createPeriodicOutputFiles() { // Create a filename suffix that notes the time interval that's being recorded: char periodicFileNameSuffix[100]; snprintf(periodicFileNameSuffix, sizeof periodicFileNameSuffix, "-%05d-%05d", fileOutputSecondsSoFar, fileOutputSecondsSoFar + fileOutputInterval); createOutputFiles(periodicFileNameSuffix); //按持续时间进行切片 // Schedule an event for writing the next output file: periodicFileOutputTask = env->taskScheduler().scheduleDelayedTask(fileOutputInterval*1000000, (TaskFunc*)periodicFileOutputTimerHandler, (void*)NULL); }
void createOutputFiles(char const* periodicFilenameSuffix) { char outFileName[1000]; //这边表示创建文件 if (outputQuickTimeFile || outputAVIFile) { if (periodicFilenameSuffix[0] == '\0') { // Normally (unless the '-P <interval-in-seconds>' option was given) we output to 'stdout': sprintf(outFileName, "stdout"); } else { // Otherwise output to a type-specific file name, containing "periodicFilenameSuffix": char const* prefix = fileNamePrefix[0] == '\0' ? "output" : fileNamePrefix; snprintf(outFileName, sizeof outFileName, "%s%s.%s", prefix, periodicFilenameSuffix, outputAVIFile ? "avi" : generateMP4Format ? "mp4" : "mov"); } //封装为MP4文件 if (outputQuickTimeFile) { qtOut = QuickTimeFileSink::createNew(*env, *session, outFileName, fileSinkBufferSize, movieWidth, movieHeight, movieFPS, packetLossCompensate, syncStreams, generateHintTracks, generateMP4Format); if (qtOut == NULL) { *env << "Failed to create a \"QuickTimeFileSink\" for outputting to \"" << outFileName << "\": " << env->getResultMsg() << "\n"; shutdown(); } else { *env << "Outputting to the file: \"" << outFileName << "\"\n"; } //开始数据传输,启动取流处理流程 qtOut->startPlaying(sessionAfterPlaying, NULL); } else { // outputAVIFile aviOut = AVIFileSink::createNew(*env, *session, outFileName, fileSinkBufferSize, movieWidth, movieHeight, movieFPS, packetLossCompensate); if (aviOut == NULL) { *env << "Failed to create an \"AVIFileSink\" for outputting to \"" << outFileName << "\": " << env->getResultMsg() << "\n"; shutdown(); } else { *env << "Outputting to the file: \"" << outFileName << "\"\n"; } aviOut->startPlaying(sessionAfterPlaying, NULL); } } }
FileSink已经创建好了,下面就是直接发送PLAY命令,通知服务器进行数据传输了。
//发送PLAY命令,一个是相对时间,一个是绝对时间 void startPlayingSession(MediaSession* session, double start, double end, float scale, RTSPClient::responseHandler* afterFunc) { ourRTSPClient->sendPlayCommand(*session, afterFunc, start, end, scale, ourAuthenticator); } void startPlayingSession(MediaSession* session, char const* absStartTime, char const* absEndTime, float scale, RTSPClient::responseHandler* afterFunc) { ourRTSPClient->sendPlayCommand(*session, afterFunc, absStartTime, absEndTime, scale, ourAuthenticator); }
之后,客户端源源不断的接收服务器发送来的RTP数据包,并进行数据处理,最后收到服务器一个BYE的RTP包后,整个流程就结束了。