Home >Backend Development >PHP Tutorial >PHP implements live streaming function
With the continuous development of the Internet, live broadcast has become an indispensable part of people's daily lives, and the live streaming function is the key to realizing live broadcast. The emergence of PHP has brought a powerful tool to web application development, and the implementation of live streaming function is no exception. This article will introduce how to use PHP to implement the live streaming function.
1. Understand the basic principles of live streaming
Before introducing how to implement the live streaming function, you need to first understand the basic principles of live streaming. Live streaming refers to the process of uploading video data to the server and distributing it to other users when users perform live broadcasts. Specifically, it includes two basic links: live broadcast upload and live broadcast distribution.
Live uploading refers to processing and storing video data uploaded by users, and transcoding it in real time to meet the playback needs of different terminal devices. Live distribution refers to transmitting processed video data to other users for them to watch. Usually live broadcast distribution is divided into two links: server distribution and P2P distribution.
2. Use PHP to implement live broadcast upload
In PHP, there are many libraries that can be used to implement live broadcast upload. Commonly used libraries include FFmpeg, Libav and H264 Streaming Module, etc. These libraries are all written based on C/C language, and the interoperability between PHP and C/C can be achieved through PHP's FFI extension. The specific implementation process is as follows:
1. Install the FFI extension
To use the FFI extension in PHP, you need to install the extension first. You can download the FFI extension from PHP's official website (https://www.php.net/manual/zh/ffi.installation.php) or install it using the package manager. After the installation is complete, add the following code to the php.ini file:
extension=ffi.so
2. Use FFmpeg to implement video encoding
To use FFmpeg to encode video data, you need to divide the video data into several frames and perform encoding conversion on each frame. The specific implementation process is as follows:
use FFI; $ffi = FFI::cdef(' typedef struct AVCodecParameters AVCodecParameters; typedef struct AVCodecContext AVCodecContext; typedef struct AVPacket AVPacket; typedef struct AVFrame AVFrame; typedef struct AVCodec AVCodec; typedef struct SwsContext SwsContext; AVCodec *avcodec_find_encoder(enum AVCodecID id); AVCodecContext *avcodec_alloc_context3(AVCodec *codec); int avcodec_parameters_to_context(AVCodecContext *codec_ctx, const AVCodecParameters *par); int avcodec_open2(AVCodecContext *avctx, const AVCodec *codec, AVDictionary **options); int avcodec_send_frame(AVCodecContext *avctx, const AVFrame *frame); int avcodec_receive_packet(AVCodecContext *avctx, AVPacket *avpkt); AVFrame *av_frame_alloc(); void av_frame_free(AVFrame **frame); SwsContext* sws_getContext(int srcW, int srcH, enum AVPixelFormat srcFormat, int dstW, int dstH, enum AVPixelFormat dstFormat, int flags, SwsFilter *srcFilter, SwsFilter *dstFilter, const double *param); const uint8_t **av_fifo_generic_read(AVFifoBuffer *f, void *dest, int buf_size, void *(*func)(void *, const void *, size_t), void *dest_ctx); int av_fifo_size(const AVFifoBuffer *f); void av_fifo_reset(AVFifoBuffer *f); void av_fifo_free(AVFifoBuffer *f); int sws_scale(SwsContext *c, const uint8_t *const srcSlice[], const int srcStride[], int srcSliceY, int srcSliceH, uint8_t *const dst[], const int dstStride[]); #define AV_CODEC_ID_H264 AV_CODEC_ID_MPEG4 #define AV_PIX_FMT_YUV420P AV_PIX_FMT_YUVJ420P ', 'libavcodec.so.58.54.100, libavutil.so.56.55.100, libavformat.so.58.29.100, libswscale.so.5.5.100'); $codec = $ffi->avcodec_find_encoder(AV_CODEC_ID_H264); $codec_ctx = $ffi->avcodec_alloc_context3($codec); $options = null; $width = 640; $height = 480; $frame_rate = 25; $bit_rate = 400000; $codec_ctx->width = $width; $codec_ctx->height = $height; $codec_ctx->pix_fmt = AV_PIX_FMT_YUV420P; $codec_ctx->bit_rate = $bit_rate; $codec_ctx->time_base = FFI::new('AVRational'); $codec_ctx->time_base->num = 1; $codec_ctx->time_base->den = $frame_rate; $codec_ctx->gop_size = $frame_rate * 2; $codec_ctx->max_b_frames = 1; $codec_ctx->rc_buffer_size = $bit_rate; $codec_ctx->rc_max_rate = $bit_rate; $codec_ctx->rc_min_rate = $bit_rate; $codec_ctx->codec_tag = $codec->id; $codec_ctx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER; if ($codec->id == AV_CODEC_ID_H264) { $codec_ctx->profile = FF_PROFILE_H264_BASELINE; $codec_ctx->level = 30; } if ($ffi->avcodec_open2($codec_ctx, $codec, $options) < 0) { throw new Exception('Cannot init encoder'); } $codec_pkt = $ffi->new('AVPacket[1]'); $frame = $ffi->av_frame_alloc(); $frame->format = AV_PIX_FMT_YUV420P; $frame->width = $width; $frame->height = $height; $ffi->av_frame_get_buffer($frame, 32); $sws = $ffi->sws_getContext( $width, $height, AV_PIX_FMT_RGB24, $width, $height, AV_PIX_FMT_YUV420P, SWS_FAST_BILINEAR, null, null, null); $buffer = $ffi->malloc($width * $height * 3); $i = 0; while ($i < 120) { $i++; //$buffer = 获取一帧RGB24像素数据 $image = 'data://' . base64_encode($buffer); $img = imagecreatefromstring(file_get_contents($image)); $rgb = imagecreatetruecolor($width, $height); imagecopyresampled($rgb, $img, 0, 0, 0, 0, $width, $height, $width, $height); $rgb_buffer = $ffi->new('uint8_t[?]', $width * $height * 3); $p = $ffi->cast('char *', $ffi->addr($rgb_buffer)); $linesize = $ffi->new('int[3]', [$width * 3, 0, 0]); $ffi->av_image_fill_arrays($frame->data, $frame->linesize, $p, AV_PIX_FMT_RGB24, $width, $height, 1); $ffi->sws_scale($sws, [$p], $linesize, 0, $height, $frame->data, $frame->linesize); $frame->pts = $i; $ret = $ffi->avcodec_send_frame($codec_ctx, $frame); if ($ret < 0) { throw new Exception('Cannot encode video frame'); } while ($ret >= 0) { $ret = $ffi->avcodec_receive_packet($codec_ctx, $codec_pkt); if ($ret < 0) { break; } //将$codec_pkt->data中的视频数据作为流推送到服务器,代码略 $ffi->av_packet_unref($codec_pkt); } }
The above code uses FFmpeg to encode and convert the obtained RGB24 pixel data, and push the converted video data to the server. If you want to use a different encoder, just replace AV_CODEC_ID_H264 in the above code.
3. Use PHP to achieve live broadcast distribution
There are many ways to achieve live broadcast distribution with PHP, including server distribution and P2P distribution. Server distribution refers to streaming the processed video data to the server, and the intermediate server can transcode and distribute the video; P2P distribution refers to distributing video data directly to other users through the UDP protocol.
The following is the code to implement server distribution through the Ratchet framework:
use RatchetServerIoServer; use RatchetWebSocketWsServer; use AppVideoServer; require dirname(__DIR__) . '/vendor/autoload.php'; $server = IoServer::factory(new WsServer(new VideoServer()), 8080); $server->run();
use RatchetConnectionInterface; use RatchetMessageComponentInterface; use ReactEventLoopFactory; use ReactEventLoopLoopInterface; use ReactStreamReadableResourceStream; class VideoServer implements MessageComponentInterface { protected $clients; /** @var LoopInterface $loop */ protected $loop; protected $sourceUrl; protected $sourceRenderer; public function __construct() { $this->clients = new SplObjectStorage(); $this->loop = Factory::create(); $this->sourceUrl = ''; $this->sourceRenderer = null; } public function onOpen(ConnectionInterface $conn) { $this->clients->attach($conn); $conn->on('close', function() use ($conn) { $this->clients->detach($conn); if ($this->clients->count() === 0 && $this->sourceRenderer !== null) { $this->sourceRenderer->close(); $this->sourceRenderer = null; } }); if ($this->sourceRenderer === null) { $this->loop->futureTick(function() use ($conn) { $conn->send('Waiting for source video stream...'); }); return; } $resource = new ReadableResourceStream($this->sourceRenderer->stdout, $this->loop); $resource->on('data', function ($chunk) use ($conn) { foreach ($this->clients as $client) { if ($client !== $conn) { $client->send($chunk); } } }); } public function onMessage(ConnectionInterface $from, $msg) { if ($this->sourceRenderer === null) { $this->sourceUrl = trim($msg); $this->sourceRenderer = new ReactChildProcessProcess('ffmpeg -i ' . escapeshellarg($this->sourceUrl) . ' -c:v libx264 -preset superfast -tune zerolatency -b:v 400k -pix_fmt yuv420p -r 30 -f mpegts -'); $this->sourceRenderer->start($this->loop); } } public function onClose(ConnectionInterface $conn) { $this->clients->detach($conn); if ($this->clients->count() === 0 && $this->sourceRenderer !== null) { $this->sourceRenderer->close(); $this->sourceRenderer = null; } } public function onError(ConnectionInterface $conn, Exception $e) {} }
The above code implements the WebSocket server through the Ratchet framework. When a user connects to the server, the server starts a sub-process to run FFmpeg to process the video stream, and pushes the processed video stream to the user through WebSocket. If multiple users are connected to the server, the server will distribute the processed video stream to each user in real time.
4. Summary
This article introduces how to use PHP to implement the live streaming function. First, the video data is encoded by calling FFmpeg, and then the encoded video data is pushed to the server through the stream. Finally, the server is distributed through the Ratchet framework, and the processed video stream is distributed to the user in real time. These technologies are crucial for achieving efficient live streaming functions and can help developers easily complete the development of live streaming applications.
The above is the detailed content of PHP implements live streaming function. For more information, please follow other related articles on the PHP Chinese website!