Home  >  Article  >  Web Front-end  >  H5 live video broadcast

H5 live video broadcast

大家讲道理
大家讲道理Original
2017-05-28 10:56:444919browse

In order to keep up with the trend, this article will introduce to you the basic process and main technical points in Video, including but not limited to front-end technology.

1 Can H5 do live video streaming?

Of course, H5 has been popular for so long and covers all aspects of technology.

For video recording, you can use the powerful webRTC (Web Real-Time Communication), which is a technology that supports web browsers for real-time voice conversations or video conversations. The disadvantage is that it is only available on PCs. chrThe support on ome is good, but the mobile support is not ideal.

For video playback, you can use the HLS (HTTP Live Streaming) protocol to play the live stream. Both iOS and Android naturally support this protocol. It is simple to configure and can be used directly. Just the video tag.

webRTC compatibility:

video tag plays hls protocol video:



##1

2

3

4


##<

video controls autoplay>     <

source src="http://10.66.69.77:8080/hls/mystream.m3u8" type="application/ vnd.apple.mpegurl" /> ​ <

p

class="warning">Your browser does not support HTML5 video.</p> ; ## </video>

##


2 What is the HLS protocol?

To put it simply, it is to divide the entire stream into small, HTTP-based files to download, and only download a few at a time. As mentioned earlier, a .m3u8 is introduced when playing live video in H5. file, this file is based on the HLS protocol and stores video stream metadata.

Each .m3u8 file corresponds to several ts files. These ts files are the actual data that stores the video. The m3u8 file only stores the configuration information and related paths of some ts files. When the video is played, .m3u8 is changed dynamically. The video tag will parse this file and find the corresponding ts file to play. Therefore, generally in order to speed up, .m3u8 is placed on the web server and the ts file is placed on the cdn.

.m3u8 file is actually an m3u file encoded in UTF-8. This file itself cannot be played. It is just a text file that stores playback information:



##1

2

3

4

5

6

7


EXTM3U                       m3u file header

EXT-X-MEDIA-SEQUENCE      The sequence number of the first TS fragment

EXT-X-TARGETDURATION                    Maximum duration of filmTS

EXT - EXT-X- ENDLIST                          m3u8 file end character

#EXTINF                                                                               Specify the duration (seconds) of each media segment (ts), which is only valid for the URI following it

#mystream-12.ts


ts file:

The request process of HLS is:
1 http request m3u8 url.
2 The server returns an m3u8 playlist. This playlist is updated in real time. Generally, 5 data URLs are given at a time.
3 The client parses the m3u8 playlist, then requests the url of each segment in sequence to obtain the ts data stream.

Simple process:

3 HLS live broadcast delay

We know that the hls protocol will The live stream is divided into short videos for downloading and playing, so assuming that the list contains 5 ts files, and each TS file contains 5 seconds of video content, then the overall delay is 25 seconds. Because when you see these videos, the host has already recorded the video and uploaded it, so there is a delay caused by this. Of course, you can shorten the length of the list and the size of a single ts file to reduce latency. In the extreme, you can reduce the list length to 1 and the ts duration to 1 second. However, this will increase the number of requests and increase server pressure. When the network speed is slow, Timing causes more buffering, so Apple’s officially recommended ts duration is 10s, so this will cause a 30s delay. Reference: https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/FrequentlyAskedQuestions/FrequentlyAskedQuestions.html

4 What is the entire process of live video streaming?

Live video can be roughly divided into:

1 Video recording end: usually an audio and video input device on a computer or a camera or microphone on a mobile phone. Currently, mobile Mainly mobile phone videos.

2 Video player: It can be the player on the computer, the native player on the mobile phone, and the h5 video tag, etc. Currently, the native player on the mobile phone is the main one. .

3 Video server: usually an nginx server, used to accept the video source provided by the video recording end and provide streaming services to the video playback end.

Simple process:

5 How to collect audio and video?

Let’s first clarify a few concepts:

Video encoding: The so-called video encoding refers to converting a file in a certain video format into another through specific compression technology. A video format file method. The video we use to record on the iPhone must be encoded, uploaded, and decoded before it can actually be played in the user-side player.

Codec standards: The most important codec standards in video streaming transmission are ITU’s H.261, H.263, and H.264, among which the HLS protocol supports H.264 Format encoding.

Audio
Encoding: Similar to video encoding, the original audio stream is encoded, uploaded, decoded, and played in the player according to certain standards. Of course, there are also many encoding standards for audio, such as PCM encoding. , WMA encoding, AAC encoding, etc. The audio encoding method supported by our HLS protocol here is AAC encoding.
The following will use the camera on ios to collect audio and video data, which is mainly divided into the following steps:

1 Audio and video Collection, in ios, the original audio and video data stream can be collected using AVCapture

Session and AVCaptureDevice. 2 Encode H264 for video and AAC for audio. There are encapsulated encoding libraries in ios to encode audio and video. 3 Assemble the encoded audio and video data into packets;
4 Establish an RTMP connection and push it to the server.

ps: Since most coding libraries are written in

C language

, you need to compile them when you use them. For ios, you can use the already compiled coding libraries. x264 encoding: https://

git

hub.com/kewlbear/x264-iosfaac encoding: https://github.com/fflydev/faac- ios-build

ffmpeg encoding: https://github.com/kewlbear/FFmpeg-iOS-build-script

About if you want to add some special effects to the video, such as adding filters, etc. , generally use the filter library before encoding, but this will also cause some time consuming, resulting in a certain delay in uploading video data.

Simple process:

6 What is the ffmpeg mentioned earlier?

Like the previous x264, ffmpeg is actually a set of encoding libraries. Similar to Xvid, Xvid is a codec based on the MPEG4 protocol, and x264 is an encoder based on the H.264 protocol. ffmpeg integrates various audio and video encoding and decoding protocols. By setting parameters, encoding and decoding based on MPEG4, H.264 and other protocols can be completed. The demo here uses the x264 encoding library.

7 What is RTMP?

Real Time Messaging Protocol (RTMP for short) is a set of video live broadcast protocols developed by Macromedia and now owned by Adobe. Like HLS, it can be applied to live video broadcasts. The difference is that RTMP is based on flash and cannot be played in iOS browsers, but its real-time performance is better than HLS. Therefore, this protocol is generally used to upload video streams, that is, the video streams are pushed to the server.

Here is a comparison between hls and rtmp:

8 Push flow

The so-called push flow is Send our encoded audio and video data to the video streaming server. Generally, rtmp is used to push the stream. You can use the third-party library librtmp-iOS to push the stream. librtmp encapsulates some core api is for users to call. If you find it troublesome, you can use the ready-made ios video streaming SDK, which is also based on rtmp, https://github.com/runner365/LiveVideoCoreSDK

9 Push server setup

Simple push server setup. Since the video streams we upload are based on the rtmp protocol, the server must also support rtmp. You probably need the following Steps:

1 Installan nginx server.

2 Install nginx’s rtmp extension. Currently, the most commonly used one is https://github.com/arut/nginx-rtmp-module

3 Configure nginx conf file:



##1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17


## rtmp

{

##server

{

##       listen

1935; #Listening port

chunk

_size

4000 ;                                                        ##

       application hls { #rtmp push request path

                                                                         #; #                                              

#                  hls_path /usr/local/var

/

www/hls;           hls_fragment 5s;

} }

}


4 Restart nginx and write the rtmp push address as rtmp://ip:1935/hls/mystream, where hls_path represents the location of the generated .m3u8 and ts files. The storage address, hls_fragment represents the slice duration, and mysteam represents an instance. You can set any file name you want to generate in the future. For more configuration, please refer to: https://github.com/arut/nginx-rtmp-module/wiki/

Based on the above steps, a video server that supports rtmp has basically been implemented.

10 Play live video on an HTML5 page?

To put it simply, you can directly use the video tag to play the live video of the HLS protocol:



##1

2

3

4


##<

video autoplay webkit-playsinline> <

source src="http://10.66.69.77:8080/hls/mystream.m3u8" type=" application/vnd.apple.mpegurl" /> ##​ <p

class="warning">Your browser does not support HTML5 video.</p> #</video >


It should be noted that the webkit-playsinline attribute is added to the video tag. This attribute is to allow the video video to be played in uiwebview of iOS without full screen. By default iOS will play the video in full screen, and you need to set allowsInlineMediaPlayback=YES to uiwebview. Videojs, which is relatively mature in the industry, can choose different strategies according to different platforms. For example, ios uses video tags, pc uses flash, etc.

11 Summary of pitfalls

Jian Based on the above steps, the author wrote a demo to implement ios video recording, collection, uploading, and nginx server to deliver live streaming. A complete set of processes for playing live videos on h5 pages summarizes the following pitfalls:

1 When using AVCaptureSession to capture video, you need to implement the AVCaptureVideoDataOutputSampleBufferDelegate protocol, and at the same time - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection captures the video stream. It should be noted that the didOutputSampleBuffer method is not the didDropSampleBuffer method. The latter will only trigger once. The didDropSampleBuffer method was written at the beginning, which was half a day away. Only to find out that the method call was wrong.

2 When using rtmp to push the stream, the rmtp address must start with rtmp://, and the ip address must write the actual ip address, not localhost, and the port number must be added at the same time, because it cannot be uploaded on the mobile phone. Identify localhost.

Some pitfalls will be added here later, some of which require pasting code, but these are listed here.

12 Industry support

Currently, Tencent Cloud, Baidu Cloud, and Alibaba Cloud all have solutions based on live video broadcasting, from video recording to video playback. There are a series of SDKs available for streaming. The disadvantage is that they need to be charged. If possible, it is not difficult to implement one yourself.

demo address: https://github.com/lvming6816077/LMVideoTest/


The above is the detailed content of H5 live video broadcast. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn