User:Joger/FFmpeg: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
No edit summary
Line 24: Line 24:
=== SRT ===
=== SRT ===
Secure Reliable Transport (SRT) is a UDP-based protocol that operates without an initial handshake. It is recognized for its low latency and is faster than RTMP. Although it relies on UDP, SRT ensures reliable streaming through its error correction mechanisms. SRT is seldom used on the distribution end as it lacks browser support. However, its popularity is growing on the ingest side due to its low latency, positioning it as a potential replacement for RTMP ingest.
Secure Reliable Transport (SRT) is a UDP-based protocol that operates without an initial handshake. It is recognized for its low latency and is faster than RTMP. Although it relies on UDP, SRT ensures reliable streaming through its error correction mechanisms. SRT is seldom used on the distribution end as it lacks browser support. However, its popularity is growing on the ingest side due to its low latency, positioning it as a potential replacement for RTMP ingest.
=== RTP ===
The Real-Time Transport Protocol (RTP) is a network protocol used for delivering audio and video over IP networks. It is designed to provide end-to-end, real-time transfer of streaming media, accommodating for packet loss, jitter, and out-of-order delivery, which are common in UDP transmissions. RTP is often used in systems that require streaming media, such as telephony, video teleconferencing, and web-based push-to-talk features. It typically runs over User Datagram Protocol (UDP) and is used in conjunction with the RTP Control Protocol (RTCP) for monitoring transmission statistics and quality of service.
=== RTSP ===
RTSP, or Real-Time Streaming Protocol, is a network control protocol designed for streaming media systems. It enables the control and delivery of real-time multimedia content, such as live video and audio, across a network. Developed by the Internet Engineering Task Force (IETF) in 1998, RTSP allows users to interact with media streams similarly to how they would with local media players. The protocol's primary function is to establish and control media sessions between endpoints, which can be multimedia servers or clients. RTSP is known for its low-latency performance, making it suitable for applications where real-time playback is crucial, such as live video streaming, teleconferencing, and surveillance.
RTP (Real-Time Transport Protocol) is used for the transport of real-time data, such as audio and video, and is designed to work over UDP/IP. It does not reserve bandwidth or guarantee Quality of Service (QoS). RTSP (Real-Time Streaming Protocol), on the other hand, is a control protocol used for commanding servers to set up, play, pause, or tear down streaming sessions. It is an application-level protocol that works in conjunction with RTP and RSVP to provide complete streaming services over the internet.
=== WebRTC ===
WebRTC, short for Web Real-Time Communication, is an open-source project that enables real-time communication (RTC) via application programming interfaces (APIs) in web browsers and mobile applications. It is both an API and a protocol, allowing for secure, bi-directional, real-time communication between two WebRTC agents. The protocol is a set of rules for these agents to negotiate this communication, while the API allows developers to implement the protocol in JavaScript, with other languages and APIs also supporting it. WebRTC is known for its open standard, mandatory encryption, NAT traversal, and sub-second latency, making it a powerful tool for peer-to-peer communication.
=== HLS ===
HLS, which stands for HTTP Live Streaming, is a widely-used media streaming protocol that delivers audio and visual content to viewers over the internet. Initially developed by Apple, HLS streams are broken down into smaller, downloadable HTTP files that are delivered using the HTTP protocol. This method is used for both live and on-demand video streaming, providing a flexible and reliable way to distribute content across various devices and network conditions.

Revision as of 11:17, 25 August 2024

FFProbe

FFProbe is used to inspect media, for example to view format or stream information. You can use the following command line to view a file's format and stream information. It will print out information about the file's streams, such as video and audio streams, as well as the container format.

ffprobe -v error -show_streams -show_format -print_format json <name_of_media>

Useful options

  • -v error suppresses FFprobe's header and decoding information, so we only see the interesting output information.
  • -print_format json makes the output easier to read.
  • -select_streams v will only print video streams. It can be replaced by a to show only audio streams or s to show only subtitle streams.

FFMpeg

Low latency streaming

> mediamtx
> ffmpeg -filter_complex ddagrab=video_size=1024x890:output_idx=0:framerate=60,hwdownload,format=bgra -fflags nobuffer -vcodec libx264 -tune zerolatency -f rtsp rtsp://127.0.0.1:8554/webcam.h264
> ffplay rtsp://127.0.0.1:8554/webcam.h264 -fflags nobuffer -flags low_delay -framedrop

Streaming protocols

RTMP

The Real-Time Messaging Protocol (RTMP), a TCP-based protocol renowned for its low latency, was created by Macromedia/Adobe for use with Adobe Flash. Although it was popular until recently, it has not been updated to support new codecs. Traditionally, RTMP was utilized for both the ingestion and distribution of streaming media, but its use has been on the decline. It necessitates an additional browser plugin and an RTMP server. Despite the discontinuation of Flash support, RTMP remains widely used due to its low latency and is still the standard for creating streamed content on platforms like YouTube and Facebook Live.

HTTP

TCP-based streaming offers the broadest reach and is unlikely to be blocked in any location. It eliminates the need for a separate streaming server or browser plugin, thanks to HTML5 video support. Media Source Extensions (MSE) allow browsers to play HLS and MPEG-DASH, commonly used in JavaScript players. While HTTP is the standard on the distribution side for major streaming platforms, it is less common on the ingestion side due to its higher latency compared to RTMP, making it less favored for high-quality content ingestion.

SRT

Secure Reliable Transport (SRT) is a UDP-based protocol that operates without an initial handshake. It is recognized for its low latency and is faster than RTMP. Although it relies on UDP, SRT ensures reliable streaming through its error correction mechanisms. SRT is seldom used on the distribution end as it lacks browser support. However, its popularity is growing on the ingest side due to its low latency, positioning it as a potential replacement for RTMP ingest.

RTP

The Real-Time Transport Protocol (RTP) is a network protocol used for delivering audio and video over IP networks. It is designed to provide end-to-end, real-time transfer of streaming media, accommodating for packet loss, jitter, and out-of-order delivery, which are common in UDP transmissions. RTP is often used in systems that require streaming media, such as telephony, video teleconferencing, and web-based push-to-talk features. It typically runs over User Datagram Protocol (UDP) and is used in conjunction with the RTP Control Protocol (RTCP) for monitoring transmission statistics and quality of service.

RTSP

RTSP, or Real-Time Streaming Protocol, is a network control protocol designed for streaming media systems. It enables the control and delivery of real-time multimedia content, such as live video and audio, across a network. Developed by the Internet Engineering Task Force (IETF) in 1998, RTSP allows users to interact with media streams similarly to how they would with local media players. The protocol's primary function is to establish and control media sessions between endpoints, which can be multimedia servers or clients. RTSP is known for its low-latency performance, making it suitable for applications where real-time playback is crucial, such as live video streaming, teleconferencing, and surveillance.

RTP (Real-Time Transport Protocol) is used for the transport of real-time data, such as audio and video, and is designed to work over UDP/IP. It does not reserve bandwidth or guarantee Quality of Service (QoS). RTSP (Real-Time Streaming Protocol), on the other hand, is a control protocol used for commanding servers to set up, play, pause, or tear down streaming sessions. It is an application-level protocol that works in conjunction with RTP and RSVP to provide complete streaming services over the internet.

WebRTC

WebRTC, short for Web Real-Time Communication, is an open-source project that enables real-time communication (RTC) via application programming interfaces (APIs) in web browsers and mobile applications. It is both an API and a protocol, allowing for secure, bi-directional, real-time communication between two WebRTC agents. The protocol is a set of rules for these agents to negotiate this communication, while the API allows developers to implement the protocol in JavaScript, with other languages and APIs also supporting it. WebRTC is known for its open standard, mandatory encryption, NAT traversal, and sub-second latency, making it a powerful tool for peer-to-peer communication.

HLS

HLS, which stands for HTTP Live Streaming, is a widely-used media streaming protocol that delivers audio and visual content to viewers over the internet. Initially developed by Apple, HLS streams are broken down into smaller, downloadable HTTP files that are delivered using the HTTP protocol. This method is used for both live and on-demand video streaming, providing a flexible and reliable way to distribute content across various devices and network conditions.