Represents a video source that can be added to a WebRTC call.
More...
#include <VideoSource.h>
Represents a video source that can be added to a WebRTC call.
Pass a shared_ptr to an instance of this to the StreamClient and call sendFrame for each of your frame.
◆ VideoSource()
Creates a VideoSource.
- Parameters
-
configuration | The configuration applied to the video stream by the image transport layer |
◆ is_screencast()
bool opentera::VideoSource::is_screencast |
( |
| ) |
const |
|
inlineoverride |
Indicates if this source is screencast.
- Returns
- true if this source is a screencast
◆ needs_denoising()
absl::optional< bool > opentera::VideoSource::needs_denoising |
( |
| ) |
const |
|
inlineoverride |
Indicates if this source needs denoising.
- Returns
- true if this source needs denoising
◆ remote()
bool opentera::VideoSource::remote |
( |
| ) |
const |
|
inlineoverride |
Indicates if this source is remote.
- Returns
- Always false, the source is local
◆ sendFrame()
void VideoSource::sendFrame |
( |
const cv::Mat & |
bgrImg, |
|
|
int64_t |
timestampUs |
|
) |
| |
Sends a frame to the WebRTC transport layer.
The frame may or may not be sent depending of the transport layer state Frame will be resized to match the transport layer request
- Parameters
-
bgrImg | BGR8 encoded frame data |
timestampUs | Frame timestamp in microseconds |
◆ state()
webrtc::MediaSourceInterface::SourceState opentera::VideoSource::state |
( |
| ) |
const |
|
inlineoverride |
Indicates if this source is live.
- Returns
- Always kLive, the source is live
The documentation for this class was generated from the following files: