小编vgo*_*129的帖子

Qt中使用QNetworkAccessManager的持久连接

我正在尝试使用Qt维护客户端和远程服务器之间的持久连接.我的服务方面很好.我在Qt做客户端.在这里,我将用于使用QNetworkAccessManagerget方法请求服务器(QNetworkRequest方法的一部分).我将能够发送和接收请求.

但是在一段时间(大约2分钟)之后,客户端正在暗示服务器,通过自动发布请求来关闭连接.我想QNetworkAccessManager是为这个连接设置超时.我想在两端之间保持持久的联系.

我的方法是否正确,如果没有,有人可以引导我走正确的道路吗?

qt qnetworkaccessmanager qnetworkrequest

5
推荐指数
1
解决办法
1474
查看次数

Changing FocusMode not working using MediaStream API in Google Chrome

In Google Chrome Browser i was able to get live feed of my connected USB Camera using getUserMedia() API. I have a slider to change the brightness value and this is working fine. I also want focusMode to toggle from continuous to manual(The camera always starts with continuous focusMode).

I have the below Javascript code to change FocusMode.

const video_constraints ={};

//Create the following keys for Constraint
video_constraints.video = {};
video_constraints.video.width = {};
video_constraints.video.width.exact = 1920; //set video width …
Run Code Online (Sandbox Code Playgroud)

javascript w3c image-capture getusermedia mediastream

5
推荐指数
0
解决办法
461
查看次数

在 Ubuntu 中使用 Gstreamer 通过 RTP 协议流式传输 Mp4 视频

我正在尝试从本地目录获取视频文件,启用来自服务器的流并从客户端捕获这些帧。我使用了以下管道:

服务器端:

gst-launch -v  filesrc location=/home/gokul/Videos/Econ_TestVideo/radioactive.mp4 ! qtdemux ! rtpmp4vpay ! udpsink host=192.168.7.61 port=5000 sync=true


Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f528045a1443000001b24c61766335332e33352e30, payload=(int)96, ssrc=(uint)3003638799, clock-base=(uint)1542273545, seqnum-base=(uint)49176
/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:sink: caps = video/mpeg, mpegversion=(int)4, systemstream=(boolean)false, profile=(string)simple, level=(string)1, codec_data=(buffer)000001b001000001b58913000001000000012000c48d8800f528045a1443000001b24c61766335332e33352e30, width=(int)1280, height=(int)720, framerate=(fraction)91/3, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0: timestamp = 1542273545
/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0: seqnum = 49176
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f528045a1443000001b24c61766335332e33352e30, payload=(int)96, ssrc=(uint)3003638799, clock-base=(uint)1542273545, seqnum-base=(uint)49176
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ... …
Run Code Online (Sandbox Code Playgroud)

ubuntu rtp video-streaming gstreamer gst-launch

2
推荐指数
1
解决办法
9494
查看次数

使用 getUserMedia API 配置帧格式

我有以下代码可以在我的 Google Chrome 浏览器中流式传输连接的视频源。WebRTC 的getUserMedia就是这样做的。以下代码片段用于配置我的外部相机设备的分辨率和帧速率。

function configureVideo()
{
      const video_constraints ={};

      //Create the following keys for Constraint
      video_constraints.video = {};

      //set camera name
      video_constraints.video.deviceId = {};
      video_constraints.video.deviceId.exact = <device_id_comes_here>

      //set resolution Width
      video_constraints.video.width = {};
      video_constraints.video.width.exact = 640;

      //set resolution height
      video_constraints.video.height = 480;
      video_constraints.video.height.exact = streamHeight;

      //set fps
      video_constraints.video.frameRate = 60;
      video_constraints.video.frameRate.exact = streamFps;

      console.log("Selected Contraints is :", video_constraints);

      navigator.mediaDevices.getUserMedia(video_constraints).then(streamCallback).catch(handleError);
}
Run Code Online (Sandbox Code Playgroud)

是的,我成功地从我的外部相机设备流式传输视频。相机提供 2 种类型的帧格式 YUYV 和 BY8。但我真的不知道当前正在流式传输什么帧格式。

有什么方法可以在 WebRTC 中配置我感兴趣的视频帧格式。

javascript video-streaming webrtc getusermedia

1
推荐指数
1
解决办法
590
查看次数