gstreamer的:链接的垃圾桶有两个汇playbin2垃圾桶、链接、有两个、gstreamer

2023-09-06 17:59:27 作者:陪你短衣到长裙落地

我想读在SDP文件和连接code视频流为H.264和音频流为AAC。然后我想这些复流成AVI流,然后到一个文件中。我不知道时间超前的SDP文件中的内容,因此它似乎最容易使用playbin2

所以,我想我可以做这样的事情:

  RawToAviMux斌
                       ______________________________________
                ----- | ghostpad ---- x264enc
              / | \
playbin2 ------ | avimux  - 文件接收器
              \ | /
               ------- | ghostpad ---- ffenc_aac
                      | _______________________________________

 playbin2的videosink设置RawToAviMux的一个实例
      和playbin2的audiosink到RawToAviMux的同一个实例
 

不过,我不能让管道进入播放状态。

下面是code:

 记录=新Gst.BasePlugins.PlayBin2();
            recorder.PlayFlags&安培; =〜((Gst.BasePlugins.PlayBin2.PlayFlagsType)(1&其中; 2));
            recorder.Bus.AddSignalWatch();
            recorder.Bus.EnableSyncMessageEmission();

            RawToAviMuxer aviMuxer =新RawToAviMuxer(文件名);

            recorder.VideoSink = aviMuxer;
            recorder.AudioSink = aviMuxer;

            recorder.SetState(Gst.State.Ready);
            recorder.Uri = @文件:///+ filePath.Replace('\\','/');
            recorder.SetState(Gst.State.Paused);
            recorder.SetState(Gst.State.Playing);

            Gst.State currentState的;
            Gst.State playingState = Gst.State.Playing;
            Gst.StateChangeReturn stateReturn = recorder.GetState(出currentState的,出playingState,Gst.Clock.Second);

            如果(stateReturn!= Gst.StateChangeReturn.Failure)
                返回true;

            返回false;
 
日杂用品选购 土巴兔选材手册

使用RawToAviMuxer为

 公共类RawToAviMuxer:Gst.Bin
{
    布尔测试= FALSE;

    公共RawToAviMuxer(字符串outputFileName)
        :基地(rawToAviMux)
    {
        Gst.Element x264Enc = Gst.ElementFactory.Make(x264enc);
        Gst.Element ffenc_aac = Gst.ElementFactory.Make(ffenc_aac);

        x264Enc [bframes] =(UINT)0;
        x264Enc [B-适应] = FALSE;
        x264Enc [比特率] =(UINT)1024;
        x264Enc [调] =为0x4;
        x264Enc [极速芯preSET] = 3;
        x264Enc [切线程] = FALSE;
        x264Enc [档案] = 0;
        x264Enc [键-INT-MAX] =(UINT)30;

        Gst.GhostPad videoToX264Pad =新Gst.GhostPad(video_sink,x264Enc.GetStaticPad(汇));
        Gst.GhostPad audioToAACPad =新Gst.GhostPad(audio_sink,ffenc_aac.GetStaticPad(汇));

        测试= this.AddPad(videoToX264Pad);
        测试= this.AddPad(audioToAACPad);

        Gst.Element aviMux = Gst.ElementFactory.Make(avimux);
        Gst.Element文件接收器= Gst.ElementFactory.Make(文件接收器);

        测试= this.Add(新Gst.Element [] {x264Enc,ffenc_aac,aviMux,文件接收});
        测试= x264Enc.Link(aviMux);
        测试= ffenc_aac.Link(aviMux);
        测试= aviMux.Link(文件接收器);

        文件接收器[所在地] = outputFileName;
    }
}
 

我已经通过调试台阶,所有的环节都成功。

更新

好了,我尝试了下面的管道与Gst.Parse.Launch:

  uride codeBIN URI =文件:/// C:/Users/Jonathan/AppData/Local/Temp/192.168.0.215_5000.sdp!
x264enc字节流=真bframes = 0 B-适应= 0调整=为0x4速度 -  preSET = 3切片线程= FALSE
简介= 0! MUX。 ffenc_aac! MUX。 avimux名= MUX!文件接收器位置= C:\用户\乔纳森\桌面\ test.avi
 

我还是不能把它弄出来了暂停。

我使用Windows构建的,所以我很担心,也许有什么不对吗?

我也不能附加消息处理程序,以总线,这样我可以弄清楚是怎么回事,这是真的开始招人烦。

我只是觉得这不过

如果我直接把持经由udpsrc元素流,知道格式是什么时间提前,它不与只是一个rtph264depay元件工作。必须有在管道中的h264parse元件。这可能是uride codeBIN是不是为我工作的原因?

解决 最后我做了以下内容:

 如果(!gst_is_init)
            {
                Gst.Application.Init();
                gst_is_init = TRUE;
            }

            如果(录像机!= NULL)
            {
                recorder.SetState(Gst.State.Null);
                recorder.Dispose();
            }

            串videoDepay,audioDepay,strExtension,strMuxer;

            GetGstElements(stream.VideoCaps,出videoDepay,出strMuxer,出strExtension);
            GetGstElements(stream.AudioCaps,出audioDepay,出strMuxer,出strExtension);

            文件名= Path.ChangeExtension(文件名,strExtension);

            //记录=新Gst.Pipeline(recordingPipe);
            串管柱=的String.Format(udpsrc端口= {0}!{1} {2}!{3}!排队!MUX。udpsrc端口= {4}!{1} {5}!{6}!MUX。 {7}名= MUX!文件接收器位置= {8},
                portToUse,应用程序/ x-RTP,stream.VideoCaps,videoDepay,(portToUse + 2),stream.AudioCaps,audioDepay,strMuxer,fileName.Replace(\\,\\\\));

            记录=(Gst.Pipeline)Gst.Parse.Launch(管柱);

            recordingFileName =文件名;

                 recorder.SetState(Gst.State.Ready);
            recorder.SetState(Gst.State.Paused);
            recorder.SetState(Gst.State.Playing);

            Gst.State currentState的;

            Gst.StateChangeReturn stateReturn = recorder.GetState(出currentState的,Gst.Clock.Second);

            如果(stateReturn!= Gst.StateChangeReturn.Failure)
                返回true;

            返回false;
 

您必须有一个解析器为所有的流,管道去演奏。因此,在一个进来的H264流的情况下,我需要使用rtph264depay! h264parse。此外,NALU和字节流必须为时序匹配是正确的。

另外,为了使该文件是可用的,则必须设置在管道的下游之前发送的EOS

  recorder.SendEvent(Gst.Event.NewEos());
                System.Threading.Thread.Sleep(100);
                recorder.SetState(Gst.State.Paused);
                recorder.SetState(Gst.State.Ready);
                recorder.SetState(Gst.State.Null);
                recorder.Dispose();
 

解决方案

是的,这是不行的。这里是你如何能做到这一点。 Playbin2是模块化部件,它由一个uride codeBIN和playsinkbin。你可以只使用和uride codeBIN,设置你的媒体文件的URI,加上信号处理程序垫添加并连接新创建的垫水槽垫您rawtoavimux组件。

一个替代rawtoavimux是使用EN codeBIN。使用uride codeBIN!烯codeBIN可以潜在到智能转码这将避免在解码和再编码,如果一个或一个以上流的格式已经处于正确的格式。

I want to read in an SDP file and encode the video stream into h.264 and the audio stream into aac. Then I want to multiplex those streams into an avi stream and then to a file. I don't know the contents of the SDP file ahead of time, so it seems easiest to use playbin2.

So, I thought I could do something like this:

                             RawToAviMux Bin
                       ______________________________________
                ----- |ghostpad----x264enc
              /       |                   \
playbin2------        |                    avimux--filesink
              \       |                    /
               -------| ghostpad----ffenc_aac
                      |_______________________________________

 setting playbin2's videosink to an instance of RawToAviMux
      and playbin2's audiosink to the same instance of RawToAviMux

However, I cannot get the pipeline into the playing state.

Here is the code:

            recorder = new Gst.BasePlugins.PlayBin2();
            recorder.PlayFlags &= ~((Gst.BasePlugins.PlayBin2.PlayFlagsType)(1 << 2));
            recorder.Bus.AddSignalWatch();
            recorder.Bus.EnableSyncMessageEmission();

            RawToAviMuxer aviMuxer = new RawToAviMuxer(fileName);               

            recorder.VideoSink = aviMuxer;
            recorder.AudioSink = aviMuxer;

            recorder.SetState(Gst.State.Ready);
            recorder.Uri = @"file:///" + filePath.Replace('\\', '/');
            recorder.SetState(Gst.State.Paused);
            recorder.SetState(Gst.State.Playing);

            Gst.State currentState;
            Gst.State playingState = Gst.State.Playing;
            Gst.StateChangeReturn stateReturn = recorder.GetState(out currentState, out playingState, Gst.Clock.Second);

            if (stateReturn != Gst.StateChangeReturn.Failure)
                return true;

            return false;

With RawToAviMuxer as

public class RawToAviMuxer : Gst.Bin
{
    bool test = false;

    public RawToAviMuxer(string outputFileName)
        : base("rawToAviMux")
    {
        Gst.Element x264Enc = Gst.ElementFactory.Make("x264enc");
        Gst.Element ffenc_aac = Gst.ElementFactory.Make("ffenc_aac");

        x264Enc["bframes"] = (uint)0;
        x264Enc["b-adapt"] = false;
        x264Enc["bitrate"] = (uint)1024;
        x264Enc["tune"] = 0x4;
        x264Enc["speed-preset"] = 3;
        x264Enc["sliced-threads"] = false;
        x264Enc["profile"] = 0;
        x264Enc["key-int-max"] = (uint)30;

        Gst.GhostPad videoToX264Pad = new Gst.GhostPad("video_sink", x264Enc.GetStaticPad("sink"));
        Gst.GhostPad audioToAACPad = new Gst.GhostPad("audio_sink", ffenc_aac.GetStaticPad("sink"));

        test = this.AddPad(videoToX264Pad);
        test = this.AddPad(audioToAACPad);

        Gst.Element aviMux = Gst.ElementFactory.Make("avimux");
        Gst.Element fileSink = Gst.ElementFactory.Make("filesink");

        test = this.Add(new Gst.Element[]{x264Enc, ffenc_aac, aviMux, fileSink});
        test = x264Enc.Link(aviMux);
        test = ffenc_aac.Link(aviMux);
        test = aviMux.Link(fileSink);

        fileSink["location"] = outputFileName;
    }
}

I have stepped through in the debugger and all of the links are successful.

update

Ok, so I tried the following pipeline with Gst.Parse.Launch:

uridecodebin uri=file:///C:/Users/Jonathan/AppData/Local/Temp/192.168.0.215_5000.sdp !
x264enc byte-stream=true bframes=0 b-adapt=0 tune=0x4 speed-preset=3 sliced-threads=false 
profile=0 ! mux. ffenc_aac ! mux. avimux name=mux ! filesink location=C:\Users\Jonathan\Desktop\test.avi

I still can't get it out of paused.

I am using the windows build, so I am worried maybe there is something wrong with that?

I also can't attach a message handler to the bus so that I can figure out what is going on, which is really starting to get annoying.

I did just find this however,

If I directly grab the streams via an udpsrc element, knowing what the formats are ahead of time, it does not work with just an rtph264depay element. There must be an h264parse element in the pipeline. This may be the reason that uridecodebin isn't working for me?

solved I ended up doing the following:

            if (!gst_is_init)
            {                
                Gst.Application.Init();
                gst_is_init = true;
            }

            if(recorder != null)
            {
                recorder.SetState(Gst.State.Null);
                recorder.Dispose();
            }

            string videoDepay, audioDepay, strExtension, strMuxer;

            GetGstElements(stream.VideoCaps, out videoDepay, out strMuxer, out strExtension);
            GetGstElements(stream.AudioCaps, out audioDepay, out strMuxer, out strExtension);

            fileName = Path.ChangeExtension(fileName, strExtension);

            //recorder = new Gst.Pipeline("recordingPipe");
            string pipeString = String.Format("udpsrc port={0} ! {1} {2} ! {3} ! queue ! mux. udpsrc port={4} ! {1} {5} ! {6} ! mux. {7} name=mux ! filesink location={8}",
                portToUse, "application/x-rtp,", stream.VideoCaps, videoDepay, (portToUse + 2), stream.AudioCaps, audioDepay, strMuxer, fileName.Replace("\\", "\\\\"));

            recorder = (Gst.Pipeline)Gst.Parse.Launch(pipeString);                

            recordingFileName = fileName;

                 recorder.SetState(Gst.State.Ready);                
            recorder.SetState(Gst.State.Paused);                
            recorder.SetState(Gst.State.Playing);

            Gst.State currentState;                

            Gst.StateChangeReturn stateReturn = recorder.GetState(out currentState, Gst.Clock.Second);            

            if (stateReturn != Gst.StateChangeReturn.Failure)
                return true;

            return false;

You have to have a parser for all streams, for the pipeline to go to playing. So, in the case of an incomming h264 stream, I would need to use rtph264depay ! h264parse. In addition the NALU and byte-stream must match for the timing to be right.

Also, in order for the file to be usable, you have to send an EOS downstream before disposing of the pipeline.

                recorder.SendEvent(Gst.Event.NewEos());
                System.Threading.Thread.Sleep(100);
                recorder.SetState(Gst.State.Paused);
                recorder.SetState(Gst.State.Ready);
                recorder.SetState(Gst.State.Null);
                recorder.Dispose();              

解决方案

Yes, this won't work. Here is how you can do it. Playbin2 is a modular component, it consists of an uridecodebin and a playsinkbin. You can just use and uridecodebin, set your media file uri, add signal handlers for pad-added and connect the newly created pads to the sink-pads of your rawtoavimux component.

One alternative for rawtoavimux would be to use encodebin. Using uridecodebin ! encodebin can potentially to smart transcoding which would avoid the decoding and re-encoding if the format of one or more stream is already in the correct format.