![]() |
2010-04-06
, 18:47
|
Posts: 14 |
Thanked: 15 times |
Joined on Feb 2010
@ bay area, us
|
#22
|
Ok, so I just spent the ENTIRE day working on this! I got a lot farther, but its just not working quite right. I was able to get the video to play in a separate window, but when i try to put it inside a QWidget on my gui, the screen goes black and starts flashing and I have to restart the phone. Here's the code I have:
class Vid:
def __init__(self, windowId):
self.player = gst.Pipeline("player")
self.source = gst.element_factory_make("v4l2src", "vsource")
self.sink = gst.element_factory_make("autovideosink", "outsink")
self.source.set_property("device", "/dev/video0")
self.scaler = gst.element_factory_make("videoscale", "vscale")
self.window_id = None
self.windowId = windowId
self.player.add(self.source, self.scaler, self.sink)
gst.element_link_many(self.source,self.scaler, self.sink)
bus = self.player.get_bus()
bus.add_signal_watch()
bus.enable_sync_message_emission()
bus.connect("message", self.on_message)
bus.connect("sync-message::element", self.on_sync_message)
def on_message(self, bus, message):
t = message.type
if t == gst.MESSAGE_EOS:
self.player.set_state(gst.STATE_NULL)
elif t == gst.MESSAGE_ERROR:
err, debug = message.parse_error()
print "Error: %s" % err, debug
self.player.set_state(gst.STATE_NULL)
def on_sync_message(self, bus, message):
if message.structure is None:
return
message_name = message.structure.get_name()
if message_name == "prepare-xwindow-id":
win_id = self.windowId
assert win_id
imagesink = message.src
imagesink.set_property("force-aspect-ratio", True)
imagesink.set_xwindow_id(win_id)
def startPrev(self):
self.player.set_state(gst.STATE_PLAYING)
print "should be playing"
vidStream = Vid(wId)
vidStream.startPrev()
where wId is the window id of the widget where I want to video displayed. Any ideas anyone? Thanks!
not sure where my indents went....
The Following User Says Thank You to ptterb For This Useful Post: | ||
![]() |
2010-04-29
, 18:58
|
Posts: 58 |
Thanked: 10 times |
Joined on Dec 2009
|
#23
|
![]() |
2010-05-02
, 15:06
|
|
Posts: 141 |
Thanked: 267 times |
Joined on May 2010
@ Germany
|
#24
|
![]() |
2010-05-03
, 17:44
|
Posts: 58 |
Thanked: 10 times |
Joined on Dec 2009
|
#25
|
Hi, I'm facing the same problem right now and get also a segmentation fault so the full code would be very helpfull.
Thanks!
import gobject gobject.threads_init()
The Following User Says Thank You to zolakt For This Useful Post: | ||
![]() |
2010-05-31
, 02:13
|
Posts: 143 |
Thanked: 99 times |
Joined on Jun 2009
@ Houston
|
#26
|
I found the solution to the second problem. You do not need to use the Phonon library, as you can just pass the WinID of the QWidget in which you want to display the content of your gst pipline sink.
The post above lead me to the solution.
1.) Create your pipeline;
2.) Create the widget where you want to display the camera stream in;
3.) Define the ApplicationAttribute to Qt::AA_NativeWindows in your main function;
QCoreApplication::setAttribute(Qt::AA_NativeWindow s,true);
4.) Call:
QApplication::syncX();
5.) Set the xoverlay using the WinID of your Widget:
gst_x_overlay_set_xwindow_id (GST_X_OVERLAY (GST_MESSAGE_SRC (message)), widget->winId());
6.) Start playing your pipline:
gst_element_set_state (pipeline, GST_STATE_PLAYING);
#define VIDEO_SRC "v4l2src" #define VIDEO_SINK "xvimagesink" /* Initialize Gstreamer */ gst_init( NULL, NULL); /* Create pipeline and attach a callback to it's * message bus */ m_pipeline = gst_pipeline_new("test-camera"); bus = gst_pipeline_get_bus(GST_PIPELINE(m_pipeline)); //gst_bus_add_watch(bus, (GstBusFunc)bus_callback, &m_appData); gst_object_unref(GST_OBJECT(bus)); /* Create elements */ /* Camera video stream comes from a Video4Linux driver */ camera_src = gst_element_factory_make(VIDEO_SRC, "camera_src"); /* Colorspace filter is needed to make sure that sinks understands * the stream coming from the camera */ csp_filter = gst_element_factory_make("ffmpegcolorspace", "csp_filter"); /* Tee that copies the stream to multiple outputs */ tee = gst_element_factory_make("tee", "tee"); /* Queue creates new thread for the stream */ screen_queue = gst_element_factory_make("queue", "screen_queue"); /* Sink that shows the image on screen. Xephyr doesn't support XVideo * extension, so it needs to use ximagesink, but the device uses * xvimagesink */ m_videoSink = gst_element_factory_make(VIDEO_SINK, "screen_sink"); /* Creates separate thread for the stream from which the image * is captured */ image_queue = gst_element_factory_make("queue", "image_queue"); /* Filter to convert stream to use format that the gdkpixbuf library * can use */ image_filter = gst_element_factory_make("ffmpegcolorspace", "image_filter"); /* A dummy sink for the image stream. Goes to bitheaven */ image_sink = gst_element_factory_make("fakesink", "image_sink"); /* Check that elements are correctly initialized */ if(!(m_pipeline && camera_src && m_videoSink && screen_queue && csp_filter && image_queue && image_filter && image_sink)) { qDebug() << "Couldn't create pipeline elements"; return FALSE; } /* Set image sink to emit handoff-signal before throwing away * it's buffer */ g_object_set(G_OBJECT(image_sink), "signal-handoffs", TRUE, NULL); /* Add elements to the pipeline. This has to be done prior to * linking them */ gst_bin_add_many(GST_BIN(m_pipeline), camera_src, csp_filter, tee, screen_queue, m_videoSink, image_queue, image_filter, image_sink, NULL); /* Specify what kind of video is wanted from the camera */ caps = gst_caps_new_simple("video/x-raw-yuv", "width", G_TYPE_INT, 640, "height", G_TYPE_INT, 480, NULL); /* Link the camera source and colorspace filter using capabilities * specified */ if(!gst_element_link_filtered(camera_src, csp_filter, caps)) { return FALSE; } gst_caps_unref(caps); /* Connect Colorspace Filter -> Tee -> Screen Queue -> Screen Sink * This finalizes the initialization of the screen-part of the pipeline */ if(!gst_element_link_many(csp_filter, tee, screen_queue, m_videoSink, NULL)) { qDebug () << "gst video sink init fail"; return FALSE; } /* gdkpixbuf requires 8 bits per sample which is 24 bits per * pixel */ caps = gst_caps_new_simple("video/x-raw-yuv", "width", G_TYPE_INT, 640, "height", G_TYPE_INT, 480, "bpp", G_TYPE_INT, 24, "depth", G_TYPE_INT, 24, "framerate", GST_TYPE_FRACTION, 15, 1, NULL); /* Link the image-branch of the pipeline. The pipeline is * ready after this */ if(!gst_element_link_many(tee, image_queue, image_filter, NULL)) { qDebug () << "gst tee init fail"; return FALSE; } if(!gst_element_link_filtered(image_filter, image_sink, caps)) { qDebug () << "gst filterinit fail"; return FALSE; } gst_caps_unref(caps); gst_element_set_state(m_pipeline, GST_STATE_NULL);
![]() |
2010-06-01
, 15:59
|
Posts: 8 |
Thanked: 1 time |
Joined on May 2010
|
#27
|
#include <QtGui/QApplication> #include <gst/gst.h> #include "mainwindow.h" #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "main.h" #include <QApplication> #include <QTimer> #include <gst/interfaces/xoverlay.h> #include <stdlib.h> #include "fast.h" #define DEFAULT_VIDEOSINK "autovideosink" #define IMGWIDTH 400 #define IMGHEIGHT 256 float *img; byte *myimg; int ret_num_corners, b=30; xy* result; GMainLoop *loop; GstElement *pipeline, *camsource, *caps_yuv, *caps_rgb, *colorspace2, *colorspace, *xvsink; GstBus *bus; int rgb=1; static gboolean bus_call (GstBus *bus, GstMessage *msg, gpointer data) { GMainLoop *loop = (GMainLoop *) data; switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_EOS: g_print ("End of stream\n"); g_main_loop_quit (loop); break; case GST_MESSAGE_ERROR: { gchar *debug; GError *error; gst_message_parse_error (msg, &error, &debug); g_free (debug); g_printerr ("Error: %s\n", error->message); g_error_free (error); g_main_loop_quit (loop); break; } default: break; } return TRUE; } SinkPipeline::SinkPipeline(QGraphicsView *parent) : QObject(parent) { GstStateChangeReturn sret; // Create gstreamer elements // pipeline = gst_pipeline_new("gst-test"); camsource = gst_element_factory_make("v4l2camsrc", NULL); //v4l2camsrc caps_rgb = gst_element_factory_make("capsfilter", NULL); colorspace = gst_element_factory_make("ffmpegcolorspace", NULL); //colorspace2 = gst_element_factory_make("ffmpegcolorspace", NULL); xvsink = gst_element_factory_make("xvimagesink", NULL); if (!(pipeline && camsource && caps_yuv && colorspace && caps_rgb && colorspace2 && xvsink)) { g_printerr ("One element could not be created. Exiting.\n"); } // Set up the pipeline // we set the input filename to the source element char yuvcapsstr[256], rgbcapsstr[256]; sprintf(yuvcapsstr, "video/x-raw-yuv,width=%d,height=%d,bpp=24,depth=24,framerate=25/1", IMGWIDTH, IMGHEIGHT); sprintf(rgbcapsstr, "video/x-raw-rgb,width=%d,height=%d,bpp=32,depth=24,framerate=25/1", IMGWIDTH, IMGHEIGHT); g_object_set(G_OBJECT(caps_yuv), "caps", gst_caps_from_string(yuvcapsstr), NULL); g_object_set(G_OBJECT(caps_rgb), "caps", gst_caps_from_string(rgbcapsstr), NULL); // we add a message handler bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); gst_bus_add_watch (bus, bus_call, loop); gst_object_unref (bus); if(rgb){ // We add a buffer probe RGB GstPad *pad = gst_element_get_pad(caps_rgb, "src"); gst_object_unref(pad); g_print("RGB\n"); }else{ // We add a buffer probe YUV GstPad *pad = gst_element_get_pad(caps_yuv, "src"); gst_object_unref(pad); g_print("YUV\n"); } // Create pipeline & test source pipeline = gst_pipeline_new ("xvoverlay"); src = gst_element_factory_make ("videotestsrc", NULL); if ((sink = gst_element_factory_make ("xvimagesink", NULL))) { sret = gst_element_set_state (sink, GST_STATE_READY); if (sret != GST_STATE_CHANGE_SUCCESS) { gst_element_set_state (sink, GST_STATE_NULL); gst_object_unref (sink); if ((sink = gst_element_factory_make ("ximagesink", NULL))) { sret = gst_element_set_state (sink, GST_STATE_READY); if (sret != GST_STATE_CHANGE_SUCCESS) { gst_element_set_state (sink, GST_STATE_NULL); gst_object_unref (sink); if (strcmp (DEFAULT_VIDEOSINK, "xvimagesink") != 0 && strcmp (DEFAULT_VIDEOSINK, "ximagesink") != 0) { if ((sink = gst_element_factory_make (DEFAULT_VIDEOSINK, NULL))) { if (!GST_IS_BIN (sink)) { sret = gst_element_set_state (sink, GST_STATE_READY); if (sret != GST_STATE_CHANGE_SUCCESS) { gst_element_set_state (sink, GST_STATE_NULL); gst_object_unref (sink); sink = NULL; } } else { gst_object_unref (sink); sink = NULL; } } } } } } } if (sink == NULL) g_error ("Couldn't find a working video sink."); gst_bin_add_many (GST_BIN (pipeline), src, sink, caps_rgb, caps_yuv, colorspace, NULL); gst_element_link_many (src, colorspace, caps_rgb, sink, NULL); xwinid = parent->winId(); } SinkPipeline::~SinkPipeline() { gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); } //Setzt im Wesentlichen Pipeline auf Status Playing void SinkPipeline::startPipeline() { GstStateChangeReturn sret; /* we know what the video sink is in this case (xvimagesink), so we can * just set it directly here now (instead of waiting for a prepare-xwindow-id * element message in a sync bus handler and setting it there)*/ gst_x_overlay_set_xwindow_id (GST_X_OVERLAY (sink), xwinid); sret = gst_element_set_state (pipeline, GST_STATE_PLAYING); if (sret == GST_STATE_CHANGE_FAILURE) { gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); // Exit application QTimer::singleShot(0, QApplication::activeWindow(), SLOT(quit())); } // Allow e display to be delayed g_object_set(G_OBJECT(xvsink), "sync", FALSE, NULL); } int main(int argc, char *argv[]) { QApplication a(argc, argv); MainWindow w; QGraphicsScene scene; scene.setSceneRect(-100.0, -100.0, 200.0, 200.0); QGraphicsView graphicsView (&scene); graphicsView.resize(400,256);//800,480 graphicsView.setWindowTitle("Fancy application"); graphicsView.show(); img = (float*)malloc(sizeof(float)*400*256); myimg = (byte*)malloc(sizeof(byte)*400*256); loop = g_main_loop_new (NULL, FALSE); // Initialisation gst_init (&argc, &argv); //init gstreamer SinkPipeline sinkpipe(&graphicsView); sinkpipe.startPipeline(); // Iterate g_print("Running...\n"); g_main_loop_run(loop); // Out of the main loop, clean up nicely g_print("Returned, stopping playback\n"); gst_element_set_state(pipeline, GST_STATE_NULL); g_print("Deleting pipeline\n"); gst_object_unref(GST_OBJECT(pipeline)); free(img); free(myimg); a.quit(); }
INCLUDEPATH += /usr/include/gstreamer-0.10 /usr/include/glib-2.0 /usr/lib/glib-2.0/include /usr/include/libxml2 LIBS += -lgstreamer-0.10 -lgobject-2.0 -lgmodule-2.0 -lgthread-2.0 -lrt -lxml2 -lglib-2.0 -lgstinterfaces-0.10
The Following User Says Thank You to Dorfmeister For This Useful Post: | ||
![]() |
2010-06-02
, 00:41
|
Posts: 143 |
Thanked: 99 times |
Joined on Jun 2009
@ Houston
|
#28
|
Hi,
I think the idea for someone who has a working Qt camera example to publish it would be highly useful (best way to learn for beginners). I'd really appreciate it as well.
The funny thing is that it works fine with the test source, but only displays a white screen when used with the actual source. Any way to fix this?
![]() |
2010-06-02
, 15:29
|
Posts: 124 |
Thanked: 213 times |
Joined on Dec 2009
|
#29
|
gst_init(NULL, NULL); pipeline = gst_parse_launch("v4l2src device=/dev/video[0|1] ! xvimagesink", NULL); GstIterator* iter = gst_bin_iterate_sinks((GstBin*)pipeline); GstElement* thisElem; // Find sink at end of pipeline - there should be only one! if (gst_iterator_next(iter, (void**)&thisElem) == GST_ITERATOR_OK) { sink = thisElem; QApplication::syncX(); gst_element_set_state(pipeline, GST_STATE_READY); gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(sink), Widget.winID()); gst_element_set_state(pipeline, GST_STATE_PLAYING); }
The Following User Says Thank You to Dak For This Useful Post: | ||
![]() |
2010-06-02
, 17:50
|
Posts: 14 |
Thanked: 15 times |
Joined on Feb 2010
@ bay area, us
|
#30
|
Cheers,
Klen