maemo.org - Talk

maemo.org - Talk (https://talk.maemo.org/index.php)
-   Development (https://talk.maemo.org/forumdisplay.php?f=13)
-   -   Fremantle GStreamer (https://talk.maemo.org/showthread.php?t=31666)

qole 2009-09-14 17:14

Fremantle GStreamer
 
Hi all,

I'm willing to test GStreamer applications on Fremantle hardware.

It would really help multimedia developers if someone could provide GStreamer Editor for Fremantle, to help figure out how to build correct pipelines.

qole 2009-09-14 17:26

Re: Fremantle GStreamer
 
Ok, basic gstreamer pipeline from camera to screen:

v4l2src device=/dev/video0 ! autovideosink

For high resolution (back) camera, it is /dev/video0 and for the front web camera, it is /dev/video1.

The output needs to be massaged, so that the aspect ratio is correct, etc.

daperl 2009-09-14 17:41

Re: Fremantle GStreamer
 
Can you change the self.player line to the one below and try again. Post the output here.

Code:

self.player = gst.parse_launch ('v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240,framerate=(fraction)15/1 ! autovideosink')

qole 2009-09-14 17:56

Re: Fremantle GStreamer
 
Just trying that pipeline from the command line (gst-launch), I start getting "cannot negotiate format" as soon as I start adding the width, height, or framerate values...

How do you check what the device's capabilities are? There's got to be a way to query the device for acceptable parameters...

daperl 2009-09-14 18:08

Re: Fremantle GStreamer
 
Use the following generic string instead. It should either work or dump out an error message with some width/height info.

Code:

self.player = gst.parse_launch ('v4l2src device=/dev/video0  ! autovideosink')
Is the lense cap off? :)

qole 2009-09-14 18:44

Re: Fremantle GStreamer
 
Yes, that very simple pipeline works. It displays the back camera's view in a square window in the middle of the screen (above your two buttons). I'd post a screenshot, but it appears that the screenshot utility doesn't capture the picture, just a black square where the picture is.

There was no output on the command line.

qole 2009-09-17 22:20

Re: Fremantle GStreamer
 
Just a note for future reference, since it is in a completely different thread; daperl's original camera script can be found here.

lardman 2009-09-18 07:39

Re: python / gstreamer / camera / xvimagesink issues
 
Ok, so the output of the large camera is 640 x 492 encoded as YUYV.

lardman 2009-09-18 07:56

Re: python / gstreamer / camera / xvimagesink issues
 
Actually although that gives you good Y data, the U and V data are probably arranged a little differently in the data stream (as for each of these you get two images side by side).

More testing...

daperl 2009-09-19 00:04

Re: Fremantle GStreamer
 
The script you pointed to doesn't really do anything. The following really, really bad hack will actually take a picture and create a .png file on an n800. As is, it won't work on an n900, but the changes should be obvious.

Also, the file creation is really, really slow; about 25 seconds. It's strictly just a hard-coded, proof-of-concept that a picture can be taken on a tablet with just using PyGTK and GStreamer. Obviously, plenty of room for improvement.

Code:

#! /usr/bin/env python

import string
import gtk
import gst

class ShowMe:
        def __init__(self):
                window = gtk.Window(gtk.WINDOW_TOPLEVEL)
                window.set_title("Webcam-Viewer")
                window.connect("destroy", gtk.main_quit, "WM destroy")
                vbox = gtk.VBox()
                window.add(vbox)
                self.movie_window = gtk.DrawingArea()
                vbox.add(self.movie_window)
                hbox = gtk.HBox()
                vbox.pack_start(hbox, False)
                hbox.set_border_width(10)
                hbox.pack_start(gtk.Label())
                self.takePicture = 0
                self.button0 = gtk.Button("Snap")
                self.button0.connect("clicked", self.onTakePicture)
                hbox.pack_start(self.button0, False)
                self.button = gtk.Button("Start")
                self.button.connect("clicked", self.start_stop)
                hbox.pack_start(self.button, False)
                self.button2 = gtk.Button("Quit")
                self.button2.connect("clicked", self.exit)
                hbox.pack_start(self.button2, False)
                hbox.add(gtk.Label())
                window.show_all()

                if 1 == 1:
                    self.player = gst.Pipeline('ThePipe')
                    src = gst.element_factory_make("gconfv4l2src", "src")
                    self.player.add(src)
                    for p in src.pads():
                        #print p.get_caps().to_string()
                        print p.get_name()
                    caps = gst.element_factory_make("capsfilter", "caps")
                    caps.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    #caps.set_property('caps', gst.caps_from_string(
                        #'video/x-raw-rgb,width=352,height=288,\
                        #framerate=15/1'))
                        #red_mask=224,green_mask=28,blue_mask=3,framerate=15/1'))
                    self.player.add(caps)
                    filt = gst.element_factory_make("ffmpegcolorspace", "filt")
                    self.player.add(filt)
                    caps2 = gst.element_factory_make("capsfilter", "caps2")
                    caps2.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    self.player.add(caps2)
                    sink = gst.element_factory_make("xvimagesink", "sink")
                    self.player.add(sink)
                    pad = src.get_pad('src')
                    pad.add_buffer_probe(self.doBuffer)
                    src.link(caps)
                    caps.link(filt)
                    filt.link(caps2)
                    caps2.link(sink)

                # Set up the gstreamer pipeline
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! autovideosink')
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! queue ! filesink location=qole.raw qole. ! queue ! autovideosink')
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-rgb,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! jpegenc ! filesink location=qole.raw qole. ! queue ! autovideosink')
                #self.player = gst.parse_launch ('v4l2src ! autovideosink')

                bus = self.player.get_bus()
                bus.add_signal_watch()
                bus.enable_sync_message_emission()
                bus.connect("message", self.on_message)
                bus.connect("sync-message::element", self.on_sync_message)

        def onTakePicture(self, w):
            self.takePicture = 1

        def doBuffer(self, pad, buffer):
            if self.takePicture:
                self.takePicture = 0
                #print buffer.get_caps()
                # 63488 2016 31
                # 0xf8  0x07,0xe0  0x1f
                p = gtk.gdk.Pixbuf(gtk.gdk.COLORSPACE_RGB,False,8,352,288)
                pa = p.get_pixels()
                pal = list(pa)
                for i in range(0,len(buffer)/2):
                    pal[i*3] = "%c" % (0xf8 & ord(buffer[i*2+1]))
                    pal[i*3+1] = "%c" % (((0x07 & ord(buffer[i*2+1])) << 5) |\
                        ((0xe0 & ord(buffer[i*2])) >> 5))
                    pal[i*3+2] = "%c" % ((0x1f & ord(buffer[i*2])) << 3)
                js = string.join(pal,'')
                pb = gtk.gdk.pixbuf_new_from_data(js,gtk.gdk.COLORSPACE_RGB,                        False,8,352,288,1056)
                pb.save('/home/user/MyDocs/.images/daperl00.png','png')
                print pb.get_width(),pb.get_height()
            return True

        def start_stop(self, w):
                if self.button.get_label() == "Start":
                        self.button.set_label("Stop")
                        self.player.set_state(gst.STATE_PLAYING)
                else:
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")

        def exit(self, widget, data=None):
                gtk.main_quit()

        def on_message(self, bus, message):
                t = message.type
                if t == gst.MESSAGE_EOS:
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")
                elif t == gst.MESSAGE_ERROR:
                        err, debug = message.parse_error()
                        print "Error: %s" % err, debug
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")

        def on_sync_message(self, bus, message):
                if message.structure is None:
                        return
                message_name = message.structure.get_name()
                if message_name == "prepare-xwindow-id":
                        # Assign the viewport
                        imagesink = message.src
                        imagesink.set_property("force-aspect-ratio", True)
                        imagesink.set_xwindow_id(self.movie_window.window.xid)

if __name__ == "__main__":
    gtk.gdk.threads_init()
    ShowMe()
    gtk.main()


qole 2009-09-19 03:26

Re: Fremantle GStreamer
 
Quote:

Originally Posted by daperl (Post 329558)
As is, it won't work on an n900, but the changes should be obvious.

If you make the "obvious" (to you, maybe! :confused:) changes, I'll test it out for you and even time how long it takes to make the png. I'm curious to know how much faster the process is...

daperl 2009-09-19 05:53

Re: Fremantle GStreamer
 
Run this script and post the output. It won't save a picture, it should just give me information about the captured buffer. Then I'll post back an n900 save solution while I work on a more general one. This is fun; I haven't flipped bits in a long while.

Code:

#! /usr/bin/env python

import string
import platform
import gtk
import gst

class ShowMe:
        def __init__(self):
                window = gtk.Window(gtk.WINDOW_TOPLEVEL)
                window.set_title("Webcam-Viewer")
                window.connect("destroy", gtk.main_quit, "WM destroy")
                vbox = gtk.VBox()
                window.add(vbox)
                self.movie_window = gtk.DrawingArea()
                vbox.add(self.movie_window)
                hbox = gtk.HBox()
                vbox.pack_start(hbox, False)
                hbox.set_border_width(10)
                hbox.pack_start(gtk.Label())
                self.takePicture = 0
                self.button0 = gtk.Button("Snap")
                self.button0.connect("clicked", self.onTakePicture)
                hbox.pack_start(self.button0, False)
                self.button = gtk.Button("Start")
                self.button.connect("clicked", self.start_stop)
                hbox.pack_start(self.button, False)
                self.button2 = gtk.Button("Quit")
                self.button2.connect("clicked", self.exit)
                hbox.pack_start(self.button2, False)
                hbox.add(gtk.Label())
                window.show_all()
                self.machine = platform.uname()[4]

                if self.machine == 'armv6l':
                    self.player = gst.Pipeline('ThePipe')
                    src = gst.element_factory_make("gconfv4l2src","src")
                    self.player.add(src)
                    for p in src.pads():
                        #print p.get_caps().to_string()
                        print p.get_name()
                    caps = gst.element_factory_make("capsfilter", "caps")
                    caps.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    #caps.set_property('caps', gst.caps_from_string(
                        #'video/x-raw-rgb,width=352,height=288,\
                        #framerate=15/1'))
                        #red_mask=224,green_mask=28,blue_mask=3,framerate=15/1'))
                    self.player.add(caps)
                    filt = gst.element_factory_make("ffmpegcolorspace", "filt")
                    self.player.add(filt)
                    caps2 = gst.element_factory_make("capsfilter", "caps2")
                    caps2.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    self.player.add(caps2)
                    #sink = gst.element_factory_make("xvimagesink", "sink")
                    sink = gst.element_factory_make("autovideosink", "sink")
                    self.player.add(sink)
                    pad = src.get_pad('src')
                    pad.add_buffer_probe(self.doBuffer)
                    src.link(caps)
                    caps.link(filt)
                    filt.link(caps2)
                    caps2.link(sink)
                else:
                    self.player = gst.Pipeline('ThePipe')
                    src = gst.element_factory_make("v4l2src","src")
                    src.set_property('device','/dev/video0')
                    self.player.add(src)
                    sink = gst.element_factory_make("autovideosink", "sink")
                    self.player.add(sink)
                    pad = src.get_pad('src')
                    pad.add_buffer_probe(self.doBuffer)
                    src.link(sink)

                # Set up the gstreamer pipeline
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! autovideosink')
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! queue ! filesink location=qole.raw qole. ! queue ! autovideosink')
                #self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-rgb,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! jpegenc ! filesink location=qole.raw qole. ! queue ! autovideosink')
                #self.player = gst.parse_launch ('v4l2src ! autovideosink')

                bus = self.player.get_bus()
                bus.add_signal_watch()
                bus.enable_sync_message_emission()
                bus.connect("message", self.on_message)
                bus.connect("sync-message::element", self.on_sync_message)

        def onTakePicture(self, w):
            self.takePicture = 1

        def doBuffer(self, pad, buffer):
            if self.takePicture:
                self.takePicture = 0
                print 'buffer length =',len(buffer)
                caps = buffer.get_caps()
                #struct = caps.get_structure(0)
                struct = caps[0]
                print 'caps',caps
                for i in range(0,struct.n_fields()):
                    fn = struct.nth_field_name(i)
                    print '  ',fn,'=',struct[fn]
                # 63488 2016 31
                # 0xf8  0x07,0xe0  0x1f
                return True
                p = gtk.gdk.Pixbuf(gtk.gdk.COLORSPACE_RGB,False,8,352,288)
                pa = p.get_pixels()
                pal = list(pa)
                for i in range(0,len(buffer)/2):
                    pal[i*3] = "%c" % (0xf8 & ord(buffer[i*2+1]))
                    pal[i*3+1] = "%c" % (((0x07 & ord(buffer[i*2+1])) << 5) |\
                        ((0xe0 & ord(buffer[i*2])) >> 5))
                    pal[i*3+2] = "%c" % ((0x1f & ord(buffer[i*2])) << 3)
                js = string.join(pal,'')
                pb = gtk.gdk.pixbuf_new_from_data(js,gtk.gdk.COLORSPACE_RGB,                        False,8,352,288,1056)
                pb.save('/home/user/MyDocs/.images/daperl00.png','png')
                print pb.get_width(),pb.get_height()
            return True

        def start_stop(self, w):
                if self.button.get_label() == "Start":
                        self.button.set_label("Stop")
                        self.player.set_state(gst.STATE_PLAYING)
                else:
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")

        def exit(self, widget, data=None):
                gtk.main_quit()

        def on_message(self, bus, message):
                t = message.type
                if t == gst.MESSAGE_EOS:
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")
                elif t == gst.MESSAGE_ERROR:
                        err, debug = message.parse_error()
                        print "Error: %s" % err, debug
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")

        def on_sync_message(self, bus, message):
                if message.structure is None:
                        return
                message_name = message.structure.get_name()
                if message_name == "prepare-xwindow-id":
                        # Assign the viewport
                        imagesink = message.src
                        imagesink.set_property("force-aspect-ratio", True)
                        imagesink.set_xwindow_id(self.movie_window.window.xid)

if __name__ == "__main__":
    gtk.gdk.threads_init()
    ShowMe()
    gtk.main()


daperl 2009-09-19 17:09

Re: Fremantle GStreamer
 
1 Attachment(s)
I've cut file creation down to under 13 10 seconds. It's just number crunching, so I'm guessing it should fly on the n900. Also, there should be other options like the numpy module and GStreamer's jpegenc. jpegenc would be interesting 'cause it would force us to learn more about pipeline control flow. Which would be a good thing.

numpy could be just what the doctor ordered.

qgil 2009-09-20 10:14

Re: Fremantle GStreamer
 
I f you want to go beyond http://wiki.maemo.org/Documentation/...mera_API_Usage then I recommend you to ask to maemo-developers where some of our Multimedia developers are following.

daperl 2009-09-21 05:20

Re: Fremantle GStreamer
 
1 Attachment(s)
Code:

/* Initialize the the Gstreamer pipeline. Below is a diagram
 * of the pipeline that will be created:
 *
 *                            |Screen|  |Screen|
 *                          ->|queue |->|sink  |-> Display
 * |Camera|  |CSP  |  |Tee|/
 * |src  |->|Filter|->|  |\  |Image|  |Image |  |Image|
 *                          ->|queue|-> |filter|->|sink |-> JPEG file
 */

So, this is the pipeline from example_camera.c. I basically implemented this in the attached Python text file, with the addition of scaling down the displayed dimensions (nearest-neighbour). This cuts file creation time in half, but as I feared, it also unnecessarily sucks on the CPU.

I quickly tried modifying things so that the image queue path doesn't run until a snapshot is requested, but I haven't had luck there yet. So, even though a file isn't created until the button's clicked, the Image queue and the Image filter are processing every frame. On my n800, that's 640x480 @ 15 fps. For the n900, that would be more like 2592x1944 @ 25 fps. According to the specs, the supplied video recorder does 848x480 @ 25 fps with unknown sound quality. Pretty good, and I'm guessing the supplied camera app doesn't use the above pipeline. But if it does, which I doubt, there seems to be room for improvement.

The problem here is that you really only want to be pushing the decimated view finder pixels until it's picture time, so I really just want some hardware decimated camera buffer. Instead, with a pipeline like this, I'm using the CPU to decimate the feed. Currently, I'm using a view window of 320x240. From the n900 demos, it looks like they're using close to the whole 800x480. And since I haven't figured out how to no-op the Image queue until photo request, I have two unnecessary loads.

My next two steps are to see if I can change the Camera src capabilities on-the-fly, and also see if I can dynamically link and unlink the Image branch from the tee when taking a picture. The first might need a pipeline start-and-stop, but the second might just need a simple switch that I haven't found yet.

But most importantly, are the camera and video apps open source? :)

daperl 2009-09-21 05:33

Re: python / gstreamer / camera / xvimagesink issues
 
1 Attachment(s)
Attached is a hard-coded, proof-of-concept that works on my n800. Look at it closely 'cause you'll probably have to change a few things to get it working on the n900. Including, but not limited to, anything that says:

Code:

if self.machine == 'armv6l':
:)

More info about what I've been doing can be found here.

abbra 2009-09-21 05:41

Re: python / gstreamer / camera / xvimagesink issues
 
I tried several Vala Gstreamer samples from http://live.gnome.org/Vala/GStreamerSample -- the last one is working well with N900 when v4l2camsrc is used instead of videotestsrc.

While this is not a Python, you can easily see how to use gstreamer correctly and I can confirm that this approach works well on the device with both cameras (front and rear).

qole 2009-09-21 06:09

Re: python / gstreamer / camera / xvimagesink issues
 
Just a note, guys. The original post (from the thread that this post was originally in before sjgadsby unceremoniously dumped everything in the right thread) is talking about Diablo and the tablets, not Fremantle and the N900...

I know that's where everyone's head is these days, but read closely... :)

daperl 2009-09-21 06:31

Re: python / gstreamer / camera / xvimagesink issues
 
Quote:

Originally Posted by qole (Post 330533)
Just a note, guys. The original post is talking about Diablo and the tablets, not Fremantle and the N900...

I know that's where everyone's head is these days, but read closely... :)

Well, he should fire my sh*t up then. And where is my output from you? Did cut-and-paste stop working on your n900? :)

abbra 2009-09-21 07:54

Re: python / gstreamer / camera / xvimagesink issues
 
I wish I had looked at the original post's timestamp :)

daperl 2009-09-21 07:59

Re: python / gstreamer / camera / xvimagesink issues
 
Quote:

Originally Posted by abbra (Post 330565)
I wish I had looked at the original post's timestamp :)

Doh! That's twice this month I've done that.

qole 2009-09-21 18:31

Re: python / gstreamer / camera / xvimagesink issues
 
It's lardman's fault. He should have posted on my thread not this ancient, crusty thread.

And sorry, daperl, I haven't tested your scripts yet. My brain was refusing to go into "work" mode all weekend. It just wanted to sit around on the couch and drink beer. :(

lardman 2009-09-21 19:18

Re: python / gstreamer / camera / xvimagesink issues
 
Sorry, I thought this was the thread you were talking about!

sjgadsby 2009-09-21 20:10

Re: python / gstreamer / camera / xvimagesink issues
 
Quote:

Originally Posted by qole (Post 330817)
It's lardman's fault. He should have posted on my thread...

Quote:

Originally Posted by lardman (Post 330839)
Sorry, I thought this was the thread you were talking about!

Okay, enough of that; I've moved the posts to the proper, non-"crusty" thread. So, now you're all in the right place, but your side discussion about being in the wrong place will confound future readers.

jcharpak 2009-09-21 21:28

Re: python / gstreamer / camera / xvimagesink issues
 
Quote:

Originally Posted by sjgadsby (Post 330877)
Okay, enough of that; I've moved the posts to the proper, non-"crusty" thread. So, now you're all in the right place, but your side discussion about being in the wrong place will confound future readers.

Like me :)

lardman 2009-09-21 22:51

Re: Fremantle GStreamer
 
So while we're at it, what are the thoughts on camera focusing?

There are v4l2 hooks there in ad5820.c iirc, and it looks like the camerabin component (or photography interface) can use them.

However as I'm currently writing in Python, that's not so great. There is the v4l2camsrc, which looks like it's linked to the above, but I'm not sure I can do anything with it (as it just appears as a source afaict).

Otherwise I'm considering talking to the ad5820 directly using ioctls, but then I need to work out how to pull in the isp_af functionality, or implement my own (which would be interesting, but ultimately a waste of time).

So, anyone have any bright ideas?

qole 2009-09-22 05:07

Re: Fremantle GStreamer
 
lardman: this is output from Fremantle VLC that you might find useful:

Code:

opening device '/dev/video0'
 V4L2 device: omap3/et8ek8/ad5820/adp1653 using driver: omap3 (version: 0.0.0) on
 the device has the capabilities: (X) Video Capure, ( ) Audio, ( ) Tuner
 supported I/O methods are: ( ) Read/Write, (X) Streaming, ( ) Asynchronous
 video input 0 (camera) has type: External analog input *
 device supports chroma UYVY [UYVY, packed, UYVY]
    device supports size 2592x1968
    device supports size 1296x984
    device supports size 864x656
    device supports size 640x492
 device supports chroma YUY2 [YUYV (YUV 4:2:2), packed, YUYV]
    device supports size 2592x1968
    device supports size 1296x984
    device supports size 864x656
    device supports size 640x492
 device codec BA10 (Bayer10 (GrR/BGb)) not supported
 '/dev/video0' is a video device
 Extended control API supported by v4l2 driver
 Available control: Brightness (980900)
    integer control
    valid values: 0 to 255 by steps of 1
    default value: 0
    current value: 0
 Available control: Contrast (980901)
    integer control
    valid values: 0 to 255 by steps of 1
    default value: 16
    current value: 16
 Available control: Exposure time [us] (980911)
    integer control
    valid values: 33 to 33132 by steps of 33
    default value: 33132
    current value: 33133
 Available control: Gain [0.1 EV] (980913)
    integer control
    valid values: 0 to 40 by steps of 1
    default value: 0
    current value: 0
 Available control: Color Effects (98091f)
    menu control
        0: None
        1: B&W
        2: Sepia
    default value: 0
    current value: 0
 Available private control: Focus, Absolute (9a090a)
    integer control
    valid values: 0 to 1023 by steps of 1
    default value: 0
    current value: 79
 Available private control: Flash strobe (9a090d)
    button control
 Available private control: Flash timeout [us] (9a090e)
    integer control
    valid values: 1000 to 500000 by steps of 54600
    default value: 500000
    current value: 500000
 Available private control: Flash intensity (9a090f)
    integer control
    valid values: 12 to 19 by steps of 1
    default value: 12
    current value: 12
 Available private control: Torch intensity (9a0910)
    integer control
    valid values: 0 to 1 by steps of 1
    default value: 0
    current value: 0
 Available private control: Indicator intensity (9a0911)
    integer control
    valid values: 0 to 7 by steps of 1
    default value: 0
    current value: 0
 Available private control: Test pattern mode (9a107e)
    menu control
        0: Normal
        1: Vertical colorbar
        2: Horizontal colorbar
        3: Scale
        4: Ramp
        5: Small vertical colorbar
        6: Small horizontal colorbar
        7: Small scale
        8: Small ramp
    default value: 0
    current value: 0
 Available private control: Focus ramping time [us] (9a10af)
    integer control
    valid values: 0 to 3200 by steps of 50
    default value: 0
    current value: 0
 Available private control: Focus ramping mode (9a10b0)
    menu control
        0: Linear ramp
        1: 64/16 ramp
    default value: 0
    current value: 0


lardman 2009-09-22 08:56

Re: Fremantle GStreamer
 
Good, at least the focus components are available through v4l2 on the device. No auto focus there, but just altering it would be good as a starting point.

lardman 2009-09-22 09:25

Re: Fremantle GStreamer
 
Hmm, using v4l2camsrc, I get a frame which is 61440 in size.

Now that is 640 x 480 x 2

But the camera is not supposed to return that frame size, and also it seems an odd encoding, either you get a one-to-one YUV, or the U and V data are a 1/4 the size of the Y data, so I'd expect the total frame to be 640 x 480 x 1.5.

Though I guess there are other encodings available (and this seems to be the same one I saw before).

Perhaps it's just the fact it's morning ;)

Indeed just morning. As the encoding is UYVY (or something like that), for every 2 Y pixels we have a total of 4*8 byes = 32bytes. 32/2 = 16bytes total per pixel -> the frame is twice the size of the resolution. Doh!

qole 2009-09-22 15:57

Re: Fremantle GStreamer
 
What does the line, "device codec BA10 (Bayer10 (GrR/BGb)) not supported" mean I wonder? What's Bayer10 (GrR/BGb)?

abbra 2009-09-22 18:52

Re: Fremantle GStreamer
 
Bayer10 is RAW sensor data format. Most camera sensors allow export of data in either already pre-processed format (some form of YUV) or in raw, usually unprocessed output of sensor.

As sensor typically has number of color filters, they need to be placed in certain order and quantities to allow capture of light similar to what our eye does. Bryce E. Bayer from Eastman Kodak developed specific arrangement pattern where 50% of green and 25% of red and blue are used. This pattern became quite popular and thus output format from any sensor that uses this arrangement is called Bayer pattern.

Bayer 10 means 10-bit values for Green Red Blue Green output from the sensor. "not supported" here means that V4L2 driver for this particular sensor does not allow to access to RAW output from Bayer filter of the sensor. Perhaps, the RAW output is possible to get by other ways, but V4L2 driver implementation simply does not give you such output.

daperl 2009-09-22 23:44

Re: Fremantle GStreamer
 
Note to self: Just create jpeg's.

I just switched from creating 'png' files to creating 'jpeg' files and creation time is now two blinks of an eye. And that's with quality set to 100%.

qole 2009-10-27 19:58

Re: Fremantle GStreamer
 
This seems to be good news to me: GDigicam at maemo.gitorious.org

daperl 2009-10-27 20:30

Re: Fremantle GStreamer
 
5 Attachment(s)
Do you not have a Fremantle dev environment? If yes, why?

I've attached the non-doc/dbg debs if you wanted to try it out.

qole 2009-10-27 21:40

Re: Fremantle GStreamer
 
I just meant it is good news to see fresh source code in this area being posted and developed in real time.

EDIT: The examples and tests look interesting, though...


All times are GMT. The time now is 13:00.

vBulletin® Version 3.8.8