* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
@ 2009-06-11 9:33 Hans Verkuil
2009-06-11 9:40 ` Hans de Goede
0 siblings, 1 reply; 16+ messages in thread
From: Hans Verkuil @ 2009-06-11 9:33 UTC (permalink / raw)
To: Hans de Goede
Cc: Jean-Philippe François, Karicheri, Muralidharan,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
>
>
> On 06/11/2009 10:35 AM, Hans Verkuil wrote:
>>> Karicheri, Muralidharan a écrit :
>>>>>> We need
>>>>>> streaming capability in the driver. This is how our driver works
>>>>>> with mt9t031 sensor
>>>>>> raw-bus (10 bit)
>>>>>> vpfe-capture ----------------- mt9t031 driver
>>>>>> | |
>>>>>> V V
>>>>>> VPFE MT9T031
>>>>>>
>>>>>> VPFE hardware has internal timing and DMA controller to
>>>>>> copy data frame by frame from the sensor output to SDRAM.
>>>>>> The PCLK form the sensor is used to generate the internal
>>>>>> timing.
>>>>> So, what is missing in the driver apart from the ability to specify
>>>>> a frame-rate?
>>>>>
>>>> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
>>>> can it output frame continuously along with PCLK, Hsync and Vsync
>>>> signals like in a video streaming device. VPFE capture can accept
>>>> frames
>>>> continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
>>>> output frames to application using QBUF/DQBUF. In our implementation,
>>>> we
>>>> have timing parameters for the sensor to do streaming at various
>>>> resolutions and fps. So application calls S_STD to set these timings.
>>>> I
>>>> am not sure if this is an acceptable way of implementing it. Any
>>>> comments?
>>>>
>>> PCLK, HSYNC, VSYNC are generated by the CMOS sensor. I don't think you
>>> can set the timings. Depending on sensor settings, pixel clock speed
>>> etc
>>> .., the frame rate will vary.
>>>
>>> You could perhaps play with the CMOS sensor registers so that when
>>> settings a standard, the driver somehow set the various exposition
>>> parameter and pll settings to get a specified framerate.
>>>
>>> This will vary with each sensor and each platform (because of
>>> pixelclock). More over, chances are that it will be conflicting with
>>> other control.
>>>
>>> For example if you set a fixed gain and autoexpo, some sensor will see
>>> a drop in fps under low light conditions. I think this kind of
>>> arbitration should be left to userspace.
>>>
>>> Unless the sensor supports a specific standard, I don't think the
>>> driver
>>> should try to make behind the scene modification to camera sensor
>>> register in response to a S_STD ioctl.
>>
>> The S_STD call is hopelessly inadequate to deal with these types of
>> devices. What is needed is a new call that allows you to set the exact
>> timings you want: frame width/height, back/front porch, h/vsync width,
>> pixelclock. It is my opinion that the use of S_STD should be limited to
>> standard definition type inputs, and not used for other devices like
>> sensors or HD video.
>>
>> Proposals for such a new ioctl are welcome :-)
>>
>
> Hmm,
>
> Why would we want the *application* to set things like this *at all* ?
> with sensors hsync and vsync and other timing are something between
> the bridge and the sensor, actaully in my experience the correct
> hsync / vsync timings to program the sensor to are very much bridge
> specific. So IMHO this should not be exposed to userspace at all.
>
> All userspace should be able to control is the resolution and the
> framerate. Although controlling the framerate in many cases also
> means controlling the maximum exposure time. So in many cases
> one cannot even control the framerate. (Asking for 30 fps in an
> artificially illuminated room will get you a very dark, useless
> picture, with most sensors). Yes this means that with cams with
> use autoexposure (which is something which we really want where ever
> possible), the framerate can and will change while streaming.
I think we have three possible use cases here:
- old-style standard definition video: use S_STD
- webcam-like devices: a combination of S_FMT and S_PARM I think? Correct
me if I'm wrong. S_STD is useless for this, right?
- video streaming devices like the davinci videoports where you can hook
up HDTV receivers or FPGAs: here you definitely need a new API to setup
the streaming parameters, and you want to be able to do that from the
application as well. Actually, sensors are also hooked up to these devices
in practice. And there you also want to be able to setup these parameters.
You will see this mostly (only?) on embedded platforms.
Regards,
Hans
--
Hans Verkuil - video4linux developer - sponsored by TANDBERG
^ permalink raw reply [flat|nested] 16+ messages in thread* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 9:33 mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device) Hans Verkuil
@ 2009-06-11 9:40 ` Hans de Goede
2009-06-11 14:43 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Hans de Goede @ 2009-06-11 9:40 UTC (permalink / raw)
To: Hans Verkuil
Cc: Jean-Philippe François, Karicheri, Muralidharan,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
On 06/11/2009 11:33 AM, Hans Verkuil wrote:
>>
>> On 06/11/2009 10:35 AM, Hans Verkuil wrote:
<snip (a lot)>
>> Hmm,
>>
>> Why would we want the *application* to set things like this *at all* ?
>> with sensors hsync and vsync and other timing are something between
>> the bridge and the sensor, actaully in my experience the correct
>> hsync / vsync timings to program the sensor to are very much bridge
>> specific. So IMHO this should not be exposed to userspace at all.
>>
>> All userspace should be able to control is the resolution and the
>> framerate. Although controlling the framerate in many cases also
>> means controlling the maximum exposure time. So in many cases
>> one cannot even control the framerate. (Asking for 30 fps in an
>> artificially illuminated room will get you a very dark, useless
>> picture, with most sensors). Yes this means that with cams with
>> use autoexposure (which is something which we really want where ever
>> possible), the framerate can and will change while streaming.
>
> I think we have three possible use cases here:
>
> - old-style standard definition video: use S_STD
>
Ack
> - webcam-like devices: a combination of S_FMT and S_PARM I think? Correct
> me if I'm wrong. S_STD is useless for this, right?
>
Ack
> - video streaming devices like the davinci videoports where you can hook
> up HDTV receivers or FPGAs: here you definitely need a new API to setup
> the streaming parameters, and you want to be able to do that from the
> application as well. Actually, sensors are also hooked up to these devices
> in practice. And there you also want to be able to setup these parameters.
> You will see this mostly (only?) on embedded platforms.
>
I agree we need an in kernel API for this, but why expose it to
userspace, as you say this will only happen on embedded systems,
shouldn't the info then go in a board_info file / struct ?
Regards,
Hans
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 9:40 ` Hans de Goede
@ 2009-06-11 14:43 ` Karicheri, Muralidharan
0 siblings, 0 replies; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-11 14:43 UTC (permalink / raw)
To: Hans de Goede, Hans Verkuil
Cc: Jean-Philippe François,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
>
>> - video streaming devices like the davinci videoports where you can hook
>> up HDTV receivers or FPGAs: here you definitely need a new API to setup
>> the streaming parameters, and you want to be able to do that from the
>> application as well. Actually, sensors are also hooked up to these
>devices
>> in practice. And there you also want to be able to setup these parameters.
>> You will see this mostly (only?) on embedded platforms.
>>
>
>I agree we need an in kernel API for this, but why expose it to
>userspace, as you say this will only happen on embedded systems,
>shouldn't the info then go in a board_info file / struct ?
>
No we still need a way for application to set these timings at the device. For example, it needs to tell a TVP7002 device to scan at 720p/1080p similar to S_STD. From user prespective, it is just like S_STD. See my email on the details...
>Regards,
>
>Hans
^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
@ 2009-06-11 10:39 Hans Verkuil
2009-06-11 14:40 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Hans Verkuil @ 2009-06-11 10:39 UTC (permalink / raw)
To: Hans de Goede
Cc: Jean-Philippe François, Karicheri, Muralidharan,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
>
>
> On 06/11/2009 11:33 AM, Hans Verkuil wrote:
>>>
>>> On 06/11/2009 10:35 AM, Hans Verkuil wrote:
>
> <snip (a lot)>
>
>>> Hmm,
>>>
>>> Why would we want the *application* to set things like this *at all* ?
>>> with sensors hsync and vsync and other timing are something between
>>> the bridge and the sensor, actaully in my experience the correct
>>> hsync / vsync timings to program the sensor to are very much bridge
>>> specific. So IMHO this should not be exposed to userspace at all.
>>>
>>> All userspace should be able to control is the resolution and the
>>> framerate. Although controlling the framerate in many cases also
>>> means controlling the maximum exposure time. So in many cases
>>> one cannot even control the framerate. (Asking for 30 fps in an
>>> artificially illuminated room will get you a very dark, useless
>>> picture, with most sensors). Yes this means that with cams with
>>> use autoexposure (which is something which we really want where ever
>>> possible), the framerate can and will change while streaming.
>>
>> I think we have three possible use cases here:
>>
>> - old-style standard definition video: use S_STD
>>
>
> Ack
>
>> - webcam-like devices: a combination of S_FMT and S_PARM I think?
>> Correct
>> me if I'm wrong. S_STD is useless for this, right?
>>
>
> Ack
>
>> - video streaming devices like the davinci videoports where you can hook
>> up HDTV receivers or FPGAs: here you definitely need a new API to setup
>> the streaming parameters, and you want to be able to do that from the
>> application as well. Actually, sensors are also hooked up to these
>> devices
>> in practice. And there you also want to be able to setup these
>> parameters.
>> You will see this mostly (only?) on embedded platforms.
>>
>
> I agree we need an in kernel API for this, but why expose it to
> userspace, as you say this will only happen on embedded systems,
> shouldn't the info then go in a board_info file / struct ?
These timings are not fixed. E.g. a 720p60 video stream has different
timings compared to a 1080p60 stream. So you have to be able to switch
from userspace. It's like PAL and NTSC, but then many times worse :-)
Regards,
Hans
--
Hans Verkuil - video4linux developer - sponsored by TANDBERG
^ permalink raw reply [flat|nested] 16+ messages in thread* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 10:39 Hans Verkuil
@ 2009-06-11 14:40 ` Karicheri, Muralidharan
0 siblings, 0 replies; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-11 14:40 UTC (permalink / raw)
To: Hans Verkuil, Hans de Goede
Cc: Jean-Philippe François,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
Hi,
In our experience, I have following cases where we need to set the device to capture/display at certain resolution and fps.
Capture side
-------------
1) IP Netcam applications. If the solution uses Micron sensors such as MT9T001/031/P031 etc, application will require to do streaming at various resolution ranging from VGA, SVGA to UXGA or 480p, 576p to 1080p. In these case what we have implemented is a table of timings looked up by the STD string. Though this is not a standard, it helps in achieving streaming at desired frame rate. The exposure used is to get the desired frame rate and video quality. The VPFE has modules to fine tune the image quality.
2) TVP7002 decoder chip has following table for various analog video standard that it supports.
SDTV(YPbPr component) - 480i/576i
EDTV (YPbPr component) - 480p/576p
HDTV (YPbPr component) - 720p@50/60, 1080i@50/60, 1080p@50/60
PC graphics (RGB Component) - VGA to UXGA
3) TVP5146/TVP5147 - supports NTSC/PAL standards from SDTV
4) camera applications that do preview and take snapshots. We don't support it in Linux, but have solutions based on other OS.
Display side
------------
1)VPBE (video processing back end) can support NTSC/PAL timing signals directly from the SOC.
2) By connecting a daughter card that does voltage translation, to the digital LCD port of VPBE, it can support PC graphics timings. Examples are logic PD LCD/ Avnet LCD kits that can be connected using these daughter card.
3) The Digital LCD port of VPBE can generate BT.656/BT.1120 timings. So you could connect a encoder chip such as THS8200 to generate 720P/1080 YPbPr component outputs. This can support any encoder chip that can accepts YUV data or RGB666 or RGB888 data along with timings signals and output PC graphic or YPbPr component output or standard NTSC/PAL outputs.
As you can see, S_STD can be used only for 3) on the capture side and 1) on the display side since it doesn't specify all the above timings and is not quite useful. So we need an API that can do the following...
Query available timings settings from the encoder/decoder/sensors. Since these timings are not relevant to application domain, it can be defined in either in the driver and only expose following as part of the query. Driver uses this to look up the correct timing.
1) resolution (VGA, 720p, 1080p, 576p etc)
2) frame rate
Set the timing by specifying
Detect the signal for capture similar to QUERYSTD??
Get the current timing...
Is VIDIOC_S_PARM & G_PARM added for this purpose. May be this might need to be enhanced for this purpose to add the resolution as well or add a new set of APIs...
Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
email: m-karicheri2@ti.com
>-----Original Message-----
>From: linux-media-owner@vger.kernel.org [mailto:linux-media-
>owner@vger.kernel.org] On Behalf Of Hans Verkuil
>Sent: Thursday, June 11, 2009 6:40 AM
>To: Hans de Goede
>Cc: Jean-Philippe François; Karicheri, Muralidharan; davinci-linux-open-
>source@linux.davincidsp.com; Muralidharan Karicheri; Guennadi Liakhovetski;
>linux-media@vger.kernel.org
>Subject: Re: mt9t031 (was RE: [PATCH] adding support for setting bus
>parameters in sub device)
>
>
>>
>>
>> On 06/11/2009 11:33 AM, Hans Verkuil wrote:
>>>>
>>>> On 06/11/2009 10:35 AM, Hans Verkuil wrote:
>>
>> <snip (a lot)>
>>
>>>> Hmm,
>>>>
>>>> Why would we want the *application* to set things like this *at all* ?
>>>> with sensors hsync and vsync and other timing are something between
>>>> the bridge and the sensor, actaully in my experience the correct
>>>> hsync / vsync timings to program the sensor to are very much bridge
>>>> specific. So IMHO this should not be exposed to userspace at all.
>>>>
>>>> All userspace should be able to control is the resolution and the
>>>> framerate. Although controlling the framerate in many cases also
>>>> means controlling the maximum exposure time. So in many cases
>>>> one cannot even control the framerate. (Asking for 30 fps in an
>>>> artificially illuminated room will get you a very dark, useless
>>>> picture, with most sensors). Yes this means that with cams with
>>>> use autoexposure (which is something which we really want where ever
>>>> possible), the framerate can and will change while streaming.
>>>
>>> I think we have three possible use cases here:
>>>
>>> - old-style standard definition video: use S_STD
>>>
>>
>> Ack
>>
>>> - webcam-like devices: a combination of S_FMT and S_PARM I think?
>>> Correct
>>> me if I'm wrong. S_STD is useless for this, right?
>>>
>>
>> Ack
>>
>>> - video streaming devices like the davinci videoports where you can hook
>>> up HDTV receivers or FPGAs: here you definitely need a new API to setup
>>> the streaming parameters, and you want to be able to do that from the
>>> application as well. Actually, sensors are also hooked up to these
>>> devices
>>> in practice. And there you also want to be able to setup these
>>> parameters.
>>> You will see this mostly (only?) on embedded platforms.
>>>
>>
>> I agree we need an in kernel API for this, but why expose it to
>> userspace, as you say this will only happen on embedded systems,
>> shouldn't the info then go in a board_info file / struct ?
>
>These timings are not fixed. E.g. a 720p60 video stream has different
>timings compared to a 1080p60 stream. So you have to be able to switch
>from userspace. It's like PAL and NTSC, but then many times worse :-)
>
>Regards,
>
> Hans
>
>--
>Hans Verkuil - video4linux developer - sponsored by TANDBERG
>
>--
>To unsubscribe from this list: send the line "unsubscribe linux-media" in
>the body of a message to majordomo@vger.kernel.org
>More majordomo info at http://vger.kernel.org/majordomo-info.html
^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
@ 2009-06-11 8:35 Hans Verkuil
2009-06-11 9:13 ` Hans de Goede
0 siblings, 1 reply; 16+ messages in thread
From: Hans Verkuil @ 2009-06-11 8:35 UTC (permalink / raw)
To: Jean-Philippe François
Cc: Karicheri, Muralidharan,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
> Karicheri, Muralidharan a écrit :
>>
>>>> We need
>>>> streaming capability in the driver. This is how our driver works
>>>> with mt9t031 sensor
>>>> raw-bus (10 bit)
>>>> vpfe-capture ----------------- mt9t031 driver
>>>> | |
>>>> V V
>>>> VPFE MT9T031
>>>>
>>>> VPFE hardware has internal timing and DMA controller to
>>>> copy data frame by frame from the sensor output to SDRAM.
>>>> The PCLK form the sensor is used to generate the internal
>>>> timing.
>>> So, what is missing in the driver apart from the ability to specify
>>> a frame-rate?
>>>
>> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
>> can it output frame continuously along with PCLK, Hsync and Vsync
>> signals like in a video streaming device. VPFE capture can accept frames
>> continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
>> output frames to application using QBUF/DQBUF. In our implementation, we
>> have timing parameters for the sensor to do streaming at various
>> resolutions and fps. So application calls S_STD to set these timings. I
>> am not sure if this is an acceptable way of implementing it. Any
>> comments?
>>
> PCLK, HSYNC, VSYNC are generated by the CMOS sensor. I don't think you
> can set the timings. Depending on sensor settings, pixel clock speed etc
> .., the frame rate will vary.
>
> You could perhaps play with the CMOS sensor registers so that when
> settings a standard, the driver somehow set the various exposition
> parameter and pll settings to get a specified framerate.
>
> This will vary with each sensor and each platform (because of
> pixelclock). More over, chances are that it will be conflicting with
> other control.
>
> For example if you set a fixed gain and autoexpo, some sensor will see
> a drop in fps under low light conditions. I think this kind of
> arbitration should be left to userspace.
>
> Unless the sensor supports a specific standard, I don't think the driver
> should try to make behind the scene modification to camera sensor
> register in response to a S_STD ioctl.
The S_STD call is hopelessly inadequate to deal with these types of
devices. What is needed is a new call that allows you to set the exact
timings you want: frame width/height, back/front porch, h/vsync width,
pixelclock. It is my opinion that the use of S_STD should be limited to
standard definition type inputs, and not used for other devices like
sensors or HD video.
Proposals for such a new ioctl are welcome :-)
Regards,
Hans
>
> JP François
>
>
>> Thanks
>>
>> Murali
>>
>>> Thanks
>>> Guennadi
>>> ---
>>> Guennadi Liakhovetski, Ph.D.
>>> Freelance Open-Source Software Developer
>>> http://www.open-technology.de/
>>
>> _______________________________________________
>> Davinci-linux-open-source mailing list
>> Davinci-linux-open-source@linux.davincidsp.com
>> http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
>>
>
>
> _______________________________________________
> Davinci-linux-open-source mailing list
> Davinci-linux-open-source@linux.davincidsp.com
> http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
>
>
--
Hans Verkuil - video4linux developer - sponsored by TANDBERG
^ permalink raw reply [flat|nested] 16+ messages in thread* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 8:35 Hans Verkuil
@ 2009-06-11 9:13 ` Hans de Goede
0 siblings, 0 replies; 16+ messages in thread
From: Hans de Goede @ 2009-06-11 9:13 UTC (permalink / raw)
To: Hans Verkuil
Cc: Jean-Philippe François, Karicheri, Muralidharan,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, Guennadi Liakhovetski,
linux-media@vger.kernel.org
On 06/11/2009 10:35 AM, Hans Verkuil wrote:
>> Karicheri, Muralidharan a écrit :
>>>>> We need
>>>>> streaming capability in the driver. This is how our driver works
>>>>> with mt9t031 sensor
>>>>> raw-bus (10 bit)
>>>>> vpfe-capture ----------------- mt9t031 driver
>>>>> | |
>>>>> V V
>>>>> VPFE MT9T031
>>>>>
>>>>> VPFE hardware has internal timing and DMA controller to
>>>>> copy data frame by frame from the sensor output to SDRAM.
>>>>> The PCLK form the sensor is used to generate the internal
>>>>> timing.
>>>> So, what is missing in the driver apart from the ability to specify
>>>> a frame-rate?
>>>>
>>> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
>>> can it output frame continuously along with PCLK, Hsync and Vsync
>>> signals like in a video streaming device. VPFE capture can accept frames
>>> continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
>>> output frames to application using QBUF/DQBUF. In our implementation, we
>>> have timing parameters for the sensor to do streaming at various
>>> resolutions and fps. So application calls S_STD to set these timings. I
>>> am not sure if this is an acceptable way of implementing it. Any
>>> comments?
>>>
>> PCLK, HSYNC, VSYNC are generated by the CMOS sensor. I don't think you
>> can set the timings. Depending on sensor settings, pixel clock speed etc
>> .., the frame rate will vary.
>>
>> You could perhaps play with the CMOS sensor registers so that when
>> settings a standard, the driver somehow set the various exposition
>> parameter and pll settings to get a specified framerate.
>>
>> This will vary with each sensor and each platform (because of
>> pixelclock). More over, chances are that it will be conflicting with
>> other control.
>>
>> For example if you set a fixed gain and autoexpo, some sensor will see
>> a drop in fps under low light conditions. I think this kind of
>> arbitration should be left to userspace.
>>
>> Unless the sensor supports a specific standard, I don't think the driver
>> should try to make behind the scene modification to camera sensor
>> register in response to a S_STD ioctl.
>
> The S_STD call is hopelessly inadequate to deal with these types of
> devices. What is needed is a new call that allows you to set the exact
> timings you want: frame width/height, back/front porch, h/vsync width,
> pixelclock. It is my opinion that the use of S_STD should be limited to
> standard definition type inputs, and not used for other devices like
> sensors or HD video.
>
> Proposals for such a new ioctl are welcome :-)
>
Hmm,
Why would we want the *application* to set things like this *at all* ?
with sensors hsync and vsync and other timing are something between
the bridge and the sensor, actaully in my experience the correct
hsync / vsync timings to program the sensor to are very much bridge
specific. So IMHO this should not be exposed to userspace at all.
All userspace should be able to control is the resolution and the
framerate. Although controlling the framerate in many cases also
means controlling the maximum exposure time. So in many cases
one cannot even control the framerate. (Asking for 30 fps in an
artificially illuminated room will get you a very dark, useless
picture, with most sensors). Yes this means that with cams with
use autoexposure (which is something which we really want where ever
possible), the framerate can and will change while streaming.
Regards,
Hans
^ permalink raw reply [flat|nested] 16+ messages in thread
* [PATCH] adding support for setting bus parameters in sub device
@ 2009-06-09 20:54 m-karicheri2
2009-06-10 18:32 ` Guennadi Liakhovetski
0 siblings, 1 reply; 16+ messages in thread
From: m-karicheri2 @ 2009-06-09 20:54 UTC (permalink / raw)
To: linux-media
Cc: davinci-linux-open-source, Muralidharan Karicheri,
Muralidharan Karicheri
From: Muralidharan Karicheri <a0868495@gt516km11.gt.design.ti.com>
This patch adds support for setting bus parameters such as bus type
(BT.656, BT.1120 etc), width (example 10 bit raw image data bus)
and polarities (vsync, hsync, field etc) in sub device. This allows
bridge driver to configure the sub device for a specific set of bus
parameters through s_bus() function call.
Reviewed By "Hans Verkuil".
Signed-off-by: Muralidharan Karicheri <m-karicheri2@ti.com>
---
Applies to v4l-dvb repository
include/media/v4l2-subdev.h | 36 ++++++++++++++++++++++++++++++++++++
1 files changed, 36 insertions(+), 0 deletions(-)
diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
index 1785608..c1cfb3b 100644
--- a/include/media/v4l2-subdev.h
+++ b/include/media/v4l2-subdev.h
@@ -37,6 +37,41 @@ struct v4l2_decode_vbi_line {
u32 type; /* VBI service type (V4L2_SLICED_*). 0 if no service found */
};
+/*
+ * Some sub-devices are connected to the bridge device through a bus that
+ * carries * the clock, vsync, hsync and data. Some interfaces such as BT.656
+ * carries the sync embedded in the data where as others have separate line
+ * carrying the sync signals. The structure below is used by bridge driver to
+ * set the desired bus parameters in the sub device to work with it.
+ */
+enum v4l2_subdev_bus_type {
+ /* BT.656 interface. Embedded sync */
+ V4L2_SUBDEV_BUS_BT_656,
+ /* BT.1120 interface. Embedded sync */
+ V4L2_SUBDEV_BUS_BT_1120,
+ /* 8 bit muxed YCbCr bus, separate sync and field signals */
+ V4L2_SUBDEV_BUS_YCBCR_8,
+ /* 16 bit YCbCr bus, separate sync and field signals */
+ V4L2_SUBDEV_BUS_YCBCR_16,
+ /* Raw Bayer image data bus , 8 - 16 bit wide, sync signals */
+ V4L2_SUBDEV_BUS_RAW_BAYER
+};
+
+struct v4l2_subdev_bus {
+ enum v4l2_subdev_bus_type type;
+ u8 width;
+ /* 0 - active low, 1 - active high */
+ unsigned pol_vsync:1;
+ /* 0 - active low, 1 - active high */
+ unsigned pol_hsync:1;
+ /* 0 - low to high , 1 - high to low */
+ unsigned pol_field:1;
+ /* 0 - sample at falling edge , 1 - sample at rising edge */
+ unsigned pol_pclock:1;
+ /* 0 - active low , 1 - active high */
+ unsigned pol_data:1;
+};
+
/* Sub-devices are devices that are connected somehow to the main bridge
device. These devices are usually audio/video muxers/encoders/decoders or
sensors and webcam controllers.
@@ -109,6 +144,7 @@ struct v4l2_subdev_core_ops {
int (*querymenu)(struct v4l2_subdev *sd, struct v4l2_querymenu *qm);
int (*s_std)(struct v4l2_subdev *sd, v4l2_std_id norm);
long (*ioctl)(struct v4l2_subdev *sd, unsigned int cmd, void *arg);
+ int (*s_bus)(struct v4l2_subdev *sd, struct v4l2_subdev_bus *bus);
#ifdef CONFIG_VIDEO_ADV_DEBUG
int (*g_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register *reg);
int (*s_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register *reg);
--
1.6.0.4
^ permalink raw reply related [flat|nested] 16+ messages in thread* Re: [PATCH] adding support for setting bus parameters in sub device
2009-06-09 20:54 [PATCH] adding support for setting bus parameters in sub device m-karicheri2
@ 2009-06-10 18:32 ` Guennadi Liakhovetski
2009-06-10 20:28 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Guennadi Liakhovetski @ 2009-06-10 18:32 UTC (permalink / raw)
To: Muralidharan Karicheri
Cc: linux-media, davinci-linux-open-source, Muralidharan Karicheri
On Tue, 9 Jun 2009, m-karicheri2@ti.com wrote:
> From: Muralidharan Karicheri <a0868495@gt516km11.gt.design.ti.com>
>
> This patch adds support for setting bus parameters such as bus type
> (BT.656, BT.1120 etc), width (example 10 bit raw image data bus)
> and polarities (vsync, hsync, field etc) in sub device. This allows
> bridge driver to configure the sub device for a specific set of bus
> parameters through s_bus() function call.
Yes, this is required, but this is not enough. Firstly, you're missing at
least one more flag - master or slave. Secondly, it is not enough to
provide a s_bus function. Many hosts and sensors can configure one of
several alternate configurations - they can select signal polarities, data
widths, master / slave role, etc. Whereas others have some or all of these
parameters fixed. That's why we have a query method in soc-camera, which
delivers all supported configurations, and then the host can select some
mutually acceptable subset. No, just returning an error code is not
enough.
So, you would need to request supported flags, the sensor would return a
bitmask, and the host would then issue a s_bus call with a selected subset
and configure itself. And then you realise, that one bit per parameter is
not enough - you need a bit for both, e.g., vsync active low and active
high.
Have a look at
include/media/soc_camera.h::soc_camera_bus_param_compatible() and macros
defined there, then you might want to have a look at
drivers/media/video/pxa_camera.c::pxa_camera_set_bus_param() to see how
the whole machinery works. And you also want inverter flags, see
drivers/media/video/soc_camera.c::soc_camera_apply_sensor_flags().
So, this is a step in the right direction, but it doesn't seem final yet.
Thanks
Guennadi
>
> Reviewed By "Hans Verkuil".
> Signed-off-by: Muralidharan Karicheri <m-karicheri2@ti.com>
> ---
> Applies to v4l-dvb repository
>
> include/media/v4l2-subdev.h | 36 ++++++++++++++++++++++++++++++++++++
> 1 files changed, 36 insertions(+), 0 deletions(-)
>
> diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> index 1785608..c1cfb3b 100644
> --- a/include/media/v4l2-subdev.h
> +++ b/include/media/v4l2-subdev.h
> @@ -37,6 +37,41 @@ struct v4l2_decode_vbi_line {
> u32 type; /* VBI service type (V4L2_SLICED_*). 0 if no service found */
> };
>
> +/*
> + * Some sub-devices are connected to the bridge device through a bus that
> + * carries * the clock, vsync, hsync and data. Some interfaces such as BT.656
> + * carries the sync embedded in the data where as others have separate line
> + * carrying the sync signals. The structure below is used by bridge driver to
> + * set the desired bus parameters in the sub device to work with it.
> + */
> +enum v4l2_subdev_bus_type {
> + /* BT.656 interface. Embedded sync */
> + V4L2_SUBDEV_BUS_BT_656,
> + /* BT.1120 interface. Embedded sync */
> + V4L2_SUBDEV_BUS_BT_1120,
> + /* 8 bit muxed YCbCr bus, separate sync and field signals */
> + V4L2_SUBDEV_BUS_YCBCR_8,
> + /* 16 bit YCbCr bus, separate sync and field signals */
> + V4L2_SUBDEV_BUS_YCBCR_16,
> + /* Raw Bayer image data bus , 8 - 16 bit wide, sync signals */
> + V4L2_SUBDEV_BUS_RAW_BAYER
> +};
> +
> +struct v4l2_subdev_bus {
> + enum v4l2_subdev_bus_type type;
> + u8 width;
> + /* 0 - active low, 1 - active high */
> + unsigned pol_vsync:1;
> + /* 0 - active low, 1 - active high */
> + unsigned pol_hsync:1;
> + /* 0 - low to high , 1 - high to low */
> + unsigned pol_field:1;
> + /* 0 - sample at falling edge , 1 - sample at rising edge */
> + unsigned pol_pclock:1;
> + /* 0 - active low , 1 - active high */
> + unsigned pol_data:1;
> +};
> +
> /* Sub-devices are devices that are connected somehow to the main bridge
> device. These devices are usually audio/video muxers/encoders/decoders or
> sensors and webcam controllers.
> @@ -109,6 +144,7 @@ struct v4l2_subdev_core_ops {
> int (*querymenu)(struct v4l2_subdev *sd, struct v4l2_querymenu *qm);
> int (*s_std)(struct v4l2_subdev *sd, v4l2_std_id norm);
> long (*ioctl)(struct v4l2_subdev *sd, unsigned int cmd, void *arg);
> + int (*s_bus)(struct v4l2_subdev *sd, struct v4l2_subdev_bus *bus);
> #ifdef CONFIG_VIDEO_ADV_DEBUG
> int (*g_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register *reg);
> int (*s_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register *reg);
> --
> 1.6.0.4
>
> --
> To unsubscribe from this list: send the line "unsubscribe linux-media" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
>
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread* RE: [PATCH] adding support for setting bus parameters in sub device
2009-06-10 18:32 ` Guennadi Liakhovetski
@ 2009-06-10 20:28 ` Karicheri, Muralidharan
2009-06-10 21:09 ` mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device) Guennadi Liakhovetski
0 siblings, 1 reply; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-10 20:28 UTC (permalink / raw)
To: Guennadi Liakhovetski
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri
Guennadi,
Thanks for responding. I acknowledge I need to add
master & slave information in the structure. Querying
the capabilities from the sub device is a good feature.
I will look into your references and let you know if I
have any question.
I will send an updated patch based on this.
BTW, I have a question about the mt9t031.c driver. Could
I use this driver to stream VGA frames from the sensor?
Is it possible to configure the driver to stream at a
specific fps? We have a version of the driver used internally
and it can do capture and stream of Bayer RGB data at VGA,
480p, 576p and 720p resolutions. I have started integrating
your driver with my vpfe capture driver and it wasn't very
clear to me. Looks like driver calculates the timing parameters
based on the width and height of the capture area. We need
streaming capability in the driver. This is how our driver works
with mt9t031 sensor
raw-bus (10 bit)
vpfe-capture ----------------- mt9t031 driver
| |
V V
VPFE MT9T031
VPFE hardware has internal timing and DMA controller to
copy data frame by frame from the sensor output to SDRAM.
The PCLK form the sensor is used to generate the internal
timing.
Thanks.
Murali
email: m-karicheri2@ti.com
>-----Original Message-----
>From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de]
>Sent: Wednesday, June 10, 2009 2:32 PM
>To: Karicheri, Muralidharan
>Cc: linux-media@vger.kernel.org; davinci-linux-open-
>source@linux.davincidsp.com; Muralidharan Karicheri
>Subject: Re: [PATCH] adding support for setting bus parameters in sub
>device
>
>On Tue, 9 Jun 2009, m-karicheri2@ti.com wrote:
>
>> From: Muralidharan Karicheri <a0868495@gt516km11.gt.design.ti.com>
>>
>> This patch adds support for setting bus parameters such as bus type
>> (BT.656, BT.1120 etc), width (example 10 bit raw image data bus)
>> and polarities (vsync, hsync, field etc) in sub device. This allows
>> bridge driver to configure the sub device for a specific set of bus
>> parameters through s_bus() function call.
>
>Yes, this is required, but this is not enough. Firstly, you're missing at
>least one more flag - master or slave. Secondly, it is not enough to
>provide a s_bus function. Many hosts and sensors can configure one of
>several alternate configurations - they can select signal polarities, data
>widths, master / slave role, etc. Whereas others have some or all of these
>parameters fixed. That's why we have a query method in soc-camera, which
>delivers all supported configurations, and then the host can select some
>mutually acceptable subset. No, just returning an error code is not
>enough.
>
>So, you would need to request supported flags, the sensor would return a
>bitmask, and the host would then issue a s_bus call with a selected subset
>and configure itself. And then you realise, that one bit per parameter is
>not enough - you need a bit for both, e.g., vsync active low and active
>high.
>
>Have a look at
>include/media/soc_camera.h::soc_camera_bus_param_compatible() and macros
>defined there, then you might want to have a look at
>drivers/media/video/pxa_camera.c::pxa_camera_set_bus_param() to see how
>the whole machinery works. And you also want inverter flags, see
>drivers/media/video/soc_camera.c::soc_camera_apply_sensor_flags().
>
>So, this is a step in the right direction, but it doesn't seem final yet.
>
>Thanks
>Guennadi
>
>>
>> Reviewed By "Hans Verkuil".
>> Signed-off-by: Muralidharan Karicheri <m-karicheri2@ti.com>
>> ---
>> Applies to v4l-dvb repository
>>
>> include/media/v4l2-subdev.h | 36 ++++++++++++++++++++++++++++++++++++
>> 1 files changed, 36 insertions(+), 0 deletions(-)
>>
>> diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
>> index 1785608..c1cfb3b 100644
>> --- a/include/media/v4l2-subdev.h
>> +++ b/include/media/v4l2-subdev.h
>> @@ -37,6 +37,41 @@ struct v4l2_decode_vbi_line {
>> u32 type; /* VBI service type (V4L2_SLICED_*). 0 if no
>service found */
>> };
>>
>> +/*
>> + * Some sub-devices are connected to the bridge device through a bus
>that
>> + * carries * the clock, vsync, hsync and data. Some interfaces such as
>BT.656
>> + * carries the sync embedded in the data where as others have separate
>line
>> + * carrying the sync signals. The structure below is used by bridge
>driver to
>> + * set the desired bus parameters in the sub device to work with it.
>> + */
>> +enum v4l2_subdev_bus_type {
>> + /* BT.656 interface. Embedded sync */
>> + V4L2_SUBDEV_BUS_BT_656,
>> + /* BT.1120 interface. Embedded sync */
>> + V4L2_SUBDEV_BUS_BT_1120,
>> + /* 8 bit muxed YCbCr bus, separate sync and field signals */
>> + V4L2_SUBDEV_BUS_YCBCR_8,
>> + /* 16 bit YCbCr bus, separate sync and field signals */
>> + V4L2_SUBDEV_BUS_YCBCR_16,
>> + /* Raw Bayer image data bus , 8 - 16 bit wide, sync signals */
>> + V4L2_SUBDEV_BUS_RAW_BAYER
>> +};
>> +
>> +struct v4l2_subdev_bus {
>> + enum v4l2_subdev_bus_type type;
>> + u8 width;
>> + /* 0 - active low, 1 - active high */
>> + unsigned pol_vsync:1;
>> + /* 0 - active low, 1 - active high */
>> + unsigned pol_hsync:1;
>> + /* 0 - low to high , 1 - high to low */
>> + unsigned pol_field:1;
>> + /* 0 - sample at falling edge , 1 - sample at rising edge */
>> + unsigned pol_pclock:1;
>> + /* 0 - active low , 1 - active high */
>> + unsigned pol_data:1;
>> +};
>> +
>> /* Sub-devices are devices that are connected somehow to the main bridge
>> device. These devices are usually audio/video
>muxers/encoders/decoders or
>> sensors and webcam controllers.
>> @@ -109,6 +144,7 @@ struct v4l2_subdev_core_ops {
>> int (*querymenu)(struct v4l2_subdev *sd, struct v4l2_querymenu *qm);
>> int (*s_std)(struct v4l2_subdev *sd, v4l2_std_id norm);
>> long (*ioctl)(struct v4l2_subdev *sd, unsigned int cmd, void *arg);
>> + int (*s_bus)(struct v4l2_subdev *sd, struct v4l2_subdev_bus *bus);
>> #ifdef CONFIG_VIDEO_ADV_DEBUG
>> int (*g_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register
>*reg);
>> int (*s_register)(struct v4l2_subdev *sd, struct v4l2_dbg_register
>*reg);
>> --
>> 1.6.0.4
>>
>> --
>> To unsubscribe from this list: send the line "unsubscribe linux-media" in
>> the body of a message to majordomo@vger.kernel.org
>> More majordomo info at http://vger.kernel.org/majordomo-info.html
>>
>
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread* mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 20:28 ` Karicheri, Muralidharan
@ 2009-06-10 21:09 ` Guennadi Liakhovetski
2009-06-10 21:29 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Guennadi Liakhovetski @ 2009-06-10 21:09 UTC (permalink / raw)
To: Karicheri, Muralidharan
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
> Guennadi,
>
> Thanks for responding. I acknowledge I need to add
> master & slave information in the structure. Querying
> the capabilities from the sub device is a good feature.
> I will look into your references and let you know if I
> have any question.
>
> I will send an updated patch based on this.
>
> BTW, I have a question about the mt9t031.c driver. Could
> I use this driver to stream VGA frames from the sensor?
Sure, any size the chip supports (up to 2048x1536) and your host can
handle.
> Is it possible to configure the driver to stream at a
> specific fps?
No, patches welcome.
> We have a version of the driver used internally
> and it can do capture and stream of Bayer RGB data at VGA,
> 480p, 576p and 720p resolutions. I have started integrating
> your driver with my vpfe capture driver and it wasn't very
> clear to me. Looks like driver calculates the timing parameters
> based on the width and height of the capture area.
Yes, it provides exposure control by setting shutter timing, and it
emulates autoexposure by calculating shutter times from window geometry.
> We need
> streaming capability in the driver. This is how our driver works
> with mt9t031 sensor
> raw-bus (10 bit)
> vpfe-capture ----------------- mt9t031 driver
> | |
> V V
> VPFE MT9T031
>
> VPFE hardware has internal timing and DMA controller to
> copy data frame by frame from the sensor output to SDRAM.
> The PCLK form the sensor is used to generate the internal
> timing.
So, what is missing in the driver apart from the ability to specify
a frame-rate?
Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 21:09 ` mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device) Guennadi Liakhovetski
@ 2009-06-10 21:29 ` Karicheri, Muralidharan
2009-06-10 21:37 ` Guennadi Liakhovetski
2009-06-11 8:01 ` Jean-Philippe François
0 siblings, 2 replies; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-10 21:29 UTC (permalink / raw)
To: Guennadi Liakhovetski
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri
>> We need
>> streaming capability in the driver. This is how our driver works
>> with mt9t031 sensor
>> raw-bus (10 bit)
>> vpfe-capture ----------------- mt9t031 driver
>> | |
>> V V
>> VPFE MT9T031
>>
>> VPFE hardware has internal timing and DMA controller to
>> copy data frame by frame from the sensor output to SDRAM.
>> The PCLK form the sensor is used to generate the internal
>> timing.
>
>So, what is missing in the driver apart from the ability to specify
>a frame-rate?
>
[MK] Does the mt9t031 output one frame (snapshot) like in a camera or can it output frame continuously along with PCLK, Hsync and Vsync signals like in a video streaming device. VPFE capture can accept frames continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and output frames to application using QBUF/DQBUF. In our implementation, we have timing parameters for the sensor to do streaming at various resolutions and fps. So application calls S_STD to set these timings. I am not sure if this is an acceptable way of implementing it. Any comments?
Thanks
Murali
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 21:29 ` Karicheri, Muralidharan
@ 2009-06-10 21:37 ` Guennadi Liakhovetski
2009-06-10 21:45 ` Karicheri, Muralidharan
2009-06-11 8:01 ` Jean-Philippe François
1 sibling, 1 reply; 16+ messages in thread
From: Guennadi Liakhovetski @ 2009-06-10 21:37 UTC (permalink / raw)
To: Karicheri, Muralidharan
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
>
>
> >> We need
> >> streaming capability in the driver. This is how our driver works
> >> with mt9t031 sensor
> >> raw-bus (10 bit)
> >> vpfe-capture ----------------- mt9t031 driver
> >> | |
> >> V V
> >> VPFE MT9T031
> >>
> >> VPFE hardware has internal timing and DMA controller to
> >> copy data frame by frame from the sensor output to SDRAM.
> >> The PCLK form the sensor is used to generate the internal
> >> timing.
> >
> >So, what is missing in the driver apart from the ability to specify
> >a frame-rate?
> >
> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
> can it output frame continuously along with PCLK, Hsync and Vsync
> signals like in a video streaming device. VPFE capture can accept frames
> continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
> output frames to application using QBUF/DQBUF. In our implementation, we
> have timing parameters for the sensor to do streaming at various
> resolutions and fps. So application calls S_STD to set these timings. I
> am not sure if this is an acceptable way of implementing it. Any
> comments?
Yes, it is streaming.
Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 21:37 ` Guennadi Liakhovetski
@ 2009-06-10 21:45 ` Karicheri, Muralidharan
2009-06-10 23:13 ` Guennadi Liakhovetski
0 siblings, 1 reply; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-10 21:45 UTC (permalink / raw)
To: Guennadi Liakhovetski
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri
>> >So, what is missing in the driver apart from the ability to specify
>> >a frame-rate?
>> >
>> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
>> can it output frame continuously along with PCLK, Hsync and Vsync
>> signals like in a video streaming device. VPFE capture can accept frames
>> continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
>> output frames to application using QBUF/DQBUF. In our implementation, we
>> have timing parameters for the sensor to do streaming at various
>> resolutions and fps. So application calls S_STD to set these timings. I
>> am not sure if this is an acceptable way of implementing it. Any
>> comments?
>
>Yes, it is streaming.
>
So how do I know what frame-rate I get? Sensor output frame rate depends on the resolution of the frame, blanking, exposure time etc.
Murali
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 21:45 ` Karicheri, Muralidharan
@ 2009-06-10 23:13 ` Guennadi Liakhovetski
2009-06-11 15:00 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Guennadi Liakhovetski @ 2009-06-10 23:13 UTC (permalink / raw)
To: Karicheri, Muralidharan
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
> So how do I know what frame-rate I get? Sensor output frame rate depends
> on the resolution of the frame, blanking, exposure time etc.
This is not supported.
Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 23:13 ` Guennadi Liakhovetski
@ 2009-06-11 15:00 ` Karicheri, Muralidharan
2009-06-11 15:58 ` Guennadi Liakhovetski
0 siblings, 1 reply; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-11 15:00 UTC (permalink / raw)
To: Guennadi Liakhovetski
Cc: linux-media@vger.kernel.org,
davinci-linux-open-source@linux.davincidsp.com
>On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
>
>> So how do I know what frame-rate I get? Sensor output frame rate depends
>> on the resolution of the frame, blanking, exposure time etc.
>
>This is not supported.
>
I am still not clear. You had said in an earlier email that it can support streaming. That means application can stream frames from the capture device.
I know you don't have support for setting a specific frame rate, but it must be outputting frame at some rate right?
Here is my usecase.
open capture device,
set resolutions (say VGA) for capture (S_FMT ???)
request buffer for streaming & mmap & QUERYBUF
start streaming (STREAMON)
DQBUF/QBUF in a loop -> get VGA buffers at some fps.
STREAMOFF
close device
Is this possible with mt9t031 available currently in the tree? This requires sensor device output frames continuously on the bus using PCLK/HSYNC/VSYNC timing to the bridge device connected to the bus. Can you give a use case like above that you are using. I just want to estimate how much effort is required to add this support in the mt9t031 driver.
Thanks
Murali
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 15:00 ` Karicheri, Muralidharan
@ 2009-06-11 15:58 ` Guennadi Liakhovetski
2009-06-11 16:30 ` Karicheri, Muralidharan
0 siblings, 1 reply; 16+ messages in thread
From: Guennadi Liakhovetski @ 2009-06-11 15:58 UTC (permalink / raw)
To: Karicheri, Muralidharan; +Cc: linux-media@vger.kernel.org
On Thu, 11 Jun 2009, Karicheri, Muralidharan wrote:
> >On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
> >
> >> So how do I know what frame-rate I get? Sensor output frame rate depends
> >> on the resolution of the frame, blanking, exposure time etc.
> >
> >This is not supported.
> >
> I am still not clear. You had said in an earlier email that it can
> support streaming. That means application can stream frames from the
> capture device.
> I know you don't have support for setting a specific frame rate, but it
> must be outputting frame at some rate right?
I am sorry, I do not know how I can explain myself clearer.
Yes, you can stream video with mt9t031.
No, you neither get the framerate measured by the driver nor can you set a
specific framerate. Frames are produced as fast as it goes, depending on
clock settings, frame size, black areas, autoexposure.
Thanks
Guennadi
>
> Here is my usecase.
>
> open capture device,
> set resolutions (say VGA) for capture (S_FMT ???)
> request buffer for streaming & mmap & QUERYBUF
> start streaming (STREAMON)
> DQBUF/QBUF in a loop -> get VGA buffers at some fps.
> STREAMOFF
> close device
>
> Is this possible with mt9t031 available currently in the tree? This requires sensor device output frames continuously on the bus using PCLK/HSYNC/VSYNC timing to the bridge device connected to the bus. Can you give a use case like above that you are using. I just want to estimate how much effort is required to add this support in the mt9t031 driver.
>
> Thanks
>
> Murali
>
> >Thanks
> >Guennadi
> >---
> >Guennadi Liakhovetski, Ph.D.
> >Freelance Open-Source Software Developer
> >http://www.open-technology.de/
>
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-11 15:58 ` Guennadi Liakhovetski
@ 2009-06-11 16:30 ` Karicheri, Muralidharan
0 siblings, 0 replies; 16+ messages in thread
From: Karicheri, Muralidharan @ 2009-06-11 16:30 UTC (permalink / raw)
To: Guennadi Liakhovetski; +Cc: linux-media@vger.kernel.org
>I am sorry, I do not know how I can explain myself clearer.
>
Thanks for helping me to understand better :)
>Yes, you can stream video with mt9t031.
>
>No, you neither get the framerate measured by the driver nor can you set a
>specific framerate. Frames are produced as fast as it goes, depending on
>clock settings, frame size, black areas, autoexposure.
>
Ok. It is now clear to me.
Thanks for all your help.
>Thanks
>Guennadi
>
>>
>> Here is my usecase.
>>
>> open capture device,
>> set resolutions (say VGA) for capture (S_FMT ???)
>> request buffer for streaming & mmap & QUERYBUF
>> start streaming (STREAMON)
>> DQBUF/QBUF in a loop -> get VGA buffers at some fps.
>> STREAMOFF
>> close device
>>
>> Is this possible with mt9t031 available currently in the tree? This
>requires sensor device output frames continuously on the bus using
>PCLK/HSYNC/VSYNC timing to the bridge device connected to the bus. Can you
>give a use case like above that you are using. I just want to estimate how
>much effort is required to add this support in the mt9t031 driver.
>>
>> Thanks
>>
>> Murali
>>
>> >Thanks
>> >Guennadi
>> >---
>> >Guennadi Liakhovetski, Ph.D.
>> >Freelance Open-Source Software Developer
>> >http://www.open-technology.de/
>>
>
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
2009-06-10 21:29 ` Karicheri, Muralidharan
2009-06-10 21:37 ` Guennadi Liakhovetski
@ 2009-06-11 8:01 ` Jean-Philippe François
1 sibling, 0 replies; 16+ messages in thread
From: Jean-Philippe François @ 2009-06-11 8:01 UTC (permalink / raw)
To: Karicheri, Muralidharan
Cc: Guennadi Liakhovetski,
davinci-linux-open-source@linux.davincidsp.com,
Muralidharan Karicheri, linux-media@vger.kernel.org
Karicheri, Muralidharan a écrit :
>
>>> We need
>>> streaming capability in the driver. This is how our driver works
>>> with mt9t031 sensor
>>> raw-bus (10 bit)
>>> vpfe-capture ----------------- mt9t031 driver
>>> | |
>>> V V
>>> VPFE MT9T031
>>>
>>> VPFE hardware has internal timing and DMA controller to
>>> copy data frame by frame from the sensor output to SDRAM.
>>> The PCLK form the sensor is used to generate the internal
>>> timing.
>> So, what is missing in the driver apart from the ability to specify
>> a frame-rate?
>>
> [MK] Does the mt9t031 output one frame (snapshot) like in a camera or can it output frame continuously along with PCLK, Hsync and Vsync signals like in a video streaming device. VPFE capture can accept frames continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and output frames to application using QBUF/DQBUF. In our implementation, we have timing parameters for the sensor to do streaming at various resolutions and fps. So application calls S_STD to set these timings. I am not sure if this is an acceptable way of implementing it. Any comments?
>
PCLK, HSYNC, VSYNC are generated by the CMOS sensor. I don't think you
can set the timings. Depending on sensor settings, pixel clock speed etc
.., the frame rate will vary.
You could perhaps play with the CMOS sensor registers so that when
settings a standard, the driver somehow set the various exposition
parameter and pll settings to get a specified framerate.
This will vary with each sensor and each platform (because of
pixelclock). More over, chances are that it will be conflicting with
other control.
For example if you set a fixed gain and autoexpo, some sensor will see
a drop in fps under low light conditions. I think this kind of
arbitration should be left to userspace.
Unless the sensor supports a specific standard, I don't think the driver
should try to make behind the scene modification to camera sensor
register in response to a S_STD ioctl.
JP François
> Thanks
>
> Murali
>
>> Thanks
>> Guennadi
>> ---
>> Guennadi Liakhovetski, Ph.D.
>> Freelance Open-Source Software Developer
>> http://www.open-technology.de/
>
> _______________________________________________
> Davinci-linux-open-source mailing list
> Davinci-linux-open-source@linux.davincidsp.com
> http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
>
^ permalink raw reply [flat|nested] 16+ messages in thread
end of thread, other threads:[~2009-06-11 16:30 UTC | newest]
Thread overview: 16+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2009-06-11 9:33 mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device) Hans Verkuil
2009-06-11 9:40 ` Hans de Goede
2009-06-11 14:43 ` Karicheri, Muralidharan
-- strict thread matches above, loose matches on Subject: below --
2009-06-11 10:39 Hans Verkuil
2009-06-11 14:40 ` Karicheri, Muralidharan
2009-06-11 8:35 Hans Verkuil
2009-06-11 9:13 ` Hans de Goede
2009-06-09 20:54 [PATCH] adding support for setting bus parameters in sub device m-karicheri2
2009-06-10 18:32 ` Guennadi Liakhovetski
2009-06-10 20:28 ` Karicheri, Muralidharan
2009-06-10 21:09 ` mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device) Guennadi Liakhovetski
2009-06-10 21:29 ` Karicheri, Muralidharan
2009-06-10 21:37 ` Guennadi Liakhovetski
2009-06-10 21:45 ` Karicheri, Muralidharan
2009-06-10 23:13 ` Guennadi Liakhovetski
2009-06-11 15:00 ` Karicheri, Muralidharan
2009-06-11 15:58 ` Guennadi Liakhovetski
2009-06-11 16:30 ` Karicheri, Muralidharan
2009-06-11 8:01 ` Jean-Philippe François
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox