* [RFC] Multi-Touch (MT) support - arbitration or not @ 2010-11-05 18:47 Ping Cheng 2010-11-06 15:16 ` Chris Bagwell ` (3 more replies) 0 siblings, 4 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-05 18:47 UTC (permalink / raw) To: X.Org Devel List Cc: Dmitry Torokhov, Peter Hutterer, Daniel Stone, linux-input Recent changes and discussion about MT support at LKML, UDS, and xorg-devel encouraged me to migrate Wacom MT devices to the slot-based MT protocol (introduced in kernel 2.6.36). Since Wacom supports both digitizer and touch devices, I need to decide how to report touch data when the pen is in proximity. My goal is to understand how X server would like the MT data to be reported from the kernel. I hope to keep kernel and X server driver MT support in sync so we can avoid unnecessary confusion or extra work in the userland. The existing solution for single touch events is to arbitrate touch when pen is in prox. This is based on the assumption that we do not want to have two cursors competing on the screen. With the introduction of MT, the touch data are most likely translated into something other than pointer events. So, reporting both pen and touch data makes sense now. However, I want to assure a smooth tansition from single touch to MT for end users so they still get the single touch behavior as they used to be. I gathered the following approaches: 1. Arbitrate all touch data in the kernel. This is the simplest solution for device driver developers. But I do not feel it is end user and userland client friendly. 2. Report first finger touch as ABS_X/Y events when pen is not in prox. Arbitrating single touch data when pen is in prox. Pen data is reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger and ABS_MT_* for MT data are reported. This approach reduces the overhead in dealing with two cursors in userland. 3. Report first finger touch as ABS_X/Y events when pen is not in prox; Report pen data as ABS_X/Y events when there is no finger touch; Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN events when both pen and touch data are received. No ABS_X/Y are reported when pen and tocuh or multi-touch data are received. I feel this one makes sense to userland since pen can be considered as another touch. 4. Report first finger touch as ABS_X/Y events when pen is not in prox; Report pen data as ABS_X/Y events when there is no finger touch; Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN events when both pen and touch data are received. ABS_X/Y are also reported for pen when both pen and tocuh data are received. This one makes sense to userland too. It eases the backward compatibility support for those clients that don't support MT at all. Which approach do you like? Or do you have other suggestions share? Ping ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-05 18:47 [RFC] Multi-Touch (MT) support - arbitration or not Ping Cheng @ 2010-11-06 15:16 ` Chris Bagwell 2010-11-06 15:53 ` Michal Suchanek ` (2 subsequent siblings) 3 siblings, 0 replies; 44+ messages in thread From: Chris Bagwell @ 2010-11-06 15:16 UTC (permalink / raw) To: Ping Cheng Cc: X.Org Devel List, Dmitry Torokhov, Peter Hutterer, Daniel Stone, linux-input On Fri, Nov 5, 2010 at 1:47 PM, Ping Cheng <pinglinux@gmail.com> wrote: > Recent changes and discussion about MT support at LKML, UDS, and > xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > MT protocol (introduced in kernel 2.6.36). Since Wacom supports both > digitizer and touch devices, I need to decide how to report touch data > when the pen is in proximity. > > My goal is to understand how X server would like the MT data to be > reported from the kernel. I hope to keep kernel and X server driver MT > support in sync so we can avoid unnecessary confusion or extra work in > the userland. > > The existing solution for single touch events is to arbitrate touch > when pen is in prox. This is based on the assumption that we do not > want to have two cursors competing on the screen. > > With the introduction of MT, the touch data are most likely translated > into something other than pointer events. So, reporting both pen and > touch data makes sense now. However, I want to assure a smooth > tansition from single touch to MT for end users so they still get the > single touch behavior as they used to be. I gathered the following > approaches: > > 1. Arbitrate all touch data in the kernel. > > This is the simplest solution for device driver developers. But I do > not feel it is end user and userland client friendly. > > 2. Report first finger touch as ABS_X/Y events when pen is not in > prox. Arbitrating single touch data when pen is in prox. Pen data is > reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger > and ABS_MT_* for MT data are reported. > > This approach reduces the overhead in dealing with two cursors in userland. > > 3. Report first finger touch as ABS_X/Y events when pen is not in prox; > Report pen data as ABS_X/Y events when there is no finger touch; > Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > events when both pen and touch data are received. No ABS_X/Y are > reported when pen and tocuh or multi-touch data are received. > > I feel this one makes sense to userland since pen can be considered as > another touch. > > 4. Report first finger touch as ABS_X/Y events when pen is not in prox; > Report pen data as ABS_X/Y events when there is no finger touch; > Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > events when both pen and touch data are received. ABS_X/Y are also > reported for pen when both pen and tocuh data are received. > > This one makes sense to userland too. It eases the backward > compatibility support for those clients that don't support MT at all. > > Which approach do you like? Or do you have other suggestions share? > > Ping Here is my input on topic. I'll summarize that I'm leaning towards your option #1 right now. First, I think we need to decide if non-MT aware apps/X drivers are to be supported or if we require them to be "fixed" to work reliably with devices such as Wacom Bamboo that split device into two inputs yet need coordination across inputs. I prefer that non-MT aware are supported and so we should continue the current arbitration in kernel over the ST events. Sending un-arbitrated MT events to userland I think is mostly OK. Next, you bring up an interesting point on tablet side. Currently, it doesn't send MT events because it only tracks 1 stylus tool at a time. Should it send MT events anyways for single tool to help userland because its events are meant to be associated with another inputs MT events? Thats a good question for MT people to consider. Thinking about that issue though makes we think of option #5 to consider. We could combine these 2 input devices into single input device and send all events as MT events. I assume userland has easier time arbitrating within single input device. I'm not sure what it would take to combine these in kernel though... and there is some value in 2 inputs since just the touchpad can currently be re-routed to xf86-input-synaptics. Chris -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-05 18:47 [RFC] Multi-Touch (MT) support - arbitration or not Ping Cheng 2010-11-06 15:16 ` Chris Bagwell @ 2010-11-06 15:53 ` Michal Suchanek 2010-11-06 20:38 ` Rafi Rubin 2010-11-07 23:32 ` Ping Cheng [not found] ` <AANLkTin1svszp87Pysi5OCt5=JTSB-yVaAWF-42gfn9T-JsoAwUIsXosN+BqQ9rBEUg@public.gmane.org> 2010-11-08 3:51 ` Peter Hutterer 3 siblings, 2 replies; 44+ messages in thread From: Michal Suchanek @ 2010-11-06 15:53 UTC (permalink / raw) To: Ping Cheng; +Cc: X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On 5 November 2010 19:47, Ping Cheng <pinglinux@gmail.com> wrote: > Recent changes and discussion about MT support at LKML, UDS, and > xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > MT protocol (introduced in kernel 2.6.36). Since Wacom supports both > digitizer and touch devices, I need to decide how to report touch data > when the pen is in proximity. > > My goal is to understand how X server would like the MT data to be > reported from the kernel. I hope to keep kernel and X server driver MT > support in sync so we can avoid unnecessary confusion or extra work in > the userland. > > The existing solution for single touch events is to arbitrate touch > when pen is in prox. This is based on the assumption that we do not > want to have two cursors competing on the screen. As a tablet user, not X developer I don't think it is desirable and expected to register pen touch and finger touch together as a multitouch. That is, if I wanted to trigger a multitouch feature I would use fingers, not pen and finger. While it would be technically possible to use the pen touch and finger touch together to trigger multitouch I don't think it makes much sense. Also it is a good idea to turn off (single) touch when pen is in proximity because it is quite easy to accidentally touch the surface while moving the pen. This means that the protocol that sends abs events from pen when in proximity and from finger otherwise would work well. However, there is one feature that makes sort of sense and does not work when touch is completely off. It might be possible to use touch for button and pen for moving the cursor. I tried to use the pad button on my Bamboo together with pen motion for right drags and it seems somewhat easier than using the pen button. So I would say that protocol that allows events from both pen and touch at the same time is desirable in the long run but with some options to filter the events when both pen and touch are used together. This gives basically three options 1) leave as is, turn off touch when pen is in proximity 2) report everything by kernel, put some filter on the touch events in the X driver (which defaults to killing most touch stuff when pen is in proximity) 3) the same as above but put the filter in the kernel driver 2 or 3 is more flexible as people with some special uses can just turn off the filter and get all events. WRT X ease of development I would suggest that the same device should never report different events based on pen proximity. Either the device is in some "compatibility mode" and reports only one pointer as per 1 or there are two separate pointers and they are just reported out of proximity when the pointer positioning is not desirable (ie when pen is out or when touch is off due to pen being in). If somebody wants to have multitouch combining two different pointers there is nothing stopping them, either with P&T pen and touch pointers or completely unrelated devices. Thanks Michal ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-06 15:53 ` Michal Suchanek @ 2010-11-06 20:38 ` Rafi Rubin 2010-11-08 3:39 ` Peter Hutterer 2010-11-07 23:32 ` Ping Cheng 1 sibling, 1 reply; 44+ messages in thread From: Rafi Rubin @ 2010-11-06 20:38 UTC (permalink / raw) To: Michal Suchanek Cc: Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On 11/06/10 11:53, Michal Suchanek wrote: > On 5 November 2010 19:47, Ping Cheng <pinglinux@gmail.com> wrote: >> Recent changes and discussion about MT support at LKML, UDS, and >> xorg-devel encouraged me to migrate Wacom MT devices to the slot-based >> MT protocol (introduced in kernel 2.6.36). Since Wacom supports both >> digitizer and touch devices, I need to decide how to report touch data >> when the pen is in proximity. >> >> My goal is to understand how X server would like the MT data to be >> reported from the kernel. I hope to keep kernel and X server driver MT >> support in sync so we can avoid unnecessary confusion or extra work in >> the userland. >> >> The existing solution for single touch events is to arbitrate touch >> when pen is in prox. This is based on the assumption that we do not >> want to have two cursors competing on the screen. > > As a tablet user, not X developer I don't think it is desirable and > expected to register pen touch and finger touch together as a > multitouch. That is, if I wanted to trigger a multitouch feature I > would use fingers, not pen and finger. While it would be technically > possible to use the pen touch and finger touch together to trigger > multitouch I don't think it makes much sense. Actually, a finger is a far more clumsy than a pen and beyond the biomechanics in my experience pen sensors are also a lot more accurate and stable. As such, I could see people using one finger as an anchor while using a pen for better control during a two tool manipulation. This is merely a hypothetical usage, realistically, I don't think that pen+finger actions are likely to be all that popular. Some of the earlier single touch n-trig firmwares supported simultaneous reporting of pen and finger. Newer firmwares all send the touch device a special event to indicate the pen is in range and then report no further touch activity until after both sensors go idle. The kernel driver still lets simultaneous events through to the two separate event device nodes. Given that was my only option at the time, I used that with mpx to play with multitouch. I did not really find it useful, and the only feedback I got from users was questions asking how to disable touch when they wanted to use the pen. > Also it is a good idea to turn off (single) touch when pen is in > proximity because it is quite easy to accidentally touch the surface > while moving the pen. Definitely simplifies using things like Xournal, where sometimes a finger is fine, but when you're writing with the pen you don't want to have to worry about accidentally making skin contact with the screen. > This means that the protocol that sends abs events from pen when in > proximity and from finger otherwise would work well. > > However, there is one feature that makes sort of sense and does not > work when touch is completely off. It might be possible to use touch > for button and pen for moving the cursor. I tried to use the pad > button on my Bamboo together with pen motion for right drags and it > seems somewhat easier than using the pen button. > > So I would say that protocol that allows events from both pen and > touch at the same time is desirable in the long run but with some > options to filter the events when both pen and touch are used > together. > > This gives basically three options > > 1) leave as is, turn off touch when pen is in proximity This would be simplest. Furthermore, if you want to block touch when the pen is in range, it would be easiest to catch that in firmware or the kernel, before the events are split off into separate streams. > 2) report everything by kernel, put some filter on the touch events in > the X driver (which defaults to killing most touch stuff when pen is > in proximity) Hard to coordinate when the events come in from separate device nodes, and may even use different drivers for pen and touch. > 3) the same as above but put the filter in the kernel driver > > > 2 or 3 is more flexible as people with some special uses can just turn > off the filter and get all events. > > WRT X ease of development I would suggest that the same device should > never report different events based on pen proximity. Either the > device is in some "compatibility mode" and reports only one pointer as > per 1 or there are two separate pointers and they are just reported > out of proximity when the pointer positioning is not desirable (ie > when pen is out or when touch is off due to pen being in). > > If somebody wants to have multitouch combining two different pointers > there is nothing stopping them, either with P&T pen and touch pointers > or completely unrelated devices. Multiplexing pen and finger events as separate MT tracks where each specifies the current tool at the start of the stream should work, but it might get a little messy. Downstream applications would be responsible for deciding if they care about the tool type and respond accordingly. Also pen and touch have different characteristics (resolution, physical dimensions, pressure) and an application would have to adjust behavior specially for every track. I have seen one really positive example where a machine had a number different of tools, all simultaneously active. However, that was a custom setup created by an artist/computer scientist for his own use and even then only for a single purpose machine. Trying to support the mixed tool streams for the general case seems tricky and prone to many mistakes. Rafi ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-06 20:38 ` Rafi Rubin @ 2010-11-08 3:39 ` Peter Hutterer 0 siblings, 0 replies; 44+ messages in thread From: Peter Hutterer @ 2010-11-08 3:39 UTC (permalink / raw) To: Rafi Rubin Cc: Michal Suchanek, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Sat, Nov 06, 2010 at 04:38:51PM -0400, Rafi Rubin wrote: > On 11/06/10 11:53, Michal Suchanek wrote: > > On 5 November 2010 19:47, Ping Cheng <pinglinux@gmail.com> wrote: > >> Recent changes and discussion about MT support at LKML, UDS, and > >> xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > >> MT protocol (introduced in kernel 2.6.36). Since Wacom supports both > >> digitizer and touch devices, I need to decide how to report touch data > >> when the pen is in proximity. > >> > >> My goal is to understand how X server would like the MT data to be > >> reported from the kernel. I hope to keep kernel and X server driver MT > >> support in sync so we can avoid unnecessary confusion or extra work in > >> the userland. > >> > >> The existing solution for single touch events is to arbitrate touch > >> when pen is in prox. This is based on the assumption that we do not > >> want to have two cursors competing on the screen. > > > > As a tablet user, not X developer I don't think it is desirable and > > expected to register pen touch and finger touch together as a > > multitouch. That is, if I wanted to trigger a multitouch feature I > > would use fingers, not pen and finger. While it would be technically > > possible to use the pen touch and finger touch together to trigger > > multitouch I don't think it makes much sense. > > Actually, a finger is a far more clumsy than a pen and beyond the biomechanics > in my experience pen sensors are also a lot more accurate and stable. As such, > I could see people using one finger as an anchor while using a pen for better > control during a two tool manipulation. I don't think the kernel (or X, for that matter) should care about clumsyness or biomechanics. If the data is valid, pass it on. Ikbel has presented one case already where the data of both can be useful, I guess over the next years we'll see more use-cases. > This is merely a hypothetical usage, realistically, I don't think that > pen+finger actions are likely to be all that popular. > > Some of the earlier single touch n-trig firmwares supported simultaneous > reporting of pen and finger. Newer firmwares all send the touch device a > special event to indicate the pen is in range and then report no further touch > activity until after both sensors go idle. > > The kernel driver still lets simultaneous events through to the two separate > event device nodes. Given that was my only option at the time, I used that with > mpx to play with multitouch. I did not really find it useful, and the only > feedback I got from users was questions asking how to disable touch when they > wanted to use the pen. > > > Also it is a good idea to turn off (single) touch when pen is in > > proximity because it is quite easy to accidentally touch the surface > > while moving the pen. > > Definitely simplifies using things like Xournal, where sometimes a finger is > fine, but when you're writing with the pen you don't want to have to worry about > accidentally making skin contact with the screen. > > > This means that the protocol that sends abs events from pen when in > > proximity and from finger otherwise would work well. > > > > However, there is one feature that makes sort of sense and does not > > work when touch is completely off. It might be possible to use touch > > for button and pen for moving the cursor. I tried to use the pad > > button on my Bamboo together with pen motion for right drags and it > > seems somewhat easier than using the pen button. > > > > So I would say that protocol that allows events from both pen and > > touch at the same time is desirable in the long run but with some > > options to filter the events when both pen and touch are used > > together. > > > > This gives basically three options > > > > 1) leave as is, turn off touch when pen is in proximity > > This would be simplest. Furthermore, if you want to block touch when the pen is > in range, it would be easiest to catch that in firmware or the kernel, before > the events are split off into separate streams. I'm not sure the splitting off into separate streams is a good idea long-term. I have a serial tablet here where the data comes out of a single kernel device and it's beautifully easy to hook onto whatever you need and filter what you don't. with multiple devices, that's not as clear-cut anymore. > > 2) report everything by kernel, put some filter on the touch events in > > the X driver (which defaults to killing most touch stuff when pen is > > in proximity) > > Hard to coordinate when the events come in from separate device nodes, and may > even use different drivers for pen and touch. > > > 3) the same as above but put the filter in the kernel driver > > > > > > 2 or 3 is more flexible as people with some special uses can just turn > > off the filter and get all events. > > > > WRT X ease of development I would suggest that the same device should > > never report different events based on pen proximity. Either the > > device is in some "compatibility mode" and reports only one pointer as > > per 1 or there are two separate pointers and they are just reported > > out of proximity when the pointer positioning is not desirable (ie > > when pen is out or when touch is off due to pen being in). > > > > If somebody wants to have multitouch combining two different pointers > > there is nothing stopping them, either with P&T pen and touch pointers > > or completely unrelated devices. > > Multiplexing pen and finger events as separate MT tracks where each specifies > the current tool at the start of the stream should work, but it might get a > little messy. Downstream applications would be responsible for deciding if they > care about the tool type and respond accordingly. Also pen and touch have > different characteristics (resolution, physical dimensions, pressure) and an > application would have to adjust behavior specially for every track. you mean, just like they have to do if they were two physical devices that were used simultaneously? :) > I have seen one really positive example where a machine had a number different > of tools, all simultaneously active. However, that was a custom setup created > by an artist/computer scientist for his own use and even then only for a single > purpose machine. > > Trying to support the mixed tool streams for the general case seems tricky and > prone to many mistakes. at the same time, we're only starting to see these devices to become commonplace. a lot of the infrastructure is missing, hence we don't have any applications that support it right now. but is there any reason why that setup you described above can't be commonplace (maybe not in this exact incarnation, but generalised). Cheers, Peter ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-06 15:53 ` Michal Suchanek 2010-11-06 20:38 ` Rafi Rubin @ 2010-11-07 23:32 ` Ping Cheng 1 sibling, 0 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-07 23:32 UTC (permalink / raw) To: Michal Suchanek Cc: X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Sat, Nov 6, 2010 at 8:53 AM, Michal Suchanek <hramrach@centrum.cz> wrote: > > As a tablet user, not X developer I don't think it is desirable and > expected to register pen touch and finger touch together as a > multitouch. That is, if I wanted to trigger a multitouch feature I > would use fingers, not pen and finger. While it would be technically > possible to use the pen touch and finger touch together to trigger > multitouch I don't think it makes much sense. > > Also it is a good idea to turn off (single) touch when pen is in > proximity because it is quite easy to accidentally touch the surface > while moving the pen. > > This means that the protocol that sends abs events from pen when in > proximity and from finger otherwise would work well. > > However, there is one feature that makes sort of sense and does not > work when touch is completely off. It might be possible to use touch > for button and pen for moving the cursor. I tried to use the pad > button on my Bamboo together with pen motion for right drags and it > seems somewhat easier than using the pen button. Thank you for sharing your thoughts. This is the kind of "third party" feedback that I wished for. So, you do see the value of reporting and using both MT and pen data simultaneously. > So I would say that protocol that allows events from both pen and > touch at the same time is desirable in the long run but with some > options to filter the events when both pen and touch are used > together. > > This gives basically three options > > 1) leave as is, turn off touch when pen is in proximity > > 2) report everything by kernel, put some filter on the touch events in > the X driver (which defaults to killing most touch stuff when pen is > in proximity) > > 3) the same as above but put the filter in the kernel driver > > > 2 or 3 is more flexible as people with some special uses can just turn > off the filter and get all events. Then, which option, 2 or 3, do you prefer? To share where I am with you: I am 60% with option 2 and 40% with 3 (I was 100% with 3 before) > WRT X ease of development I would suggest that the same device should > never report different events based on pen proximity. Either the > device is in some "compatibility mode" and reports only one pointer as > per 1 or there are two separate pointers and they are just reported > out of proximity when the pointer positioning is not desirable (ie > when pen is out or when touch is off due to pen being in). > > If somebody wants to have multitouch combining two different pointers > there is nothing stopping them, either with P&T pen and touch pointers > or completely unrelated devices. We'll take care of this step once we know how to report the data. Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
[parent not found: <AANLkTin1svszp87Pysi5OCt5=JTSB-yVaAWF-42gfn9T-JsoAwUIsXosN+BqQ9rBEUg@public.gmane.org>]
* Re: [RFC] Multi-Touch (MT) support - arbitration or not [not found] ` <AANLkTin1svszp87Pysi5OCt5=JTSB-yVaAWF-42gfn9T-JsoAwUIsXosN+BqQ9rBEUg@public.gmane.org> @ 2010-11-07 16:30 ` Daniel Stone 2010-11-07 21:07 ` Michal Suchanek ` (2 more replies) 0 siblings, 3 replies; 44+ messages in thread From: Daniel Stone @ 2010-11-07 16:30 UTC (permalink / raw) To: Ping Cheng; +Cc: X.Org Devel List, Dmitry Torokhov, linux-input [-- Attachment #1.1: Type: text/plain, Size: 1892 bytes --] Hi, On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: > Recent changes and discussion about MT support at LKML, UDS, and > xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > MT protocol (introduced in kernel 2.6.36). Nice! > My goal is to understand how X server would like the MT data to be > reported from the kernel. I hope to keep kernel and X server driver MT > support in sync so we can avoid unnecessary confusion or extra work in > the userland. :) > The existing solution for single touch events is to arbitrate touch > when pen is in prox. This is based on the assumption that we do not > want to have two cursors competing on the screen. What do you mean by 'arbitrate' here? I guess you're talking about which events should send ABS_[XY]? > 1. Arbitrate all touch data in the kernel. > > This is the simplest solution for device driver developers. But I do > not feel it is end user and userland client friendly. > > [...] > > 3. Report first finger touch as ABS_X/Y events when pen is not in prox; > Report pen data as ABS_X/Y events when there is no finger touch; > Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > events when both pen and touch data are received. No ABS_X/Y are > reported when pen and tocuh or multi-touch data are received. > > I feel this one makes sense to userland since pen can be considered as > another touch. I'd say that either #1 or #3 is the best idea here, simply because they seem to be the most straightforward, and thus easier to support. Assuming that xf86-input-wacom will track the state itself and decide (using whatever criteria) which events should send core x/y motion, then all you need to do with ABS_[XY] is just make a best effort to have it more or less work for dumb userspace clients. Cheers, Daniel [-- Attachment #1.2: Digital signature --] [-- Type: application/pgp-signature, Size: 198 bytes --] [-- Attachment #2: Type: text/plain, Size: 219 bytes --] _______________________________________________ xorg-devel-go0+a7rfsptAfugRpC6u6w@public.gmane.org: X.Org development Archives: http://lists.x.org/archives/xorg-devel Info: http://lists.x.org/mailman/listinfo/xorg-devel ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-07 16:30 ` Daniel Stone @ 2010-11-07 21:07 ` Michal Suchanek 2010-11-07 23:38 ` Ping Cheng 2010-11-07 23:49 ` Ping Cheng 2 siblings, 0 replies; 44+ messages in thread From: Michal Suchanek @ 2010-11-07 21:07 UTC (permalink / raw) To: Daniel Stone, Ping Cheng, X.Org Devel List, Dmitry Torokhov, linux-input Hello On 7 November 2010 17:30, Daniel Stone <daniel@fooishbar.org> wrote: > Hi, > > On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: >> The existing solution for single touch events is to arbitrate touch >> when pen is in prox. This is based on the assumption that we do not >> want to have two cursors competing on the screen. > > What do you mean by 'arbitrate' here? I guess you're talking about which > events should send ABS_[XY]? > >> 1. Arbitrate all touch data in the kernel. >> >> This is the simplest solution for device driver developers. But I do >> not feel it is end user and userland client friendly. >> >> [...] >> >> 3. Report first finger touch as ABS_X/Y events when pen is not in prox; >> Report pen data as ABS_X/Y events when there is no finger touch; >> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >> events when both pen and touch data are received. No ABS_X/Y are >> reported when pen and tocuh or multi-touch data are received. >> >> I feel this one makes sense to userland since pen can be considered as >> another touch. > > I'd say that either #1 or #3 is the best idea here, simply because they > seem to be the most straightforward, and thus easier to support. > Assuming that xf86-input-wacom will track the state itself and decide > (using whatever criteria) which events should send core x/y motion, then > all you need to do with ABS_[XY] is just make a best effort to have it > more or less work for dumb userspace clients. > This is horrible. If I understand this correctly if you use #1 then you can never get events from both pen and touch. If you use #3 then devices would suddenly start to send completely different events depending on state of *other* devices, like if you are moving cursor with the pen and touch the pad with your hand you create a) MT even with pen and finger which does not make sense at all b) out of prox for the ABS device and effectively lose input for non-MT clients I think that there should be either #1 or separate devices for pen and touch both with their own ABS events. In no event should devices report different events at random, decoding that from the client side would be a nightmare. Thanks Michal -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-07 16:30 ` Daniel Stone 2010-11-07 21:07 ` Michal Suchanek @ 2010-11-07 23:38 ` Ping Cheng 2010-11-07 23:49 ` Ping Cheng 2 siblings, 0 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-07 23:38 UTC (permalink / raw) To: Daniel Stone, Ping Cheng, X.Org Devel List, Dmitry Torokhov, linux-input On Sun, Nov 7, 2010 at 8:30 AM, Daniel Stone <daniel@fooishbar.org> wrote: >> The existing solution for single touch events is to arbitrate touch >> when pen is in prox. This is based on the assumption that we do not >> want to have two cursors competing on the screen. > > What do you mean by 'arbitrate' here? I guess you're talking about which > events should send ABS_[XY]? It means ignoring touch data when pen is in prox. >> 1. Arbitrate all touch data in the kernel. >> >> This is the simplest solution for device driver developers. But I do >> not feel it is end user and userland client friendly. >> >> [...] >> >> 3. Report first finger touch as ABS_X/Y events when pen is not in prox; >> Report pen data as ABS_X/Y events when there is no finger touch; >> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >> events when both pen and touch data are received. No ABS_X/Y are >> reported when pen and tocuh or multi-touch data are received. >> >> I feel this one makes sense to userland since pen can be considered as >> another touch. > > I'd say that either #1 or #3 is the best idea here, simply because they > seem to be the most straightforward, and thus easier to support. There are two major difference between # > Assuming that xf86-input-wacom will track the state itself and decide > (using whatever criteria) which events should send core x/y motion, then > all you need to do with ABS_[XY] is just make a best effort to have it > more or less work for dumb userspace clients. -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-07 16:30 ` Daniel Stone 2010-11-07 21:07 ` Michal Suchanek 2010-11-07 23:38 ` Ping Cheng @ 2010-11-07 23:49 ` Ping Cheng 2010-11-08 0:53 ` Mohamed Ikbel Boulabiar 2 siblings, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-07 23:49 UTC (permalink / raw) To: Daniel Stone, X.Org Devel List, Dmitry Torokhov, linux-input On Sun, Nov 7, 2010 at 8:30 AM, Daniel Stone <daniel@fooishbar.org> wrote: >> 1. Arbitrate all touch data in the kernel. >> >> This is the simplest solution for device driver developers. But I do >> not feel it is end user and userland client friendly. >> >> [...] >> >> 3. Report first finger touch as ABS_X/Y events when pen is not in prox; >> Report pen data as ABS_X/Y events when there is no finger touch; >> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >> events when both pen and touch data are received. No ABS_X/Y are >> reported when pen and tocuh or multi-touch data are received. >> >> I feel this one makes sense to userland since pen can be considered as >> another touch. > > I'd say that either #1 or #3 is the best idea here, simply because they > seem to be the most straightforward, and thus easier to support. > Assuming that xf86-input-wacom will track the state itself and decide > (using whatever criteria) which events should send core x/y motion, then > all you need to do with ABS_[XY] is just make a best effort to have it > more or less work for dumb userspace clients. Sorry for the noise (hit a wrong key). There are two major differences between #1 and #3: 1. #3 does not report any ABS_XY data when both pen and touch data are received (#1 still sends pen ABS_X/Y); 2. #3 sends both pen and touch data as MT events when both types of data are received (#1 only sends pen data). Which option do you like? We could go with a simple option. But that would close the door to those feature rich applications. Ping Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-07 23:49 ` Ping Cheng @ 2010-11-08 0:53 ` Mohamed Ikbel Boulabiar 0 siblings, 0 replies; 44+ messages in thread From: Mohamed Ikbel Boulabiar @ 2010-11-08 0:53 UTC (permalink / raw) To: Ping Cheng; +Cc: Daniel Stone, X.Org Devel List, Dmitry Torokhov, linux-input I don't know but it seems like a decision is being made here without future possibility of change in the user land. What if someone wants to create an application which uses pen+touch input like this one ? http://goo.gl/ydkl The paper is here : http://www.billbuxton.com/manual%20deskterity.pdf But I confirm that the use case where disabling touch input when using the pen is more common. i ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-05 18:47 [RFC] Multi-Touch (MT) support - arbitration or not Ping Cheng ` (2 preceding siblings ...) [not found] ` <AANLkTin1svszp87Pysi5OCt5=JTSB-yVaAWF-42gfn9T-JsoAwUIsXosN+BqQ9rBEUg@public.gmane.org> @ 2010-11-08 3:51 ` Peter Hutterer 2010-11-08 8:08 ` Benjamin Tissoires 3 siblings, 1 reply; 44+ messages in thread From: Peter Hutterer @ 2010-11-08 3:51 UTC (permalink / raw) To: Ping Cheng; +Cc: X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input fwiw, I'm not sure "arbitrate" is the right word here, filtering seems easier to understand in this context. I guess "arbitrate" would apply more if we emit the events across multiple devices like in the bamboo case. that's mostly bikeshedding though, my points below apply regardless of what word we choose :) note that we also have two different approaches - single kernel device or multiple kernel devices and depending on the approach the device uses the options below have different advantages and disadvantages. the tablets I've dealt with so far exposed a single event device, so that's what I'm focusing on in this email. On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: > Recent changes and discussion about MT support at LKML, UDS, and > xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > MT protocol (introduced in kernel 2.6.36). Since Wacom supports both > digitizer and touch devices, I need to decide how to report touch data > when the pen is in proximity. > > My goal is to understand how X server would like the MT data to be > reported from the kernel. I hope to keep kernel and X server driver MT > support in sync so we can avoid unnecessary confusion or extra work in > the userland. > > The existing solution for single touch events is to arbitrate touch > when pen is in prox. This is based on the assumption that we do not > want to have two cursors competing on the screen. > > With the introduction of MT, the touch data are most likely translated > into something other than pointer events. So, reporting both pen and > touch data makes sense now. However, I want to assure a smooth > tansition from single touch to MT for end users so they still get the > single touch behavior as they used to be. I gathered the following > approaches: > > 1. Arbitrate all touch data in the kernel. > > This is the simplest solution for device driver developers. But I do > not feel it is end user and userland client friendly. I'm strongly opposed to this. kernel filtering of these devices is hard to circumvent and there _will_ be use-cases where we need more than one tool to work simultaneously. right now we're worrying about pen + touch, but what stops tablets from becoming large enough to be used by 2+ users with 2+ pens simultaneously? from a purely event-stream focused viewpoint: why do we even care whether something is a pen or a touch? both are just tools and how these should be used is mostly up to the clients anyway. IMO, the whole point of MT_TOOL_TYPE is that we don't have to assume use-cases for the tools but just forward the information to someone who knows how to deal with this. > 2. Report first finger touch as ABS_X/Y events when pen is not in > prox. Arbitrating single touch data when pen is in prox. Pen data is > reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger > and ABS_MT_* for MT data are reported. > > This approach reduces the overhead in dealing with two cursors in userland. > > 3. Report first finger touch as ABS_X/Y events when pen is not in prox; > Report pen data as ABS_X/Y events when there is no finger touch; > Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > events when both pen and touch data are received. No ABS_X/Y are > reported when pen and tocuh or multi-touch data are received. > > I feel this one makes sense to userland since pen can be considered as > another touch. > > 4. Report first finger touch as ABS_X/Y events when pen is not in prox; > Report pen data as ABS_X/Y events when there is no finger touch; > Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > events when both pen and touch data are received. ABS_X/Y are also > reported for pen when both pen and tocuh data are received. I'd vote for this one. It provides all the data necessary for MT clients (and all the data the device can support) but has a reasonable single-touch strategy. Given that wacom tablets are still primarily pen-centric tablets, the emphasis on pen overriding touch makes sense to me. Cheers, Peter > This one makes sense to userland too. It eases the backward > compatibility support for those clients that don't support MT at all. > > Which approach do you like? Or do you have other suggestions share? ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-08 3:51 ` Peter Hutterer @ 2010-11-08 8:08 ` Benjamin Tissoires 2010-11-08 21:54 ` Chris Bagwell 0 siblings, 1 reply; 44+ messages in thread From: Benjamin Tissoires @ 2010-11-08 8:08 UTC (permalink / raw) To: Peter Hutterer Cc: Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input Le 08/11/2010 04:51, Peter Hutterer a écrit : > fwiw, I'm not sure "arbitrate" is the right word here, filtering seems > easier to understand in this context. I guess "arbitrate" would apply more > if we emit the events across multiple devices like in the bamboo case. > that's mostly bikeshedding though, my points below apply regardless of what > word we choose :) > > note that we also have two different approaches - single kernel device or > multiple kernel devices and depending on the approach the device uses the > options below have different advantages and disadvantages. > > the tablets I've dealt with so far exposed a single event device, so that's > what I'm focusing on in this email. > > On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: >> Recent changes and discussion about MT support at LKML, UDS, and >> xorg-devel encouraged me to migrate Wacom MT devices to the slot-based >> MT protocol (introduced in kernel 2.6.36). Since Wacom supports both >> digitizer and touch devices, I need to decide how to report touch data >> when the pen is in proximity. >> >> My goal is to understand how X server would like the MT data to be >> reported from the kernel. I hope to keep kernel and X server driver MT >> support in sync so we can avoid unnecessary confusion or extra work in >> the userland. >> >> The existing solution for single touch events is to arbitrate touch >> when pen is in prox. This is based on the assumption that we do not >> want to have two cursors competing on the screen. >> >> With the introduction of MT, the touch data are most likely translated >> into something other than pointer events. So, reporting both pen and >> touch data makes sense now. However, I want to assure a smooth >> tansition from single touch to MT for end users so they still get the >> single touch behavior as they used to be. I gathered the following >> approaches: >> >> 1. Arbitrate all touch data in the kernel. >> >> This is the simplest solution for device driver developers. But I do >> not feel it is end user and userland client friendly. > > I'm strongly opposed to this. kernel filtering of these devices is hard to > circumvent and there _will_ be use-cases where we need more than one tool to > work simultaneously. right now we're worrying about pen + touch, but what > stops tablets from becoming large enough to be used by 2+ users with 2+ > pens simultaneously? > > from a purely event-stream focused viewpoint: why do we even care whether > something is a pen or a touch? both are just tools and how these should be > used is mostly up to the clients anyway. IMO, the whole point of > MT_TOOL_TYPE is that we don't have to assume use-cases for the tools but > just forward the information to someone who knows how to deal with this. > >> 2. Report first finger touch as ABS_X/Y events when pen is not in >> prox. Arbitrating single touch data when pen is in prox. Pen data is >> reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger >> and ABS_MT_* for MT data are reported. >> >> This approach reduces the overhead in dealing with two cursors in userland. >> >> 3. Report first finger touch as ABS_X/Y events when pen is not in prox; >> Report pen data as ABS_X/Y events when there is no finger touch; >> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >> events when both pen and touch data are received. No ABS_X/Y are >> reported when pen and tocuh or multi-touch data are received. >> >> I feel this one makes sense to userland since pen can be considered as >> another touch. >> >> 4. Report first finger touch as ABS_X/Y events when pen is not in prox; >> Report pen data as ABS_X/Y events when there is no finger touch; >> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >> events when both pen and touch data are received. ABS_X/Y are also >> reported for pen when both pen and tocuh data are received. > > I'd vote for this one. It provides all the data necessary for MT clients > (and all the data the device can support) but has a reasonable single-touch > strategy. Given that wacom tablets are still primarily pen-centric tablets, > the emphasis on pen overriding touch makes sense to me. Hi, I'd also vote for this. I don't think that the kernel should make any assumption on the final application. The data are available, so we have to pass them. 1. I read that people worry about sending "false" events (touch) while using the pen. But in my mind, this is a _design_ problem of the final application. I think the final application will have to filter these events: for instance, what happens if the user is too lazy to remove his pen (or just want to keep the hover on the application) out of the proximity range and want to move its digital sheet of paper in his (her) design application? The final application will have to choose whether using or not the touch features (depending on the pressure for instance...). The solution 4. (*technical solution*) addresses the problem of the "false" events for the applications (*design problem*) that are not designed to used multitouch. They will just ignore the touch data. So I think, it's a good start 2. I would also add that multitouch is not only available for trackpads: there are also direct devices in absolute coordinate mode. With those device, the touch data can be directed to an other Xclient that is used by an other user if the surface is large enough. Currently we only see relatively small surfaces (bamboo, ntrig devices), but in the future, we can easily imagine a whole table with both pen and touch. And this solve Michal's problem as he will be able to use buttons in the application with the finger. Cheers, Benjamin > >> This one makes sense to userland too. It eases the backward >> compatibility support for those clients that don't support MT at all. >> >> Which approach do you like? Or do you have other suggestions share? > > _______________________________________________ > xorg-devel@lists.x.org: X.Org development > Archives: http://lists.x.org/archives/xorg-devel > Info: http://lists.x.org/mailman/listinfo/xorg-devel -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-08 8:08 ` Benjamin Tissoires @ 2010-11-08 21:54 ` Chris Bagwell 2010-11-08 23:33 ` Ping Cheng 2010-11-09 3:31 ` Peter Hutterer 0 siblings, 2 replies; 44+ messages in thread From: Chris Bagwell @ 2010-11-08 21:54 UTC (permalink / raw) To: Benjamin Tissoires Cc: Peter Hutterer, Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 8, 2010 at 2:08 AM, Benjamin Tissoires <tissoire@cena.fr> wrote: > Le 08/11/2010 04:51, Peter Hutterer a écrit : >> >> fwiw, I'm not sure "arbitrate" is the right word here, filtering seems >> easier to understand in this context. I guess "arbitrate" would apply more >> if we emit the events across multiple devices like in the bamboo case. >> that's mostly bikeshedding though, my points below apply regardless of >> what >> word we choose :) >> >> note that we also have two different approaches - single kernel device or >> multiple kernel devices and depending on the approach the device uses the >> options below have different advantages and disadvantages. >> >> the tablets I've dealt with so far exposed a single event device, so >> that's >> what I'm focusing on in this email. >> >> On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: >>> >>> Recent changes and discussion about MT support at LKML, UDS, and >>> xorg-devel encouraged me to migrate Wacom MT devices to the slot-based >>> MT protocol (introduced in kernel 2.6.36). Since Wacom supports both >>> digitizer and touch devices, I need to decide how to report touch data >>> when the pen is in proximity. >>> >>> My goal is to understand how X server would like the MT data to be >>> reported from the kernel. I hope to keep kernel and X server driver MT >>> support in sync so we can avoid unnecessary confusion or extra work in >>> the userland. >>> >>> The existing solution for single touch events is to arbitrate touch >>> when pen is in prox. This is based on the assumption that we do not >>> want to have two cursors competing on the screen. >>> >>> With the introduction of MT, the touch data are most likely translated >>> into something other than pointer events. So, reporting both pen and >>> touch data makes sense now. However, I want to assure a smooth >>> tansition from single touch to MT for end users so they still get the >>> single touch behavior as they used to be. I gathered the following >>> approaches: >>> >>> 1. Arbitrate all touch data in the kernel. >>> >>> This is the simplest solution for device driver developers. But I do >>> not feel it is end user and userland client friendly. >> >> I'm strongly opposed to this. kernel filtering of these devices is hard to >> circumvent and there _will_ be use-cases where we need more than one tool >> to >> work simultaneously. right now we're worrying about pen + touch, but what >> stops tablets from becoming large enough to be used by 2+ users with 2+ >> pens simultaneously? >> >> from a purely event-stream focused viewpoint: why do we even care whether >> something is a pen or a touch? both are just tools and how these should be >> used is mostly up to the clients anyway. IMO, the whole point of >> MT_TOOL_TYPE is that we don't have to assume use-cases for the tools but >> just forward the information to someone who knows how to deal with this. >> >>> 2. Report first finger touch as ABS_X/Y events when pen is not in >>> prox. Arbitrating single touch data when pen is in prox. Pen data is >>> reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger >>> and ABS_MT_* for MT data are reported. >>> >>> This approach reduces the overhead in dealing with two cursors in >>> userland. >>> >>> 3. Report first finger touch as ABS_X/Y events when pen is not in >>> prox; >>> Report pen data as ABS_X/Y events when there is no finger touch; >>> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >>> events when both pen and touch data are received. No ABS_X/Y are >>> reported when pen and tocuh or multi-touch data are received. >>> >>> I feel this one makes sense to userland since pen can be considered as >>> another touch. >>> >>> 4. Report first finger touch as ABS_X/Y events when pen is not in >>> prox; >>> Report pen data as ABS_X/Y events when there is no finger touch; >>> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN >>> events when both pen and touch data are received. ABS_X/Y are also >>> reported for pen when both pen and tocuh data are received. >> >> I'd vote for this one. It provides all the data necessary for MT clients >> (and all the data the device can support) but has a reasonable >> single-touch >> strategy. Given that wacom tablets are still primarily pen-centric >> tablets, >> the emphasis on pen overriding touch makes sense to me. > > Hi, > > I'd also vote for this. > > I don't think that the kernel should make any assumption on the final > application. The data are available, so we have to pass them. > > 1. I read that people worry about sending "false" events (touch) while using > the pen. But in my mind, this is a _design_ problem of the final > application. I think the final application will have to filter these events: > for instance, what happens if the user is too lazy to remove his pen (or > just want to keep the hover on the application) out of the proximity range > and want to move its digital sheet of paper in his (her) design application? > The final application will have to choose whether using or not the touch > features (depending on the pressure for instance...). > > The solution 4. (*technical solution*) addresses the problem of the "false" > events for the applications (*design problem*) that are not designed to used > multitouch. They will just ignore the touch data. > So I think, it's a good start > > > 2. I would also add that multitouch is not only available for trackpads: > there are also direct devices in absolute coordinate mode. With those > device, the touch data can be directed to an other Xclient that is used by > an other user if the surface is large enough. Currently we only see > relatively small surfaces (bamboo, ntrig devices), but in the future, we can > easily imagine a whole table with both pen and touch. > > And this solve Michal's problem as he will be able to use buttons in the > application with the finger. > > Cheers, > Benjamin > >> >>> This one makes sense to userland too. It eases the backward >>> compatibility support for those clients that don't support MT at all. >>> >>> Which approach do you like? Or do you have other suggestions share? >> I think we may be mixing some topics and so I'd like to try to re-frame the discussion. There are two different cases and they may have different answers because of it. Case 1) 1 input device can support multiple tools that are in proximity at same time. I believe this is currently a theoretical example (no driver exists like this). In RFC example, this input devices has a pen and 2 finger touches. They all share ABS_X/Y/PRESSURE values. The single touch (ST) input filtering breaks being able to support this case and what multitouch events (MT) were added for. To date, when converting drivers over to MT events the guideline is *always* send MT events (because what app wants to randomly switch between MT event processing and ST event processing for same X/Y/PRESSURE?) and send something sane for ST events to be backwards compatible with older apps. I think everyone is happy in this thread to always send pen+touch MT events and let X drivers or similar filter/arbitrate out unwanted touch events as needed. The ideal "sane" behavior for touch ST events has been leaning towards tracking 1st touch and continue sending 1st touch during multi-touch but there is some debate because tracking can be expensive in kernel. In case of pen+touch, the sane may change to prefer pen over touch and prefer first touch when 2 touches exist. Or "sane" can mean let the ST values go crazy during multi-touch and hope user can use GUI enough after new kernel install to get a MT-aware X driver. Its easy to implement preferring pen then preferring 1st touch so I suggest doing that. This is for backwards compatibility only (un-modified xf86-input-wacom/synaptics/evdev/etc). The future is MT events, in which case the ST events are meaningless and we are hiding nothing to applications that look at MT events. Case 2) 2 input devices can support multiple tools in proximity at same time. I believe it was Rafi that brought up point that dual pen+touch interfaces will have different properties. Touch will be lower resolution then Pen and maybe different fuzz factor. Also, on tablets I would think pretty easy to have different dimensions (one tool works over larger area of tablet). This is easy to expose to user when 2 input devices. Combining into single input to user would be nice but at least when dimensions are different, we probably do not want to remove that visibility to user and so must keep 2 input devices. In this case, the RFC example becomes 2 touches on 1 input device and 1 pen on another input device. So using same MT guidelines, the touch input device would always send MT events and always send ST events tracking the first touch. For pen input, ST-only events are OK because its not competing with anything being in proximity at same time. But we may wish to also send MT events for this 1 tool/slot as a hint to X drivers that somehow this input corresponds with another input device and so it needs to do filtering/arbitration. We also need to somehow give info to applications so they can bind these 2 inputs. Also, non-MT ware drivers are also same apps that will not know how to bind 2 input devices and so can't filter/arbitrate the unwanted touches. So problem, we do want to filter ST events on touch input when pen is in proximity. There are lots of things needing to be addressed for this 2nd case so I'll not really give a personal opinion. My opinion is likely to change as we make individual decisions. Chris -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-08 21:54 ` Chris Bagwell @ 2010-11-08 23:33 ` Ping Cheng 2010-11-09 3:42 ` Chris Bagwell 2010-11-09 3:31 ` Peter Hutterer 1 sibling, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-08 23:33 UTC (permalink / raw) To: Chris Bagwell Cc: Benjamin Tissoires, Peter Hutterer, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 8, 2010 at 1:54 PM, Chris Bagwell <chris@cnpbagwell.com> wrote: > I think we may be mixing some topics and so I'd like to try to > re-frame the discussion. > > There are two different cases and they may have different answers > because of it. I'd like to cover both cases with an unified answer. There is only one physical device no matter which case we are in, especially in the kernel. The only difference is whether it is a serial or USB device. So, my comments applies to both types of devices. > Case 1) 1 input device can support multiple tools that are in > proximity at same time. > > I believe this is currently a theoretical example (no driver exists like this). > > In RFC example, this input devices has a pen and 2 finger touches. > They all share ABS_X/Y/PRESSURE values. The single touch (ST) input > filtering breaks being able to support this case and what multitouch > events (MT) were added for. > > To date, when converting drivers over to MT events the guideline is > *always* send MT events (because what app wants to randomly switch > between MT event processing and ST event processing for same > X/Y/PRESSURE?) and send something sane for ST events to be backwards > compatible with older apps. > > I think everyone is happy in this thread to always send pen+touch MT > events and let X drivers or similar filter/arbitrate out unwanted > touch events as needed. Ok, the done deal so far is: we'll send MT events regardless of pen position. > The ideal "sane" behavior for touch ST events has been leaning towards > tracking 1st touch and continue sending 1st touch during multi-touch > but there is some debate because tracking can be expensive in kernel. > In case of pen+touch, the sane may change to prefer pen over touch and > prefer first touch when 2 touches exist. > > Or "sane" can mean let the ST values go crazy during multi-touch and > hope user can use GUI enough after new kernel install to get a > MT-aware X driver. > > Its easy to implement preferring pen then preferring 1st touch so I > suggest doing that. This is for backwards compatibility only > (un-modified xf86-input-wacom/synaptics/evdev/etc). The future is MT > events, in which case the ST events are meaningless and we are hiding > nothing to applications that look at MT events. > > Case 2) 2 input devices can support multiple tools in proximity at same time. > > I believe it was Rafi that brought up point that dual pen+touch > interfaces will have different properties. Touch will be lower > resolution then Pen and maybe different fuzz factor. Also, on tablets > I would think pretty easy to have different dimensions (one tool works > over larger area of tablet). This is easy to expose to user when 2 > input devices. Kernel driver doesn't need to worry about this detail. X server driver or clients should take care of them. Each tool has . Dual pen+touch (ST) has been supported for a while.... > Combining into single input to user would be nice but at least when > dimensions are different, we probably do not want to remove that > visibility to user and so must keep 2 input devices. > > In this case, the RFC example becomes 2 touches on 1 input device and > 1 pen on another input device. > > So using same MT guidelines, the touch input device would always send > MT events and always send ST events tracking the first touch. > > For pen input, ST-only events are OK because its not competing with > anything being in proximity at same time. But we may wish to also > send MT events for this 1 tool/slot as a hint to X drivers that > somehow this input corresponds with another input device and so it > needs to do filtering/arbitration. We also need to somehow give info > to applications so they can bind these 2 inputs. > > Also, non-MT ware drivers are also same apps that will not know how to > bind 2 input devices and so can't filter/arbitrate the unwanted > touches. So problem, we do want to filter ST events on touch input > when pen is in proximity. Filtering touch events when pen is in prox is reasonable for existing apps. We will add a new property in the X driver so future MT-aware apps can turn ST/cursor events on/off when necessary. By default, the ST/cursor will be on to support backward compatibility. > There are lots of things needing to be addressed for this 2nd case so > I'll not really give a personal opinion. My opinion is likely to > change as we make individual decisions. You'll see that my opinion changes too. That's why we need this [RFC] ;). I still have one decision to make though: Do we really want to repeat the pen data with MT_TOOL_PEN? It is already reported by ST events. From my comments for proposal #4, you can tell that I was in favor of repeating them. After comparing #2 with #4 a bit further, I see both options have its pros and cons. Which way should we go? #2 or #4? Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-08 23:33 ` Ping Cheng @ 2010-11-09 3:42 ` Chris Bagwell 0 siblings, 0 replies; 44+ messages in thread From: Chris Bagwell @ 2010-11-09 3:42 UTC (permalink / raw) To: Ping Cheng Cc: Benjamin Tissoires, Peter Hutterer, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 8, 2010 at 5:33 PM, Ping Cheng <pinglinux@gmail.com> wrote: > On Mon, Nov 8, 2010 at 1:54 PM, Chris Bagwell <chris@cnpbagwell.com> wrote: >> I think we may be mixing some topics and so I'd like to try to >> re-frame the discussion. >> >> There are two different cases and they may have different answers >> because of it. > > I'd like to cover both cases with an unified answer. There is only one > physical device no matter which case we are in, especially in the > kernel. The only difference is whether it is a serial or USB device. > So, my comments applies to both types of devices. > >> Case 1) 1 input device can support multiple tools that are in >> proximity at same time. >> >> I believe this is currently a theoretical example (no driver exists like this). >> >> In RFC example, this input devices has a pen and 2 finger touches. >> They all share ABS_X/Y/PRESSURE values. The single touch (ST) input >> filtering breaks being able to support this case and what multitouch >> events (MT) were added for. >> >> To date, when converting drivers over to MT events the guideline is >> *always* send MT events (because what app wants to randomly switch >> between MT event processing and ST event processing for same >> X/Y/PRESSURE?) and send something sane for ST events to be backwards >> compatible with older apps. >> >> I think everyone is happy in this thread to always send pen+touch MT >> events and let X drivers or similar filter/arbitrate out unwanted >> touch events as needed. > > Ok, the done deal so far is: we'll send MT events regardless of pen position. > >> The ideal "sane" behavior for touch ST events has been leaning towards >> tracking 1st touch and continue sending 1st touch during multi-touch >> but there is some debate because tracking can be expensive in kernel. >> In case of pen+touch, the sane may change to prefer pen over touch and >> prefer first touch when 2 touches exist. >> >> Or "sane" can mean let the ST values go crazy during multi-touch and >> hope user can use GUI enough after new kernel install to get a >> MT-aware X driver. >> >> Its easy to implement preferring pen then preferring 1st touch so I >> suggest doing that. This is for backwards compatibility only >> (un-modified xf86-input-wacom/synaptics/evdev/etc). The future is MT >> events, in which case the ST events are meaningless and we are hiding >> nothing to applications that look at MT events. >> >> Case 2) 2 input devices can support multiple tools in proximity at same time. >> >> I believe it was Rafi that brought up point that dual pen+touch >> interfaces will have different properties. Touch will be lower >> resolution then Pen and maybe different fuzz factor. Also, on tablets >> I would think pretty easy to have different dimensions (one tool works >> over larger area of tablet). This is easy to expose to user when 2 >> input devices. > > Kernel driver doesn't need to worry about this detail. X server driver > or clients should take care of them. Each tool has . Dual pen+touch > (ST) has been supported for a while.... Kernel driver informs clients this information with EVIOCGABS so clients can work with future hardware. Driver can probably scale resolution up to same and hide difference to client but dimensions are different story. > >> Combining into single input to user would be nice but at least when >> dimensions are different, we probably do not want to remove that >> visibility to user and so must keep 2 input devices. >> >> In this case, the RFC example becomes 2 touches on 1 input device and >> 1 pen on another input device. >> >> So using same MT guidelines, the touch input device would always send >> MT events and always send ST events tracking the first touch. >> >> For pen input, ST-only events are OK because its not competing with >> anything being in proximity at same time. But we may wish to also >> send MT events for this 1 tool/slot as a hint to X drivers that >> somehow this input corresponds with another input device and so it >> needs to do filtering/arbitration. We also need to somehow give info >> to applications so they can bind these 2 inputs. >> >> Also, non-MT ware drivers are also same apps that will not know how to >> bind 2 input devices and so can't filter/arbitrate the unwanted >> touches. So problem, we do want to filter ST events on touch input >> when pen is in proximity. > > Filtering touch events when pen is in prox is reasonable for existing > apps. We will add a new property in the X driver so future MT-aware > apps can turn ST/cursor events on/off when necessary. By default, the > ST/cursor will be on to support backward compatibility. I'm not sure I'd add an option to look at ST events when MT are around but I do agree with need some TBD knobs to tweak user visible behaviour. > >> There are lots of things needing to be addressed for this 2nd case so >> I'll not really give a personal opinion. My opinion is likely to >> change as we make individual decisions. > > You'll see that my opinion changes too. That's why we need this [RFC] ;). > > I still have one decision to make though: > > Do we really want to repeat the pen data with MT_TOOL_PEN? It is > already reported by ST events. Just to be clear, for case #1 we have to send it. For case #2, its surely optional and I've not made up my mind either. On one side, if we send then it make case #1 and case #2 have similar code flow to client but on the other hand it looks a little odd if you look at pen input device by itself (for example, if pen goes to xf86-input-wacom and touch goes to xf86-input-evdev). > > From my comments for proposal #4, you can tell that I was in favor of > repeating them. After comparing #2 with #4 a bit further, I see both > options have its pros and cons. If I understand #4, its what we would use in case #1. For that, we want to always send some ST events that thats pen first priority and 1st touch second prority. For case #2 (at least Bamboo, Wacom Tablet PC's, and NTrig are all these 2 input devices BTW), I'm leaning towards #2. Chris > > Which way should we go? #2 or #4? > > Ping > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-08 21:54 ` Chris Bagwell 2010-11-08 23:33 ` Ping Cheng @ 2010-11-09 3:31 ` Peter Hutterer 2010-11-09 4:14 ` Chris Bagwell 2010-11-09 6:59 ` Dmitry Torokhov 1 sibling, 2 replies; 44+ messages in thread From: Peter Hutterer @ 2010-11-09 3:31 UTC (permalink / raw) To: Chris Bagwell Cc: Benjamin Tissoires, Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 08, 2010 at 03:54:51PM -0600, Chris Bagwell wrote: > On Mon, Nov 8, 2010 at 2:08 AM, Benjamin Tissoires <tissoire@cena.fr> wrote: > > Le 08/11/2010 04:51, Peter Hutterer a écrit : > >> > >> fwiw, I'm not sure "arbitrate" is the right word here, filtering seems > >> easier to understand in this context. I guess "arbitrate" would apply more > >> if we emit the events across multiple devices like in the bamboo case. > >> that's mostly bikeshedding though, my points below apply regardless of > >> what > >> word we choose :) > >> > >> note that we also have two different approaches - single kernel device or > >> multiple kernel devices and depending on the approach the device uses the > >> options below have different advantages and disadvantages. > >> > >> the tablets I've dealt with so far exposed a single event device, so > >> that's > >> what I'm focusing on in this email. > >> > >> On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote: > >>> > >>> Recent changes and discussion about MT support at LKML, UDS, and > >>> xorg-devel encouraged me to migrate Wacom MT devices to the slot-based > >>> MT protocol (introduced in kernel 2.6.36). Since Wacom supports both > >>> digitizer and touch devices, I need to decide how to report touch data > >>> when the pen is in proximity. > >>> > >>> My goal is to understand how X server would like the MT data to be > >>> reported from the kernel. I hope to keep kernel and X server driver MT > >>> support in sync so we can avoid unnecessary confusion or extra work in > >>> the userland. > >>> > >>> The existing solution for single touch events is to arbitrate touch > >>> when pen is in prox. This is based on the assumption that we do not > >>> want to have two cursors competing on the screen. > >>> > >>> With the introduction of MT, the touch data are most likely translated > >>> into something other than pointer events. So, reporting both pen and > >>> touch data makes sense now. However, I want to assure a smooth > >>> tansition from single touch to MT for end users so they still get the > >>> single touch behavior as they used to be. I gathered the following > >>> approaches: > >>> > >>> 1. Arbitrate all touch data in the kernel. > >>> > >>> This is the simplest solution for device driver developers. But I do > >>> not feel it is end user and userland client friendly. > >> > >> I'm strongly opposed to this. kernel filtering of these devices is hard to > >> circumvent and there _will_ be use-cases where we need more than one tool > >> to > >> work simultaneously. right now we're worrying about pen + touch, but what > >> stops tablets from becoming large enough to be used by 2+ users with 2+ > >> pens simultaneously? > >> > >> from a purely event-stream focused viewpoint: why do we even care whether > >> something is a pen or a touch? both are just tools and how these should be > >> used is mostly up to the clients anyway. IMO, the whole point of > >> MT_TOOL_TYPE is that we don't have to assume use-cases for the tools but > >> just forward the information to someone who knows how to deal with this. > >> > >>> 2. Report first finger touch as ABS_X/Y events when pen is not in > >>> prox. Arbitrating single touch data when pen is in prox. Pen data is > >>> reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger > >>> and ABS_MT_* for MT data are reported. > >>> > >>> This approach reduces the overhead in dealing with two cursors in > >>> userland. > >>> > >>> 3. Report first finger touch as ABS_X/Y events when pen is not in > >>> prox; > >>> Report pen data as ABS_X/Y events when there is no finger touch; > >>> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > >>> events when both pen and touch data are received. No ABS_X/Y are > >>> reported when pen and tocuh or multi-touch data are received. > >>> > >>> I feel this one makes sense to userland since pen can be considered as > >>> another touch. > >>> > >>> 4. Report first finger touch as ABS_X/Y events when pen is not in > >>> prox; > >>> Report pen data as ABS_X/Y events when there is no finger touch; > >>> Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN > >>> events when both pen and touch data are received. ABS_X/Y are also > >>> reported for pen when both pen and tocuh data are received. > >> > >> I'd vote for this one. It provides all the data necessary for MT clients > >> (and all the data the device can support) but has a reasonable > >> single-touch > >> strategy. Given that wacom tablets are still primarily pen-centric > >> tablets, > >> the emphasis on pen overriding touch makes sense to me. > > > > Hi, > > > > I'd also vote for this. > > > > I don't think that the kernel should make any assumption on the final > > application. The data are available, so we have to pass them. > > > > 1. I read that people worry about sending "false" events (touch) while using > > the pen. But in my mind, this is a _design_ problem of the final > > application. I think the final application will have to filter these events: > > for instance, what happens if the user is too lazy to remove his pen (or > > just want to keep the hover on the application) out of the proximity range > > and want to move its digital sheet of paper in his (her) design application? > > The final application will have to choose whether using or not the touch > > features (depending on the pressure for instance...). > > > > The solution 4. (*technical solution*) addresses the problem of the "false" > > events for the applications (*design problem*) that are not designed to used > > multitouch. They will just ignore the touch data. > > So I think, it's a good start > > > > > > 2. I would also add that multitouch is not only available for trackpads: > > there are also direct devices in absolute coordinate mode. With those > > device, the touch data can be directed to an other Xclient that is used by > > an other user if the surface is large enough. Currently we only see > > relatively small surfaces (bamboo, ntrig devices), but in the future, we can > > easily imagine a whole table with both pen and touch. > > > > And this solve Michal's problem as he will be able to use buttons in the > > application with the finger. > > > > Cheers, > > Benjamin > > > >> > >>> This one makes sense to userland too. It eases the backward > >>> compatibility support for those clients that don't support MT at all. > >>> > >>> Which approach do you like? Or do you have other suggestions share? > >> > > I think we may be mixing some topics and so I'd like to try to > re-frame the discussion. > > There are two different cases and they may have different answers > because of it. > > Case 1) 1 input device can support multiple tools that are in > proximity at same time. > > I believe this is currently a theoretical example (no driver exists like this). if you consider touch to be just another tool, we already have devices that support proximity of multiple tools. This isn't theoretical anymore. > In RFC example, this input devices has a pen and 2 finger touches. > They all share ABS_X/Y/PRESSURE values. The single touch (ST) input > filtering breaks being able to support this case and what multitouch > events (MT) were added for. > > To date, when converting drivers over to MT events the guideline is > *always* send MT events (because what app wants to randomly switch > between MT event processing and ST event processing for same > X/Y/PRESSURE?) and send something sane for ST events to be backwards > compatible with older apps. > > I think everyone is happy in this thread to always send pen+touch MT > events and let X drivers or similar filter/arbitrate out unwanted > touch events as needed. > > The ideal "sane" behavior for touch ST events has been leaning towards > tracking 1st touch and continue sending 1st touch during multi-touch > but there is some debate because tracking can be expensive in kernel. > In case of pen+touch, the sane may change to prefer pen over touch and > prefer first touch when 2 touches exist. > > Or "sane" can mean let the ST values go crazy during multi-touch and > hope user can use GUI enough after new kernel install to get a > MT-aware X driver. > > Its easy to implement preferring pen then preferring 1st touch so I > suggest doing that. This is for backwards compatibility only > (un-modified xf86-input-wacom/synaptics/evdev/etc). The future is MT > events, in which case the ST events are meaningless and we are hiding > nothing to applications that look at MT events. > > Case 2) 2 input devices can support multiple tools in proximity at same time. > > I believe it was Rafi that brought up point that dual pen+touch > interfaces will have different properties. Touch will be lower > resolution then Pen and maybe different fuzz factor. Also, on tablets > I would think pretty easy to have different dimensions (one tool works > over larger area of tablet). This is easy to expose to user when 2 > input devices. Yes and no. We're talking about kernel level here and I don't think this should be done at this level. The current behaviour of the X driver is to split multiple tools up into multiple X devices, so the points above can easily be achieved in userspace. > Combining into single input to user would be nice but at least when > dimensions are different, we probably do not want to remove that > visibility to user and so must keep 2 input devices. If we run into issues with different axis ranges/resolutions for multiple MT_SLOT devices, this should be addressed in the kernel as well. I feel uncomfortable about splitting up a physical device into multiple devices, it takes information away that cannot easily be re-created in the userspace. Even with a method of associating multiple event devices to the same physical device, the parsing of simultaneous events is harder because you're essentially deserialising event streams. In userspace, you have to re-serialize based on parallel inputs. That said, it also goes counter the whole multi-touch approach - allowing more than one device on a single physical device. Cheers, Peter > In this case, the RFC example becomes 2 touches on 1 input device and > 1 pen on another input device. > > So using same MT guidelines, the touch input device would always send > MT events and always send ST events tracking the first touch. > > For pen input, ST-only events are OK because its not competing with > anything being in proximity at same time. But we may wish to also > send MT events for this 1 tool/slot as a hint to X drivers that > somehow this input corresponds with another input device and so it > needs to do filtering/arbitration. We also need to somehow give info > to applications so they can bind these 2 inputs. > > Also, non-MT ware drivers are also same apps that will not know how to > bind 2 input devices and so can't filter/arbitrate the unwanted > touches. So problem, we do want to filter ST events on touch input > when pen is in proximity. > > There are lots of things needing to be addressed for this 2nd case so > I'll not really give a personal opinion. My opinion is likely to > change as we make individual decisions. > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 3:31 ` Peter Hutterer @ 2010-11-09 4:14 ` Chris Bagwell 2010-11-10 4:46 ` Peter Hutterer 2010-11-09 6:59 ` Dmitry Torokhov 1 sibling, 1 reply; 44+ messages in thread From: Chris Bagwell @ 2010-11-09 4:14 UTC (permalink / raw) To: Peter Hutterer Cc: Benjamin Tissoires, Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 8, 2010 at 9:31 PM, Peter Hutterer <peter.hutterer@who-t.net> wrote: > On Mon, Nov 08, 2010 at 03:54:51PM -0600, Chris Bagwell wrote: >> >> I think we may be mixing some topics and so I'd like to try to >> re-frame the discussion. >> >> There are two different cases and they may have different answers >> because of it. >> >> Case 1) 1 input device can support multiple tools that are in >> proximity at same time. >> >> I believe this is currently a theoretical example (no driver exists like this). > > if you consider touch to be just another tool, we already have devices that > support proximity of multiple tools. This isn't theoretical anymore. Yes, I totally agree there. I meant more a MT driver with both pen and touch or really any case were one tool in proximity can invalidate meaning of other tools in proximity. 2 paragraphs down describes todays 1 input device with MT behaviour. > >> In RFC example, this input devices has a pen and 2 finger touches. >> They all share ABS_X/Y/PRESSURE values. The single touch (ST) input >> filtering breaks being able to support this case and what multitouch >> events (MT) were added for. >> >> To date, when converting drivers over to MT events the guideline is >> *always* send MT events (because what app wants to randomly switch >> between MT event processing and ST event processing for same >> X/Y/PRESSURE?) and send something sane for ST events to be backwards >> compatible with older apps. >> >> I think everyone is happy in this thread to always send pen+touch MT >> events and let X drivers or similar filter/arbitrate out unwanted >> touch events as needed. >> >> The ideal "sane" behavior for touch ST events has been leaning towards >> tracking 1st touch and continue sending 1st touch during multi-touch >> but there is some debate because tracking can be expensive in kernel. >> In case of pen+touch, the sane may change to prefer pen over touch and >> prefer first touch when 2 touches exist. >> >> Or "sane" can mean let the ST values go crazy during multi-touch and >> hope user can use GUI enough after new kernel install to get a >> MT-aware X driver. >> >> Its easy to implement preferring pen then preferring 1st touch so I >> suggest doing that. This is for backwards compatibility only >> (un-modified xf86-input-wacom/synaptics/evdev/etc). The future is MT >> events, in which case the ST events are meaningless and we are hiding >> nothing to applications that look at MT events. >> >> Case 2) 2 input devices can support multiple tools in proximity at same time. >> >> I believe it was Rafi that brought up point that dual pen+touch >> interfaces will have different properties. Touch will be lower >> resolution then Pen and maybe different fuzz factor. Also, on tablets >> I would think pretty easy to have different dimensions (one tool works >> over larger area of tablet). This is easy to expose to user when 2 >> input devices. > > Yes and no. We're talking about kernel level here and I don't think this > should be done at this level. The current behaviour of the X driver is to > split multiple tools up into multiple X devices, so the points above can > easily be achieved in userspace. > > >> Combining into single input to user would be nice but at least when >> dimensions are different, we probably do not want to remove that >> visibility to user and so must keep 2 input devices. > > If we run into issues with different axis ranges/resolutions for multiple > MT_SLOT devices, this should be addressed in the kernel as well. Yes, that seems a fair statement. > I feel uncomfortable about splitting up a physical device into multiple > devices, it takes information away that cannot easily be re-created in the > userspace. Even with a method of associating multiple event devices to the > same physical device, the parsing of simultaneous events is harder because > you're essentially deserialising event streams. In userspace, you have to > re-serialize based on parallel inputs. > > That said, it also goes counter the whole multi-touch approach - allowing > more than one device on a single physical device. Hmm, does this sum up your opinion? You are a strong proponent of having all related tools sent over a single input device so you get natural context of events. When you do it this way, todays sample MT implementation for touchpad "just work" for pen+touch as well. That behaviour can basically be summed up with "send MT events for all tools and let clients figure it out. For older ST events do something sane to help older apps." So I get and do agree with that part but you've not clearly stated if your also saying something like refuse to support split input solutions and we should fix kernel instead of defining a behaviour for this case. If we are forced to support split inputs, I suspect your basically OK with behaviour #2 because its effectively emulating single input behaviour as best it can and we are just picking what "sane" means in this case odd case. I've copied #2 below and added my own text in "[]" to be sure and clarify text in context of case #2. 2. Report first finger touch as ABS_X/Y events [on touch input device] when pen is not in prox. Arbitrating single touch data [on touch input device] when pen is in prox. Pen data is reported as ABS_X/Y events [on pen input device]. Both ABS_X/Y for pen or the first finger and ABS_MT_* for MT data are reported [each MT send]. [MT are even sent on touch device even though only 1 in proximity tool possible so that client can combine both inputs' events and see same behaviour as if it was a single input device.] Chris > > Cheers, > Peter > >> In this case, the RFC example becomes 2 touches on 1 input device and >> 1 pen on another input device. >> >> So using same MT guidelines, the touch input device would always send >> MT events and always send ST events tracking the first touch. >> >> For pen input, ST-only events are OK because its not competing with >> anything being in proximity at same time. But we may wish to also >> send MT events for this 1 tool/slot as a hint to X drivers that >> somehow this input corresponds with another input device and so it >> needs to do filtering/arbitration. We also need to somehow give info >> to applications so they can bind these 2 inputs. >> >> Also, non-MT ware drivers are also same apps that will not know how to >> bind 2 input devices and so can't filter/arbitrate the unwanted >> touches. So problem, we do want to filter ST events on touch input >> when pen is in proximity. >> >> There are lots of things needing to be addressed for this 2nd case so >> I'll not really give a personal opinion. My opinion is likely to >> change as we make individual decisions. >> > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 4:14 ` Chris Bagwell @ 2010-11-10 4:46 ` Peter Hutterer 2010-11-11 3:57 ` Chris Bagwell 0 siblings, 1 reply; 44+ messages in thread From: Peter Hutterer @ 2010-11-10 4:46 UTC (permalink / raw) To: Chris Bagwell Cc: Benjamin Tissoires, Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Mon, Nov 08, 2010 at 10:14:56PM -0600, Chris Bagwell wrote: > > I feel uncomfortable about splitting up a physical device into multiple > > devices, it takes information away that cannot easily be re-created in the > > userspace. Even with a method of associating multiple event devices to the > > same physical device, the parsing of simultaneous events is harder because > > you're essentially deserialising event streams. In userspace, you have to > > re-serialize based on parallel inputs. > > > > That said, it also goes counter the whole multi-touch approach - allowing > > more than one device on a single physical device. > > Hmm, does this sum up your opinion? You are a strong proponent of > having all related tools sent over a single input device so you get > natural context of events. When you do it this way, todays sample MT > implementation for touchpad "just work" for pen+touch as well. That > behaviour can basically be summed up with "send MT events for all > tools and let clients figure it out. For older ST events do something > sane to help older apps." yes, that sums it up nicely. > So I get and do agree with that part but you've not clearly stated if > your also saying something like refuse to support split input > solutions and we should fix kernel instead of defining a behaviour for > this case. If we are forced to support split inputs, I suspect your > basically OK with behaviour #2 because its effectively emulating > single input behaviour as best it can and we are just picking what > "sane" means in this case odd case. If I read this correctly, then yes, I say don't try to find a split device solution but fix up the per-MT-tool axis range problem instead. Once that's done, we can do #4 as Ping suggested for ST. Splitting gives us some benefits now, but long-term it'll be harder to coordinate the different tools and devices. Unfortunately, my crystal ball is cloudy, so I can't prove that case :) Either way, I think even the current bamboo case should be merged back into one single device. > I've copied #2 below and added my own text in "[]" to be sure and > clarify text in context of case #2. > > 2. Report first finger touch as ABS_X/Y events [on touch input > device] when pen is not in > prox. Arbitrating single touch data [on touch input device] when pen > is in prox. Pen data is > reported as ABS_X/Y events [on pen input device]. Both ABS_X/Y for pen > or the first finger > and ABS_MT_* for MT data are reported [each MT send]. [MT are even > sent on touch device even though only 1 in proximity tool possible so > that client can combine both inputs' events and see same behaviour as > if it was a single input device.] Assuming a split device - why do any filtering at all? Report ABS_X/Y for pen on the pen device, touch as MT events on the touch device _and_ first finger touch as ABS_X/Y on the touch device. If userspace can somehow couple the two devices then it's easy enough to filter touch events when necessary. Don't report pen as the only MT set on the pen device, that's just confusing. Does this answer your question? Cheers, Peter > >> In this case, the RFC example becomes 2 touches on 1 input device and > >> 1 pen on another input device. > >> > >> So using same MT guidelines, the touch input device would always send > >> MT events and always send ST events tracking the first touch. > >> > >> For pen input, ST-only events are OK because its not competing with > >> anything being in proximity at same time. But we may wish to also > >> send MT events for this 1 tool/slot as a hint to X drivers that > >> somehow this input corresponds with another input device and so it > >> needs to do filtering/arbitration. We also need to somehow give info > >> to applications so they can bind these 2 inputs. > >> > >> Also, non-MT ware drivers are also same apps that will not know how to > >> bind 2 input devices and so can't filter/arbitrate the unwanted > >> touches. So problem, we do want to filter ST events on touch input > >> when pen is in proximity. > >> > >> There are lots of things needing to be addressed for this 2nd case so > >> I'll not really give a personal opinion. My opinion is likely to > >> change as we make individual decisions. > >> > > > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-10 4:46 ` Peter Hutterer @ 2010-11-11 3:57 ` Chris Bagwell 2010-11-11 18:23 ` Ping Cheng 0 siblings, 1 reply; 44+ messages in thread From: Chris Bagwell @ 2010-11-11 3:57 UTC (permalink / raw) To: Peter Hutterer Cc: Benjamin Tissoires, Ping Cheng, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Tue, Nov 9, 2010 at 10:46 PM, Peter Hutterer <peter.hutterer@who-t.net> wrote: > On Mon, Nov 08, 2010 at 10:14:56PM -0600, Chris Bagwell wrote: >> >> I've copied #2 below and added my own text in "[]" to be sure and >> clarify text in context of case #2. >> >> 2. Report first finger touch as ABS_X/Y events [on touch input >> device] when pen is not in >> prox. Arbitrating single touch data [on touch input device] when pen >> is in prox. Pen data is >> reported as ABS_X/Y events [on pen input device]. Both ABS_X/Y for pen >> or the first finger >> and ABS_MT_* for MT data are reported [each MT send]. [MT are even >> sent on touch device even though only 1 in proximity tool possible so >> that client can combine both inputs' events and see same behaviour as >> if it was a single input device.] > > Assuming a split device - why do any filtering at all? > Report ABS_X/Y for pen on the pen device, touch as MT events on the touch > device _and_ first finger touch as ABS_X/Y on the touch device. If userspace > can somehow couple the two devices then it's easy enough to filter touch > events when necessary. Who is audience of these ABS_X/Y events? We are not sending them for MT-aware application benefit. Matter of fact, it slightly complicates MT-aware apps because they have to filter them in addition to filtering MT events. The real audience of ABS_X/Y is either older apps or "simple" apps (people that chose to keep it simple and not support MT events). This class of apps can't really be expected to bind two devices logically and mask. So without filtering ABS_X/Y on kernel side, we've basically made Bamboo drivers unusable with a range of apps; which probably means xf86-input-{evdev|synaptics} (I can't justify adding binding logic to those two). I think we also have some high level agreements that we should combine input devices in kernel long term with minor technical decisions to be made. So that makes any logical binding code in xf86-input-* transitional only. So by masking in kernel for these special pen+touch 2 input case, its simple, keeps Bamboo/Ntrig usable with existing xf86-input-{evdev|synaptics|wacom} and keeps user land from writing that transitional logic. > > Don't report pen as the only MT set on the pen device, that's just > confusing. Yeah, agree. Its mainly only if it solves some large applications binding issues but thats theoretically right now. No real need to do it. > > Does this answer your question? Yes. Thanks. Chris -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 3:57 ` Chris Bagwell @ 2010-11-11 18:23 ` Ping Cheng 0 siblings, 0 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-11 18:23 UTC (permalink / raw) To: Chris Bagwell Cc: Peter Hutterer, Benjamin Tissoires, X.Org Devel List, Dmitry Torokhov, Daniel Stone, linux-input On Wed, Nov 10, 2010 at 7:57 PM, Chris Bagwell <chris@cnpbagwell.com> wrote: > On Tue, Nov 9, 2010 at 10:46 PM, Peter Hutterer > <peter.hutterer@who-t.net> wrote: >> On Mon, Nov 08, 2010 at 10:14:56PM -0600, Chris Bagwell wrote: >>> >>> I've copied #2 below and added my own text in "[]" to be sure and >>> clarify text in context of case #2. >>> >>> 2. Report first finger touch as ABS_X/Y events [on touch input >>> device] when pen is not in >>> prox. Arbitrating single touch data [on touch input device] when pen >>> is in prox. Pen data is >>> reported as ABS_X/Y events [on pen input device]. Both ABS_X/Y for pen >>> or the first finger >>> and ABS_MT_* for MT data are reported [each MT send]. [MT are even >>> sent on touch device even though only 1 in proximity tool possible so >>> that client can combine both inputs' events and see same behaviour as >>> if it was a single input device.] >> >> Assuming a split device - why do any filtering at all? >> Report ABS_X/Y for pen on the pen device, touch as MT events on the touch >> device _and_ first finger touch as ABS_X/Y on the touch device. If userspace >> can somehow couple the two devices then it's easy enough to filter touch >> events when necessary. > > Who is audience of these ABS_X/Y events? We are not sending them for > MT-aware application benefit. Matter of fact, it slightly complicates > MT-aware apps because they have to filter them in addition to > filtering MT events. > > The real audience of ABS_X/Y is either older apps or "simple" apps > (people that chose to keep it simple and not support MT events). This > class of apps can't really be expected to bind two devices logically > and mask. So without filtering ABS_X/Y on kernel side, we've > basically made Bamboo drivers unusable with a range of apps; which > probably means xf86-input-{evdev|synaptics} (I can't justify adding > binding logic to those two). I agree with Chris. Filtering touch ABS_X/Y when pen in prox is necessary for non-MT aware apps. It also eases backward support. Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 3:31 ` Peter Hutterer 2010-11-09 4:14 ` Chris Bagwell @ 2010-11-09 6:59 ` Dmitry Torokhov 2010-11-09 16:28 ` Chris Bagwell 2010-11-09 18:10 ` Ping Cheng 1 sibling, 2 replies; 44+ messages in thread From: Dmitry Torokhov @ 2010-11-09 6:59 UTC (permalink / raw) To: Peter Hutterer Cc: Chris Bagwell, Benjamin Tissoires, Ping Cheng, X.Org Devel List, Daniel Stone, linux-input On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: > > That said, it also goes counter the whole multi-touch approach - allowing > more than one device on a single physical device. > So maybe we should teach wacom to handle all devices as a single input device even in cases when they use several USB interfaces? We should be able to detect related interfaces by examining intf->intf_assoc (I hope) and using usb_driver_claim_interface() to claim them. -- Dmitry ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 6:59 ` Dmitry Torokhov @ 2010-11-09 16:28 ` Chris Bagwell 2010-11-09 20:10 ` Ping Cheng 2010-11-09 18:10 ` Ping Cheng 1 sibling, 1 reply; 44+ messages in thread From: Chris Bagwell @ 2010-11-09 16:28 UTC (permalink / raw) To: Dmitry Torokhov Cc: Peter Hutterer, Benjamin Tissoires, Ping Cheng, X.Org Devel List, Daniel Stone, linux-input On Tue, Nov 9, 2010 at 12:59 AM, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: >> >> That said, it also goes counter the whole multi-touch approach - allowing >> more than one device on a single physical device. >> > > So maybe we should teach wacom to handle all devices as a single input device even > in cases when they use several USB interfaces? We should be able to > detect related interfaces by examining intf->intf_assoc (I hope) and > using usb_driver_claim_interface() to claim them. Thanks for tips. I may try it just to prove its possible. Here is extra info on resolution/dimension issue to also solve when combining. Taken from current logic on touch input of Wacom Bamboo: input_mt_create_slots(input_dev, 2); input_set_abs_params(input_dev, ABS_MT_POSITION_X, 0, features->x_max, features->x_fuzz, 0); Combining 2 inputs means MT slots increases from 2 to 3 (2 touches and 1 stylus). Today, Pen has x_max=17420, x_fuzz=4 and resolution of 2540. Also today, Touch has x_max=15360, x_fuzz=128 and resolution=dunno (we are scaling up touch x_max in driver and I haven't calculated its affects on resolution). I believe that a normalized value of x_max would show that there is a greater area of tablet can be used for touch then pen. To handle this difference, we can scale reported values in driver such that x_max=x_max for all slots. I'm not clear on fuzz logic so don't know what 4 vs 128 does. Or we can maybe update MT interface so you can have per slot values for clients to query (or MT logic automatically scales for driver as another option). Then slots 0-1 are reserved for touch and slot 3 is reserved for pen. Chris ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 16:28 ` Chris Bagwell @ 2010-11-09 20:10 ` Ping Cheng 2010-11-10 5:02 ` Peter Hutterer 0 siblings, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-09 20:10 UTC (permalink / raw) To: Chris Bagwell Cc: Dmitry Torokhov, Peter Hutterer, X.Org Devel List, linux-input, Henrik Rydberg On Tue, Nov 9, 2010 at 8:28 AM, Chris Bagwell <chris@cnpbagwell.com> wrote: > On Tue, Nov 9, 2010 at 12:59 AM, Dmitry Torokhov > <dmitry.torokhov@gmail.com> wrote: >> On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: >>> >>> That said, it also goes counter the whole multi-touch approach - allowing >>> more than one device on a single physical device. >>> >> >> So maybe we should teach wacom to handle all devices as a single input device even >> in cases when they use several USB interfaces? We should be able to >> detect related interfaces by examining intf->intf_assoc (I hope) and >> using usb_driver_claim_interface() to claim them. > > Thanks for tips. I may try it just to prove its possible. > > Here is extra info on resolution/dimension issue to also solve when > combining. Taken from current logic on touch input of Wacom Bamboo: > > input_mt_create_slots(input_dev, 2); > input_set_abs_params(input_dev, ABS_MT_POSITION_X, 0, features->x_max, > features->x_fuzz, 0); > > Combining 2 inputs means MT slots increases from 2 to 3 (2 touches and > 1 stylus). > > Today, Pen has x_max=17420, x_fuzz=4 and resolution of 2540. Also > today, Touch has x_max=15360, x_fuzz=128 and resolution=dunno (we are > scaling up touch x_max in driver and I haven't calculated its affects > on resolution). > > I believe that a normalized value of x_max would show that there is a > greater area of tablet can be used for touch then pen. Normalization can be a way to address this issue. But, I am not sure if we want normalization in the kernel. You are onto the right point though. How are we going to pass the different resolution and x/y_max with the same SLOT array? This is an issue we can not avoid for serial MT devices since all data are from the same interface (your case #1). The USB case (your case #2) can work around it since pen and touch are on different logical port. But, I think we want to have an unified approach for both serial and USB to avoid userland inconsistency. What I did in the current code is to report touch x/y_max through ABS_RX/Y so we don't step on ABS_X/Y for serial devices. I know there are people don't like this idea either. So, any more suggestions? Specifying individual resolution and x/y_max for each MT slot is another way to address this issue. But that requires MT implementation and (maybe) spec change. Thank you for thinking out loud. I am sure you understood why I haven't picked up that Bamboo for so long ;). Ping > To handle this difference, we can scale reported values in driver such > that x_max=x_max for all slots. I'm not clear on fuzz logic so don't > know what 4 vs 128 does. > > Or we can maybe update MT interface so you can have per slot values > for clients to query (or MT logic automatically scales for driver as > another option). Then slots 0-1 are reserved for touch and slot 3 is > reserved for pen. > > Chris > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 20:10 ` Ping Cheng @ 2010-11-10 5:02 ` Peter Hutterer 2010-11-10 10:00 ` Henrik Rydberg 0 siblings, 1 reply; 44+ messages in thread From: Peter Hutterer @ 2010-11-10 5:02 UTC (permalink / raw) To: Ping Cheng Cc: Chris Bagwell, Dmitry Torokhov, X.Org Devel List, linux-input, Henrik Rydberg On Tue, Nov 09, 2010 at 12:10:26PM -0800, Ping Cheng wrote: > On Tue, Nov 9, 2010 at 8:28 AM, Chris Bagwell <chris@cnpbagwell.com> wrote: > > On Tue, Nov 9, 2010 at 12:59 AM, Dmitry Torokhov > > <dmitry.torokhov@gmail.com> wrote: > >> On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: > >>> > >>> That said, it also goes counter the whole multi-touch approach - allowing > >>> more than one device on a single physical device. > >>> > >> > >> So maybe we should teach wacom to handle all devices as a single input device even > >> in cases when they use several USB interfaces? We should be able to > >> detect related interfaces by examining intf->intf_assoc (I hope) and > >> using usb_driver_claim_interface() to claim them. > > > > Thanks for tips. I may try it just to prove its possible. > > > > Here is extra info on resolution/dimension issue to also solve when > > combining. Taken from current logic on touch input of Wacom Bamboo: > > > > input_mt_create_slots(input_dev, 2); > > input_set_abs_params(input_dev, ABS_MT_POSITION_X, 0, features->x_max, > > features->x_fuzz, 0); > > > > Combining 2 inputs means MT slots increases from 2 to 3 (2 touches and > > 1 stylus). > > > > Today, Pen has x_max=17420, x_fuzz=4 and resolution of 2540. Also > > today, Touch has x_max=15360, x_fuzz=128 and resolution=dunno (we are > > scaling up touch x_max in driver and I haven't calculated its affects > > on resolution). > > > > I believe that a normalized value of x_max would show that there is a > > greater area of tablet can be used for touch then pen. > > Normalization can be a way to address this issue. But, I am not sure > if we want normalization in the kernel. You are onto the right point > though. How are we going to pass the different resolution and x/y_max > with the same SLOT array? This is an issue we can not avoid for serial > MT devices since all data are from the same interface (your case #1). > > The USB case (your case #2) can work around it since pen and touch are > on different logical port. But, I think we want to have an unified > approach for both serial and USB to avoid userland inconsistency. yes please! in the X driver, I shouldn't have to care if the tablet is usb, serial or pigeon based. likewise I shouldn't really have to worry about I4, I3, Bamboo, Cintiq or whatever, just about the capabilities that the kernel driver reports. Right now, that's not the case and the amount of model checks we have in the X driver is amazing. I realize that we may never get down to zero model-specific code paths, but we could at least aim to get there. > What I did in the current code is to report touch x/y_max through ABS_RX/Y > so we don't step on ABS_X/Y for serial devices. > > I know there are people don't like this idea either. So, any more suggestions? > > Specifying individual resolution and x/y_max for each MT slot is > another way to address this issue. But that requires MT implementation > and (maybe) spec change. well, we can add to the spec. lucky us. ;) reporting touch input through and axis that's labelled as rotation is the fast-track to pain. it makes generic user-space code really hard. Cheers, Peter > Thank you for thinking out loud. I am sure you understood why I > haven't picked up that Bamboo for so long ;). > > Ping > > > To handle this difference, we can scale reported values in driver such > > that x_max=x_max for all slots. I'm not clear on fuzz logic so don't > > know what 4 vs 128 does. > > > > Or we can maybe update MT interface so you can have per slot values > > for clients to query (or MT logic automatically scales for driver as > > another option). Then slots 0-1 are reserved for touch and slot 3 is > > reserved for pen. > > > > Chris > > > -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-10 5:02 ` Peter Hutterer @ 2010-11-10 10:00 ` Henrik Rydberg 2010-11-10 23:53 ` Peter Hutterer 0 siblings, 1 reply; 44+ messages in thread From: Henrik Rydberg @ 2010-11-10 10:00 UTC (permalink / raw) To: Peter Hutterer Cc: Ping Cheng, Chris Bagwell, Dmitry Torokhov, X.Org Devel List, linux-input A comment on pixels and resolution: A pen and a thumb may have different resolution (signal-to-noise ratio), but there is no reason they cannot be reported on the same scale. In fact, it could be argued that it is natural for objects on the same surface to be reported in the coordinate system of the surface. So, if anything, the resolution is object/sensor dependent, and adding a possibility to specify resolution per object type would be good. It would also be good to know the physical dimensions of the surface. Cheers, Henrik ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-10 10:00 ` Henrik Rydberg @ 2010-11-10 23:53 ` Peter Hutterer 2010-11-11 0:48 ` Henrik Rydberg 0 siblings, 1 reply; 44+ messages in thread From: Peter Hutterer @ 2010-11-10 23:53 UTC (permalink / raw) To: Henrik Rydberg Cc: Ping Cheng, Chris Bagwell, Dmitry Torokhov, X.Org Devel List, linux-input On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: > A comment on pixels and resolution: > > A pen and a thumb may have different resolution (signal-to-noise ratio), but > there is no reason they cannot be reported on the same scale. In fact, it could > be argued that it is natural for objects on the same surface to be reported in > the coordinate system of the surface. it may be natural from a human perspective, but the computer doesn't care about it. And given that most input device interpretation is done in software, the scale used doesn't matter as long as it's correct. in the UI, even with different ranges for different tools, top-left should refer to whatever coordinate that is. in other words, if the pure numbers matter in the UI, we've done something wrong. what benefit do we get from reporting tools on the same scale if the HW doesn't do so? > So, if anything, the resolution is object/sensor dependent, and adding a > possibility to specify resolution per object type would be good. It would also > be good to know the physical dimensions of the surface. well, the physical dimensions are exported through the resolution, isn't it? if I have a range of 0-1000 and a resolution of 100 units per cm, I can guesstimate the phys size of the device. I know that in the X protocol resolution is specified in units/m, but given previous threads the kernel seems to be ambiguous here, alternating between in and mm. Cheers, Peter ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-10 23:53 ` Peter Hutterer @ 2010-11-11 0:48 ` Henrik Rydberg 2010-11-11 1:22 ` Dmitry Torokhov 0 siblings, 1 reply; 44+ messages in thread From: Henrik Rydberg @ 2010-11-11 0:48 UTC (permalink / raw) To: Peter Hutterer Cc: Ping Cheng, Chris Bagwell, Dmitry Torokhov, X.Org Devel List, linux-input On 11/11/2010 12:53 AM, Peter Hutterer wrote: > On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: >> A comment on pixels and resolution: >> >> A pen and a thumb may have different resolution (signal-to-noise ratio), but >> there is no reason they cannot be reported on the same scale. In fact, it could >> be argued that it is natural for objects on the same surface to be reported in >> the coordinate system of the surface. > > it may be natural from a human perspective, but the computer doesn't care > about it. The fact that we discuss a computer protocol suggests the computer does care. ;-) The question is whether we need to be able to support different scales for different tools types. I argue that among the three things value range, physical range and SN ratio, the one most naturally seen as an attribute of a tool is the SN ratio. > And given that most input device interpretation is done in > software, the scale used doesn't matter as long as it's correct. > in the UI, even with different ranges for different tools, top-left should > refer to whatever coordinate that is. > > in other words, if the pure numbers matter in the UI, we've done something > wrong. > > what benefit do we get from reporting tools on the same scale if the HW > doesn't do so? The question is, given a set of tool types reported via MT events, what additional information is needed. Having tools share the same ABS axes, I would like to see that as a good thing. So what is missing from that picture? >> So, if anything, the resolution is object/sensor dependent, and adding a >> possibility to specify resolution per object type would be good. It would also >> be good to know the physical dimensions of the surface. > > well, the physical dimensions are exported through the resolution, isn't it? > if I have a range of 0-1000 and a resolution of 100 units per cm, I can > guesstimate the phys size of the device. I see, we are apparently using different meanings of resolution. I meant signal-to-noise ratio, the number of distinctly measurable different values. If you mean how many ticks on your scale there are in an inch, you are of course right. The SN ratio is still unknown, though - that would be how many _different_ ticks there are in an inch. > I know that in the X protocol resolution is specified in units/m, but given > previous threads the kernel seems to be ambiguous here, alternating between > in and mm. Using finger width is another means of estimation. Cheers, Henrik ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 0:48 ` Henrik Rydberg @ 2010-11-11 1:22 ` Dmitry Torokhov 2010-11-11 8:06 ` Michal Suchanek 0 siblings, 1 reply; 44+ messages in thread From: Dmitry Torokhov @ 2010-11-11 1:22 UTC (permalink / raw) To: Henrik Rydberg Cc: Peter Hutterer, Ping Cheng, Chris Bagwell, X.Org Devel List, linux-input On Thu, Nov 11, 2010 at 01:48:44AM +0100, Henrik Rydberg wrote: > On 11/11/2010 12:53 AM, Peter Hutterer wrote: > > > On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: > >> A comment on pixels and resolution: > >> > >> A pen and a thumb may have different resolution (signal-to-noise ratio), but > >> there is no reason they cannot be reported on the same scale. In fact, it could > >> be argued that it is natural for objects on the same surface to be reported in > >> the coordinate system of the surface. > > > > it may be natural from a human perspective, but the computer doesn't care > > about it. > > > The fact that we discuss a computer protocol suggests the computer does care. > ;-) The question is whether we need to be able to support different scales for > different tools types. I argue that among the three things value range, physical > range and SN ratio, the one most naturally seen as an attribute of a tool is the > SN ratio. > > > And given that most input device interpretation is done in > > software, the scale used doesn't matter as long as it's correct. > > > in the UI, even with different ranges for different tools, top-left should > > refer to whatever coordinate that is. > > > > in other words, if the pure numbers matter in the UI, we've done something > > wrong. > > > > > what benefit do we get from reporting tools on the same scale if the HW > > doesn't do so? > > > The question is, given a set of tool types reported via MT events, what > additional information is needed. Having tools share the same ABS axes, I would > like to see that as a good thing. So what is missing from that picture? I think that with devices that share the same working surface with different tools (touch/pen) it would make sense to normalize the data in the kernel. If for no other reason but to simplify the protocol. In cases when tools are way to different driver writer might opt to export the device as 2 separate logical input devices (and then take care of coordinating them in userspace), in other cases treat them as one input device and scale to the same range. I tend to think [at the moment] that the latter would be most sensible option for Bamboo... Thanks. -- Dmitry ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 1:22 ` Dmitry Torokhov @ 2010-11-11 8:06 ` Michal Suchanek 2010-11-11 8:26 ` Dmitry Torokhov 0 siblings, 1 reply; 44+ messages in thread From: Michal Suchanek @ 2010-11-11 8:06 UTC (permalink / raw) To: Dmitry Torokhov Cc: Henrik Rydberg, X.Org Devel List, Chris Bagwell, linux-input On 11 November 2010 02:22, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > On Thu, Nov 11, 2010 at 01:48:44AM +0100, Henrik Rydberg wrote: >> On 11/11/2010 12:53 AM, Peter Hutterer wrote: >> >> > On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: >> >> A comment on pixels and resolution: >> >> >> >> A pen and a thumb may have different resolution (signal-to-noise ratio), but >> >> there is no reason they cannot be reported on the same scale. In fact, it could >> >> be argued that it is natural for objects on the same surface to be reported in >> >> the coordinate system of the surface. >> > >> > it may be natural from a human perspective, but the computer doesn't care >> > about it. >> >> >> The fact that we discuss a computer protocol suggests the computer does care. >> ;-) The question is whether we need to be able to support different scales for >> different tools types. I argue that among the three things value range, physical >> range and SN ratio, the one most naturally seen as an attribute of a tool is the >> SN ratio. >> >> > And given that most input device interpretation is done in >> > software, the scale used doesn't matter as long as it's correct. >> >> > in the UI, even with different ranges for different tools, top-left should >> > refer to whatever coordinate that is. >> > >> > in other words, if the pure numbers matter in the UI, we've done something >> > wrong. >> >> > >> > what benefit do we get from reporting tools on the same scale if the HW >> > doesn't do so? >> >> >> The question is, given a set of tool types reported via MT events, what >> additional information is needed. Having tools share the same ABS axes, I would >> like to see that as a good thing. So what is missing from that picture? > > I think that with devices that share the same working surface with > different tools (touch/pen) it would make sense to normalize the data in > the kernel. If for no other reason but to simplify the protocol. > > In cases when tools are way to different driver writer might opt to > export the device as 2 separate logical input devices (and then take > care of coordinating them in userspace), in other cases treat them as > one input device and scale to the same range. > > I tend to think [at the moment] that the latter would be most sensible > option for Bamboo... > What is the purpose of scaling the inouts to the same range? Is that a workaround for na issue with current protocol design or does anybody think that it makes some sort of sense? I am asking because to me it does not make any sense to scale the values. Devices that have a touch layer superimposed over LCD screen have to be calibrated for the system to know which ABS_X/Y corresponds to which pixel. I suspect that Bamboo P&T may have similar issue with its two superimposed input devices. Alos somebody suggested that the input area usable for pen might be smaller than area usable for touch. This means we are most likely talking apples and oranges, even if we scale them to be the same size. So I think it is most sensible to leave the scales are reported by the hardware because that gives a hint to the application which receives the event that the inputs are in fact different. They will still be when the events are scaled, it will be just harder to see from the client point of view. Thanks Michal ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 8:06 ` Michal Suchanek @ 2010-11-11 8:26 ` Dmitry Torokhov 2010-11-11 9:35 ` Michal Suchanek 2010-11-11 19:01 ` Ping Cheng 0 siblings, 2 replies; 44+ messages in thread From: Dmitry Torokhov @ 2010-11-11 8:26 UTC (permalink / raw) To: Michal Suchanek Cc: Henrik Rydberg, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 09:06:14AM +0100, Michal Suchanek wrote: > On 11 November 2010 02:22, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > > On Thu, Nov 11, 2010 at 01:48:44AM +0100, Henrik Rydberg wrote: > >> On 11/11/2010 12:53 AM, Peter Hutterer wrote: > >> > >> > On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: > >> >> A comment on pixels and resolution: > >> >> > >> >> A pen and a thumb may have different resolution (signal-to-noise ratio), but > >> >> there is no reason they cannot be reported on the same scale. In fact, it could > >> >> be argued that it is natural for objects on the same surface to be reported in > >> >> the coordinate system of the surface. > >> > > >> > it may be natural from a human perspective, but the computer doesn't care > >> > about it. > >> > >> > >> The fact that we discuss a computer protocol suggests the computer does care. > >> ;-) The question is whether we need to be able to support different scales for > >> different tools types. I argue that among the three things value range, physical > >> range and SN ratio, the one most naturally seen as an attribute of a tool is the > >> SN ratio. > >> > >> > And given that most input device interpretation is done in > >> > software, the scale used doesn't matter as long as it's correct. > >> > >> > in the UI, even with different ranges for different tools, top-left should > >> > refer to whatever coordinate that is. > >> > > >> > in other words, if the pure numbers matter in the UI, we've done something > >> > wrong. > >> > >> > > >> > what benefit do we get from reporting tools on the same scale if the HW > >> > doesn't do so? > >> > >> > >> The question is, given a set of tool types reported via MT events, what > >> additional information is needed. Having tools share the same ABS axes, I would > >> like to see that as a good thing. So what is missing from that picture? > > > > I think that with devices that share the same working surface with > > different tools (touch/pen) it would make sense to normalize the data in > > the kernel. If for no other reason but to simplify the protocol. > > > > In cases when tools are way to different driver writer might opt to > > export the device as 2 separate logical input devices (and then take > > care of coordinating them in userspace), in other cases treat them as > > one input device and scale to the same range. > > > > I tend to think [at the moment] that the latter would be most sensible > > option for Bamboo... > > > > What is the purpose of scaling the inouts to the same range? > > Is that a workaround for na issue with current protocol design or does > anybody think that it makes some sort of sense? > I do not believe that current protocol design has an issue. MT protocol is really for representing devices reporting multiple touches on the same working surface which means that size stays the same for all touches. > I am asking because to me it does not make any sense to scale the values. > > Devices that have a touch layer superimposed over LCD screen have to > be calibrated for the system to know which ABS_X/Y corresponds to > which pixel. I suspect that Bamboo P&T may have similar issue with its > two superimposed input devices. Alos somebody suggested that the input > area usable for pen might be smaller than area usable for touch. > > This means we are most likely talking apples and oranges, even if we > scale them to be the same size. > No, I do not believe this is true. Consider touchscreen that you can touch with either stylus or finger, with stylus having better resolution suited for drawing/writing and finger just used for selecting UI objects. Obviously they work with the same number of pixels even though ranges reported by the hardware might differ. > So I think it is most sensible to leave the scales are reported by the > hardware because that gives a hint to the application which receives > the event that the inputs are in fact different. They will still be > when the events are scaled, it will be just harder to see from the > client point of view. It really depends on the device in question and we have several options: 1. Single multitouch device - unified working surface with the same characteristics, kernel scales properly. 2. 2+ logical devices - may have different size/resolutions, userspace will have to do arbitration if they are in the same physical package. With Bamboo I think option 1 makes more sense and is easier on everyone. -- Dmitry ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 8:26 ` Dmitry Torokhov @ 2010-11-11 9:35 ` Michal Suchanek 2010-11-11 19:01 ` Ping Cheng 1 sibling, 0 replies; 44+ messages in thread From: Michal Suchanek @ 2010-11-11 9:35 UTC (permalink / raw) To: Dmitry Torokhov Cc: X.Org Devel List, Chris Bagwell, Henrik Rydberg, linux-input On 11 November 2010 09:26, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > On Thu, Nov 11, 2010 at 09:06:14AM +0100, Michal Suchanek wrote: >> On 11 November 2010 02:22, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: >> > On Thu, Nov 11, 2010 at 01:48:44AM +0100, Henrik Rydberg wrote: >> >> On 11/11/2010 12:53 AM, Peter Hutterer wrote: >> >> >> >> > On Wed, Nov 10, 2010 at 11:00:05AM +0100, Henrik Rydberg wrote: >> >> >> A comment on pixels and resolution: >> >> >> >> >> >> A pen and a thumb may have different resolution (signal-to-noise ratio), but >> >> >> there is no reason they cannot be reported on the same scale. In fact, it could >> >> >> be argued that it is natural for objects on the same surface to be reported in >> >> >> the coordinate system of the surface. >> >> > >> >> > it may be natural from a human perspective, but the computer doesn't care >> >> > about it. >> >> >> >> >> >> The fact that we discuss a computer protocol suggests the computer does care. >> >> ;-) The question is whether we need to be able to support different scales for >> >> different tools types. I argue that among the three things value range, physical >> >> range and SN ratio, the one most naturally seen as an attribute of a tool is the >> >> SN ratio. >> >> >> >> > And given that most input device interpretation is done in >> >> > software, the scale used doesn't matter as long as it's correct. >> >> >> >> > in the UI, even with different ranges for different tools, top-left should >> >> > refer to whatever coordinate that is. >> >> > >> >> > in other words, if the pure numbers matter in the UI, we've done something >> >> > wrong. >> >> >> >> > >> >> > what benefit do we get from reporting tools on the same scale if the HW >> >> > doesn't do so? >> >> >> >> >> >> The question is, given a set of tool types reported via MT events, what >> >> additional information is needed. Having tools share the same ABS axes, I would >> >> like to see that as a good thing. So what is missing from that picture? >> > >> > I think that with devices that share the same working surface with >> > different tools (touch/pen) it would make sense to normalize the data in >> > the kernel. If for no other reason but to simplify the protocol. >> > >> > In cases when tools are way to different driver writer might opt to >> > export the device as 2 separate logical input devices (and then take >> > care of coordinating them in userspace), in other cases treat them as >> > one input device and scale to the same range. >> > >> > I tend to think [at the moment] that the latter would be most sensible >> > option for Bamboo... >> > >> >> What is the purpose of scaling the inouts to the same range? >> >> Is that a workaround for na issue with current protocol design or does >> anybody think that it makes some sort of sense? >> > > I do not believe that current protocol design has an issue. MT protocol > is really for representing devices reporting multiple touches on the same > working surface which means that size stays the same for all touches. > >> I am asking because to me it does not make any sense to scale the values. >> >> Devices that have a touch layer superimposed over LCD screen have to >> be calibrated for the system to know which ABS_X/Y corresponds to >> which pixel. I suspect that Bamboo P&T may have similar issue with its >> two superimposed input devices. Alos somebody suggested that the input >> area usable for pen might be smaller than area usable for touch. >> >> This means we are most likely talking apples and oranges, even if we >> scale them to be the same size. >> > > No, I do not believe this is true. Consider touchscreen that you can > touch with either stylus or finger, with stylus having better > resolution suited for drawing/writing and finger just used for selecting > UI objects. Obviously they work with the same number of pixels even > though ranges reported by the hardware might differ. Still you have to perform some calibration to match the pen and finger inputs to screen objects and when implemented with different sensors then the sensors one to the other. This calibration needs to be done separately for each input sensor if the resulting date is to be precise within limits of the device resolution. Where would this calibration (which includes scaling) happen? In the kernel? Does that mean that only the administrator can calibrate the device? Or do we do one scaling in kernel and then rescale the data in X or other client again? Thanks Michal ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 8:26 ` Dmitry Torokhov 2010-11-11 9:35 ` Michal Suchanek @ 2010-11-11 19:01 ` Ping Cheng 2010-11-11 19:24 ` Dmitry Torokhov 1 sibling, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-11 19:01 UTC (permalink / raw) To: Dmitry Torokhov Cc: Michal Suchanek, Henrik Rydberg, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 12:26 AM, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > > I do not believe that current protocol design has an issue. MT protocol > is really for representing devices reporting multiple touches on the same > working surface which means that size stays the same for all touches. When we say multiple touches, do we mean multiple touches of the same type or different types? If we mean different types (pen and finger), the size could be different due to the differences in technology. For Bamboo, the size is different for pen and touch. >> I am asking because to me it does not make any sense to scale the values. >> >> Devices that have a touch layer superimposed over LCD screen have to >> be calibrated for the system to know which ABS_X/Y corresponds to >> which pixel. I suspect that Bamboo P&T may have similar issue with its >> two superimposed input devices. Alos somebody suggested that the input >> area usable for pen might be smaller than area usable for touch. >> >> This means we are most likely talking apples and oranges, even if we >> scale them to be the same size. >> > > No, I do not believe this is true. Consider touchscreen that you can > touch with either stylus or finger, with stylus having better > resolution suited for drawing/writing and finger just used for selecting > UI objects. Obviously they work with the same number of pixels even > though ranges reported by the hardware might differ. > >> So I think it is most sensible to leave the scales are reported by the >> hardware because that gives a hint to the application which receives >> the event that the inputs are in fact different. They will still be >> when the events are scaled, it will be just harder to see from the >> client point of view. > > It really depends on the device in question and we have several options: > > 1. Single multitouch device - unified working surface with the same > characteristics, kernel scales properly. > > 2. 2+ logical devices - may have different size/resolutions, userspace > will have to do arbitration if they are in the same physical package. > > With Bamboo I think option 1 makes more sense and is easier on everyone. I agree with you on "different size/resolutions" and "unified working surface" for the two opitons above. And I also agree with you that we should leave the arbitration to the userland for MT events for both opitons. But, I think ABS_X/Y arbitration should be considered in the kernel to reduce the overhead in userland. Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 19:01 ` Ping Cheng @ 2010-11-11 19:24 ` Dmitry Torokhov 2010-11-11 19:41 ` Henrik Rydberg 0 siblings, 1 reply; 44+ messages in thread From: Dmitry Torokhov @ 2010-11-11 19:24 UTC (permalink / raw) To: Ping Cheng Cc: Michal Suchanek, Henrik Rydberg, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 11:01:11AM -0800, Ping Cheng wrote: > On Thu, Nov 11, 2010 at 12:26 AM, Dmitry Torokhov > <dmitry.torokhov@gmail.com> wrote: > > > > I do not believe that current protocol design has an issue. MT protocol > > is really for representing devices reporting multiple touches on the same > > working surface which means that size stays the same for all touches. > > When we say multiple touches, do we mean multiple touches of the same > type or different types? If we mean different types (pen and finger), > the size could be different due to the differences in technology. For > Bamboo, the size is different for pen and touch. > I want to say that generally for same tool, when it makes sense we can scale so that different tools could pretend to have the same range. If there are really 2 vastly different devices superimposed together then maybe the best option is to treat them as 2 separate input devices. > >> I am asking because to me it does not make any sense to scale the values. > >> > >> Devices that have a touch layer superimposed over LCD screen have to > >> be calibrated for the system to know which ABS_X/Y corresponds to > >> which pixel. I suspect that Bamboo P&T may have similar issue with its > >> two superimposed input devices. Alos somebody suggested that the input > >> area usable for pen might be smaller than area usable for touch. > >> > >> This means we are most likely talking apples and oranges, even if we > >> scale them to be the same size. > >> > > > > No, I do not believe this is true. Consider touchscreen that you can > > touch with either stylus or finger, with stylus having better > > resolution suited for drawing/writing and finger just used for selecting > > UI objects. Obviously they work with the same number of pixels even > > though ranges reported by the hardware might differ. > > > >> So I think it is most sensible to leave the scales are reported by the > >> hardware because that gives a hint to the application which receives > >> the event that the inputs are in fact different. They will still be > >> when the events are scaled, it will be just harder to see from the > >> client point of view. > > > > It really depends on the device in question and we have several options: > > > > 1. Single multitouch device - unified working surface with the same > > characteristics, kernel scales properly. > > > > 2. 2+ logical devices - may have different size/resolutions, userspace > > will have to do arbitration if they are in the same physical package. > > > > With Bamboo I think option 1 makes more sense and is easier on everyone. > > I agree with you on "different size/resolutions" and "unified working > surface" for the two opitons above. And I also agree with you that we > should leave the arbitration to the userland for MT events for both > opitons. > > But, I think ABS_X/Y arbitration should be considered in the kernel to > reduce the overhead in userland. With option 1 you have natural ST arbitration - the first touch gets to report (ABS_X, ABS_Y), the rest will have to report ABS_MT_* till the first touch is lifted, right? Thanks. -- Dmitry -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 19:24 ` Dmitry Torokhov @ 2010-11-11 19:41 ` Henrik Rydberg 2010-11-11 19:55 ` Dmitry Torokhov 2010-11-11 21:25 ` Ping Cheng 0 siblings, 2 replies; 44+ messages in thread From: Henrik Rydberg @ 2010-11-11 19:41 UTC (permalink / raw) To: Dmitry Torokhov Cc: Ping Cheng, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input >> >> But, I think ABS_X/Y arbitration should be considered in the kernel to >> reduce the overhead in userland. > > With option 1 you have natural ST arbitration - the first touch gets to > report (ABS_X, ABS_Y), the rest will have to report ABS_MT_* till the > first touch is lifted, right? Right. Perhaps this is a good time to discuss some additional helper functions Chris and myself are playing with right now. /** * input_mt_report_state() - report contact state * @dev: input device with allocated MT slots * @active: true if contact is active, false otherwise * * Reports an active touch via ABS_MT_TRACKING_ID. If active is * true and the slot is currently inactive, a new tracking id is * assigned to the slot. */ void input_mt_report_state(struct input_dev *dev, bool active); Since all tracking-capable drivers we have seen so far have a contact id with the same semantics as the slot id, it makes sense to have the above function to remove the need for individual drivers to assign tracking ids. With the input core (or input-mt.c) controlling the tracking id, it is easy to add more goodness, such as /** * input_mt_report_pointer_emulation() - common pointer emulation * @dev: input device with allocated MT slots * * Performs legacy pointer emulation via BTN_TOUCH, etc. */ void input_mt_report_pointer_emulation(struct input_dev *dev); Since the input device knows the MT state and what buttons and abs axes are in use, it can provide the right set of data to emit to emulate pointer and touch events. The track-oldest-contact logic can be readily implemented. >From what I can see, this would remove two complications from each MT slots driver, removing both code and headaches. What do you think? Henrik ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 19:41 ` Henrik Rydberg @ 2010-11-11 19:55 ` Dmitry Torokhov 2010-11-11 21:25 ` Ping Cheng 1 sibling, 0 replies; 44+ messages in thread From: Dmitry Torokhov @ 2010-11-11 19:55 UTC (permalink / raw) To: Henrik Rydberg Cc: Ping Cheng, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 08:41:25PM +0100, Henrik Rydberg wrote: > >> > > >> But, I think ABS_X/Y arbitration should be considered in the kernel to > >> reduce the overhead in userland. > > > > With option 1 you have natural ST arbitration - the first touch gets to > > report (ABS_X, ABS_Y), the rest will have to report ABS_MT_* till the > > first touch is lifted, right? > > > Right. Perhaps this is a good time to discuss some additional helper functions > Chris and myself are playing with right now. > > /** > * input_mt_report_state() - report contact state > * @dev: input device with allocated MT slots > * @active: true if contact is active, false otherwise > * > * Reports an active touch via ABS_MT_TRACKING_ID. If active is > * true and the slot is currently inactive, a new tracking id is > * assigned to the slot. > */ > void input_mt_report_state(struct input_dev *dev, bool active); > > Since all tracking-capable drivers we have seen so far have a contact id with > the same semantics as the slot id, it makes sense to have the above function to > remove the need for individual drivers to assign tracking ids. > > With the input core (or input-mt.c) controlling the tracking id, it is easy to > add more goodness, such as > > /** > * input_mt_report_pointer_emulation() - common pointer emulation > * @dev: input device with allocated MT slots > * > * Performs legacy pointer emulation via BTN_TOUCH, etc. > */ > void input_mt_report_pointer_emulation(struct input_dev *dev); > > Since the input device knows the MT state and what buttons and abs axes are in > use, it can provide the right set of data to emit to emulate pointer and touch > events. The track-oldest-contact logic can be readily implemented. > > From what I can see, this would remove two complications from each MT slots > driver, removing both code and headaches. > > What do you think? > input-mt.c seems like a good idea for keeping a common library of goodies for MT drivers. -- Dmitry ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 19:41 ` Henrik Rydberg 2010-11-11 19:55 ` Dmitry Torokhov @ 2010-11-11 21:25 ` Ping Cheng 2010-11-11 21:38 ` Henrik Rydberg 1 sibling, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-11 21:25 UTC (permalink / raw) To: Henrik Rydberg Cc: Dmitry Torokhov, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 11:41 AM, Henrik Rydberg <rydberg@euromail.se> wrote: > > Right. Perhaps this is a good time to discuss some additional helper functions > Chris and myself are playing with right now. > > /** > * input_mt_report_state() - report contact state > * @dev: input device with allocated MT slots > * @active: true if contact is active, false otherwise > * > * Reports an active touch via ABS_MT_TRACKING_ID. If active is > * true and the slot is currently inactive, a new tracking id is > * assigned to the slot. > */ > void input_mt_report_state(struct input_dev *dev, bool active); > > Since all tracking-capable drivers we have seen so far have a contact id with > the same semantics as the slot id, it makes sense to have the above function to > remove the need for individual drivers to assign tracking ids. > > With the input core (or input-mt.c) controlling the tracking id, it is easy to > add more goodness, such as I agree that centralizing the common code is a good way to go. > /** > * input_mt_report_pointer_emulation() - common pointer emulation > * @dev: input device with allocated MT slots > * > * Performs legacy pointer emulation via BTN_TOUCH, etc. > */ > void input_mt_report_pointer_emulation(struct input_dev *dev); > > Since the input device knows the MT state and what buttons and abs axes are in > use, it can provide the right set of data to emit to emulate pointer and touch > events. The track-oldest-contact logic can be readily implemented. Are we going to do the tocuh and pen aribtration in input-mt.c? I need to understand this to make my patches. Sounds a bit complicated if we do the aribtration here, considering the differences between pen and tocuh. If not, we will need to tell input_mt_report_pointer_emulation if another tool (such as pen) has already sent pointer events. That is input_mt_report_pointer_emulation is only called when needed. Ping -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 21:25 ` Ping Cheng @ 2010-11-11 21:38 ` Henrik Rydberg 2010-11-11 22:10 ` Ping Cheng 2010-11-14 20:40 ` Ping Cheng 0 siblings, 2 replies; 44+ messages in thread From: Henrik Rydberg @ 2010-11-11 21:38 UTC (permalink / raw) To: Ping Cheng Cc: Dmitry Torokhov, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input > > Are we going to do the tocuh and pen aribtration in input-mt.c? I need > to understand this to make my patches. If the pen is implemented as just another contact, it will also be treated as such, and arbitration becomes the same thing as single-pointer emulation. Cheers, Henrik ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 21:38 ` Henrik Rydberg @ 2010-11-11 22:10 ` Ping Cheng 2010-11-11 23:21 ` Michal Suchanek 2010-11-12 8:21 ` Henrik Rydberg 2010-11-14 20:40 ` Ping Cheng 1 sibling, 2 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-11 22:10 UTC (permalink / raw) To: Henrik Rydberg Cc: Dmitry Torokhov, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input On Thu, Nov 11, 2010 at 1:38 PM, Henrik Rydberg <rydberg@euromail.se> wrote: >> >> Are we going to do the tocuh and pen aribtration in input-mt.c? I need >> to understand this to make my patches. > > If the pen is implemented as just another contact, it will also be treated as > such, and arbitration becomes the same thing as single-pointer emulation. As we discussed earlier, pen is on a separate logical port for USB devices. But that should not be an issue in the kernel driver, i.e., input-mt.c. I'd love to wait for the implementation of input-mt.c. Do you have a rough timeline for it? If we want to merge pen, and other tool types in the future, into the same MT slot, a way to tell the userland of the different size and resolution of the tool is a must. Or maybe you have other suggestions on "pen is implemented as just another contact"? Ping ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 22:10 ` Ping Cheng @ 2010-11-11 23:21 ` Michal Suchanek 2010-11-12 8:21 ` Henrik Rydberg 1 sibling, 0 replies; 44+ messages in thread From: Michal Suchanek @ 2010-11-11 23:21 UTC (permalink / raw) To: Ping Cheng Cc: Henrik Rydberg, Dmitry Torokhov, X.Org Devel List, Chris Bagwell, linux-input On 11 November 2010 23:10, Ping Cheng <pinglinux@gmail.com> wrote: > On Thu, Nov 11, 2010 at 1:38 PM, Henrik Rydberg <rydberg@euromail.se> wrote: >>> >>> Are we going to do the tocuh and pen aribtration in input-mt.c? I need >>> to understand this to make my patches. >> >> If the pen is implemented as just another contact, it will also be treated as >> such, and arbitration becomes the same thing as single-pointer emulation. > > As we discussed earlier, pen is on a separate logical port for USB > devices. But that should not be an issue in the kernel driver, i.e., > input-mt.c. > > I'd love to wait for the implementation of input-mt.c. Do you have a > rough timeline for it? > > If we want to merge pen, and other tool types in the future, into the > same MT slot, a way to tell the userland of the different size and > resolution of the tool is a must. Or maybe you have other suggestions > on "pen is implemented as just another contact"? > I would suggest to report the data in the resolution and scale they are obtained. Most attempts to 'cook' data in the kernel eventually result in issues. In no use I know of the scale really matters. Either the data is used for relative motion and then different tools need different acceleration formulas depending on their properties like fuzz or it is used as absolute pointer and then it is scaled to the desktop or widow size regardless of the original range. When used for gestures again fuzz is more relevant than range of possible values received. Thanks Michal ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 22:10 ` Ping Cheng 2010-11-11 23:21 ` Michal Suchanek @ 2010-11-12 8:21 ` Henrik Rydberg 1 sibling, 0 replies; 44+ messages in thread From: Henrik Rydberg @ 2010-11-12 8:21 UTC (permalink / raw) To: Ping Cheng Cc: Dmitry Torokhov, Michal Suchanek, X.Org Devel List, Chris Bagwell, linux-input > > I'd love to wait for the implementation of input-mt.c. Do you have a > rough timeline for it? Well... we have patches already, so we are talking weeks. Cheers, Henrik ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-11 21:38 ` Henrik Rydberg 2010-11-11 22:10 ` Ping Cheng @ 2010-11-14 20:40 ` Ping Cheng 1 sibling, 0 replies; 44+ messages in thread From: Ping Cheng @ 2010-11-14 20:40 UTC (permalink / raw) To: Henrik Rydberg, Dmitry Torokhov; +Cc: linux-input On Thu, Nov 11, 2010 at 1:38 PM, Henrik Rydberg <rydberg@euromail.se> wrote: > >> Are we going to do the tocuh and pen aribtration in input-mt.c? I need >> to understand this to make my patches. > > If the pen is implemented as just another contact, it will also be treated as > such, and arbitration becomes the same thing as single-pointer emulation. New questions occured when I think about input-mt.c. How are we going to pass MT_TOOL types and resolution to the userland? We need to pass MT_TOOL types to the userland since there would be different tools in the same slot array. Something like: input_report_key(input, MT_TOOL_PEN, prox); is needed. To do so, we need to enable them by a call to __set_bit(MT_TOOL_.... first. But, __set_bit wasn't meant for MT_TOOL_ since 0 (MT_TOOL_FINGER) and 1 (MT_TOOL_PEN) are used for KEY_RESERVED and KEY_ESC. One way to make this work is to increase MT_TOOL_FINGER from 0 to something like 0x2ff (we need to increase KEY_MAX as well). Luckily MT_TOOL_ is not in any of the existing HID drivers. Another solution could be to pass the tool type as ABS values since we have ABS_MT_TOOL_TYPE defined already: input_report_abs(dev, ABS_MT_TOOL_TYPE, MT_TOOL_FINGER); ......... input_set_abs_params(dev, ABS_MT_TOOL_TYPE, MT_TOOL_FINGER, MT_TOOL_PEN ....) . We need to add more MT tool types, such as MT_TOOL_AIRBRUSH...., for styli. Not sure if we want to go that far or not. I have thought about the resolution issue for a while. I didn't want to tocuh it since it affects all HID drivers. Resolution has been added to absinfo since kernel 2.6.31. But no kernel driver has passed resolution to the userland yet. We (linuxwacom project) still keep a resolution table in the X driver. Passing resolution to the userland from the kernel can get rid of those tables and avoid retrieving that information from HID descriptor. One simple way to support resolution is to add "int res" to input_set_abs_params(...). This means an interface change. All HID drivers that call input_set_abs_params() have to be updated. We could introduce a new input_set_abs_something so only those drivers that want to pass the resolution switch to the new routine. Maybe you guys have other suggestions? Ping ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 6:59 ` Dmitry Torokhov 2010-11-09 16:28 ` Chris Bagwell @ 2010-11-09 18:10 ` Ping Cheng 2010-11-09 20:36 ` Michal Suchanek 1 sibling, 1 reply; 44+ messages in thread From: Ping Cheng @ 2010-11-09 18:10 UTC (permalink / raw) To: Dmitry Torokhov Cc: Peter Hutterer, X.Org Devel List, Daniel Stone, linux-input On Mon, Nov 8, 2010 at 10:59 PM, Dmitry Torokhov <dmitry.torokhov@gmail.com> wrote: > On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: >> >> That said, it also goes counter the whole multi-touch approach - allowing >> more than one device on a single physical device. > > So maybe we should teach wacom to handle all devices as a single input device even > in cases when they use several USB interfaces? Hehe, we got you talking again. I need to understand if the lessons are for hardware/firmware or driver engineers ;). > We should be able to detect related interfaces by examining intf->intf_assoc (I hope) and > using usb_driver_claim_interface() to claim them. I guess you mean we do this in the kernel driver. Providing more than one type of HID devices from one physical device is not Wacom specific. Standalone keyboard with touchpad has the same feature/issue. When I plug an USB keyboard (whcih has a touchpad) on a Linux system, I see two input ports (/dev/input/input9 and /dev/input/event10) displayed. So, unless we only show one logical port for the same device (no matter how many types of devices it supports), userland clients would not be able to get the data from the same port. That is what we get from the Linux kernel, right? HID specification does not exclude one physical device supporting more than one type of HID devices. So, it is a question of how we represent the device to the userland on Linux (Windows and Mac have their own ways). To me, both one or two logical posts have pros and cons. Sorry to talk before listening. I am all ears now ;). Ping ^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [RFC] Multi-Touch (MT) support - arbitration or not 2010-11-09 18:10 ` Ping Cheng @ 2010-11-09 20:36 ` Michal Suchanek 0 siblings, 0 replies; 44+ messages in thread From: Michal Suchanek @ 2010-11-09 20:36 UTC (permalink / raw) To: Ping Cheng; +Cc: Dmitry Torokhov, X.Org Devel List, Daniel Stone, linux-input On 9 November 2010 19:10, Ping Cheng <pinglinux@gmail.com> wrote: > On Mon, Nov 8, 2010 at 10:59 PM, Dmitry Torokhov > <dmitry.torokhov@gmail.com> wrote: >> On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote: >>> >>> That said, it also goes counter the whole multi-touch approach - allowing >>> more than one device on a single physical device. >> >> So maybe we should teach wacom to handle all devices as a single input device even >> in cases when they use several USB interfaces? > > Hehe, we got you talking again. I need to understand if the lessons > are for hardware/firmware or driver engineers ;). > >> We should be able to detect related interfaces by examining intf->intf_assoc (I hope) and >> using usb_driver_claim_interface() to claim them. > > I guess you mean we do this in the kernel driver. > > Providing more than one type of HID devices from one physical device > is not Wacom specific. Standalone keyboard with touchpad has the same > feature/issue. When I plug an USB keyboard (whcih has a touchpad) on a > Linux system, I see two input ports (/dev/input/input9 and > /dev/input/event10) displayed. So, unless we only show one logical > port for the same device (no matter how many types of devices it > supports), userland clients would not be able to get the data from the > same port. That is what we get from the Linux kernel, right? The thing is that while the keyboard has separate devices for a touchpad and for keys these devices really are separate, logically from the user's point of view and physically. A keyboard with a trackpoint is not physically separate from the pointing device but it is still logically separate. It has only two axes on the single pointer device which are somewhat related but other than that you can have separate device for every button and key and lose no information whatsoever. This same applies to tablets with one tool and buttons. Tablets with multiple tools don't pose much problem either. The tools are related and should ideally be reported from the same device but they also behave the same and don't pose any difficulties when collected from multiple related devices, they have the same range of motion, resolution, etc. The situation with a P&T is special in that two largely disparate devices are superimposed physically so that using one hinders using the other to the point that measures have to be taken and some events filtered when both devices are in proximity to make the hardware usable. The implication of the situation is that if you want to make any sense of the input events you - absolutely have to know these two devices are related at some level - at a level where you know the devices are related you have to filter out some events - when using the devices at the same time for different purposes (button clicks and gestures vs moves) the difference in resolution and range does not pose a problem but when the user moves both pen and finger it might be challenging or even impossible to determine whether the touchdown points moved towards each other or away from each other due to differences in available motion range, resolution and hardware manufacture Thanks Michal ^ permalink raw reply [flat|nested] 44+ messages in thread
end of thread, other threads:[~2010-11-14 20:40 UTC | newest] Thread overview: 44+ messages (download: mbox.gz follow: Atom feed -- links below jump to the message on this page -- 2010-11-05 18:47 [RFC] Multi-Touch (MT) support - arbitration or not Ping Cheng 2010-11-06 15:16 ` Chris Bagwell 2010-11-06 15:53 ` Michal Suchanek 2010-11-06 20:38 ` Rafi Rubin 2010-11-08 3:39 ` Peter Hutterer 2010-11-07 23:32 ` Ping Cheng [not found] ` <AANLkTin1svszp87Pysi5OCt5=JTSB-yVaAWF-42gfn9T-JsoAwUIsXosN+BqQ9rBEUg@public.gmane.org> 2010-11-07 16:30 ` Daniel Stone 2010-11-07 21:07 ` Michal Suchanek 2010-11-07 23:38 ` Ping Cheng 2010-11-07 23:49 ` Ping Cheng 2010-11-08 0:53 ` Mohamed Ikbel Boulabiar 2010-11-08 3:51 ` Peter Hutterer 2010-11-08 8:08 ` Benjamin Tissoires 2010-11-08 21:54 ` Chris Bagwell 2010-11-08 23:33 ` Ping Cheng 2010-11-09 3:42 ` Chris Bagwell 2010-11-09 3:31 ` Peter Hutterer 2010-11-09 4:14 ` Chris Bagwell 2010-11-10 4:46 ` Peter Hutterer 2010-11-11 3:57 ` Chris Bagwell 2010-11-11 18:23 ` Ping Cheng 2010-11-09 6:59 ` Dmitry Torokhov 2010-11-09 16:28 ` Chris Bagwell 2010-11-09 20:10 ` Ping Cheng 2010-11-10 5:02 ` Peter Hutterer 2010-11-10 10:00 ` Henrik Rydberg 2010-11-10 23:53 ` Peter Hutterer 2010-11-11 0:48 ` Henrik Rydberg 2010-11-11 1:22 ` Dmitry Torokhov 2010-11-11 8:06 ` Michal Suchanek 2010-11-11 8:26 ` Dmitry Torokhov 2010-11-11 9:35 ` Michal Suchanek 2010-11-11 19:01 ` Ping Cheng 2010-11-11 19:24 ` Dmitry Torokhov 2010-11-11 19:41 ` Henrik Rydberg 2010-11-11 19:55 ` Dmitry Torokhov 2010-11-11 21:25 ` Ping Cheng 2010-11-11 21:38 ` Henrik Rydberg 2010-11-11 22:10 ` Ping Cheng 2010-11-11 23:21 ` Michal Suchanek 2010-11-12 8:21 ` Henrik Rydberg 2010-11-14 20:40 ` Ping Cheng 2010-11-09 18:10 ` Ping Cheng 2010-11-09 20:36 ` Michal Suchanek
This is a public inbox, see mirroring instructions for how to clone and mirror all data and code used for this inbox; as well as URLs for NNTP newsgroup(s).