* Jack event API - decision needed
@ 2011-06-20 13:37 David Henningsson
2011-06-20 17:07 ` Mark Brown
2011-06-27 12:07 ` Mark Brown
0 siblings, 2 replies; 57+ messages in thread
From: David Henningsson @ 2011-06-20 13:37 UTC (permalink / raw)
To: ALSA Development Mailing List, Takashi Iwai, Kay Sievers,
Lennart Poettering, broonie
Hi,
I'm still new in the community in that sense that I'm not sure how
decisions are made. But I could use the outcome of such a decision.
Background: I'm trying to pull together the missing pieces of jack
detection on both kernel / plumbing / application layers, so that when
you plug something in (headset, microphone etc), userspace is notified
and can take appropriate actions (e g routing decisions).
As part of that I wrote a udev patch a few days ago, which nobody
commented on in alsa-devel [1], but was somewhat disliked by at least
Kay Sievers who maintains udev [2], who preferred we would rewrite our
input layer to do something else within ALSA.
So before I proceed further I'd like to know if
1) We're continuing the path with /dev/input devices
2a) We'll rewrite these devices to be read-only ALSA mixer controls
2b) We'll rewrite these devices to be something within ALSA, but not
exactly mixer controls.
For options 2a) and 2b) I guess the existing /dev/input thing should be
deprecated and/or removed. So part of decision should maybe be based on
information about how widespread the usage of these devices are
currently...?
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
[1]
http://mailman.alsa-project.org/pipermail/alsa-devel/2011-June/040916.html
[2] e g http://www.spinics.net/lists/hotplug/msg04949.html
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
@ 2011-06-20 13:56 Mark Brown
2011-06-20 14:11 ` David Henningsson
2011-06-20 14:19 ` Kay Sievers
0 siblings, 2 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-20 13:56 UTC (permalink / raw)
To: David Henningsson, ALSA Development Mailing List, Takashi Iwai,
Kay Sievers, Lenn
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
David Henningsson <david.henningsson@canonical.com> wrote:
>Hi,
>
>I'm still new in the community in that sense that I'm not sure how
>decisions are made. But I could use the outcome of such a decision.
>
>Background: I'm trying to pull together the missing pieces of jack
>detection on both kernel / plumbing / application layers, so that when
>you plug something in (headset, microphone etc), userspace is notified
>and can take appropriate actions (e g routing decisions).
>
>As part of that I wrote a udev patch a few days ago, which nobody
>commented on in alsa-devel [1], but was somewhat disliked by at least
>Kay Sievers who maintains udev [2], who preferred we would rewrite our
>input layer to do something else within ALSA.
>
>So before I proceed further I'd like to know if
>
>1) We're continuing the path with /dev/input devices
>
>2a) We'll rewrite these devices to be read-only ALSA mixer controls
>
>2b) We'll rewrite these devices to be something within ALSA, but not
>exactly mixer controls.
>
>For options 2a) and 2b) I guess the existing /dev/input thing should be
>deprecated and/or removed. So part of decision should maybe be based on
>information about how widespread the usage of these devices are
>currently...?
>
>--
>David Henningsson, Canonical Ltd.
>http://launchpad.net/~diwic
>
>[1]
>http://mailman.alsa-project.org/pipermail/alsa-devel/2011-June/040916.html
>
>[2] e g http://www.spinics.net/lists/hotplug/msg04949.html
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 13:56 Mark Brown
@ 2011-06-20 14:11 ` David Henningsson
2011-06-20 14:19 ` Kay Sievers
1 sibling, 0 replies; 57+ messages in thread
From: David Henningsson @ 2011-06-20 14:11 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-20 15:56, Mark Brown wrote:
> Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address,your proposed solution and the problems Kay had.
This particular issue and the udev patch was about letting the current
logged in user access the input devices, which are currently only
accessible by root.
I'm sure there will be quite a few more issues before something that
"just works" for the vast majority of people!
I'll leave it to Kay to explain what problems he had with accepting the
patch.
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 13:56 Mark Brown
2011-06-20 14:11 ` David Henningsson
@ 2011-06-20 14:19 ` Kay Sievers
2011-06-20 15:35 ` Takashi Iwai
2011-06-20 16:47 ` Mark Brown
1 sibling, 2 replies; 57+ messages in thread
From: Kay Sievers @ 2011-06-20 14:19 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Lennart Poettering,
David Henningsson
On Mon, Jun 20, 2011 at 15:56, Mark Brown
<broonie@opensource.wolfsonmicro.com> wrote:
> Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
Domain-specific events should not be 'faked' as 'input' devices. No
non-input subsystem should do that if it can be avoided. That 'input'
supports nice and easy events to userspace should not be a reason to
misuse it for pretty much unrelated things.
There are patches to have the ALSA control device emit these ALSA
related events natively. That's would be just better, simpler and more
correct than any additionally created input device.
If Takashi can make that possible in a reasonable time frame, we
should not even start handling the (currently not handled) input
devices in upstream projects like udev and PulseAudio, and focus right
away on the native ALSA control events.
If we can't have the native ALSA events anytime soon for some reason,
we might need to merge the input device support, but I would like to
avoid that.
Kay
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 14:19 ` Kay Sievers
@ 2011-06-20 15:35 ` Takashi Iwai
2011-06-20 16:52 ` Mark Brown
2011-06-20 18:24 ` David Henningsson
2011-06-20 16:47 ` Mark Brown
1 sibling, 2 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 15:35 UTC (permalink / raw)
To: Kay Sievers
Cc: ALSA Development Mailing List, Mark Brown, David Henningsson,
Lennart Poettering
At Mon, 20 Jun 2011 16:19:51 +0200,
Kay Sievers wrote:
>
> On Mon, Jun 20, 2011 at 15:56, Mark Brown
> <broonie@opensource.wolfsonmicro.com> wrote:
> > Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
>
> Domain-specific events should not be 'faked' as 'input' devices. No
> non-input subsystem should do that if it can be avoided. That 'input'
> supports nice and easy events to userspace should not be a reason to
> misuse it for pretty much unrelated things.
>
> There are patches to have the ALSA control device emit these ALSA
> related events natively. That's would be just better, simpler and more
> correct than any additionally created input device.
>
> If Takashi can make that possible in a reasonable time frame, we
> should not even start handling the (currently not handled) input
> devices in upstream projects like udev and PulseAudio, and focus right
> away on the native ALSA control events.
>
> If we can't have the native ALSA events anytime soon for some reason,
> we might need to merge the input device support, but I would like to
> avoid that.
Well, the implementation would be relatively easy. There was already
a patch, so not too hard to revisit.
But, there are still some open questions. For example, what
information is mandatory and what information is preferred about the
pin. HD-audio provides the location, type, color, etc. We can encode
these into a single name string, but is it a preferred way?
Alternatively, the control element can provide the HD-audio pin config
via an extra TLV data, so that apps can refer to it if needed in
addition to the name string.
Also, we may consider some way to expose the corelation of the jack
control element and other mixer elements. Though, this sounds
optional to me for the time being.
These questions are basically requirements from the apps; so I'd like
to know the exact demands before going to implementation.
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 14:19 ` Kay Sievers
2011-06-20 15:35 ` Takashi Iwai
@ 2011-06-20 16:47 ` Mark Brown
2011-06-20 16:59 ` Takashi Iwai
1 sibling, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-20 16:47 UTC (permalink / raw)
To: Kay Sievers
Cc: Takashi Iwai, ALSA Development Mailing List, Lennart Poettering,
David Henningsson
On Mon, Jun 20, 2011 at 04:19:51PM +0200, Kay Sievers wrote:
> Domain-specific events should not be 'faked' as 'input' devices. No
> non-input subsystem should do that if it can be avoided. That 'input'
> supports nice and easy events to userspace should not be a reason to
> misuse it for pretty much unrelated things.
OK, but since jacks aren't at all audio specific (the most obvious
additional thing that goes over them is video for cases like composite
out on mic and the modern digital standards like HDMI) that wouldn't
address the issue. There's also things like docking stations which can
present very much like big fancy jacks.
> There are patches to have the ALSA control device emit these ALSA
> related events natively. That's would be just better, simpler and more
> correct than any additionally created input device.
That's not really terribly clever for the non-audio users, they'd have
to implement ALSA support.
> If we can't have the native ALSA events anytime soon for some reason,
> we might need to merge the input device support, but I would like to
> avoid that.
The input device usage has been present and in use in a basic form since
2.6.18, the ALSA integration of it since 2.6.27 so this isn't terribly
new.
It might be nice to have the information available via ALSA but it
doesn't seem reasonable to expect anything that might need the
information to have to go through ALSA.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 15:35 ` Takashi Iwai
@ 2011-06-20 16:52 ` Mark Brown
2011-06-20 17:01 ` Takashi Iwai
2011-06-20 18:24 ` David Henningsson
1 sibling, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-20 16:52 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, David Henningsson,
Lennart Poettering
On Mon, Jun 20, 2011 at 05:35:00PM +0200, Takashi Iwai wrote:
> Also, we may consider some way to expose the corelation of the jack
> control element and other mixer elements. Though, this sounds
> optional to me for the time being.
>From that point of view it'd be much better to do this as part of
exposing the full routing map and positioning of the controls within the
map to applications - there has been some discussion about using the
media controller API to do that within the context of ASoC though not
much work on actually implementing it yet. This would solve a lot of
problems with figuring out how the card fits together that we have
currently.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 16:47 ` Mark Brown
@ 2011-06-20 16:59 ` Takashi Iwai
2011-06-20 17:17 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 16:59 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Mon, 20 Jun 2011 17:47:14 +0100,
Mark Brown wrote:
>
> On Mon, Jun 20, 2011 at 04:19:51PM +0200, Kay Sievers wrote:
>
> > Domain-specific events should not be 'faked' as 'input' devices. No
> > non-input subsystem should do that if it can be avoided. That 'input'
> > supports nice and easy events to userspace should not be a reason to
> > misuse it for pretty much unrelated things.
>
> OK, but since jacks aren't at all audio specific (the most obvious
> additional thing that goes over them is video for cases like composite
> out on mic and the modern digital standards like HDMI) that wouldn't
> address the issue. There's also things like docking stations which can
> present very much like big fancy jacks.
>
> > There are patches to have the ALSA control device emit these ALSA
> > related events natively. That's would be just better, simpler and more
> > correct than any additionally created input device.
>
> That's not really terribly clever for the non-audio users, they'd have
> to implement ALSA support.
>
> > If we can't have the native ALSA events anytime soon for some reason,
> > we might need to merge the input device support, but I would like to
> > avoid that.
>
> The input device usage has been present and in use in a basic form since
> 2.6.18, the ALSA integration of it since 2.6.27 so this isn't terribly
> new.
>
> It might be nice to have the information available via ALSA but it
> doesn't seem reasonable to expect anything that might need the
> information to have to go through ALSA.
Note that the issue came up because David posted a patch for udev
to change the device-permission of these input jacks via ACL together
with the normal sound devices. Then, the question arose, whether
this is needed really for udev for now.
If udev is being used by real users of such input devices, it'd be a
good justification. In the previous thread, I also gave some
examples that the input-jack device was used for HD-audio media-PC
devices, but it's not clear whether udev is used there. And I'm not
sure whether there are real users of embedded devices with the
input-jack layer that requires udev's ACL handling.
OTOH, if this is mainly targeted for the future extension of
PulseAudio, the primary question would become different. Possibly the
path via ALSA control API might give more information than the current
implementation with the input-jack layer.
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 16:52 ` Mark Brown
@ 2011-06-20 17:01 ` Takashi Iwai
0 siblings, 0 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 17:01 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, David Henningsson,
Lennart Poettering
At Mon, 20 Jun 2011 17:52:37 +0100,
Mark Brown wrote:
>
> On Mon, Jun 20, 2011 at 05:35:00PM +0200, Takashi Iwai wrote:
>
> > Also, we may consider some way to expose the corelation of the jack
> > control element and other mixer elements. Though, this sounds
> > optional to me for the time being.
>
> From that point of view it'd be much better to do this as part of
> exposing the full routing map and positioning of the controls within the
> map to applications - there has been some discussion about using the
> media controller API to do that within the context of ASoC though not
> much work on actually implementing it yet. This would solve a lot of
> problems with figuring out how the card fits together that we have
> currently.
Yeah, that'd be another option. Though, I don't know what actually PA
would need for the jack-detection for the current implementation...
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 13:37 Jack event API - decision needed David Henningsson
@ 2011-06-20 17:07 ` Mark Brown
2011-06-20 17:12 ` Takashi Iwai
2011-06-20 18:53 ` David Henningsson
2011-06-27 12:07 ` Mark Brown
1 sibling, 2 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-20 17:07 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
> I'm still new in the community in that sense that I'm not sure how
> decisions are made. But I could use the outcome of such a decision.
So, I'm still not 100% sure what the actual technical issue is here?
> As part of that I wrote a udev patch a few days ago, which nobody
> commented on in alsa-devel [1], but was somewhat disliked by at
Having dug out the mailing list archive I guess the fact that you just
posted it to the list and didn't CC anyone on the patch didn't help
here. Looking at the patch the main thing that jumps out at me without
any knowledge of the udev code is that the patch will end up classifying
video output jacks as audio even if they've no audio capability which is
obviously not correct.
I still don't entirely understand the technical issue you're trying to
address here - this doesn't seem specific to audio. As far as I can
tell the issue is that some of the input devices on the system aren't
being made available to the console user, presumably because there are
some that shouldn't be made available to them, so the issue is that the
heuristics that udev uses to decide if an input device should be root
only aren't working correctly in at least this case. It feels like if
we understood why the heuristics are making a bad call here we might be
able to come up with a better solution.
> least Kay Sievers who maintains udev [2], who preferred we would
> rewrite our input layer to do something else within ALSA.
It seems like Kay's issues are based on a misunderstanding about what a
jack might be.
> For options 2a) and 2b) I guess the existing /dev/input thing should
> be deprecated and/or removed. So part of decision should maybe be
> based on information about how widespread the usage of these devices
> are currently...?
There's a reasonable amount of usage in the embedded space.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 17:07 ` Mark Brown
@ 2011-06-20 17:12 ` Takashi Iwai
2011-06-20 17:31 ` Mark Brown
2011-06-20 18:53 ` David Henningsson
1 sibling, 1 reply; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 17:12 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Mon, 20 Jun 2011 18:07:40 +0100,
Mark Brown wrote:
>
> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>
> > I'm still new in the community in that sense that I'm not sure how
> > decisions are made. But I could use the outcome of such a decision.
>
> So, I'm still not 100% sure what the actual technical issue is here?
The device permissions. They are set inaccessible from the normal
desktop user as default.
...
> > For options 2a) and 2b) I guess the existing /dev/input thing should
> > be deprecated and/or removed. So part of decision should maybe be
> > based on information about how widespread the usage of these devices
> > are currently...?
>
> There's a reasonable amount of usage in the embedded space.
Do they use udev? And don't they need to tweak the device permissions
of these input-jack devices on the fly per user?
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 16:59 ` Takashi Iwai
@ 2011-06-20 17:17 ` Mark Brown
2011-06-20 17:38 ` Takashi Iwai
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-20 17:17 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
On Mon, Jun 20, 2011 at 06:59:46PM +0200, Takashi Iwai wrote:
> If udev is being used by real users of such input devices, it'd be a
> good justification. In the previous thread, I also gave some
> examples that the input-jack device was used for HD-audio media-PC
> devices, but it's not clear whether udev is used there. And I'm not
> sure whether there are real users of embedded devices with the
> input-jack layer that requires udev's ACL handling.
I don't think the embedded space cares desparately about the udev stuff
since most of the time the system management daemon which owns this
stuff is running as root anyway, PulseAudio or otherwise.
> OTOH, if this is mainly targeted for the future extension of
> PulseAudio, the primary question would become different. Possibly the
> path via ALSA control API might give more information than the current
> implementation with the input-jack layer.
I think they can both either give equivalent information or work in
concert, I don't see a need to pick between the two. Like I say the
jacks aren't exclusively for audio use so if we have to rely on the ALSA
APIs to get information about them it seems like we're messing up. For
things like working out where the jack fits into the audio routing map
we'd certainly want to have ALSA API integration but that doesn't mean
everything about the jack has to go that way.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 17:12 ` Takashi Iwai
@ 2011-06-20 17:31 ` Mark Brown
2011-06-20 17:37 ` Takashi Iwai
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-20 17:31 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
On Mon, Jun 20, 2011 at 07:12:21PM +0200, Takashi Iwai wrote:
> Mark Brown wrote:
> > On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
> > > I'm still new in the community in that sense that I'm not sure how
> > > decisions are made. But I could use the outcome of such a decision.
> > So, I'm still not 100% sure what the actual technical issue is here?
> The device permissions. They are set inaccessible from the normal
> desktop user as default.
Right, I got that much but what I was trying to say in the rest of the
paragraph was that it wasn't immediately obvious to me that this was due
to needing to do something special for audio jacks. That's certainly a
symptom but what's not clear to me is why we think the current setup is
a good call for the generic input device that udev didn't have specific
magic for. We seem to have jumped straight into some very non-generic
solutions, skipping quite a few steps in understanding what the current
situation is.
> > There's a reasonable amount of usage in the embedded space.
> Do they use udev? And don't they need to tweak the device permissions
> of these input-jack devices on the fly per user?
No and no. The physical user owns the whole device and typically there
aren't user accounts in that sense on the system.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 17:31 ` Mark Brown
@ 2011-06-20 17:37 ` Takashi Iwai
0 siblings, 0 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 17:37 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Mon, 20 Jun 2011 18:31:01 +0100,
Mark Brown wrote:
>
> On Mon, Jun 20, 2011 at 07:12:21PM +0200, Takashi Iwai wrote:
> > Mark Brown wrote:
> > > On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>
> > > > I'm still new in the community in that sense that I'm not sure how
> > > > decisions are made. But I could use the outcome of such a decision.
>
> > > So, I'm still not 100% sure what the actual technical issue is here?
>
> > The device permissions. They are set inaccessible from the normal
> > desktop user as default.
>
> Right, I got that much but what I was trying to say in the rest of the
> paragraph was that it wasn't immediately obvious to me that this was due
> to needing to do something special for audio jacks. That's certainly a
> symptom but what's not clear to me is why we think the current setup is
> a good call for the generic input device that udev didn't have specific
> magic for. We seem to have jumped straight into some very non-generic
> solutions, skipping quite a few steps in understanding what the current
> situation is.
True. OTOH, the input-jack devices of the sound driver belong to the
sound device (from device-tree POV), so it's natural to take the
similar control for them as the sound devices. So, I initially
thought it'd be OK in that way.
But, maybe the situation will change when the jack layer is used more
widely for other device classes.
> > > There's a reasonable amount of usage in the embedded space.
>
> > Do they use udev? And don't they need to tweak the device permissions
> > of these input-jack devices on the fly per user?
>
> No and no. The physical user owns the whole device and typically there
> aren't user accounts in that sense on the system.
OK, thanks for clarification.
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 17:17 ` Mark Brown
@ 2011-06-20 17:38 ` Takashi Iwai
0 siblings, 0 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-20 17:38 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Mon, 20 Jun 2011 18:17:45 +0100,
Mark Brown wrote:
>
> On Mon, Jun 20, 2011 at 06:59:46PM +0200, Takashi Iwai wrote:
>
> > If udev is being used by real users of such input devices, it'd be a
> > good justification. In the previous thread, I also gave some
> > examples that the input-jack device was used for HD-audio media-PC
> > devices, but it's not clear whether udev is used there. And I'm not
> > sure whether there are real users of embedded devices with the
> > input-jack layer that requires udev's ACL handling.
>
> I don't think the embedded space cares desparately about the udev stuff
> since most of the time the system management daemon which owns this
> stuff is running as root anyway, PulseAudio or otherwise.
>
> > OTOH, if this is mainly targeted for the future extension of
> > PulseAudio, the primary question would become different. Possibly the
> > path via ALSA control API might give more information than the current
> > implementation with the input-jack layer.
>
> I think they can both either give equivalent information or work in
> concert, I don't see a need to pick between the two.
Yes, we can implement both concurrently. They don't conflict.
> Like I say the
> jacks aren't exclusively for audio use so if we have to rely on the ALSA
> APIs to get information about them it seems like we're messing up. For
> things like working out where the jack fits into the audio routing map
> we'd certainly want to have ALSA API integration but that doesn't mean
> everything about the jack has to go that way.
Agreed.
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 15:35 ` Takashi Iwai
2011-06-20 16:52 ` Mark Brown
@ 2011-06-20 18:24 ` David Henningsson
2011-06-21 0:29 ` Mark Brown
1 sibling, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-20 18:24 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
Mark Brown
On 2011-06-20 17:35, Takashi Iwai wrote:
> At Mon, 20 Jun 2011 16:19:51 +0200,
> Kay Sievers wrote:
>>
>> On Mon, Jun 20, 2011 at 15:56, Mark Brown
>> <broonie@opensource.wolfsonmicro.com> wrote:
>>> Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
>>
>> Domain-specific events should not be 'faked' as 'input' devices. No
>> non-input subsystem should do that if it can be avoided. That 'input'
>> supports nice and easy events to userspace should not be a reason to
>> misuse it for pretty much unrelated things.
>>
>> There are patches to have the ALSA control device emit these ALSA
>> related events natively. That's would be just better, simpler and more
>> correct than any additionally created input device.
>>
>> If Takashi can make that possible in a reasonable time frame, we
>> should not even start handling the (currently not handled) input
>> devices in upstream projects like udev and PulseAudio, and focus right
>> away on the native ALSA control events.
>>
>> If we can't have the native ALSA events anytime soon for some reason,
>> we might need to merge the input device support, but I would like to
>> avoid that.
>
> Well, the implementation would be relatively easy. There was already
> a patch, so not too hard to revisit.
>
> But, there are still some open questions. For example, what
> information is mandatory and what information is preferred about the
> pin. HD-audio provides the location, type, color, etc. We can encode
> these into a single name string, but is it a preferred way?
I was thinking the same thing. We don't want another parser as we now
have for the volume control names in Pulseaudio. [1] Better encode the
information in binary.
> Alternatively, the control element can provide the HD-audio pin config
> via an extra TLV data, so that apps can refer to it if needed in
> addition to the name string.
>
> Also, we may consider some way to expose the corelation of the jack
> control element and other mixer elements. Though, this sounds
> optional to me for the time being.
I think this correlation is what I'm missing the most, I think. We need
e g, a way to figure out that if you plugged something into HDMI port nr
2, we should output through PCM device 2 on that card. Exposing what
mixers affect this port is highly desirable as well to get more accurate
than the current name-based algorithm Pulseaudio currently uses.
For the jack itself, the most important info would be
1) Type (Headphone / Line-out / etc)
2) Channel allocation (Front / Side / LFE / etc)
3) Location (Rear / Docking station / and can also be Front, just to add
to the confusion)
> These questions are basically requirements from the apps; so I'd like
> to know the exact demands before going to implementation.
Hopefully this mail gives a little more insight from the Pulseaudio
viewport at least.
Should we design something new, I think we should start with a pin/port
concept rather than a jack concept. "Internal speaker" would then be a
port that does not have a jack, whereas "Headphone" would be a port that
has a jack.
(Btw, I don't know much about DAPM and how well that scales to cope with
these requirements, it looks very ASoC to me, but perhaps it's just the
SND_SOC_DAPM_* naming that fools me. But can DAPM e g send events to
userspace?)
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
[1] It becomes even worse when some things such as "I control headphones
and line-out but not internal speaker" cannot be expressed with a name,
at least there is no current standard for doing so. :-(
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 17:07 ` Mark Brown
2011-06-20 17:12 ` Takashi Iwai
@ 2011-06-20 18:53 ` David Henningsson
2011-06-20 23:40 ` Mark Brown
1 sibling, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-20 18:53 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-20 19:07, Mark Brown wrote:
> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>> As part of that I wrote a udev patch a few days ago, which nobody
>> commented on in alsa-devel [1], but was somewhat disliked by at
>
> Having dug out the mailing list archive I guess the fact that you just
> posted it to the list and didn't CC anyone on the patch didn't help
> here.
Therefore I cc:ed more people on this thread, which seems to at least
got more attention, but risk falling into the ditch of endless
discussion instead. Please prove me wrong :-)
> Looking at the patch the main thing that jumps out at me without
> any knowledge of the udev code is that the patch will end up classifying
> video output jacks as audio even if they've no audio capability which is
> obviously not correct.
In suggested implementation, they will only be marked audio jacks if
their parent object is a sound card.
> I still don't entirely understand the technical issue you're trying to
> address here - this doesn't seem specific to audio.
I think this is a difference between embedded space and desktop space.
In embedded space, a hardware designer can decide to connect the "take a
camera picture" button to "line in jack detect" just because that saves
him an extra GPIO chip (and "line in" isn't connected anyway). Those
things don't happen on normal PC [1], but instead, things must be
auto-detectable.
> As far as I can
> tell the issue is that some of the input devices on the system aren't
> being made available to the console user, presumably because there are
> some that shouldn't be made available to them, so the issue is that the
> heuristics that udev uses to decide if an input device should be root
> only aren't working correctly in at least this case. It feels like if
> we understood why the heuristics are making a bad call here we might be
> able to come up with a better solution.
AFAICT, the current "heuristics", is to assign all input devices to root
and root only.
>> For options 2a) and 2b) I guess the existing /dev/input thing should
>> be deprecated and/or removed. So part of decision should maybe be
>> based on information about how widespread the usage of these devices
>> are currently...?
> There's a reasonable amount of usage in the embedded space.
But maintaining two different implementations of input jacks without at
least strongly deprecating one of them, lead to application programmers
being confused, kernel being big and bloated, and so on...
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
[1] Just because I said so, presumably someone comes up with a story
about a PC vendor who has actually done just that...
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 18:53 ` David Henningsson
@ 2011-06-20 23:40 ` Mark Brown
2011-06-21 12:11 ` David Henningsson
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-20 23:40 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Mon, Jun 20, 2011 at 08:53:46PM +0200, David Henningsson wrote:
> On 2011-06-20 19:07, Mark Brown wrote:
> >On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
> >Looking at the patch the main thing that jumps out at me without
> >any knowledge of the udev code is that the patch will end up classifying
> >video output jacks as audio even if they've no audio capability which is
> >obviously not correct.
> In suggested implementation, they will only be marked audio jacks if
> their parent object is a sound card.
Oh, then in that case we've got another issue in that jacks that happen
not to be associated directly with a sound card for some reason (they're
implemented by the embedded controller and exposed via ACPI for example)
won't be made available to applications.
> >I still don't entirely understand the technical issue you're trying to
> >address here - this doesn't seem specific to audio.
> I think this is a difference between embedded space and desktop
> space. In embedded space, a hardware designer can decide to connect
> the "take a camera picture" button to "line in jack detect" just
> because that saves him an extra GPIO chip (and "line in" isn't
> connected anyway). Those things don't happen on normal PC [1], but
> instead, things must be auto-detectable.
I'm sorry but I'm not sure I follow the connection between the text
you've written above and the point made in the text you're quoting. The
physical implementation of the hardware doesn't seem strongly related to
how Linux distributions manage permissions for the interfaces exposed by
the drivers that manage that hardware.
If you're talking about the support for buttons that's nothing to do
with wiring random unrelated controls to jack detection circuits (which
would be highly unusual as pretty much any detection specific
electronics are highly specialised and difficult to use for anything
else). It's there because most headsets have at least one button on
them, implemented by shorting the mic to ground and used for
play/pause/call functionality, and some have more complex arrangements.
I've got a laptop sitting on this table which implements such
functionality.
As far as the implementation stuff goes you do get all sorts of odd
stuff on PCs, anything you've done that involves ACPI interaction (like
the Thinkpad extra volume control that was being discussed recently) is
going to be that sort of oddity.
> >only aren't working correctly in at least this case. It feels like if
> >we understood why the heuristics are making a bad call here we might be
> >able to come up with a better solution.
> AFAICT, the current "heuristics", is to assign all input devices to
> root and root only.
OK, well that doesn't seem like an immediately obvious choice. I can
imagine there might be some concern around keyboards due to passwords
but in general it doesn't seem obvious that the console user shouldn't
be able to read physical input devices on the box (which is the case
we're trying to implement here; we don't need write support). Do all
classes of input device currently have some root only service to mediate
access to them, and if that is the model we're using perhaps it'd make
sense to have one for this with a dbus interface or whatever that
PulseAudio can talk do rather than to have it talking direct to the
device?
> >>For options 2a) and 2b) I guess the existing /dev/input thing should
> >>be deprecated and/or removed. So part of decision should maybe be
> >>based on information about how widespread the usage of these devices
> >>are currently...?
> >There's a reasonable amount of usage in the embedded space.
> But maintaining two different implementations of input jacks without
> at least strongly deprecating one of them, lead to application
> programmers being confused, kernel being big and bloated, and so
> on...
"But"?
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 18:24 ` David Henningsson
@ 2011-06-21 0:29 ` Mark Brown
2011-06-21 6:57 ` David Henningsson
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-21 0:29 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Mon, Jun 20, 2011 at 08:24:14PM +0200, David Henningsson wrote:
> Exposing what mixers affect this port is highly desirable as well to
> get more accurate than the current name-based algorithm Pulseaudio
> currently uses.
Of course given the potential for internal routing within the card you
can only really say that about the very edge controls that are fully
committed to a given path - hence my comment about exposing the all the
routing being much better.
> Should we design something new, I think we should start with a
> pin/port concept rather than a jack concept. "Internal speaker"
> would then be a port that does not have a jack, whereas "Headphone"
> would be a port that has a jack.
I think that's too edge node focused, if we're going to define new
interfaces we may as well cover everything rather than doing half the
job.
> (Btw, I don't know much about DAPM and how well that scales to cope
> with these requirements, it looks very ASoC to me, but perhaps it's
> just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send
> events to userspace?)
No, there is no userspace visiblity of the routing map. It's pretty
much exactly equivalent to the data HDA CODECs expose but shipped in
source rather than parsed out of the device at runtime.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-21 0:29 ` Mark Brown
@ 2011-06-21 6:57 ` David Henningsson
2011-06-21 10:40 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-21 6:57 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-21 02:29, Mark Brown wrote:
> On Mon, Jun 20, 2011 at 08:24:14PM +0200, David Henningsson wrote:
>
>> Exposing what mixers affect this port is highly desirable as well to
>> get more accurate than the current name-based algorithm Pulseaudio
>> currently uses.
>
> Of course given the potential for internal routing within the card you
> can only really say that about the very edge controls that are fully
> committed to a given path - hence my comment about exposing the all the
> routing being much better.
I'm ok with exposing all the routing. Perhaps we could then add some
convenience functions in alsa-lib (and/or UCM?) that makes it easier for
applications to figure out what they need to know without having to
parse the entire graph.
>> Should we design something new, I think we should start with a
>> pin/port concept rather than a jack concept. "Internal speaker"
>> would then be a port that does not have a jack, whereas "Headphone"
>> would be a port that has a jack.
>
> I think that's too edge node focused, if we're going to define new
> interfaces we may as well cover everything rather than doing half the
> job.
Rather, the ports are what we're missing. We already have PCM devices
(which correspond to the DAC nodes) and mixer controls, so the ports are
the only objects/nodes we're missing.
Maybe there are some other type of objects we're missing as well, but I
don't think they're as common or important.
We're also missing the links between the nodes (it's partially exposed
through the mixer control naming, but that's error prone).
>> (Btw, I don't know much about DAPM and how well that scales to cope
>> with these requirements, it looks very ASoC to me, but perhaps it's
>> just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send
>> events to userspace?)
>
> No, there is no userspace visiblity of the routing map. It's pretty
> much exactly equivalent to the data HDA CODECs expose but shipped in
> source rather than parsed out of the device at runtime.
Sorry, I mixed up DAPM and the Media Controller API, but they seem
partially overlapping to me (both are supposed to expose this codec graph).
So, if we were to use the new Media Controller API (which I think you
have suggested?), could you explain a little where the
boundaries/intersections between this API and the current objects (ALSA
PCM objects, ALSA mixer control objects) would be, and how the Media
controller API would interact with the existing ALSA APIs?
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-21 6:57 ` David Henningsson
@ 2011-06-21 10:40 ` Mark Brown
0 siblings, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-21 10:40 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Tue, Jun 21, 2011 at 08:57:39AM +0200, David Henningsson wrote:
> On 2011-06-21 02:29, Mark Brown wrote:
> >Of course given the potential for internal routing within the card you
> >can only really say that about the very edge controls that are fully
> >committed to a given path - hence my comment about exposing the all the
> >routing being much better.
> I'm ok with exposing all the routing. Perhaps we could then add some
> convenience functions in alsa-lib (and/or UCM?) that makes it easier
> for applications to figure out what they need to know without having
> to parse the entire graph.
I'd guess so unless the raw interface is just easy enough to use but
first we need to get the data in place, then we can worry about
convenience functions.
UCM wouldn't be an appropriate place to put this, it's all about
applying configurations something else has generated and has no reason
to care about routing itself.
> >I think that's too edge node focused, if we're going to define new
> >interfaces we may as well cover everything rather than doing half the
> >job.
> Rather, the ports are what we're missing. We already have PCM
> devices (which correspond to the DAC nodes) and mixer controls, so
> the ports are the only objects/nodes we're missing.
PCM devices correspond to digital links to the CPU, not to DACs and
ADCs. You can have similar digital routing between these two to what
you can get in the analogue doman.
> We're also missing the links between the nodes (it's partially
> exposed through the mixer control naming, but that's error prone).
It's not just error prone, it just plain doesn't exist in any real sense
at the minute. The name based approach can work only for the most basic
sound cards with simple routes, and even then it can't tell applications
how the controls are ordered in the audio paths.
> >>(Btw, I don't know much about DAPM and how well that scales to cope
> >>with these requirements, it looks very ASoC to me, but perhaps it's
> >>just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send
> >>events to userspace?)
> >No, there is no userspace visiblity of the routing map. It's pretty
> >much exactly equivalent to the data HDA CODECs expose but shipped in
> >source rather than parsed out of the device at runtime.
> Sorry, I mixed up DAPM and the Media Controller API, but they seem
> partially overlapping to me (both are supposed to expose this codec
> graph).
No, DAPM is not supposed to expose anything to the application layer.
It's purely an implementation detail of the drivers and it is only
concerned with power so has no information about anything that doesn't
affect power.
> So, if we were to use the new Media Controller API (which I think
> you have suggested?), could you explain a little where the
> boundaries/intersections between this API and the current objects
> (ALSA PCM objects, ALSA mixer control objects) would be, and how the
> Media controller API would interact with the existing ALSA APIs?
My expectation would be that whatever interface we use for the graph
would in the first instance just point at the existing user visible
objects where they exist from the nodes and edges of the graph. I'd not
expect any interaction as such otherwise we get an API break.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 23:40 ` Mark Brown
@ 2011-06-21 12:11 ` David Henningsson
2011-06-21 12:39 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-21 12:11 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-21 01:40, Mark Brown wrote:
> On Mon, Jun 20, 2011 at 08:53:46PM +0200, David Henningsson wrote:
>> On 2011-06-20 19:07, Mark Brown wrote:
>>> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>
>>> Looking at the patch the main thing that jumps out at me without
>>> any knowledge of the udev code is that the patch will end up classifying
>>> video output jacks as audio even if they've no audio capability which is
>>> obviously not correct.
>
>> In suggested implementation, they will only be marked audio jacks if
>> their parent object is a sound card.
>
> Oh, then in that case we've got another issue in that jacks that happen
> not to be associated directly with a sound card for some reason (they're
> implemented by the embedded controller and exposed via ACPI for example)
> won't be made available to applications.
Which is not a problem IMO, because in the desktop world this does not
happen (or at least I've never seen it), and in the embedded world you
have PA running as root anyway.
>>> I still don't entirely understand the technical issue you're trying to
>>> address here - this doesn't seem specific to audio.
>
>> I think this is a difference between embedded space and desktop
>> space. In embedded space, a hardware designer can decide to connect
>> the "take a camera picture" button to "line in jack detect" just
>> because that saves him an extra GPIO chip (and "line in" isn't
>> connected anyway). Those things don't happen on normal PC [1], but
>> instead, things must be auto-detectable.
>
> I'm sorry but I'm not sure I follow the connection between the text
> you've written above and the point made in the text you're quoting. The
> physical implementation of the hardware doesn't seem strongly related to
> how Linux distributions manage permissions for the interfaces exposed by
> the drivers that manage that hardware.
>
> If you're talking about the support for buttons that's nothing to do
> with wiring random unrelated controls to jack detection circuits (which
> would be highly unusual as pretty much any detection specific
> electronics are highly specialised and difficult to use for anything
> else). It's there because most headsets have at least one button on
> them, implemented by shorting the mic to ground and used for
> play/pause/call functionality, and some have more complex arrangements.
> I've got a laptop sitting on this table which implements such
> functionality.
> As far as the implementation stuff goes you do get all sorts of odd
> stuff on PCs, anything you've done that involves ACPI interaction (like
> the Thinkpad extra volume control that was being discussed recently) is
> going to be that sort of oddity.
If you ask me, I think the Thinkpad stuff should be implemented with
connections between the hda-intel and thinkpad-acpi driver inside the
kernel, and we can leave userspace out of it - userspace would just see
the hda-intel card, possibly with an extra mute or volume control. But
that's another story.
>>> only aren't working correctly in at least this case. It feels like if
>>> we understood why the heuristics are making a bad call here we might be
>>> able to come up with a better solution.
>
>> AFAICT, the current "heuristics", is to assign all input devices to
>> root and root only.
>
> OK, well that doesn't seem like an immediately obvious choice. I can
> imagine there might be some concern around keyboards due to passwords
> but in general it doesn't seem obvious that the console user shouldn't
> be able to read physical input devices on the box (which is the case
> we're trying to implement here; we don't need write support). Do all
> classes of input device currently have some root only service to mediate
> access to them, and if that is the model we're using perhaps it'd make
> sense to have one for this with a dbus interface or whatever that
> PulseAudio can talk do rather than to have it talking direct to the
> device?
If I look at what /dev/input devices I have here, that's keyboard,
mouse, and ACPI buttons (power, lid close etc). AFAIK, all of those go
through the X input layer. Play/pause keys should go through there as
well, but are you saying that audio jack events should also go through
the X input layer?
If you are, you're welcome to try to fit it in the 248 keycodes that are
already occupied, design a new X input extension, or whatever is the
right solution for that. I don't know myself.
If you're not, then that's why this is specific to audio.
>>>> For options 2a) and 2b) I guess the existing /dev/input thing should
>>>> be deprecated and/or removed. So part of decision should maybe be
>>>> based on information about how widespread the usage of these devices
>>>> are currently...?
>
>>> There's a reasonable amount of usage in the embedded space.
>
>> But maintaining two different implementations of input jacks without
>> at least strongly deprecating one of them, lead to application
>> programmers being confused, kernel being big and bloated, and so
>> on...
> "But"?
Do you agree that maintaining two different implementations of input
jacks is something we should avoid?
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-21 12:11 ` David Henningsson
@ 2011-06-21 12:39 ` Mark Brown
2011-06-22 10:47 ` David Henningsson
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-21 12:39 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Tue, Jun 21, 2011 at 02:11:15PM +0200, David Henningsson wrote:
> On 2011-06-21 01:40, Mark Brown wrote:
> >Oh, then in that case we've got another issue in that jacks that happen
> >not to be associated directly with a sound card for some reason (they're
> >implemented by the embedded controller and exposed via ACPI for example)
> >won't be made available to applications.
> Which is not a problem IMO, because in the desktop world this does
> not happen (or at least I've never seen it), and in the embedded
> world you have PA running as root anyway.
This seems rather short sighted - the desktop and embedded worlds aren't
as clearly divided as all that, you've already got things like the
Genesi systems on the market at the minute running Ubuntu on ARM
hardware for example and you've also got a bunch of laptop manufacturers
putting ARM systems in their systems to dual boot into.
> >OK, well that doesn't seem like an immediately obvious choice. I can
> >imagine there might be some concern around keyboards due to passwords
> >but in general it doesn't seem obvious that the console user shouldn't
> >be able to read physical input devices on the box (which is the case
> >we're trying to implement here; we don't need write support). Do all
> >classes of input device currently have some root only service to mediate
> >access to them, and if that is the model we're using perhaps it'd make
> >sense to have one for this with a dbus interface or whatever that
> >PulseAudio can talk do rather than to have it talking direct to the
> >device?
> If I look at what /dev/input devices I have here, that's keyboard,
> mouse, and ACPI buttons (power, lid close etc). AFAIK, all of those
> go through the X input layer. Play/pause keys should go through
> there as well, but are you saying that audio jack events should also
> go through the X input layer?
I'm asking what the sensible model for handling this stuff is and why we
need to use a different model for this. Perhaps this should go through
the X server, perhaps somewhere else.
It does strike me that the UI may want to be able to join the buttons up
with the particular jack - for example, if you've got a headset that
you've got assigned to telephony functions only perhaps the mic short
button on that headset should only do telephony things.
> If you are, you're welcome to try to fit it in the 248 keycodes that
> are already occupied, design a new X input extension, or whatever is
> the right solution for that. I don't know myself.
> If you're not, then that's why this is specific to audio.
It doesn't strike me as an obviously awful solution if that's what we're
using for things like the ACPI buttons which seem pretty similar - like
I say one of my questions here is why we need to do something domain
specific here. There are certainly a bunch of other switch types in the
input API which look like the desktop might want to look at them (things
like SW_DOCK and SW_ROTATE_LOCK for example) so presumably we ought to
handle them somehow too.
It does sound like we need some input from the udev and more general
userspace stack people about how this stuff is usually handled.
> >>>>For options 2a) and 2b) I guess the existing /dev/input thing should
> >>>>be deprecated and/or removed. So part of decision should maybe be
> >>>>based on information about how widespread the usage of these devices
> >>>>are currently...?
> >>>There's a reasonable amount of usage in the embedded space.
> >>But maintaining two different implementations of input jacks without
> >>at least strongly deprecating one of them, lead to application
> >>programmers being confused, kernel being big and bloated, and so
> >>on...
> >"But"?
> Do you agree that maintaining two different implementations of input
> jacks is something we should avoid?
Show me the interfaces and I'll tell you if they look like a bad idea.
Do we have two identical APIs or do we have two APIs that sit next to
each other, more providing more detail on the data from the other?
I don't see what that's got to do with what I said above, though?
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-21 12:39 ` Mark Brown
@ 2011-06-22 10:47 ` David Henningsson
2011-06-22 11:48 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-22 10:47 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-21 14:39, Mark Brown wrote:
> On Tue, Jun 21, 2011 at 02:11:15PM +0200, David Henningsson wrote:
>> On 2011-06-21 01:40, Mark Brown wrote:
>
>>> Oh, then in that case we've got another issue in that jacks that happen
>>> not to be associated directly with a sound card for some reason (they're
>>> implemented by the embedded controller and exposed via ACPI for example)
>>> won't be made available to applications.
>
>> Which is not a problem IMO, because in the desktop world this does
>> not happen (or at least I've never seen it), and in the embedded
>> world you have PA running as root anyway.
>
> This seems rather short sighted - the desktop and embedded worlds aren't
> as clearly divided as all that, you've already got things like the
> Genesi systems on the market at the minute running Ubuntu on ARM
> hardware for example and you've also got a bunch of laptop manufacturers
> putting ARM systems in their systems to dual boot into.
If you have a better idea about how to determine if a SW_VIDEOOUT jack
is actually an audio jack or not, let me know.
>>> OK, well that doesn't seem like an immediately obvious choice. I can
>>> imagine there might be some concern around keyboards due to passwords
>>> but in general it doesn't seem obvious that the console user shouldn't
>>> be able to read physical input devices on the box (which is the case
>>> we're trying to implement here; we don't need write support). Do all
>>> classes of input device currently have some root only service to mediate
>>> access to them, and if that is the model we're using perhaps it'd make
>>> sense to have one for this with a dbus interface or whatever that
>>> PulseAudio can talk do rather than to have it talking direct to the
>>> device?
>
>> If I look at what /dev/input devices I have here, that's keyboard,
>> mouse, and ACPI buttons (power, lid close etc). AFAIK, all of those
>> go through the X input layer. Play/pause keys should go through
>> there as well, but are you saying that audio jack events should also
>> go through the X input layer?
>
> I'm asking what the sensible model for handling this stuff is and why we
> need to use a different model for this. Perhaps this should go through
> the X server, perhaps somewhere else.
>
> It does strike me that the UI may want to be able to join the buttons up
> with the particular jack - for example, if you've got a headset that
> you've got assigned to telephony functions only perhaps the mic short
> button on that headset should only do telephony things.
>
>> If you are, you're welcome to try to fit it in the 248 keycodes that
>> are already occupied, design a new X input extension, or whatever is
>> the right solution for that. I don't know myself.
>
>> If you're not, then that's why this is specific to audio.
>
> It doesn't strike me as an obviously awful solution if that's what we're
> using for things like the ACPI buttons which seem pretty similar - like
> I say one of my questions here is why we need to do something domain
> specific here. There are certainly a bunch of other switch types in the
> input API which look like the desktop might want to look at them (things
> like SW_DOCK and SW_ROTATE_LOCK for example) so presumably we ought to
> handle them somehow too.
>
> It does sound like we need some input from the udev and more general
> userspace stack people about how this stuff is usually handled.
I was hoping for Lennart or Kay to answer here as they are more "udev
and general userspace stack people" than I am, but lacking that, I can
only notice that on my machine, the only processes listening to
/dev/input files are upowerd (for the lid switch) and Xorg (for
everything else). [1] Acpid seems to be able to listen to these events
too, but does not seem to do that on my machine (as determined by "sudo
fuser -v").
Xorg can only handle keypresses and mouse events, and AFAICT don't care
about switch events, so, if the way things are usually handled is to add
another u-audiojack-d just to forward events to pulseaudio, I would
prefer giving the user access to the relevant input devices directly,
out of pure simplicity.
--
David Henningsson, Canonical Ltd.
http://launchpad.net/~diwic
[1] On a slightly older machine there is also hald-addon-input, but
since that's deprecated, let's ignore that for now.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 10:47 ` David Henningsson
@ 2011-06-22 11:48 ` Mark Brown
2011-06-22 12:50 ` Kay Sievers
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-22 11:48 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Wed, Jun 22, 2011 at 12:47:35PM +0200, David Henningsson wrote:
> On 2011-06-21 14:39, Mark Brown wrote:
> >On Tue, Jun 21, 2011 at 02:11:15PM +0200, David Henningsson wrote:
[Using the parent device to work out if a jack is an audio jack.]
> If you have a better idea about how to determine if a SW_VIDEOOUT
> jack is actually an audio jack or not, let me know.
I'd have done this by checking for any audio capabilities - if the jack
can detect an audio input or output then it's an audio capable jack.
> >It does sound like we need some input from the udev and more general
> >userspace stack people about how this stuff is usually handled.
> I was hoping for Lennart or Kay to answer here as they are more
> "udev and general userspace stack people" than I am, but lacking
Yes, me too. Kay, does any of this ring any bells from a udev point of
view?
> that, I can only notice that on my machine, the only processes
> listening to /dev/input files are upowerd (for the lid switch) and
> Xorg (for everything else). [1] Acpid seems to be able to listen to
> these events too, but does not seem to do that on my machine (as
> determined by "sudo fuser -v").
Hrm, right. So upowerd is an example of a little daemon running as root
which provides an interface to the applications running as users
separately to the X server, and it was added recently too so it should
be an example of current best practice. However with the lid there is
an additional motivation for doing this as a system daemon as we need to
do power handling even when there is nobody logged in which is less of
an issue for audio especially with the current model we have.
> Xorg can only handle keypresses and mouse events, and AFAICT don't
> care about switch events, so, if the way things are usually handled
> is to add another u-audiojack-d just to forward events to
> pulseaudio, I would prefer giving the user access to the relevant
> input devices directly, out of pure simplicity.
I tend to agree. I think my preferences would be to open up access to
input (or possibly just switch) devices in general, or if there's a good
reason not to do that then do the daemon thing.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 11:48 ` Mark Brown
@ 2011-06-22 12:50 ` Kay Sievers
2011-06-22 13:25 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: Kay Sievers @ 2011-06-22 12:50 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Lennart Poettering,
David Henningsson
On Wed, Jun 22, 2011 at 13:48, Mark Brown
<broonie@opensource.wolfsonmicro.com> wrote:
> On Wed, Jun 22, 2011 at 12:47:35PM +0200, David Henningsson wrote:
>> On 2011-06-21 14:39, Mark Brown wrote:
>> >On Tue, Jun 21, 2011 at 02:11:15PM +0200, David Henningsson wrote:
>
> [Using the parent device to work out if a jack is an audio jack.]
>
>> If you have a better idea about how to determine if a SW_VIDEOOUT
>> jack is actually an audio jack or not, let me know.
>
> I'd have done this by checking for any audio capabilities - if the jack
> can detect an audio input or output then it's an audio capable jack.
>
>> >It does sound like we need some input from the udev and more general
>> >userspace stack people about how this stuff is usually handled.
>
>> I was hoping for Lennart or Kay to answer here as they are more
>> "udev and general userspace stack people" than I am, but lacking
>
> Yes, me too. Kay, does any of this ring any bells from a udev point of
> view?
At this point, I wouldn't really expect udev to be directly involved.
Udev does not really know anything about audio, video, buttons or
input devices.
I'm still pretty sure that there should be no fake input devices
involved for any jack related things. Especially not in the simple and
common desktop case, where the input devices are created by, and
direct child devices of the ALSA card device (like Intel HDA). I'm
sure, such things just belong to the ALSA control device directly.
We should get the needed patches Takashi already has, refreshed and
merged. For PulseAudio we would just add the support for the control
device instead of input device support. Udev would not do anything
special here, just manage the access permissions for the control
device like it already does since years.
There might be odd use cases or use cases where multiple device
classes are involved for the same type of jacks, which can probably
not be covered by the ALSA-only control device, and where some
video/multimedia event channel makes sense too. I don't necessarily
see a conflict with the ALSA control device event here, not even with
events for the jacks coming out of ALSA control and the multimedia
device at the same time.
To me the ALSA control device events for audio-related jack events
seems like the natural interface and the best option currently. Unless
there are good technical reasons not to do that, I like to see them
made working, instead of adding support for input devices, or wait for
some (magic) meta multimedia device to be defined, or an additional
userspace event daemon requirement.
Thanks,
Kay
_______________________________________________
Alsa-devel mailing list
Alsa-devel@alsa-project.org
http://mailman.alsa-project.org/mailman/listinfo/alsa-devel
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 12:50 ` Kay Sievers
@ 2011-06-22 13:25 ` Mark Brown
2011-06-22 13:55 ` Kay Sievers
2011-06-22 21:01 ` Lennart Poettering
0 siblings, 2 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-22 13:25 UTC (permalink / raw)
To: Kay Sievers
Cc: Takashi Iwai, ALSA Development Mailing List, Lennart Poettering,
David Henningsson
On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
> I'm still pretty sure that there should be no fake input devices
> involved for any jack related things. Especially not in the simple and
> common desktop case, where the input devices are created by, and
> direct child devices of the ALSA card device (like Intel HDA). I'm
> sure, such things just belong to the ALSA control device directly.
It would really helpful if you could engage with some of the discussion
about this... Apart from anything else "fake" seems a bit misleading,
a jack is a physical thing which really exists and which I can point to
on the system and in may cases has buttons on it.
> There might be odd use cases or use cases where multiple device
> classes are involved for the same type of jacks, which can probably
> not be covered by the ALSA-only control device, and where some
Like I said in reply to your earlier mail on this subject that's not a
terribly unusual case for PC class hardware, HDMI is very widely
deployed and obviously carries both audio and video over a single
physical connector.
> To me the ALSA control device events for audio-related jack events
> seems like the natural interface and the best option currently. Unless
> there are good technical reasons not to do that, I like to see them
> made working, instead of adding support for input devices, or wait for
> some (magic) meta multimedia device to be defined, or an additional
> userspace event daemon requirement.
To summarise some of the issues from the rest of the thread:
- This isn't a new interface at all, it's been implemented in the
kernel for quite some time now (the first switch was added in
2.6.18). All that's changed here is that PulseAudio is trying to
use the information that the kernel is exporting.
- Mixed function jacks aren't that unusual; if anything they're more
common on PCs than anything else. HDMI is the obvious one for PC
class hardware and where these things happen we do need to be able to
join the audio and the video up with each other.
- Many jacks include buttons so need an input device anyway which again
we want to join up with the other functions.
- We still need to handle cases where the jack isn't particularly
closely connected to the audio subsystem.
- There are other non-audio non-jack things being exposed via the same
interface which we need to have some method for handling even if we
end up doing something audio specific for some subset of jacks.
and probably some others I forget.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 13:25 ` Mark Brown
@ 2011-06-22 13:55 ` Kay Sievers
2011-06-22 15:11 ` Mark Brown
2011-06-22 21:01 ` Lennart Poettering
1 sibling, 1 reply; 57+ messages in thread
From: Kay Sievers @ 2011-06-22 13:55 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Dmitry Torokhov,
Lennart Poettering, David Henningsson
On Wed, Jun 22, 2011 at 15:25, Mark Brown
<broonie@opensource.wolfsonmicro.com> wrote:
> On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
>
>> I'm still pretty sure that there should be no fake input devices
>> involved for any jack related things. Especially not in the simple and
>> common desktop case, where the input devices are created by, and
>> direct child devices of the ALSA card device (like Intel HDA). I'm
>> sure, such things just belong to the ALSA control device directly.
>
> It would really helpful if you could engage with some of the discussion
> about this... Apart from anything else "fake" seems a bit misleading,
> a jack is a physical thing which really exists and which I can point to
> on the system and in may cases has buttons on it.
Physical does not imply button, and even then, button does not imply
input. It's really a domain of the individual subsystem it is
associated with. Cdrom media eject buttons are no input devices, the
events come out of SCSI.
We should stop misusing input for anything that isn't user a real
input device like a keyboard button, mouse, joystick, screen, tablet
like interface. Otherwise, I fear, with that logic anything with a
physical state change could become an input device. USB would then
make a good case too, to have all ports as individual input devices,
telling us if anything is plugged in or out, and so on ...
The kernel's input just has some sensible event relay interface, that
should not be the reason to put everything that has a state and can
change its state as an input device. People should come up with their
own event channel instead of just lazily misuse input for unrelated
things.
>> There might be odd use cases or use cases where multiple device
>> classes are involved for the same type of jacks, which can probably
>> not be covered by the ALSA-only control device, and where some
>
> Like I said in reply to your earlier mail on this subject that's not a
> terribly unusual case for PC class hardware, HDMI is very widely
> deployed and obviously carries both audio and video over a single
> physical connector.
HDMI has no buttons, no virtual ones no physical ones. :)
HDMI exposes an ALSA interface if audio is involved, and that ALSA can
send events, just as the associated video interface can send video
related events. That there is a single connector does again not make a
case to use input for it, it's an hot-pluggable audio/video
connection, not an input device.
>> To me the ALSA control device events for audio-related jack events
>> seems like the natural interface and the best option currently. Unless
>> there are good technical reasons not to do that, I like to see them
>> made working, instead of adding support for input devices, or wait for
>> some (magic) meta multimedia device to be defined, or an additional
>> userspace event daemon requirement.
>
> To summarise some of the issues from the rest of the thread:
>
> - This isn't a new interface at all, it's been implemented in the
> kernel for quite some time now (the first switch was added in
> 2.6.18).
Which does not make it correct, or any more tasteful. It's just wrong
to misuse input for subsystem specific plugging events not related to
input.
If these devices should be removed ever, or at what time if so, is
nothing we need to discuss now. But we should not start using them in
_new_ stuff. They are just wrong from the beginning.
> All that's changed here is that PulseAudio is trying to
> use the information that the kernel is exporting.
Right, And we don't want to use mis-designed interfaces if we don't
need to. And as we are not in a hurry and have the option of a native
ALSA control interface, we should focus on that.
> - Mixed function jacks aren't that unusual; if anything they're more
> common on PCs than anything else. HDMI is the obvious one for PC
> class hardware and where these things happen we do need to be able to
> join the audio and the video up with each other.
Sure. But again, HDMI has in no way any button or input. This is about
hot-plugging of audio/video devices. Just let ALSA _and_ the video
device send the event.
> - Many jacks include buttons so need an input device anyway which again
> we want to join up with the other functions.
I disagree. They are states for the associated class of devices, they
are not sensible input devices today and probably will never really be
such.
> - We still need to handle cases where the jack isn't particularly
> closely connected to the audio subsystem.
Sure, that might need to be solved by some mapping, still makes no
case to misuse input. In many modern connector cases the jack senses
are not even buttons anymore, and they should not become buttons for
userspace.
> - There are other non-audio non-jack things being exposed via the same
> interface which we need to have some method for handling even if we
> end up doing something audio specific for some subset of jacks.
>
> and probably some others I forget.
Sure, they all might need to rethink their use of input, and ideally
stop doing that. I expect most of them are not right doing what they
do.
Thanks,
Kay
_______________________________________________
Alsa-devel mailing list
Alsa-devel@alsa-project.org
http://mailman.alsa-project.org/mailman/listinfo/alsa-devel
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 13:55 ` Kay Sievers
@ 2011-06-22 15:11 ` Mark Brown
2011-06-22 21:41 ` Dmitry Torokhov
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-22 15:11 UTC (permalink / raw)
To: Kay Sievers
Cc: Takashi Iwai, ALSA Development Mailing List, Dmitry Torokhov,
Lennart Poettering, David Henningsson
On Wed, Jun 22, 2011 at 03:55:46PM +0200, Kay Sievers wrote:
> On Wed, Jun 22, 2011 at 15:25, Mark Brown
> > It would really helpful if you could engage with some of the discussion
> > about this... Apart from anything else "fake" seems a bit misleading,
> > a jack is a physical thing which really exists and which I can point to
> > on the system and in may cases has buttons on it.
> We should stop misusing input for anything that isn't user a real
> input device like a keyboard button, mouse, joystick, screen, tablet
> like interface. Otherwise, I fear, with that logic anything with a
> physical state change could become an input device. USB would then
> make a good case too, to have all ports as individual input devices,
> telling us if anything is plugged in or out, and so on ...
The idea of reporting presence status for USB ports doesn't seem
obviously bad.
> The kernel's input just has some sensible event relay interface, that
> should not be the reason to put everything that has a state and can
> change its state as an input device. People should come up with their
> own event channel instead of just lazily misuse input for unrelated
> things.
So if what you want to do is create a new API for this sort of switch
rather than do some audio specific thing like you keep saying I guess we
could merge some version of the switch API that the Android guys wrote
because they didn't notice the existing functionality. Currently it'd
need quite a bit of work from what I remember, it's very basic, and
there's going to be a transition period where we move all or most of the
various existing users of switches out of the input API.
One can, of course, flip this around and say that what we should really
be doing is factoring the event delivery mechanism out of the input
layer and into a generic subsystem where it can be reused so we don't
have to reinvent the wheel. The transition issues do apply there too
but probably less so and it seems like we'd end up with less redundant
code.
If we did one of those two things we would still need to deal with all
the same permission management issues that we've got for input devices
at some point.
> HDMI exposes an ALSA interface if audio is involved, and that ALSA can
> send events, just as the associated video interface can send video
> related events. That there is a single connector does again not make a
> case to use input for it, it's an hot-pluggable audio/video
> connection, not an input device.
There's two orthogonal issues here which you keep confusing.
One is that you keep saying to add a new audio specific API to replace
the existing API. This seems like a clear loss to me, we're currently
able to represent multi function jacks as a single object to the
application layer and we'd loose that ability. That seems like a
failure.
The other is that you don't like the fact that the input layer has been
used for this stuff. This seems like a more valid point, the main
issues are implementation and deployment, but you keep suggesting an
audio specific solution.
> > - This isn't a new interface at all, it's been implemented in the
> > kernel for quite some time now (the first switch was added in
> > 2.6.18).
> Which does not make it correct, or any more tasteful. It's just wrong
> to misuse input for subsystem specific plugging events not related to
> input.
You keep saying subsystem specific here - like I keep on saying I don't
think that reflects the reality of the hardware we've got currently.
> If these devices should be removed ever, or at what time if so, is
> nothing we need to discuss now. But we should not start using them in
> _new_ stuff. They are just wrong from the beginning.
Well, deciding that we want to ditch the current API doesn't really help
people like David who are trying to use the current kernel - userspace
has a lot of the information right now, it's just having a hard time
figuring out how to talk to itself. That's a third mostly orthogonal
problem.
> > All that's changed here is that PulseAudio is trying to
> > use the information that the kernel is exporting.
> Right, And we don't want to use mis-designed interfaces if we don't
> need to. And as we are not in a hurry and have the option of a native
> ALSA control interface, we should focus on that.
I don't think it makes any sense to push for an audio specific solution.
> > - Mixed function jacks aren't that unusual; if anything they're more
> > common on PCs than anything else. HDMI is the obvious one for PC
> > class hardware and where these things happen we do need to be able to
> > join the audio and the video up with each other.
> Sure. But again, HDMI has in no way any button or input. This is about
> hot-plugging of audio/video devices. Just let ALSA _and_ the video
> device send the event.
I'm not so sure about that, I believe there's some mechanism for
propagating events through HDMI links for things like volume changes
which we don't currently support and I'd not be surprised if they didn't
look like buttons. Besides if you don't like HDMI as an example there's
also four pole analogue jacks that can do video, audio and buttons. I'm
mostly just mentioning HDMI because it's a blatantly obvious example
seen on PCs.
Besides, userspace needs some way to figure out that the two events are
connected and that they're also connected with anything else that
appared as part of the same accessory detection.
> > - Many jacks include buttons so need an input device anyway which again
> > we want to join up with the other functions.
> I disagree. They are states for the associated class of devices, they
> are not sensible input devices today and probably will never really be
> such.
Please read what I wrote, this is for *buttons*. For example, I'm
sitting here with a headset that has play/pause, rewind and fast forward
buttons on it. Pretty much every headset used with phones has at least
one such button.
> > - We still need to handle cases where the jack isn't particularly
> > closely connected to the audio subsystem.
> Sure, that might need to be solved by some mapping, still makes no
So again why the push to come up with an audio specific solution? We
can see right now that it's got some issues with current hardware.
> case to misuse input. In many modern connector cases the jack senses
> are not even buttons anymore, and they should not become buttons for
> userspace.
I do have some passing familiarity with the physical implementations,
thanks. Note that the presence detection stuff is all reported as
*switches* rather than buttons.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 13:25 ` Mark Brown
2011-06-22 13:55 ` Kay Sievers
@ 2011-06-22 21:01 ` Lennart Poettering
2011-06-22 21:57 ` Stephen Warren
2011-06-23 1:10 ` Mark Brown
1 sibling, 2 replies; 57+ messages in thread
From: Lennart Poettering @ 2011-06-22 21:01 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson
On Wed, 22.06.11 14:25, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
>
> On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
>
> > I'm still pretty sure that there should be no fake input devices
> > involved for any jack related things. Especially not in the simple and
> > common desktop case, where the input devices are created by, and
> > direct child devices of the ALSA card device (like Intel HDA). I'm
> > sure, such things just belong to the ALSA control device directly.
>
> It would really helpful if you could engage with some of the discussion
> about this... Apart from anything else "fake" seems a bit misleading,
> a jack is a physical thing which really exists and which I can point to
> on the system and in may cases has buttons on it.
I fully agree with Kay here btw.
Input devices should really just be used for input, i.e. interfacing
with humans via keyboards, mice, joystick, touchscreens and
suchlike. However Jack sensing does not qualify as such. The main reason
why input device exist to do jack sensing right now appears to be that
there was no other infrastructure readily available for it. But that is
a poor excuse, which might be acceptable early in the game where things
aren't really clear yet but not for the long run. On top of that there
actually always has been a more appropriate place for signalling audio
jack sensing events: the alsa control layer. Because that is what Jack
sensing ultimately is: audio control. And if you have it in audio
control this has many benefits: for example it's in the same namespace
as the other audio controls (a very weakly defined namespace, but still
the same namespace), so you could by employing some rules match up mute
and volume controls with their matching jacks.
On top of that Takashi has patches ready to port HDA over to jack
sensing via control devices, and hence I believe this is definitely the
way to go.
Jack sensing for video plugs should similarly be handled by the video
layer -- and actually is handled like that for VGA hotplug already.
HDMI is a composite technology, but I don't think this should be used as
excuse to implement jack sensing via an independent subsystem. If it is
composite it should simply use the native sensing interfaces for its
technologies, i.e. send plug notifications via both video and audio
control, the way they want.
There are a number of other non-input technologies that have been bolted
into the input subsystem, but that is really not an excuse to bolt even
more into them.
> To summarise some of the issues from the rest of the thread:
>
> - This isn't a new interface at all, it's been implemented in the
> kernel for quite some time now (the first switch was added in
> 2.6.18). All that's changed here is that PulseAudio is trying to
> use the information that the kernel is exporting.
It's not a new interface, but it will be used for the first time in a
big way that will find its way into generic distros. That means it is
still time to get this right before this is too late.
You should also not forget that having a fucked up kernel interface
usually means a lot of additional work to work around that in userspace:
i.e. in PA we need some way to match up input devices with alsa control
devices and specific controls. That is messy, requires knowledge and
logic that would be much much simpler if we could just access jack
sensing via the same interface as the rest of the audio control stuff we do.
> - Mixed function jacks aren't that unusual; if anything they're more
> common on PCs than anything else. HDMI is the obvious one for PC
> class hardware and where these things happen we do need to be able to
> join the audio and the video up with each other.
As mentioned this is not really an excuse as nothing stops you from
sending out plug events via both the audio and the video subsystem
> - Many jacks include buttons so need an input device anyway which again
> we want to join up with the other functions.
Well, if you are speaking of headsets here with specific Play and Stop
buttons then I believe those actually qualify as proper input keys, and
it is the right thing to route them via the input layer. And if we do X
and GNOME and Rhythmbox will actually already handle them as they should.
> - We still need to handle cases where the jack isn't particularly
> closely connected to the audio subsystem.
Well, I am tempted to say that this should be handled in kernel. I am
not convinced that it is a good idea to expose particularities of the hw
design too much in userspace. In fact, I agree with David that the
thinkpad laptop driver would best integrate directly with HDA so that in
userspace no knowledge about their relation would be necessary.
> - There are other non-audio non-jack things being exposed via the same
> interface which we need to have some method for handling even if we
> end up doing something audio specific for some subset of jacks.
This exists for VGA hotplug at least. I see no reason why this shouldn't
be available for HDMI as well.
Lennart
--
Lennart Poettering - Red Hat, Inc.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 15:11 ` Mark Brown
@ 2011-06-22 21:41 ` Dmitry Torokhov
2011-06-23 0:15 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: Dmitry Torokhov @ 2011-06-22 21:41 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
On Wed, Jun 22, 2011 at 04:11:30PM +0100, Mark Brown wrote:
> On Wed, Jun 22, 2011 at 03:55:46PM +0200, Kay Sievers wrote:
> > On Wed, Jun 22, 2011 at 15:25, Mark Brown
>
> > > It would really helpful if you could engage with some of the discussion
> > > about this... Apart from anything else "fake" seems a bit misleading,
> > > a jack is a physical thing which really exists and which I can point to
> > > on the system and in may cases has buttons on it.
>
> > We should stop misusing input for anything that isn't user a real
> > input device like a keyboard button, mouse, joystick, screen, tablet
> > like interface. Otherwise, I fear, with that logic anything with a
> > physical state change could become an input device. USB would then
> > make a good case too, to have all ports as individual input devices,
> > telling us if anything is plugged in or out, and so on ...
>
> The idea of reporting presence status for USB ports doesn't seem
> obviously bad.
And so is reporting whether network cable is plugged in. Does not mean
that it should be routed through input though.
>
> > The kernel's input just has some sensible event relay interface, that
> > should not be the reason to put everything that has a state and can
> > change its state as an input device. People should come up with their
> > own event channel instead of just lazily misuse input for unrelated
> > things.
>
> So if what you want to do is create a new API for this sort of switch
> rather than do some audio specific thing like you keep saying I guess we
> could merge some version of the switch API that the Android guys wrote
> because they didn't notice the existing functionality. Currently it'd
> need quite a bit of work from what I remember, it's very basic, and
> there's going to be a transition period where we move all or most of the
> various existing users of switches out of the input API.
I do not think you want to have switch API, it is more a 'connection'
API that is needed (i.e. we need a way to report and query whether
something is connected to something else or not).
>
> One can, of course, flip this around and say that what we should really
> be doing is factoring the event delivery mechanism out of the input
> layer and into a generic subsystem where it can be reused so we don't
> have to reinvent the wheel. The transition issues do apply there too
> but probably less so and it seems like we'd end up with less redundant
> code.
>
> If we did one of those two things we would still need to deal with all
> the same permission management issues that we've got for input devices
> at some point.
Hmm, the connection changes should normally be infrequent; maybe
delivering them over netlink in the same fashion we deliver uevents
woudl make sense. In fact, maybe uevents are all that is needed here.
>
> > HDMI exposes an ALSA interface if audio is involved, and that ALSA can
> > send events, just as the associated video interface can send video
> > related events. That there is a single connector does again not make a
> > case to use input for it, it's an hot-pluggable audio/video
> > connection, not an input device.
>
> There's two orthogonal issues here which you keep confusing.
>
> One is that you keep saying to add a new audio specific API to replace
> the existing API. This seems like a clear loss to me, we're currently
> able to represent multi function jacks as a single object to the
> application layer and we'd loose that ability. That seems like a
> failure.
>
> The other is that you don't like the fact that the input layer has been
> used for this stuff. This seems like a more valid point, the main
> issues are implementation and deployment, but you keep suggesting an
> audio specific solution.
>
> > > - This isn't a new interface at all, it's been implemented in the
> > > kernel for quite some time now (the first switch was added in
> > > 2.6.18).
>
> > Which does not make it correct, or any more tasteful. It's just wrong
> > to misuse input for subsystem specific plugging events not related to
> > input.
>
> You keep saying subsystem specific here - like I keep on saying I don't
> think that reflects the reality of the hardware we've got currently.
>
> > If these devices should be removed ever, or at what time if so, is
> > nothing we need to discuss now. But we should not start using them in
> > _new_ stuff. They are just wrong from the beginning.
>
> Well, deciding that we want to ditch the current API doesn't really help
> people like David who are trying to use the current kernel - userspace
> has a lot of the information right now, it's just having a hard time
> figuring out how to talk to itself. That's a third mostly orthogonal
> problem.
>
> > > All that's changed here is that PulseAudio is trying to
> > > use the information that the kernel is exporting.
>
> > Right, And we don't want to use mis-designed interfaces if we don't
> > need to. And as we are not in a hurry and have the option of a native
> > ALSA control interface, we should focus on that.
>
> I don't think it makes any sense to push for an audio specific solution.
>
> > > - Mixed function jacks aren't that unusual; if anything they're more
> > > common on PCs than anything else. HDMI is the obvious one for PC
> > > class hardware and where these things happen we do need to be able to
> > > join the audio and the video up with each other.
>
> > Sure. But again, HDMI has in no way any button or input. This is about
> > hot-plugging of audio/video devices. Just let ALSA _and_ the video
> > device send the event.
>
> I'm not so sure about that, I believe there's some mechanism for
> propagating events through HDMI links for things like volume changes
> which we don't currently support and I'd not be surprised if they didn't
> look like buttons. Besides if you don't like HDMI as an example there's
> also four pole analogue jacks that can do video, audio and buttons. I'm
> mostly just mentioning HDMI because it's a blatantly obvious example
> seen on PCs.
>
> Besides, userspace needs some way to figure out that the two events are
> connected and that they're also connected with anything else that
> appared as part of the same accessory detection.
>
> > > - Many jacks include buttons so need an input device anyway which again
> > > we want to join up with the other functions.
>
> > I disagree. They are states for the associated class of devices, they
> > are not sensible input devices today and probably will never really be
> > such.
>
> Please read what I wrote, this is for *buttons*. For example, I'm
> sitting here with a headset that has play/pause, rewind and fast forward
> buttons on it. Pretty much every headset used with phones has at least
> one such button.
>
> > > - We still need to handle cases where the jack isn't particularly
> > > closely connected to the audio subsystem.
>
> > Sure, that might need to be solved by some mapping, still makes no
>
> So again why the push to come up with an audio specific solution? We
> can see right now that it's got some issues with current hardware.
>
> > case to misuse input. In many modern connector cases the jack senses
> > are not even buttons anymore, and they should not become buttons for
> > userspace.
>
> I do have some passing familiarity with the physical implementations,
> thanks. Note that the presence detection stuff is all reported as
> *switches* rather than buttons.
Buttons or switches does not really make much difference. They are not
really HID devices (unless there are physical switches that can be
toggled by the user while device is still plugged into a socket).
--
Dmitry
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 21:01 ` Lennart Poettering
@ 2011-06-22 21:57 ` Stephen Warren
2011-06-23 1:10 ` Mark Brown
1 sibling, 0 replies; 57+ messages in thread
From: Stephen Warren @ 2011-06-22 21:57 UTC (permalink / raw)
To: Lennart Poettering, Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Henningsson, David
Lennart Poettering wrote at Wednesday, June 22, 2011 3:02 PM:
> On Wed, 22.06.11 14:25, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
> > On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
> >
> > > I'm still pretty sure that there should be no fake input devices
> > > involved for any jack related things. Especially not in the simple and
> > > common desktop case, where the input devices are created by, and
> > > direct child devices of the ALSA card device (like Intel HDA). I'm
> > > sure, such things just belong to the ALSA control device directly.
> >
> > It would really helpful if you could engage with some of the discussion
> > about this... Apart from anything else "fake" seems a bit misleading,
> > a jack is a physical thing which really exists and which I can point to
> > on the system and in may cases has buttons on it.
>
> I fully agree with Kay here btw.
>
> Input devices should really just be used for input, i.e. interfacing
> with humans via keyboards, mice, joystick, touchscreens and
> suchlike. However Jack sensing does not qualify as such. The main reason
> why input device exist to do jack sensing right now appears to be that
> there was no other infrastructure readily available for it. But that is
> a poor excuse, which might be acceptable early in the game where things
> aren't really clear yet but not for the long run. On top of that there
> actually always has been a more appropriate place for signalling audio
> jack sensing events: the alsa control layer. Because that is what Jack
> sensing ultimately is: audio control. And if you have it in audio
> control this has many benefits: for example it's in the same namespace
> as the other audio controls (a very weakly defined namespace, but still
> the same namespace), so you could by employing some rules match up mute
> and volume controls with their matching jacks.
>
> On top of that Takashi has patches ready to port HDA over to jack
> sensing via control devices, and hence I believe this is definitely the
> way to go.
>
> Jack sensing for video plugs should similarly be handled by the video
> layer -- and actually is handled like that for VGA hotplug already.
>
> HDMI is a composite technology, but I don't think this should be used as
> excuse to implement jack sensing via an independent subsystem. If it is
> composite it should simply use the native sensing interfaces for its
> technologies, i.e. send plug notifications via both video and audio
> control, the way they want.
For composite devices such as HDMI, with audio and video "sub ports",
user space needs to know which audio port and which video port are the
same physical port. For example, when displaying UI to select an audio
output device, this information is needed to display an X display ID,
monitor name, etc. as an option, which is much more informative to a
user than some random GPU's ALSA PCM device name/number.
It seems like this correlation would be easier to represent with a
single unified jack model, with "sub ports" on each jack for audio,
video, etc.
However, if the jack information is pushed into separate APIs, can they
please at least have some unified naming for the physical ports, so that
user-space can determine which audio jacks are the same as which video
jacks etc.
Thanks!
--
nvpublic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 21:41 ` Dmitry Torokhov
@ 2011-06-23 0:15 ` Mark Brown
2011-06-23 8:42 ` Dmitry Torokhov
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-23 0:15 UTC (permalink / raw)
To: Dmitry Torokhov
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
On Wed, Jun 22, 2011 at 02:41:54PM -0700, Dmitry Torokhov wrote:
> On Wed, Jun 22, 2011 at 04:11:30PM +0100, Mark Brown wrote:
> > The idea of reporting presence status for USB ports doesn't seem
> > obviously bad.
> And so is reporting whether network cable is plugged in. Does not mean
> that it should be routed through input though.
Like I said in the message you're replying to I don't really disagree
with that. There's a whole section of the mail (which you quoted but I
didn't respond to) where I tried to clarify the several issues here.
Just to reiterate once more it's not the fact that people don't like the
input API, it's the fact that people are proposing solutions which are
obviously less functional than what we've got right now.
> > So if what you want to do is create a new API for this sort of switch
> > rather than do some audio specific thing like you keep saying I guess we
> > could merge some version of the switch API that the Android guys wrote
> I do not think you want to have switch API, it is more a 'connection'
> API that is needed (i.e. we need a way to report and query whether
> something is connected to something else or not).
The reason I called it switch is because that's what both the existing
Android API and the events within input are called. It's also not just
a case of reporting if things are connected, we need to group these
things together into bundles and link them to multiple other devices in
the system.
> > If we did one of those two things we would still need to deal with all
> > the same permission management issues that we've got for input devices
> > at some point.
> Hmm, the connection changes should normally be infrequent; maybe
> delivering them over netlink in the same fashion we deliver uevents
> woudl make sense. In fact, maybe uevents are all that is needed here.
uevents are what the Android switch API uses. I think that would have
some of the daemon issues that David raised, though?
> > > case to misuse input. In many modern connector cases the jack senses
> > > are not even buttons anymore, and they should not become buttons for
> > > userspace.
> > I do have some passing familiarity with the physical implementations,
> > thanks. Note that the presence detection stuff is all reported as
> > *switches* rather than buttons.
> Buttons or switches does not really make much difference. They are not
> really HID devices (unless there are physical switches that can be
> toggled by the user while device is still plugged into a socket).
There are such switches on a vast proportion of jacks out there,
typically presented physically as a button, so we really do want input
devices in the mix. To repeat again I'm not saying that they need to be
the only way a jack is represented.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-22 21:01 ` Lennart Poettering
2011-06-22 21:57 ` Stephen Warren
@ 2011-06-23 1:10 ` Mark Brown
2011-06-23 7:01 ` Clemens Ladisch
2011-06-23 9:49 ` Lennart Poettering
1 sibling, 2 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-23 1:10 UTC (permalink / raw)
To: Lennart Poettering
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson
On Wed, Jun 22, 2011 at 11:01:31PM +0200, Lennart Poettering wrote:
> On Wed, 22.06.11 14:25, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
> > On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
> Input devices should really just be used for input, i.e. interfacing
> with humans via keyboards, mice, joystick, touchscreens and
Once more: there are several mostly orthogonal issues here which are
getting conflated a bit. The main ones are:
- We need to represent the fact that multiple subsystems are working
with the same jack.
- People don't like the fact that we're currently reporting state via
the input API.
- Even if we invent new interfaces for this we really ought to be able
to teach userspace about existing kernels.
Like Stephen says the first point is vital and it's the major thing I'm
missing with the idea of doing everything per subsystem but it's the one
which you and Kay aren't really addressing. We need some way for
userspace to figure out that a jack exists and that it contains multiple
functionalities from multiple subsystems. If we have a proposal for how
we do that which seems viable then great, but we don't have that yet.
> suchlike. However Jack sensing does not qualify as such. The main reason
> why input device exist to do jack sensing right now appears to be that
> there was no other infrastructure readily available for it. But that is
No, that's not the case. If you look at the changelogs when this was
added the Zaurus systems that were the main driving force had a bunch of
switches in them, one of which was a microswitch for detecting physical
insertion of jacks, so implementing them as switches was fairly natural.
When I did the more detailed jack support within ALSA and ASoC I looked
at what we were doing at the time, didn't see any massive problems with
it (it probably wouldn't have been my first choice) and just worked with
the existing ABI.
Personally it's not using the input API in particular that I'm concerned
about, it's the fact that the way we're using it at the minute gives us
a single thing to point at in userspace. Although that said some of the
discussion came from the fact that looking at this does make me think we
have several other problems with how we're using and handling input in
both kernel and userspace right now and perhaps it's best to fix up the
current situation up (at least in userspace) even if we change things
later.
> aren't really clear yet but not for the long run. On top of that there
> actually always has been a more appropriate place for signalling audio
> jack sensing events: the alsa control layer. Because that is what Jack
> sensing ultimately is: audio control. And if you have it in audio
No, not at all - it's easy to make this assumption in the basic PC space
(though you do still have to dismiss HDMI) but as soon as you start to
look at a wider range of systems systems you just start to see issues.
If you've got a jack which only carries audio data then obviously that's
all it's there for but as you start to carry additional functionality
over the one connector it becomes more and more difficult to take that
view.
> On top of that Takashi has patches ready to port HDA over to jack
> sensing via control devices, and hence I believe this is definitely the
> way to go.
I really hope it's not just for HDA, that would again be a loss.
> Jack sensing for video plugs should similarly be handled by the video
> layer -- and actually is handled like that for VGA hotplug already.
Actually one other issue here is that physically the jack sensing is
generally done over the jack as a whole in some way so even if we don't
make it user visible we're going to want some general interface within
the kernel to glue stuff together.
> HDMI is a composite technology, but I don't think this should be used as
> excuse to implement jack sensing via an independent subsystem. If it is
> composite it should simply use the native sensing interfaces for its
> technologies, i.e. send plug notifications via both video and audio
> control, the way they want.
That's certainly simpler for programs which only work within a single
subsystem but as soon as you care about the system as a whole it becomes
less clear, especially given that the proposals to do subsystem specific
things haven't included any suggestion about how userspace should work
out that the different objects are connected to each other.
One of the use cases I thought about with this was that if you're on a
phone call with headset and simultaneously playing a movie via HDMI it's
going to be important to make sure that you work out which jack any
button press events came from in order to decide if you should pause the
move or hang up the call.
> > To summarise some of the issues from the rest of the thread:
> > - This isn't a new interface at all, it's been implemented in the
> > kernel for quite some time now (the first switch was added in
> > 2.6.18). All that's changed here is that PulseAudio is trying to
> > use the information that the kernel is exporting.
> It's not a new interface, but it will be used for the first time in a
> big way that will find its way into generic distros. That means it is
> still time to get this right before this is too late.
Generic *desktop* distros. There's also the issue I see with other
non-jack but similar functionality which is currently made available via
the same mechanism - the question which I was asking earlier was how
things currently work for them or if we're just happening to look at
this one problem first.
> You should also not forget that having a fucked up kernel interface
> usually means a lot of additional work to work around that in userspace:
> i.e. in PA we need some way to match up input devices with alsa control
> devices and specific controls. That is messy, requires knowledge and
> logic that would be much much simpler if we could just access jack
> sensing via the same interface as the rest of the audio control stuff we do.
Right, but hopefully this means you can appreciate the concerns that I
and Stephen have expressed about being able to join the different bits
of the jack up with each other for programs which do care about things
over more than one subsystem?
> > - Many jacks include buttons so need an input device anyway which again
> > we want to join up with the other functions.
> Well, if you are speaking of headsets here with specific Play and Stop
> buttons then I believe those actually qualify as proper input keys, and
> it is the right thing to route them via the input layer. And if we do X
> and GNOME and Rhythmbox will actually already handle them as they should.
Yes, they do - on a system wide basis.
> > - We still need to handle cases where the jack isn't particularly
> > closely connected to the audio subsystem.
> Well, I am tempted to say that this should be handled in kernel. I am
> not convinced that it is a good idea to expose particularities of the hw
> design too much in userspace. In fact, I agree with David that the
> thinkpad laptop driver would best integrate directly with HDA so that in
> userspace no knowledge about their relation would be necessary.
Yeah, that's just one example - I was actually thinking of the sort of
large many function connectors in the style of docking stations you can
get (and I guess docking stations too) and need to get to play together.
> > - There are other non-audio non-jack things being exposed via the same
> > interface which we need to have some method for handling even if we
> > end up doing something audio specific for some subset of jacks.
> This exists for VGA hotplug at least. I see no reason why this shouldn't
> be available for HDMI as well.
That's not my point - I'm talking about things like the docking station
or front proximity detection that are clearly not at all related to
jacks but are going out via the same interface. If jacks don't
currently work in the application layer due to the way input is handled
then like I said above it looks like we have a bunch of other issues we
also need to cope with.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 1:10 ` Mark Brown
@ 2011-06-23 7:01 ` Clemens Ladisch
2011-06-23 7:24 ` Takashi Iwai
2011-06-23 9:49 ` Lennart Poettering
1 sibling, 1 reply; 57+ messages in thread
From: Clemens Ladisch @ 2011-06-23 7:01 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson, Lennart Poettering
Mark Brown wrote:
> We need some way for userspace to figure out that a jack exists and
> that it contains multiple functionalities from multiple subsystems.
> If we have a proposal for how we do that which seems viable then
> great, but we don't have that yet.
Last November I posted a patch to add ALSA entity types to the media
API; it stalled due to lack of time. What's still missing is
a mechanism to associate entities with ALSA controls.
Regards,
Clemens
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 7:01 ` Clemens Ladisch
@ 2011-06-23 7:24 ` Takashi Iwai
0 siblings, 0 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-23 7:24 UTC (permalink / raw)
To: Clemens Ladisch
Cc: ALSA Development Mailing List, Mark Brown, Kay Sievers,
Lennart Poettering, David Henningsson
At Thu, 23 Jun 2011 09:01:33 +0200,
Clemens Ladisch wrote:
>
> Mark Brown wrote:
> > We need some way for userspace to figure out that a jack exists and
> > that it contains multiple functionalities from multiple subsystems.
> > If we have a proposal for how we do that which seems viable then
> > great, but we don't have that yet.
>
> Last November I posted a patch to add ALSA entity types to the media
> API; it stalled due to lack of time. What's still missing is
> a mechanism to associate entities with ALSA controls.
The association of ALSA stuff is the biggest obstacle, IMHO.
The implementation of the notification mechanism itself is trivial, no
matter what way is used. But, it's pretty new that you create the
association between jacks and various components (PCM, control
elements, whatever). Such information is completely missing in the
current driver, thus this must be added newly from scratch.
Basically this was the reason I didn't go further for the jack
notification implementation with ALSA control API (besides the lack of
time). Which information should be coupled and how should be -- this
has to be discussed in details beforehand.
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 0:15 ` Mark Brown
@ 2011-06-23 8:42 ` Dmitry Torokhov
2011-06-23 10:47 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: Dmitry Torokhov @ 2011-06-23 8:42 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
On Thu, Jun 23, 2011 at 01:15:01AM +0100, Mark Brown wrote:
> On Wed, Jun 22, 2011 at 02:41:54PM -0700, Dmitry Torokhov wrote:
> > On Wed, Jun 22, 2011 at 04:11:30PM +0100, Mark Brown wrote:
>
> > > The idea of reporting presence status for USB ports doesn't seem
> > > obviously bad.
>
> > And so is reporting whether network cable is plugged in. Does not mean
> > that it should be routed through input though.
>
> Like I said in the message you're replying to I don't really disagree
> with that. There's a whole section of the mail (which you quoted but I
> didn't respond to)
That is because I mostly agree with you saying that it should be a
generic API, not alsa or other subsystem-specific solution.
> where I tried to clarify the several issues here.
> Just to reiterate once more it's not the fact that people don't like the
> input API, it's the fact that people are proposing solutions which are
> obviously less functional than what we've got right now.
>
> > > So if what you want to do is create a new API for this sort of switch
> > > rather than do some audio specific thing like you keep saying I guess we
> > > could merge some version of the switch API that the Android guys wrote
>
> > I do not think you want to have switch API, it is more a 'connection'
> > API that is needed (i.e. we need a way to report and query whether
> > something is connected to something else or not).
>
> The reason I called it switch is because that's what both the existing
> Android API and the events within input are called.
And I think we should stop calling them switches, if only to avoid
confusion. Input layer originally had notion of switches representing
physical objects that could be toggled on an off and retain their state
(unlike a button that is expected to auto-release). Later jacks were
added as "switches" (which is something I never liked but end up adding
snice we did not have anything better at the time - still don't).
> It's also not just
> a case of reporting if things are connected, we need to group these
> things together into bundles and link them to multiple other devices in
> the system.
Hmm, I am not sure I follow. We usually need to calssify and group
devices; there is nothing new here...
>
> > > If we did one of those two things we would still need to deal with all
> > > the same permission management issues that we've got for input devices
> > > at some point.
>
> > Hmm, the connection changes should normally be infrequent; maybe
> > delivering them over netlink in the same fashion we deliver uevents
> > woudl make sense. In fact, maybe uevents are all that is needed here.
>
> uevents are what the Android switch API uses. I think that would have
> some of the daemon issues that David raised, though?
I'll have to locate original email; I am not subscribed to alsa list.
>
> > > > case to misuse input. In many modern connector cases the jack senses
> > > > are not even buttons anymore, and they should not become buttons for
> > > > userspace.
>
> > > I do have some passing familiarity with the physical implementations,
> > > thanks. Note that the presence detection stuff is all reported as
> > > *switches* rather than buttons.
>
> > Buttons or switches does not really make much difference. They are not
> > really HID devices (unless there are physical switches that can be
> > toggled by the user while device is still plugged into a socket).
>
> There are such switches on a vast proportion of jacks out there,
> typically presented physically as a button, so we really do want input
> devices in the mix. To repeat again I'm not saying that they need to be
> the only way a jack is represented.
And I agree here as well.
Thanks.
--
Dmitry
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 1:10 ` Mark Brown
2011-06-23 7:01 ` Clemens Ladisch
@ 2011-06-23 9:49 ` Lennart Poettering
2011-06-23 11:43 ` Mark Brown
2011-06-23 15:32 ` Stephen Warren
1 sibling, 2 replies; 57+ messages in thread
From: Lennart Poettering @ 2011-06-23 9:49 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson
On Thu, 23.06.11 02:10, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
>
> On Wed, Jun 22, 2011 at 11:01:31PM +0200, Lennart Poettering wrote:
> > On Wed, 22.06.11 14:25, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
> > > On Wed, Jun 22, 2011 at 02:50:59PM +0200, Kay Sievers wrote:
>
> > Input devices should really just be used for input, i.e. interfacing
> > with humans via keyboards, mice, joystick, touchscreens and
>
> Once more: there are several mostly orthogonal issues here which are
> getting conflated a bit. The main ones are:
>
> - We need to represent the fact that multiple subsystems are working
> with the same jack.
Tbh I am not entirely convinced that this is really such an important
thing. We can't even map audio controls to PCM devices which would be
vastly more interesting, and we can't even do that.
I can give you a thousand of real-life usecases for wanting to match up
PCM devices with controls, but the one for wanting to match up HDMI
audio with HDMI video is much weaker, since machines usually have
multiple PCM streams, but not multiple HDMI ports.
I am not saying that such a match-up shouldn't be possible, but I'd say
it could be relatively easy to implement. One option would be to go by
names. i.e. simply say that if an alsa control device is called "HDMI1
Jack Sensing", and an X11 XRANDR port is called "HDMI1", then they
should be the same, and apps should comapre the first words of these
names, and that both can be traced to the same udev originating device.
In fact, something similar is now already in place for external usb
speakers with built-in volume keys to map X11 XInput keypresses up to PA
audio devices so that we can map volume keypresses to the appropriate
audio device in gnome-settings-daemon (note that the latter is not
making use of this yet, as the infrastructure is still very new): for
each keypress we can find the originating XInput device, from that we
can query the kernel device, which we can then find in sysfs. Similarly
we can find the sysfs device from PA and then do a match-up.
So, what I am saying here is that doing this cross-subsystem match-up in
other areas is already possible. It's not beautiful, but it works.
If you are thinking about matching up devics across subsystems, you
should not limit your focus to jack sensing events only: the above
mentioned key-press use is a lot more interesting -- and is solved.
Also, I'd claim that adding an additional subsystem that covers only
jack sensing would complicate things even further, since now you have to
match up audio, video and input devices with this jack sensing device,
instead of just matching them up directly.
> - Even if we invent new interfaces for this we really ought to be able
> to teach userspace about existing kernels.
See, I don't buy this actually. It wouldn't be a regression if we don't
support the input device based scheme in PA.
> > > - There are other non-audio non-jack things being exposed via the same
> > > interface which we need to have some method for handling even if we
> > > end up doing something audio specific for some subset of jacks.
>
> > This exists for VGA hotplug at least. I see no reason why this shouldn't
> > be available for HDMI as well.
>
> That's not my point - I'm talking about things like the docking station
> or front proximity detection that are clearly not at all related to
> jacks but are going out via the same interface. If jacks don't
> currently work in the application layer due to the way input is handled
> then like I said above it looks like we have a bunch of other issues we
> also need to cope with.
I think bolting proximity detection and docking station stuff into the
input layer is very wrong too. Both deserve probably their own kind of
class devices.
Lennart
--
Lennart Poettering - Red Hat, Inc.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 8:42 ` Dmitry Torokhov
@ 2011-06-23 10:47 ` Mark Brown
0 siblings, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-23 10:47 UTC (permalink / raw)
To: Dmitry Torokhov
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
On Thu, Jun 23, 2011 at 01:42:36AM -0700, Dmitry Torokhov wrote:
> On Thu, Jun 23, 2011 at 01:15:01AM +0100, Mark Brown wrote:
> > a case of reporting if things are connected, we need to group these
> > things together into bundles and link them to multiple other devices in
> > the system.
> Hmm, I am not sure I follow. We usually need to calssify and group
> devices; there is nothing new here...
Usually we do this per device type and by control path but what we need
to do here is say that a given jack has audio, video and input
functionality - it's a different grouping to our normal ones.
> > uevents are what the Android switch API uses. I think that would have
> > some of the daemon issues that David raised, though?
> I'll have to locate original email; I am not subscribed to alsa list.
The issue was that he would really prefer PulseAudio (which runs as a
regular user) to be able to read the events directly.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 9:49 ` Lennart Poettering
@ 2011-06-23 11:43 ` Mark Brown
2011-06-23 15:32 ` Stephen Warren
1 sibling, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-23 11:43 UTC (permalink / raw)
To: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson
On Thu, Jun 23, 2011 at 11:49:25AM +0200, Lennart Poettering wrote:
> On Thu, 23.06.11 02:10, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
> > - We need to represent the fact that multiple subsystems are working
> > with the same jack.
> Tbh I am not entirely convinced that this is really such an important
> thing. We can't even map audio controls to PCM devices which would be
> vastly more interesting, and we can't even do that.
No disagreement that being able to expose the audio routing within the
devices is needed.
> I can give you a thousand of real-life usecases for wanting to match up
> PCM devices with controls, but the one for wanting to match up HDMI
> audio with HDMI video is much weaker, since machines usually have
> multiple PCM streams, but not multiple HDMI ports.
Apparently that's becoming an issue on desktops with nVidia chipsets -
there's been a moderate amount of discussion on that recently on the
list.
> I am not saying that such a match-up shouldn't be possible, but I'd say
> it could be relatively easy to implement. One option would be to go by
> names. i.e. simply say that if an alsa control device is called "HDMI1
> Jack Sensing", and an X11 XRANDR port is called "HDMI1", then they
> should be the same, and apps should comapre the first words of these
> names, and that both can be traced to the same udev originating device.
You certainly can't assume that the same device will originate all the
functionality on the jack - in embedded systems the way functionality is
built up from blocks on the SoC means that the control bus routing is of
essentially no use in working out how the system looks to the user.
> In fact, something similar is now already in place for external usb
> speakers with built-in volume keys to map X11 XInput keypresses up to PA
> audio devices so that we can map volume keypresses to the appropriate
> audio device in gnome-settings-daemon (note that the latter is not
> making use of this yet, as the infrastructure is still very new): for
> each keypress we can find the originating XInput device, from that we
> can query the kernel device, which we can then find in sysfs. Similarly
> we can find the sysfs device from PA and then do a match-up.
OK, can you point us to a description of what the actual mechanism is
here? This sounds hopeful, though the fact that nobody knows about it
isn't inspiring.
> If you are thinking about matching up devics across subsystems, you
> should not limit your focus to jack sensing events only: the above
> mentioned key-press use is a lot more interesting -- and is solved.
I have *repeatedly* talked about the need to hook the button presses up
with the rest of the jacks. Including in the mail you're replying to.
> Also, I'd claim that adding an additional subsystem that covers only
> jack sensing would complicate things even further, since now you have to
> match up audio, video and input devices with this jack sensing device,
> instead of just matching them up directly.
On the other hand it also means you now have to work through any number
of separate APIs (the jack could acquire other functionality beyond
those three) and manually build up a picture of what's there. I like
the idea that we can point application developers to a thing rather than
them having to infer what's going on from a bunch of separate devices -
the fact that nobody else seems to know about the support you mention is
one of the potential issues with this.
> > - Even if we invent new interfaces for this we really ought to be able
> > to teach userspace about existing kernels.
> See, I don't buy this actually. It wouldn't be a regression if we don't
> support the input device based scheme in PA.
On the other hand if we go for doing something completely new it's going
to push back support by I'd guess at least six months. Which doesn't
seem great.
> > currently work in the application layer due to the way input is handled
> > then like I said above it looks like we have a bunch of other issues we
> > also need to cope with.
> I think bolting proximity detection and docking station stuff into the
> input layer is very wrong too. Both deserve probably their own kind of
> class devices.
I don't really disagree, I'm just saying that we have a broader problem
which we should probably look at those too.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-23 9:49 ` Lennart Poettering
2011-06-23 11:43 ` Mark Brown
@ 2011-06-23 15:32 ` Stephen Warren
1 sibling, 0 replies; 57+ messages in thread
From: Stephen Warren @ 2011-06-23 15:32 UTC (permalink / raw)
To: Lennart Poettering, Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Henningsson, David
Lennart Poettering wrote at Thursday, June 23, 2011 3:49 AM:
> On Thu, 23.06.11 02:10, Mark Brown (broonie@opensource.wolfsonmicro.com) wrote:
> I can give you a thousand of real-life usecases for wanting to match up
> PCM devices with controls, but the one for wanting to match up HDMI
> audio with HDMI video is much weaker, since machines usually have
> multiple PCM streams, but not multiple HDMI ports.
As an example, all NVIDIA GPUs, and some chipsets, built within the last
perhaps 2 or 3 years contain an HD-audio controller with 4 PCM devices.
Any/all of these could be routed out to actual connectors on the board.
For audio purposes, HDMI, DVI (connected to an HDMI sink), and DisplayPort
connectors all can carry audio and end up being exposed identically by
ALSA. In practice, it's very common to have at least 2 such digital
connectors on a GPU.
I believe both Intel and AMD/ATI chipsets and GPUs that support audio
are in exactly the same boat, although perhaps only supporting up to
2 or 3 PCMs total?
> I am not saying that such a match-up shouldn't be possible, but I'd say
> it could be relatively easy to implement. One option would be to go by
> names. i.e. simply say that if an alsa control device is called "HDMI1
> Jack Sensing", and an X11 XRANDR port is called "HDMI1", then they
> should be the same, and apps should comapre the first words of these
> names, and that both can be traced to the same udev originating device.
It'll be challenging to implement this based on device name alone. The
X driver probably has enough information to know the difference between
DVI, HDMI, and DP ports and hence could name ports appropriately.
However, the generic HD-audio HDMI driver has basically zero knowledge
of which HDA pin complexes are routed to connectors at all, or for those
that are, whether they're routed to DVI, HDMI, or DP ports, and equally
no idea what port number X might choose if there are say 2 DVI ports.
(well, HDMI vs. DP is pin complex property, but HDMI vs. DVI certainly
isn't, and numbering order when there are n HDMI pins isn't defined, and
only very recent NVIDIA GPUs mark unconnected pin complexes)
This is what I attempted to start talking about in my somewhat recent
email titled " EDID-like data for audio: Port correlation between ALSA
and X"; see:
http://mailman.alsa-project.org/pipermail/alsa-devel/2011-May/040186.html
The basic idea: This X vs. ALSA correlation is probably going to be
through ELD data, at least internally to the kernel.
--
nvpublic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-20 13:37 Jack event API - decision needed David Henningsson
2011-06-20 17:07 ` Mark Brown
@ 2011-06-27 12:07 ` Mark Brown
2011-06-27 17:01 ` Colin Guthrie
2011-06-28 16:27 ` David Henningsson
1 sibling, 2 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-27 12:07 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
> 1) We're continuing the path with /dev/input devices
> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
> 2b) We'll rewrite these devices to be something within ALSA, but not
> exactly mixer controls.
> For options 2a) and 2b) I guess the existing /dev/input thing should be
> deprecated and/or removed. So part of decision should maybe be based on
> information about how widespread the usage of these devices are
> currently...?
So, this discussion seems to have ground to a halt. Summarising briefly
it seems that Lennart and Kay both seem adamant that they don't want a
unified jack object visible to userspace, they want to glue the objects
together via some implicit means. Lennart mentioned that there is some
support for at least some of this functionality in PulseAudio already
but didn't provide any details yet - does anyone else know what happens
here or have time to go look at the code?
In kernel I'm concerned that we'll still need to have a unified object
due to the need to share the detection between multiple subsystems, if
one driver can report for multiple subsystems. This seems likely to
mean that we'll end up with some sort of in kernel object for many if
not all jacks.
I'm also concerned that an implicit system may be hard to use from
userspace, if only in terms of figuring out what the rules are for
matching things up.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-27 12:07 ` Mark Brown
@ 2011-06-27 17:01 ` Colin Guthrie
2011-06-28 16:20 ` Mark Brown
2011-07-09 3:38 ` Mark Brown
2011-06-28 16:27 ` David Henningsson
1 sibling, 2 replies; 57+ messages in thread
From: Colin Guthrie @ 2011-06-27 17:01 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
'Twas brillig, and Mark Brown at 27/06/11 13:07 did gyre and gimble:
> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>
>> 1) We're continuing the path with /dev/input devices
>
>> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
>
>> 2b) We'll rewrite these devices to be something within ALSA, but not
>> exactly mixer controls.
>
>> For options 2a) and 2b) I guess the existing /dev/input thing should be
>> deprecated and/or removed. So part of decision should maybe be based on
>> information about how widespread the usage of these devices are
>> currently...?
>
> So, this discussion seems to have ground to a halt. Summarising briefly
> it seems that Lennart and Kay both seem adamant that they don't want a
> unified jack object visible to userspace, they want to glue the objects
> together via some implicit means. Lennart mentioned that there is some
> support for at least some of this functionality in PulseAudio already
> but didn't provide any details yet - does anyone else know what happens
> here or have time to go look at the code?
So AFAIK, there is no support in PulseAudio per-se for this.
My understanding is that the component which deals with handling the
volume keyboard controls (e.g. gnome-settings-daemon?) is the bit that
actually does this glueing....
I believe it works like this:
1. It gets an keyboard event for VOLUME_UP.
2. It checks which device this keypress came from.
3. If it's a generic keyboard or the laptop keyboard, then it likely
just picks the "default" sound device.
4. If it's a USB HID that happens to have an "Originating Device" then
g-s-d looks through the various PA sinks and see is one of them has the
same "Originating Device". If we get a match, then this is the device
we'll use.
5. g-s-d issues a volume increment or decrement to PA with the
appropriate device.
The above is a very high level, hand wavey description. I don't think PA
has (or needs) specific support to tie these things together.
I'm not 100% sure what elements of a sink is used to do this matching
but I'd take a wild stab in the dark guess at the "device.bus_path"
entry in the proplis which can look something like:
device.bus_path = "pci-0000:00:1d.7-usb-0:7.1:1.0"
But I really don't know what the xinput2 or gtk3 side of things looks
like for this, so I'm just going on descriptions Lennart has given in
the past.
> I'm also concerned that an implicit system may be hard to use from
> userspace, if only in terms of figuring out what the rules are for
> matching things up.
I doubt this will be much of a problem once a few clean examples are out
there. The logic to match things up should be pretty trivial.
Col
--
Colin Guthrie
gmane(at)colin.guthr.ie
http://colin.guthr.ie/
Day Job:
Tribalogic Limited [http://www.tribalogic.net/]
Open Source:
Mageia Contributor [http://www.mageia.org/]
PulseAudio Hacker [http://www.pulseaudio.org/]
Trac Hacker [http://trac.edgewall.org/]
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-27 17:01 ` Colin Guthrie
@ 2011-06-28 16:20 ` Mark Brown
2011-07-09 3:38 ` Mark Brown
1 sibling, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-28 16:20 UTC (permalink / raw)
To: Colin Guthrie
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering, David Henningsson
On Mon, Jun 27, 2011 at 06:01:55PM +0100, Colin Guthrie wrote:
> I'm not 100% sure what elements of a sink is used to do this matching
> but I'd take a wild stab in the dark guess at the "device.bus_path"
> entry in the proplis which can look something like:
> device.bus_path = "pci-0000:00:1d.7-usb-0:7.1:1.0"
Ah, if it's doing that that's not going to work in general - the jack
may not be part of the same device as the thing we're reporting on.
> > I'm also concerned that an implicit system may be hard to use from
> > userspace, if only in terms of figuring out what the rules are for
> > matching things up.
> I doubt this will be much of a problem once a few clean examples are out
> there. The logic to match things up should be pretty trivial.
One of my biggest concerns here is people even finding the relevant bit
of the stack, let alone working out what to do with it - bear in mind
that my main focus is on the embedded side so people may not be familiar
with Linux or using the standard desktop stack at all.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-27 12:07 ` Mark Brown
2011-06-27 17:01 ` Colin Guthrie
@ 2011-06-28 16:27 ` David Henningsson
2011-06-28 16:34 ` Liam Girdwood
` (2 more replies)
1 sibling, 3 replies; 57+ messages in thread
From: David Henningsson @ 2011-06-28 16:27 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-27 13:07, Mark Brown wrote:
> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>
>> 1) We're continuing the path with /dev/input devices
>
>> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
>
>> 2b) We'll rewrite these devices to be something within ALSA, but not
>> exactly mixer controls.
>
>> For options 2a) and 2b) I guess the existing /dev/input thing should be
>> deprecated and/or removed. So part of decision should maybe be based on
>> information about how widespread the usage of these devices are
>> currently...?
>
> So, this discussion seems to have ground to a halt.
Yes, unfortunately, and without a clear consensus. That puts me in a
difficult position, because I'm trying to get the job done. And very
preferrable, in the next release of Ubuntu (which is to be released on
October this year). I've been talking to a few of my colleagues here at
Canonical, and here's how we reason currently:
We have the input layer. We also have a set of patches for supporting
this in PulseAudio (although currently unmerged and I'm not sure about
their current quality/state).
We don't have the alternative Alsa Control API discussed in this thread.
Without taking a stance of whether that would be a better solution or
not, designing it will take some time - somewhat depending on the set of
problems we're aiming to solve. Regardless, it will definitely not reach
the 3.0 kernel (which is what we will ship in the next release of Ubuntu).
As such, it makes the most sense for me to continue working on the
existing input layer API for the time being. If or when a new API is
announced and finished, rewriting the pulseaudio patches to target that
API will probably make sense. But we're not there today, and the time
schedule for getting there is unknown.
If upstream udev/PulseAudio would be willing to merge my (existing and
upcoming) patches, I would appreciate that, as I believe that would make
both our lives easier. If not, well at least make a note that I /tried/
to do things the right way, and to make everybody happy - which is not
always possible.
--
David Henningsson
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-28 16:27 ` David Henningsson
@ 2011-06-28 16:34 ` Liam Girdwood
2011-06-28 16:35 ` Mark Brown
2011-06-28 16:35 ` Takashi Iwai
2 siblings, 0 replies; 57+ messages in thread
From: Liam Girdwood @ 2011-06-28 16:34 UTC (permalink / raw)
To: David Henningsson
Cc: ALSA Development Mailing List, Kay Sievers, Takashi Iwai,
Mark Brown, Graeme Gregory, Lennart Poettering
On 28/06/11 17:27, David Henningsson wrote:
> On 2011-06-27 13:07, Mark Brown wrote:
>> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>>
>>> 1) We're continuing the path with /dev/input devices
>>
>>> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
>>
>>> 2b) We'll rewrite these devices to be something within ALSA, but not
>>> exactly mixer controls.
>>
>>> For options 2a) and 2b) I guess the existing /dev/input thing should be
>>> deprecated and/or removed. So part of decision should maybe be based on
>>> information about how widespread the usage of these devices are
>>> currently...?
>>
>> So, this discussion seems to have ground to a halt.
>
> Yes, unfortunately, and without a clear consensus. That puts me in a
> difficult position, because I'm trying to get the job done. And very
> preferrable, in the next release of Ubuntu (which is to be released on
> October this year). I've been talking to a few of my colleagues here at
> Canonical, and here's how we reason currently:
>
> We have the input layer. We also have a set of patches for supporting
> this in PulseAudio (although currently unmerged and I'm not sure about
> their current quality/state).
I know that these are working on Panda and that Graeme is addressing review comments for upstream. Don't know his ETA yet though.
Liam
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-28 16:27 ` David Henningsson
2011-06-28 16:34 ` Liam Girdwood
@ 2011-06-28 16:35 ` Mark Brown
2011-06-28 16:35 ` Takashi Iwai
2 siblings, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-28 16:35 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Tue, Jun 28, 2011 at 05:27:15PM +0100, David Henningsson wrote:
> As such, it makes the most sense for me to continue working on the
> existing input layer API for the time being. If or when a new API is
> announced and finished, rewriting the pulseaudio patches to target
> that API will probably make sense. But we're not there today, and
> the time schedule for getting there is unknown.
> If upstream udev/PulseAudio would be willing to merge my (existing
> and upcoming) patches, I would appreciate that, as I believe that
> would make both our lives easier. If not, well at least make a note
> that I /tried/ to do things the right way, and to make everybody
> happy - which is not always possible.
FWIW I tend to agree that pragmatically it'd make a lot of sense to
support the current kernel interfaces in userspace even if we're also
going to decide to deprecate them, if only for the deployment reasons
you mention.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-28 16:27 ` David Henningsson
2011-06-28 16:34 ` Liam Girdwood
2011-06-28 16:35 ` Mark Brown
@ 2011-06-28 16:35 ` Takashi Iwai
2011-06-29 2:59 ` Mark Brown
2011-06-29 7:13 ` David Henningsson
2 siblings, 2 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-28 16:35 UTC (permalink / raw)
To: David Henningsson
Cc: ALSA Development Mailing List, Mark Brown, Kay Sievers,
Lennart Poettering
At Tue, 28 Jun 2011 17:27:15 +0100,
David Henningsson wrote:
>
> On 2011-06-27 13:07, Mark Brown wrote:
> > On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
> >
> >> 1) We're continuing the path with /dev/input devices
> >
> >> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
> >
> >> 2b) We'll rewrite these devices to be something within ALSA, but not
> >> exactly mixer controls.
> >
> >> For options 2a) and 2b) I guess the existing /dev/input thing should be
> >> deprecated and/or removed. So part of decision should maybe be based on
> >> information about how widespread the usage of these devices are
> >> currently...?
> >
> > So, this discussion seems to have ground to a halt.
>
> Yes, unfortunately, and without a clear consensus. That puts me in a
> difficult position, because I'm trying to get the job done. And very
> preferrable, in the next release of Ubuntu (which is to be released on
> October this year). I've been talking to a few of my colleagues here at
> Canonical, and here's how we reason currently:
>
> We have the input layer. We also have a set of patches for supporting
> this in PulseAudio (although currently unmerged and I'm not sure about
> their current quality/state).
So, the targets are already defined, i.e. what information is actually
required? This wasn't very clear to me until now.
> We don't have the alternative Alsa Control API discussed in this thread.
> Without taking a stance of whether that would be a better solution or
> not, designing it will take some time - somewhat depending on the set of
> problems we're aiming to solve. Regardless, it will definitely not reach
> the 3.0 kernel (which is what we will ship in the next release of Ubuntu).
If only the same functionality is required as currently done in the
input-jack layer, re-implementing the jack-detection elements in ALSA
control API is pretty easy. It means that the control element would
have some jack name with a location prefix or such, reports the
current jack status, and notifies the jack change. That's all.
Optionally, it can have a TLV data giving the HD-audio jack
attribute, too.
As already mentioned, the difficulties of implementation mainly
depends on the requirements. If just a flat array without structural
connection is enough, it's easy, of course.
But, it definitely won't reach to 3.0 kernel. So, maybe this won't
help in your case, anyway...
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-28 16:35 ` Takashi Iwai
@ 2011-06-29 2:59 ` Mark Brown
2011-06-29 5:34 ` Takashi Iwai
2011-06-29 7:13 ` David Henningsson
1 sibling, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-29 2:59 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
On Tue, Jun 28, 2011 at 06:35:33PM +0200, Takashi Iwai wrote:
> If only the same functionality is required as currently done in the
> input-jack layer, re-implementing the jack-detection elements in ALSA
> control API is pretty easy. It means that the control element would
> have some jack name with a location prefix or such, reports the
> current jack status, and notifies the jack change. That's all.
> Optionally, it can have a TLV data giving the HD-audio jack
> attribute, too.
It needs a subset of the current information - it should report only
audio events, so pretty much only headphone, microphone or line out
presence. Anything else needs to be reported via a different API and
figured out by userspace, and the input device should stay there for
button press events.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 2:59 ` Mark Brown
@ 2011-06-29 5:34 ` Takashi Iwai
2011-06-29 6:59 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: Takashi Iwai @ 2011-06-29 5:34 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Tue, 28 Jun 2011 19:59:18 -0700,
Mark Brown wrote:
>
> On Tue, Jun 28, 2011 at 06:35:33PM +0200, Takashi Iwai wrote:
>
> > If only the same functionality is required as currently done in the
> > input-jack layer, re-implementing the jack-detection elements in ALSA
> > control API is pretty easy. It means that the control element would
> > have some jack name with a location prefix or such, reports the
> > current jack status, and notifies the jack change. That's all.
> > Optionally, it can have a TLV data giving the HD-audio jack
> > attribute, too.
>
> It needs a subset of the current information - it should report only
> audio events, so pretty much only headphone, microphone or line out
> presence.
Is that really all for PA, for now, and even for future?
That's my primary question. Was this clarified in the thread...?
> Anything else needs to be reported via a different API and
> figured out by userspace, and the input device should stay there for
> button press events.
The first and the second part are independent, IMO.
That is, whether using the input-jack device in future is an open
question. It might be better to re-implement in a new API in a
unified way, for example.
Of course, keeping the old API is mandatory, so we'll still keep
the input-jack code in future, though.
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 5:34 ` Takashi Iwai
@ 2011-06-29 6:59 ` Mark Brown
2011-06-29 7:03 ` Takashi Iwai
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-29 6:59 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
On Wed, Jun 29, 2011 at 07:34:11AM +0200, Takashi Iwai wrote:
> Mark Brown wrote:
> > It needs a subset of the current information - it should report only
> > audio events, so pretty much only headphone, microphone or line out
> > presence.
> Is that really all for PA, for now, and even for future?
> That's my primary question. Was this clarified in the thread...?
Well, that was what Kay and Lennart were keen on - the information for a
given subsystem should be reported via that subsystem so all the other
information we're reporting currently should be going via some other
subsystem.
Like I keep saying I still don't have a clear picture as to how this
would actually be implemented in userspace and how practical it is.
> > Anything else needs to be reported via a different API and
> > figured out by userspace, and the input device should stay there for
> > button press events.
> The first and the second part are independent, IMO.
> That is, whether using the input-jack device in future is an open
> question. It might be better to re-implement in a new API in a
> unified way, for example.
You're missing my point here. The point is not the switches, the point
is the buttons on the jacks - they need to go via the input API and not
the ALSA API.
> Of course, keeping the old API is mandatory, so we'll still keep
> the input-jack code in future, though.
That too.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 6:59 ` Mark Brown
@ 2011-06-29 7:03 ` Takashi Iwai
0 siblings, 0 replies; 57+ messages in thread
From: Takashi Iwai @ 2011-06-29 7:03 UTC (permalink / raw)
To: Mark Brown
Cc: ALSA Development Mailing List, Kay Sievers, Lennart Poettering,
David Henningsson
At Tue, 28 Jun 2011 23:59:41 -0700,
Mark Brown wrote:
>
> > > Anything else needs to be reported via a different API and
> > > figured out by userspace, and the input device should stay there for
> > > button press events.
>
> > The first and the second part are independent, IMO.
> > That is, whether using the input-jack device in future is an open
> > question. It might be better to re-implement in a new API in a
> > unified way, for example.
>
> You're missing my point here. The point is not the switches, the point
> is the buttons on the jacks - they need to go via the input API and not
> the ALSA API.
OK, buttons. How confusing terminologies are...
thanks,
Takashi
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-28 16:35 ` Takashi Iwai
2011-06-29 2:59 ` Mark Brown
@ 2011-06-29 7:13 ` David Henningsson
2011-06-29 7:21 ` Mark Brown
1 sibling, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-29 7:13 UTC (permalink / raw)
To: Takashi Iwai
Cc: ALSA Development Mailing List, Mark Brown, Kay Sievers,
Lennart Poettering
On 2011-06-28 17:35, Takashi Iwai wrote:
> At Tue, 28 Jun 2011 17:27:15 +0100,
> David Henningsson wrote:
>>
>> On 2011-06-27 13:07, Mark Brown wrote:
>>> On Mon, Jun 20, 2011 at 03:37:25PM +0200, David Henningsson wrote:
>>>
>>>> 1) We're continuing the path with /dev/input devices
>>>
>>>> 2a) We'll rewrite these devices to be read-only ALSA mixer controls
>>>
>>>> 2b) We'll rewrite these devices to be something within ALSA, but not
>>>> exactly mixer controls.
>>>
>>>> For options 2a) and 2b) I guess the existing /dev/input thing should be
>>>> deprecated and/or removed. So part of decision should maybe be based on
>>>> information about how widespread the usage of these devices are
>>>> currently...?
>>>
>>> So, this discussion seems to have ground to a halt.
>>
>> Yes, unfortunately, and without a clear consensus. That puts me in a
>> difficult position, because I'm trying to get the job done. And very
>> preferrable, in the next release of Ubuntu (which is to be released on
>> October this year). I've been talking to a few of my colleagues here at
>> Canonical, and here's how we reason currently:
>>
>> We have the input layer. We also have a set of patches for supporting
>> this in PulseAudio (although currently unmerged and I'm not sure about
>> their current quality/state).
>
> So, the targets are already defined, i.e. what information is actually
> required? This wasn't very clear to me until now.
Well, for basic jack detection the current input layer will work. We
might want to add some more information to it (e g by changing the name
string or use the additional fields available), but that should not be
very controversial, I think.
However, if we manage to expose all the routing between PCMs, volume
controls and ports, we can make an even better PulseAudio, because we
can solve other problems as well, such as the volume control naming
problem. But figuring out how to do that in the best way API-wise and so
on, I guess that will take some time.
>> We don't have the alternative Alsa Control API discussed in this thread.
>> Without taking a stance of whether that would be a better solution or
>> not, designing it will take some time - somewhat depending on the set of
>> problems we're aiming to solve. Regardless, it will definitely not reach
>> the 3.0 kernel (which is what we will ship in the next release of Ubuntu).
>
> If only the same functionality is required as currently done in the
> input-jack layer, re-implementing the jack-detection elements in ALSA
> control API is pretty easy.
Yes, but it is not clear to me that this is the decision, and what we
actually want to do. So far I have only seen Kay and Lennart advocating
for it and Mark against it. I started off this thread because I needed a
decision, but so far no clear decision has been made.
> It means that the control element would
> have some jack name with a location prefix or such, reports the
> current jack status, and notifies the jack change. That's all.
> Optionally, it can have a TLV data giving the HD-audio jack
> attribute, too.
I guess this will require support in alsa-lib as well then? At least if
you want to pass through new types of TLV data.
> As already mentioned, the difficulties of implementation mainly
> depends on the requirements. If just a flat array without structural
> connection is enough, it's easy, of course.
>
> But, it definitely won't reach to 3.0 kernel. So, maybe this won't
> help in your case, anyway...
Well, if the changes are not too invasive, there is also the possibility
to backport things into the Ubuntu version of the 3.0 kernel as well.
--
David Henningsson
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 7:13 ` David Henningsson
@ 2011-06-29 7:21 ` Mark Brown
2011-06-29 8:52 ` David Henningsson
0 siblings, 1 reply; 57+ messages in thread
From: Mark Brown @ 2011-06-29 7:21 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Wed, Jun 29, 2011 at 08:13:42AM +0100, David Henningsson wrote:
> However, if we manage to expose all the routing between PCMs, volume
> controls and ports, we can make an even better PulseAudio, because
> we can solve other problems as well, such as the volume control
> naming problem. But figuring out how to do that in the best way
> API-wise and so on, I guess that will take some time.
This is really a totally separate question, there's so much routing
within the devices that the jacks are just not a particularly
interesting part of it.
> >If only the same functionality is required as currently done in the
> >input-jack layer, re-implementing the jack-detection elements in ALSA
> >control API is pretty easy.
> Yes, but it is not clear to me that this is the decision, and what
> we actually want to do. So far I have only seen Kay and Lennart
> advocating for it and Mark against it. I started off this thread
> because I needed a decision, but so far no clear decision has been
> made.
I'm not against having the information in ALSA, or really against using
separate APIs. My main concern is being able to group the objects back
together again for use both in kernel (when the hardware overlaps) and
in userspace in an understandable fashion and I've not seen anything
that I'm as comfortable with as a directly visible object exported from
the kernel. If ALSA chooses to export some subset of the information
that's kind of orthogonal to that issue.
I do have to say I like not having to use alsa-lib to get the
information but that's even less of a big deal.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 7:21 ` Mark Brown
@ 2011-06-29 8:52 ` David Henningsson
2011-06-29 17:00 ` Mark Brown
0 siblings, 1 reply; 57+ messages in thread
From: David Henningsson @ 2011-06-29 8:52 UTC (permalink / raw)
To: Mark Brown
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On 2011-06-29 08:21, Mark Brown wrote:
> On Wed, Jun 29, 2011 at 08:13:42AM +0100, David Henningsson wrote:
>
>> However, if we manage to expose all the routing between PCMs, volume
>> controls and ports, we can make an even better PulseAudio, because
>> we can solve other problems as well, such as the volume control
>> naming problem. But figuring out how to do that in the best way
>> API-wise and so on, I guess that will take some time.
>
> This is really a totally separate question,
The question is not totally separate as if we design an ALSA API, we
might have this in the back of our minds in order to not build an API
when adding this information later will be very difficult.
> there's so much routing
> within the devices that the jacks are just not a particularly
> interesting part of it.
I'm probably misunderstanding you here, because that sounds crazy. What
sense would it make to display a routing graph if the jacks are not part
of that graph?
>>> If only the same functionality is required as currently done in the
>>> input-jack layer, re-implementing the jack-detection elements in ALSA
>>> control API is pretty easy.
>
>> Yes, but it is not clear to me that this is the decision, and what
>> we actually want to do. So far I have only seen Kay and Lennart
>> advocating for it and Mark against it. I started off this thread
>> because I needed a decision, but so far no clear decision has been
>> made.
>
> I'm not against having the information in ALSA, or really against using
> separate APIs.
Ok.
> My main concern is being able to group the objects back
> together again for use both in kernel (when the hardware overlaps) and
> in userspace in an understandable fashion and I've not seen anything
> that I'm as comfortable with as a directly visible object exported from
> the kernel.
I'm not following. For userspace, you need to match the input layer
against an ALSA sound card. You need to do that regardless of whether
the buttons and the jacks are in the input layer, or if only the buttons
are there. No difference in difficulty as I see it.
For the kernel part, I guess that if ASoC core (or ALSA core) is
responsible for the splitting, and that the actual driver has an object
covering both types of input (i e both jacks and buttons), that would be
the most obvious way to solve the problem.
> I do have to say I like not having to use alsa-lib to get the
> information but that's even less of a big deal.
Ok, is that because Android does not use alsa-lib?
--
David Henningsson
http://launchpad.net/~diwic
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-29 8:52 ` David Henningsson
@ 2011-06-29 17:00 ` Mark Brown
0 siblings, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-06-29 17:00 UTC (permalink / raw)
To: David Henningsson
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
Lennart Poettering
On Wed, Jun 29, 2011 at 09:52:13AM +0100, David Henningsson wrote:
> On 2011-06-29 08:21, Mark Brown wrote:
> >>However, if we manage to expose all the routing between PCMs, volume
> >>controls and ports, we can make an even better PulseAudio, because
> >>we can solve other problems as well, such as the volume control
> >>naming problem. But figuring out how to do that in the best way
> >>API-wise and so on, I guess that will take some time.
> >This is really a totally separate question,
> The question is not totally separate as if we design an ALSA API, we
> might have this in the back of our minds in order to not build an
> API when adding this information later will be very difficult.
I think if we make that difficult we've messed up massively anyway.
> >there's so much routing
> >within the devices that the jacks are just not a particularly
> >interesting part of it.
> I'm probably misunderstanding you here, because that sounds crazy.
> What sense would it make to display a routing graph if the jacks are
> not part of that graph?
I said they weren't an interesting part of it, not that they're not part
of it. It's a very small detail relative to all the other stuff we need
to worry about.
> >My main concern is being able to group the objects back
> >together again for use both in kernel (when the hardware overlaps) and
> >in userspace in an understandable fashion and I've not seen anything
> >that I'm as comfortable with as a directly visible object exported from
> >the kernel.
> I'm not following. For userspace, you need to match the input layer
> against an ALSA sound card. You need to do that regardless of
> whether the buttons and the jacks are in the input layer, or if only
> the buttons are there. No difference in difficulty as I see it.
As repeatedly discussed we need to do at least a three way match -
currently we need to also glue both input and audio to graphics as well.
It seems likely we'll end up with other things on there too.
> For the kernel part, I guess that if ASoC core (or ALSA core) is
> responsible for the splitting, and that the actual driver has an
> object covering both types of input (i e both jacks and buttons),
> that would be the most obvious way to solve the problem.
The multiple subsystems thing also applies here - you really can't rely
on doing this all in one driver.
> >I do have to say I like not having to use alsa-lib to get the
> >information but that's even less of a big deal.
> Ok, is that because Android does not use alsa-lib?
No, it's because it's a pretty big library with it's own programming
interface that's more complicated than your standard poll and read so
it's not quite so natural to integrate into your system management
daemon (that looks after the system as a whole) as it could be.
^ permalink raw reply [flat|nested] 57+ messages in thread
* Re: Jack event API - decision needed
2011-06-27 17:01 ` Colin Guthrie
2011-06-28 16:20 ` Mark Brown
@ 2011-07-09 3:38 ` Mark Brown
1 sibling, 0 replies; 57+ messages in thread
From: Mark Brown @ 2011-07-09 3:38 UTC (permalink / raw)
To: Colin Guthrie
Cc: Takashi Iwai, ALSA Development Mailing List, Kay Sievers,
David Henningsson, Lennart Poettering
On Mon, Jun 27, 2011 at 06:01:55PM +0100, Colin Guthrie wrote:
> 'Twas brillig, and Mark Brown at 27/06/11 13:07 did gyre and gimble:
> > So, this discussion seems to have ground to a halt. Summarising briefly
> > it seems that Lennart and Kay both seem adamant that they don't want a
> > unified jack object visible to userspace, they want to glue the objects
> > together via some implicit means. Lennart mentioned that there is some
> > support for at least some of this functionality in PulseAudio already
> > but didn't provide any details yet - does anyone else know what happens
> > here or have time to go look at the code?
> So AFAIK, there is no support in PulseAudio per-se for this.
Hrm, nobody else seems to have any idea...
> I'm not 100% sure what elements of a sink is used to do this matching
> but I'd take a wild stab in the dark guess at the "device.bus_path"
> entry in the proplis which can look something like:
> device.bus_path = "pci-0000:00:1d.7-usb-0:7.1:1.0"
...this being the key bit.
^ permalink raw reply [flat|nested] 57+ messages in thread
end of thread, other threads:[~2011-07-11 15:28 UTC | newest]
Thread overview: 57+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2011-06-20 13:37 Jack event API - decision needed David Henningsson
2011-06-20 17:07 ` Mark Brown
2011-06-20 17:12 ` Takashi Iwai
2011-06-20 17:31 ` Mark Brown
2011-06-20 17:37 ` Takashi Iwai
2011-06-20 18:53 ` David Henningsson
2011-06-20 23:40 ` Mark Brown
2011-06-21 12:11 ` David Henningsson
2011-06-21 12:39 ` Mark Brown
2011-06-22 10:47 ` David Henningsson
2011-06-22 11:48 ` Mark Brown
2011-06-22 12:50 ` Kay Sievers
2011-06-22 13:25 ` Mark Brown
2011-06-22 13:55 ` Kay Sievers
2011-06-22 15:11 ` Mark Brown
2011-06-22 21:41 ` Dmitry Torokhov
2011-06-23 0:15 ` Mark Brown
2011-06-23 8:42 ` Dmitry Torokhov
2011-06-23 10:47 ` Mark Brown
2011-06-22 21:01 ` Lennart Poettering
2011-06-22 21:57 ` Stephen Warren
2011-06-23 1:10 ` Mark Brown
2011-06-23 7:01 ` Clemens Ladisch
2011-06-23 7:24 ` Takashi Iwai
2011-06-23 9:49 ` Lennart Poettering
2011-06-23 11:43 ` Mark Brown
2011-06-23 15:32 ` Stephen Warren
2011-06-27 12:07 ` Mark Brown
2011-06-27 17:01 ` Colin Guthrie
2011-06-28 16:20 ` Mark Brown
2011-07-09 3:38 ` Mark Brown
2011-06-28 16:27 ` David Henningsson
2011-06-28 16:34 ` Liam Girdwood
2011-06-28 16:35 ` Mark Brown
2011-06-28 16:35 ` Takashi Iwai
2011-06-29 2:59 ` Mark Brown
2011-06-29 5:34 ` Takashi Iwai
2011-06-29 6:59 ` Mark Brown
2011-06-29 7:03 ` Takashi Iwai
2011-06-29 7:13 ` David Henningsson
2011-06-29 7:21 ` Mark Brown
2011-06-29 8:52 ` David Henningsson
2011-06-29 17:00 ` Mark Brown
-- strict thread matches above, loose matches on Subject: below --
2011-06-20 13:56 Mark Brown
2011-06-20 14:11 ` David Henningsson
2011-06-20 14:19 ` Kay Sievers
2011-06-20 15:35 ` Takashi Iwai
2011-06-20 16:52 ` Mark Brown
2011-06-20 17:01 ` Takashi Iwai
2011-06-20 18:24 ` David Henningsson
2011-06-21 0:29 ` Mark Brown
2011-06-21 6:57 ` David Henningsson
2011-06-21 10:40 ` Mark Brown
2011-06-20 16:47 ` Mark Brown
2011-06-20 16:59 ` Takashi Iwai
2011-06-20 17:17 ` Mark Brown
2011-06-20 17:38 ` Takashi Iwai
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox