* Re: Question about your DSP topic branch
[not found] ` <1295973973.3322.347.camel@odin>
@ 2011-01-27 23:41 ` Patrick Lai
2011-03-15 7:08 ` [alsa-devel] " Patrick Lai
0 siblings, 1 reply; 5+ messages in thread
From: Patrick Lai @ 2011-01-27 23:41 UTC (permalink / raw)
To: Liam Girdwood; +Cc: alsa-devel, alsa-devel, Mark Brown, linux-arm-msm
On 1/25/2011 8:46 AM, Liam Girdwood wrote:
> Hi Patrick,
>
> CCing in Mark and alsa-devel@alsa-project.org (preferred list)
>
> On Mon, 2011-01-24 at 23:01 -0800, Patrick Lai wrote:
>> Hi Liam,
>>
>> I have two more questions about your DSP topic branch
>>
>> 7. I see in sdp4430.c, SDP4430 MODEM front-end dailink no_host_mode is
>> set to SND_SOC_DAI_LINK_NO_HOST. What is the purpose of no_host_mode?
>
> No host mode means no audio is transferred to the host CPU in this case.
> i.e. the MODEM routes audio directly between the DSP and mic/speaker.
>
> This flags also tells the ASoC core that no DMA will be required, hence
> the code in the DMA branch will not start any DMA. This part had to be
> cleaned up for upstream.
>
>> Is it for use case that two physical devices can exchange audio data
>> without host-processor intervention? If so, if user-space application
>> tries to write to PCM buffer, will framework reject the buffer?
>>
>
> Yes, that's correct. The PCM core will also not complain here when it
> receive no data either.
I experimented NO_HOST option and found platform driver pcm copy
function is called. Is it part of the clean up you were talking about?
>> 8. I see there is dmic codec(dmic.c) under sound/soc/codec which is
>> pretty much just a dummy codec driver. I supposed the configuration of
>> DMIC is done in other driver. Would it be better if we could have
>> something like fixed voltage regulator? So, there is no need to
>> duplicate the effort.
>>
>
> Yeah, this is a generic DMIC driver. It's designed to have a very wide
> coverage and should cover most DMICs out there. So it should also be
> able to fit into your architecture too.
>
>> Look forward to seeing your reply soon
>>
>> Thanks
>>
>> On 1/6/2011 3:39 PM, Patrick Lai wrote:
>>> Hi Liam,
>>>
>>> I sync to your kernel DSP topic branch two days back in attempt to
>>> understand the up-coming ASOC DSP design. I have few questions to get
>>> clarification from you.
>>>
>>> 1. In the sdp4430.c, there are both FE dai link and BE dai link have
>>> omap-aess-audio as platform driver. How is omap-aess-audio platform
>>> driver used in both front-end and backend?
>>>
>
> The MODEM and Low Power (LP) Front Ends (FE) use the AESS platform
> driver since they do not require DMA, whilts the other FE's use the
> normal DMA platform driver since they do require DMA.
>
PCM functions inside omap-abe-dsp.c seem to do no-op if dai ID is not
MODEM or LP. I guess reusing omap-aess-audio platform driver for the
back-end DAI link serves purpose of architecture requirement that each
DAI-LINK must have platform driver. Do I get the right impression?
>
>>> 2. Front-end dai with stream name "Multimedia" has platform driver as
>>> omap-pcm-audio which is the DMA based. This front-end dai is mapped to
>>> backend (i.e PDM-DL1) with platform driver as omap-aess-audio.
>>> Omap-aess-audio looks to me is DSP platform driver. If a stream is DMA
>>> based, it seems strange to have DSP-based backend.
>>>
>
> The DMA is used to send the PCM data from the ALSA PCM device to the DSP
> FIFO.
>
>>> 3. To best of my knowledge, I see in omap-abe.c which is the front end
>>> implementation. Playback_trigger() seems to go ahead enable all
>>> back-ends linked to the given front-end. However, front end may not be
>>> routed to all back-ends. Where in the code to make sure BE is only
>>> activated when a stream is being routed to that particular back-end?
>>>
>
> This is all done in soc-dsp.c now. We use the DAPM graph to work out all
> valid routes from FE to BE and vice versa.
>
I experimented dynamic routing. If issuing mixer command to route AIF_IN
to AIF_OUT through mixer widget before starting playback, it works.
Otherwise, I get I/O error from aplay. I think it's acceptable to
require application to setup path before start of playback. For device
switch case, is it mandatory to route stream to the mixer of BE
DAI-LINK2 before derouting stream out of mixer of BE DAI-LINK1?
>>> 4. omap-abe.c manages activation of BE DAIS and omap-abe-dsp.c manages
>>> routing of front-end to back-end DAPM widgets and routing map. Am I
>>> correct?
>
> Yes, although the routing management is now all in soc-dsp.c
>
>> This leads to previous question. How are two drivers working
>>> together to make sure BEs are only activated if Front-end has been
>>> routed to them?
>>>
>
> soc-dsp.c now marshals all the PCM ops and makes sure that only valid
> paths have their DAIs activated.
>
>
>>> 5. Is there mechanism for front-end to switch between DMA and DSP
>>> platform driver? It looks to me that mapping of frond-end and platform
>>> driver is predetermined based on use case. For example, HDMI goes
>>> through DMA as legacy dailink.
>
> There is no way to dynamically switch platform drivers atm, but this can
> be solved by having a mutually exclusive FE for each use case.
>
>>>
>>> 6. struct abe_data has array member called dapm. It looks to me that
>>> this array simply tracks dapm components status but I don't see it's
>>> really being used in meaningful way in omap-abe-adsp.c.
>>>
>
> It used by the OMAP4 ABE to work out the OPP power level and to work out
> the routing between FE and BE.
>
> Liam
--
Sent by an employee of the Qualcomm Innovation Center, Inc.
The Qualcomm Innovation Center, Inc. is a member of the Code Aurora Forum.
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: [alsa-devel] Question about your DSP topic branch
2011-01-27 23:41 ` Question about your DSP topic branch Patrick Lai
@ 2011-03-15 7:08 ` Patrick Lai
2011-03-15 11:25 ` Mark Brown
0 siblings, 1 reply; 5+ messages in thread
From: Patrick Lai @ 2011-03-15 7:08 UTC (permalink / raw)
To: Liam Girdwood; +Cc: linux-arm-msm, alsa-devel, Mark Brown, alsa-devel
Hi Liam,
In the DSP routing scheme, what is expectation of back-end CPU/Codec
drivers in term of determining what hardware parameters especially
channel mode to use? DSP in question is capable of resampling
, down mixing so hardware configuration of front-end DAI does not need
to match one of back-end.
Here are the scenarios that I need clarification
1. If there are two front-ends routed to same backend, which front-end
hardware parameters should backend DAI be based on? For example, one
front-end is MONO and another front is stereo.
2. Depending on device mode/use case, I would like to configure BE to
different channel mode irrespective of front-end configuration(i.e
configuring back-end to handset mode). Where is the hook to do so under
ASoC DSP framework?
Thanks
Patrick
On 1/27/2011 3:41 PM, Patrick Lai wrote:
> On 1/25/2011 8:46 AM, Liam Girdwood wrote:
>> Hi Patrick,
>>
>> CCing in Mark and alsa-devel@alsa-project.org (preferred list)
>>
>> On Mon, 2011-01-24 at 23:01 -0800, Patrick Lai wrote:
>>> Hi Liam,
>>>
>>> I have two more questions about your DSP topic branch
>>>
>>> 7. I see in sdp4430.c, SDP4430 MODEM front-end dailink no_host_mode is
>>> set to SND_SOC_DAI_LINK_NO_HOST. What is the purpose of no_host_mode?
>>
>> No host mode means no audio is transferred to the host CPU in this case.
>> i.e. the MODEM routes audio directly between the DSP and mic/speaker.
>>
>> This flags also tells the ASoC core that no DMA will be required, hence
>> the code in the DMA branch will not start any DMA. This part had to be
>> cleaned up for upstream.
>>
>>> Is it for use case that two physical devices can exchange audio data
>>> without host-processor intervention? If so, if user-space application
>>> tries to write to PCM buffer, will framework reject the buffer?
>>>
>>
>> Yes, that's correct. The PCM core will also not complain here when it
>> receive no data either.
>
> I experimented NO_HOST option and found platform driver pcm copy
> function is called. Is it part of the clean up you were talking about?
>
>>> 8. I see there is dmic codec(dmic.c) under sound/soc/codec which is
>>> pretty much just a dummy codec driver. I supposed the configuration of
>>> DMIC is done in other driver. Would it be better if we could have
>>> something like fixed voltage regulator? So, there is no need to
>>> duplicate the effort.
>>>
>>
>> Yeah, this is a generic DMIC driver. It's designed to have a very wide
>> coverage and should cover most DMICs out there. So it should also be
>> able to fit into your architecture too.
>>
>>> Look forward to seeing your reply soon
>>>
>>> Thanks
>>>
>>> On 1/6/2011 3:39 PM, Patrick Lai wrote:
>>>> Hi Liam,
>>>>
>>>> I sync to your kernel DSP topic branch two days back in attempt to
>>>> understand the up-coming ASOC DSP design. I have few questions to get
>>>> clarification from you.
>>>>
>>>> 1. In the sdp4430.c, there are both FE dai link and BE dai link have
>>>> omap-aess-audio as platform driver. How is omap-aess-audio platform
>>>> driver used in both front-end and backend?
>>>>
>>
>> The MODEM and Low Power (LP) Front Ends (FE) use the AESS platform
>> driver since they do not require DMA, whilts the other FE's use the
>> normal DMA platform driver since they do require DMA.
>>
> PCM functions inside omap-abe-dsp.c seem to do no-op if dai ID is not
> MODEM or LP. I guess reusing omap-aess-audio platform driver for the
> back-end DAI link serves purpose of architecture requirement that each
> DAI-LINK must have platform driver. Do I get the right impression?
>
>
>>
>>>> 2. Front-end dai with stream name "Multimedia" has platform driver as
>>>> omap-pcm-audio which is the DMA based. This front-end dai is mapped to
>>>> backend (i.e PDM-DL1) with platform driver as omap-aess-audio.
>>>> Omap-aess-audio looks to me is DSP platform driver. If a stream is DMA
>>>> based, it seems strange to have DSP-based backend.
>>>>
>>
>> The DMA is used to send the PCM data from the ALSA PCM device to the DSP
>> FIFO.
>>
>>>> 3. To best of my knowledge, I see in omap-abe.c which is the front end
>>>> implementation. Playback_trigger() seems to go ahead enable all
>>>> back-ends linked to the given front-end. However, front end may not be
>>>> routed to all back-ends. Where in the code to make sure BE is only
>>>> activated when a stream is being routed to that particular back-end?
>>>>
>>
>> This is all done in soc-dsp.c now. We use the DAPM graph to work out all
>> valid routes from FE to BE and vice versa.
>>
>
> I experimented dynamic routing. If issuing mixer command to route AIF_IN
> to AIF_OUT through mixer widget before starting playback, it works.
> Otherwise, I get I/O error from aplay. I think it's acceptable to
> require application to setup path before start of playback. For device
> switch case, is it mandatory to route stream to the mixer of BE
> DAI-LINK2 before derouting stream out of mixer of BE DAI-LINK1?
>
>>>> 4. omap-abe.c manages activation of BE DAIS and omap-abe-dsp.c manages
>>>> routing of front-end to back-end DAPM widgets and routing map. Am I
>>>> correct?
>>
>> Yes, although the routing management is now all in soc-dsp.c
>>
>>> This leads to previous question. How are two drivers working
>>>> together to make sure BEs are only activated if Front-end has been
>>>> routed to them?
>>>>
>>
>> soc-dsp.c now marshals all the PCM ops and makes sure that only valid
>> paths have their DAIs activated.
>>
>>
>>>> 5. Is there mechanism for front-end to switch between DMA and DSP
>>>> platform driver? It looks to me that mapping of frond-end and platform
>>>> driver is predetermined based on use case. For example, HDMI goes
>>>> through DMA as legacy dailink.
>>
>> There is no way to dynamically switch platform drivers atm, but this can
>> be solved by having a mutually exclusive FE for each use case.
>>
>>>>
>>>> 6. struct abe_data has array member called dapm. It looks to me that
>>>> this array simply tracks dapm components status but I don't see it's
>>>> really being used in meaningful way in omap-abe-adsp.c.
>>>>
>>
>> It used by the OMAP4 ABE to work out the OPP power level and to work out
>> the routing between FE and BE.
>>
>> Liam
>
>
--
Sent by an employee of the Qualcomm Innovation Center, Inc.
The Qualcomm Innovation Center, Inc. is a member of the Code Aurora Forum.
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: [alsa-devel] Question about your DSP topic branch
2011-03-15 7:08 ` [alsa-devel] " Patrick Lai
@ 2011-03-15 11:25 ` Mark Brown
2011-03-17 7:21 ` Patrick Lai
0 siblings, 1 reply; 5+ messages in thread
From: Mark Brown @ 2011-03-15 11:25 UTC (permalink / raw)
To: Patrick Lai
Cc: Liam Girdwood, linux-arm-msm, alsa-devel, Mark Brown, alsa-devel
On Tue, Mar 15, 2011 at 12:08:39AM -0700, Patrick Lai wrote:
> 1. If there are two front-ends routed to same backend, which front-end
> hardware parameters should backend DAI be based on? For example, one
> front-end is MONO and another front is stereo.
> 2. Depending on device mode/use case, I would like to configure BE to
> different channel mode irrespective of front-end configuration(i.e
> configuring back-end to handset mode). Where is the hook to do so under
> ASoC DSP framework?
If you can configure the two completely separately then you can just
have a DAPM path between the two devices and don't need to link the DAI
configurations at all - WM8994 is an example of doing this, there is a
bunch of DSP and mixing in the digital section between the DAIs but
because the DSP includes sample rate conversion the DAIs can be
configured separately and we don't have to worry about propagating DSP
through.
Thinking off the top of my head if you do need to link the
configurations sometimes the first thing that springs to mind is to set
constraints so broad that they're noops - we're going to want to support
propagating constraints through as well as specific settings anyway
since some stuff has restrictions like "Route DAI 1 to DAI 2 with DAI 2
at any sample rate that's an integer division of the DAI 1 rate", though
that sort of stuff may well be a second pass.
One thing I should mention which I've been working on and which may
(depending on the design of your hardware) be a better match for you is
providing a mechanism for setting up DAI links and starting streams on
them in kernel space based on DAPM routing, with DAPM propagating over
these links automatically. This is mostly intended for representing
things like basebands where you'll have a DAI link that's going to be
running in one of a small number of configurations (often only one)
connected to something else along the lines of:
CODEC < 8kHz mono > Baseband
so the idea is that to the user you can have a couple of pin switches
(or whatever else can be represented) in the baseband to say that the
baseband is passing audio and then DAPM will be extended to figure out
that it should start up the audio intefaces and configure their params
without needing userspace to do it just like it can if the link between
the two were analogue.
This will compliment the work Liam's doing - Liam's work ties together
the configuration within a device, this'll help with inter-device links.
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: [alsa-devel] Question about your DSP topic branch
2011-03-15 11:25 ` Mark Brown
@ 2011-03-17 7:21 ` Patrick Lai
2011-03-17 23:58 ` Mark Brown
0 siblings, 1 reply; 5+ messages in thread
From: Patrick Lai @ 2011-03-17 7:21 UTC (permalink / raw)
To: Mark Brown
Cc: Liam Girdwood, linux-arm-msm, alsa-devel, Mark Brown, alsa-devel
On 3/15/2011 4:25 AM, Mark Brown wrote:
> On Tue, Mar 15, 2011 at 12:08:39AM -0700, Patrick Lai wrote:
>
>> 1. If there are two front-ends routed to same backend, which front-end
>> hardware parameters should backend DAI be based on? For example, one
>> front-end is MONO and another front is stereo.
>
>> 2. Depending on device mode/use case, I would like to configure BE to
>> different channel mode irrespective of front-end configuration(i.e
>> configuring back-end to handset mode). Where is the hook to do so under
>> ASoC DSP framework?
>
> If you can configure the two completely separately then you can just
> have a DAPM path between the two devices and don't need to link the DAI
> configurations at all - WM8994 is an example of doing this, there is a
> bunch of DSP and mixing in the digital section between the DAIs but
> because the DSP includes sample rate conversion the DAIs can be
> configured separately and we don't have to worry about propagating DSP
> through.
Essentially, I have front-end DAI link represents PCM stream while
back-end DAI link represents physical hardware path. Irrespective
of PCM configuration of front-end DAI link, I want CPU DAI and CODEC
DAI of backend DAI link configured to the same PCM configuration. DSP
can handle resampling/downmixing. How to achieve this use case in
soc-dsp implementation is not obvious to me. In existing soc-dsp
implementation, back-end does not get activated unless front-end is
routed to back-end. Then, PCM parameters of front-end gets propagated
to CPU driver and codec driver of back-end DAI link. I don't see how I
would be able to pass PCM configuration specifically for back-end.
>
> One thing I should mention which I've been working on and which may
> (depending on the design of your hardware) be a better match for you is
> providing a mechanism for setting up DAI links and starting streams on
> them in kernel space based on DAPM routing, with DAPM propagating over
> these links automatically. This is mostly intended for representing
> things like basebands where you'll have a DAI link that's going to be
> running in one of a small number of configurations (often only one)
> connected to something else along the lines of:
>
> CODEC< 8kHz mono> Baseband
>
> so the idea is that to the user you can have a couple of pin switches
> (or whatever else can be represented) in the baseband to say that the
> baseband is passing audio and then DAPM will be extended to figure out
> that it should start up the audio intefaces and configure their params
> without needing userspace to do it just like it can if the link between
> the two were analogue.
>
Yes, I do have use case in mind which requires functionality as you
described to activate back-end DAI link without front-end DAI being
activated. DAPM seems to be most appropriate approach.
--
Sent by an employee of the Qualcomm Innovation Center, Inc.
The Qualcomm Innovation Center, Inc. is a member of the Code Aurora Forum.
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: [alsa-devel] Question about your DSP topic branch
2011-03-17 7:21 ` Patrick Lai
@ 2011-03-17 23:58 ` Mark Brown
0 siblings, 0 replies; 5+ messages in thread
From: Mark Brown @ 2011-03-17 23:58 UTC (permalink / raw)
To: Patrick Lai; +Cc: Liam Girdwood, linux-arm-msm, alsa-devel, alsa-devel
On Thu, Mar 17, 2011 at 12:21:57AM -0700, Patrick Lai wrote:
> Yes, I do have use case in mind which requires functionality as you
> described to activate back-end DAI link without front-end DAI being
> activated. DAPM seems to be most appropriate approach.
OK, cool - I'll try to remember to CC you on the patches when I get them
done. I need to get it done for .40 so should be soonish.
^ permalink raw reply [flat|nested] 5+ messages in thread
end of thread, other threads:[~2011-03-17 23:58 UTC | newest]
Thread overview: 5+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
[not found] <4D2652C8.7030701@codeaurora.org>
[not found] ` <4D3E7536.9070906@codeaurora.org>
[not found] ` <1295973973.3322.347.camel@odin>
2011-01-27 23:41 ` Question about your DSP topic branch Patrick Lai
2011-03-15 7:08 ` [alsa-devel] " Patrick Lai
2011-03-15 11:25 ` Mark Brown
2011-03-17 7:21 ` Patrick Lai
2011-03-17 23:58 ` Mark Brown
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).