From: Imre Deak <imre.deak@intel.com>
To: Jani Nikula <jani.nikula@intel.com>
Cc: intel-gfx@lists.freedesktop.org, intel-xe@lists.freedesktop.org
Subject: Re: [PATCH 05/14] drm/i915/dp: Move max DSC BPP reduction one level higher
Date: Fri, 31 Jan 2025 16:26:24 +0200 [thread overview]
Message-ID: <Z5zdkN58_u4gdozL@ideak-desk.fi.intel.com> (raw)
In-Reply-To: <62fa7f18ea49dce24c5d0ee7b2f0cbde9e2b609c.1738327620.git.jani.nikula@intel.com>
On Fri, Jan 31, 2025 at 02:49:58PM +0200, Jani Nikula wrote:
> Now that {icl,xelpd}_dsc_compute_link_config() take .4 fixed point as
> parameter, move the common max DSC BPP reduction one level higher. Use
> intel_dp_dsc_bpp_step() to compute the step, and pass on to both
> platform specific functions. (Though it's unused for now in
> icl_dsc_compute_link_config()).
>
> We can drop the pipe_bpp and connector parameters.
>
> Signed-off-by: Jani Nikula <jani.nikula@intel.com>
Reviewed-by: Imre Deak <imre.deak@intel.com>
> ---
> drivers/gpu/drm/i915/display/intel_dp.c | 32 +++++++++++--------------
> 1 file changed, 14 insertions(+), 18 deletions(-)
>
> diff --git a/drivers/gpu/drm/i915/display/intel_dp.c b/drivers/gpu/drm/i915/display/intel_dp.c
> index b13d806c9de7..4e7b3dd4067c 100644
> --- a/drivers/gpu/drm/i915/display/intel_dp.c
> +++ b/drivers/gpu/drm/i915/display/intel_dp.c
> @@ -2079,14 +2079,10 @@ icl_dsc_compute_link_config(struct intel_dp *intel_dp,
> const struct link_config_limits *limits,
> int min_bpp_x16,
> int max_bpp_x16,
> - int pipe_bpp,
> + int bpp_step_x16,
> int timeslots)
> {
> int i, ret;
> - int output_bpp = intel_dp_output_bpp(pipe_config->output_format, pipe_bpp);
> -
> - /* Compressed BPP should be less than the Input DSC bpp */
> - max_bpp_x16 = min(max_bpp_x16, fxp_q4_from_int(output_bpp - 1));
>
> for (i = ARRAY_SIZE(valid_dsc_bpp) - 1; i >= 0; i--) {
> if (valid_dsc_bpp[i] < fxp_q4_to_int(min_bpp_x16) ||
> @@ -2116,24 +2112,17 @@ icl_dsc_compute_link_config(struct intel_dp *intel_dp,
> */
> static int
> xelpd_dsc_compute_link_config(struct intel_dp *intel_dp,
> - const struct intel_connector *connector,
> struct intel_crtc_state *pipe_config,
> const struct link_config_limits *limits,
> int min_bpp_x16,
> int max_bpp_x16,
> - int pipe_bpp,
> + int bpp_step_x16,
> int timeslots)
> {
> struct intel_display *display = to_intel_display(intel_dp);
> - int output_bpp = intel_dp_output_bpp(pipe_config->output_format, pipe_bpp);
> - int bpp_x16, bpp_step_x16;
> + int bpp_x16;
> int ret;
>
> - bpp_step_x16 = intel_dp_dsc_bpp_step_x16(connector);
> -
> - /* Compressed BPP should be less than the Input DSC bpp */
> - max_bpp_x16 = min(max_bpp_x16, fxp_q4_from_int(output_bpp) - bpp_step_x16);
> -
> for (bpp_x16 = max_bpp_x16; bpp_x16 >= min_bpp_x16; bpp_x16 -= bpp_step_x16) {
> if (intel_dp->force_dsc_fractional_bpp_en &&
> !fxp_q4_to_frac(bpp_x16))
> @@ -2165,9 +2154,10 @@ static int dsc_compute_compressed_bpp(struct intel_dp *intel_dp,
> {
> struct intel_display *display = to_intel_display(intel_dp);
> const struct drm_display_mode *adjusted_mode = &pipe_config->hw.adjusted_mode;
> + int output_bpp;
> int dsc_min_bpp;
> int dsc_max_bpp;
> - int min_bpp_x16, max_bpp_x16;
> + int min_bpp_x16, max_bpp_x16, bpp_step_x16;
> int dsc_joiner_max_bpp;
> int num_joined_pipes = intel_crtc_num_joined_pipes(pipe_config);
>
> @@ -2182,11 +2172,17 @@ static int dsc_compute_compressed_bpp(struct intel_dp *intel_dp,
> min_bpp_x16 = fxp_q4_from_int(dsc_min_bpp);
> max_bpp_x16 = fxp_q4_from_int(dsc_max_bpp);
>
> + bpp_step_x16 = intel_dp_dsc_bpp_step_x16(connector);
> +
> + /* Compressed BPP should be less than the Input DSC bpp */
> + output_bpp = intel_dp_output_bpp(pipe_config->output_format, pipe_bpp);
> + max_bpp_x16 = min(max_bpp_x16, fxp_q4_from_int(output_bpp) - bpp_step_x16);
> +
> if (DISPLAY_VER(display) >= 13)
> - return xelpd_dsc_compute_link_config(intel_dp, connector, pipe_config, limits,
> - min_bpp_x16, max_bpp_x16, pipe_bpp, timeslots);
> + return xelpd_dsc_compute_link_config(intel_dp, pipe_config, limits,
> + min_bpp_x16, max_bpp_x16, bpp_step_x16, timeslots);
> return icl_dsc_compute_link_config(intel_dp, pipe_config, limits,
> - min_bpp_x16, max_bpp_x16, pipe_bpp, timeslots);
> + min_bpp_x16, max_bpp_x16, bpp_step_x16, timeslots);
> }
>
> int intel_dp_dsc_min_src_input_bpc(void)
> --
> 2.39.5
>
next prev parent reply other threads:[~2025-01-31 14:25 UTC|newest]
Thread overview: 43+ messages / expand[flat|nested] mbox.gz Atom feed top
2025-01-31 12:49 [PATCH 00/14] drm/i915/dp: dsc fix, refactoring and cleanups Jani Nikula
2025-01-31 12:49 ` [PATCH 01/14] drm/i915/dp: Iterate DSC BPP from high to low on all platforms Jani Nikula
2025-01-31 13:32 ` Imre Deak
2025-02-03 14:46 ` Jani Nikula
2025-01-31 16:13 ` Nautiyal, Ankit K
2025-01-31 12:49 ` [PATCH 02/14] drm/i915/dp: Add intel_dp_dsc_bpp_step_x16() helper to get DSC BPP precision Jani Nikula
2025-01-31 13:45 ` Imre Deak
2025-01-31 14:06 ` Jani Nikula
2025-01-31 23:28 ` [PATCH v2] " Jani Nikula
2025-01-31 12:49 ` [PATCH 03/14] drm/i915/dp: Rename some variables in xelpd_dsc_compute_link_config() Jani Nikula
2025-01-31 13:57 ` Imre Deak
2025-01-31 12:49 ` [PATCH 04/14] drm/i915/dp: Pass .4 BPP values to {icl, xelpd}_dsc_compute_link_config() Jani Nikula
2025-01-31 14:05 ` [PATCH 04/14] drm/i915/dp: Pass .4 BPP values to {icl,xelpd}_dsc_compute_link_config() Imre Deak
2025-01-31 12:49 ` [PATCH 05/14] drm/i915/dp: Move max DSC BPP reduction one level higher Jani Nikula
2025-01-31 14:26 ` Imre Deak [this message]
2025-01-31 12:49 ` [PATCH 06/14] drm/i915/dp: Change icl_dsc_compute_link_config() DSC BPP iteration Jani Nikula
2025-01-31 14:30 ` Imre Deak
2025-01-31 12:50 ` [PATCH 07/14] drm/i915/dp: Move force_dsc_fractional_bpp_en check to intel_dp_dsc_valid_bpp() Jani Nikula
2025-01-31 14:32 ` Imre Deak
2025-01-31 12:50 ` [PATCH 08/14] drm/i915/dp: Unify DSC link config functions Jani Nikula
2025-01-31 14:35 ` Imre Deak
2025-01-31 12:50 ` [PATCH 09/14] drm/i915/dp: Inline do_dsc_compute_compressed_bpp() Jani Nikula
2025-01-31 14:48 ` Imre Deak
2025-01-31 12:50 ` [PATCH 10/14] drm/i915/dp: Simplify input BPP checks in intel_dp_dsc_compute_pipe_bpp() Jani Nikula
2025-01-31 14:52 ` Imre Deak
2025-01-31 12:50 ` [PATCH 11/14] drm/i915/dp: Use int for compressed BPP in dsc_compute_link_config() Jani Nikula
2025-01-31 15:08 ` Imre Deak
2025-01-31 15:27 ` Imre Deak
2025-01-31 12:50 ` [PATCH 12/14] drm/i915/dp: Drop compute_pipe_bpp parameter from intel_dp_dsc_compute_config() Jani Nikula
2025-01-31 15:10 ` Imre Deak
2025-01-31 12:50 ` [PATCH 13/14] drm/i915/dp: Pass connector state all the way to dsc_compute_link_config() Jani Nikula
2025-01-31 15:38 ` Imre Deak
2025-01-31 12:50 ` [PATCH 14/14] drm/i915/mst: Convert intel_dp_mtp_tu_compute_config() to .4 format Jani Nikula
2025-01-31 15:46 ` Imre Deak
2025-01-31 12:57 ` ✓ CI.Patch_applied: success for drm/i915/dp: dsc fix, refactoring and cleanups Patchwork
2025-01-31 12:57 ` ✗ CI.checkpatch: warning " Patchwork
2025-01-31 12:58 ` ✓ CI.KUnit: success " Patchwork
2025-01-31 13:15 ` ✓ CI.Build: " Patchwork
2025-01-31 13:17 ` ✓ CI.Hooks: " Patchwork
2025-01-31 13:18 ` ✗ CI.checksparse: warning " Patchwork
2025-01-31 13:38 ` ✓ Xe.CI.BAT: success " Patchwork
2025-01-31 17:34 ` ✗ Xe.CI.Full: failure " Patchwork
2025-02-01 0:24 ` ✗ CI.Patch_applied: failure for drm/i915/dp: dsc fix, refactoring and cleanups (rev2) Patchwork
Reply instructions:
You may reply publicly to this message via plain-text email
using any one of the following methods:
* Save the following mbox file, import it into your mail client,
and reply-to-all from there: mbox
Avoid top-posting and favor interleaved quoting:
https://en.wikipedia.org/wiki/Posting_style#Interleaved_style
* Reply using the --to, --cc, and --in-reply-to
switches of git-send-email(1):
git send-email \
--in-reply-to=Z5zdkN58_u4gdozL@ideak-desk.fi.intel.com \
--to=imre.deak@intel.com \
--cc=intel-gfx@lists.freedesktop.org \
--cc=intel-xe@lists.freedesktop.org \
--cc=jani.nikula@intel.com \
/path/to/YOUR_REPLY
https://kernel.org/pub/software/scm/git/docs/git-send-email.html
* If your mail client supports setting the In-Reply-To header
via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line
before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox