Consistent Quantity–Quality Control across Scenes for Deployment-Aware
Gaussian Splatting

1Tsinghua University, 2Pengcheng Laboratory

Abstract

To reduce storage and computational costs, 3D Gaussian splatting (3DGS) seeks to minimize the number of Gaussians used while preserving high rendering quality, introducing an inherent trade-off between Gaussian quantity and rendering quality. Existing methods strive for better quantity–quality performance but lack the ability for users to intuitively adjust this trade-off to suit deployment under diverse hardware and communication constraints. Here, we present ControlGS, a 3DGS optimization method that achieves semantically meaningful and cross-scene consistent quantity–quality control. Through a single training run using a fixed setup and a user-specified hyperparameter reflecting quantity–quality preference, ControlGS automatically finds desirable trade-off points across scenes—from compact objects to large outdoor environments—outperforms baselines by achieving higher rendering quality with fewer Gaussians, and supports stepless control over the trade-off.

Method Overview

Illustration of our proposed ControlGS:

Starting from a sparse point cloud reconstructed via SfM, we initialize an anisotropic Gaussian set and alternate between uniform octree-style subdivision and sparsity-driven pruning.

  • Uniform Gaussian Branching: Whenever the number of pruned Gaussians falls below a threshold, we split all surviving Gaussians into eight children in an octree fashion—inheriting parent attributes to achieve a coarse-to-fine frequency progression without scene-specific heuristics.
  • Gaussian Atrophy: We add an L1 opacity regularization that applies a constant negative gradient to each Gaussian’s opacity, causing low-contribution Gaussians to “self-shrink” and be removed once their opacity drops below a threshold.

The controllable core of ControlGS is a single hyperparameter, 𝜆𝛼, which scales the atrophy loss:

  • Increasing 𝜆𝛼 strengthens pruning, yielding more compact models;
  • Decreasing 𝜆𝛼 preserves more Gaussians for higher-fidelity rendering.

By training only once under a fixed setup and adjusting 𝜆𝛼, ControlGS enable a consistent, stepless, and linear trade-off control between Gaussian quantity and rendering quality across diverse scenes, facilitating the efficient generation of multiple model variants tailored to diverse deployment needs.

Results

Quantity–Quality Control Performance

ControlGS achieves smooth, stepless, and predictable control over the trade-off between rendering quality and Gaussian quantity across diverse scenes, including high-fidelity reconstructions and highly compressed models, and significantly outperforms baseline methods in control consistency, range, and precision.


Comparison with SOTA method

Compared to existing methods, ControlGS achieves higher rendering quality with fewer Gaussians on unseen test views, consistently preserving intricate structures and high-frequency textures across diverse scenes.


Interactive Demo (Coming Soon)

We are building an interactive demo system that will automatically detect your device’s performance and display 3DGS models from various scenes, each optimized for the best quantity–quality balance on your device.


You will also be able to manually select different trade-off points to explore how different scenes perform under varying quantity–quality settings.


This demo is currently under development and will be available soon.