This is the official implementation of our paper "Riemannian Optimization on Relaxed Indicator Matrix Manifold" . We propose a fundamental manifold in machine learning—the Relaxed Indicator Matrix (RIM) manifold—and develop a set of Riemannian optimization methods on the RIM manifold.
The optimization of the indicator matrix manifold is an NP-hard discrete optimization problem. We relax it into a continuous form, where each row is a probability vector summing to 1, and each column satisfies constraints within a certain range.
These constraints are prior information that can be incorporated into machine learning systems. Specifically, when the column constraints are sufficiently relaxed, the manifold degenerates into a single-stochastic manifold, and when the constraints are sufficiently tight, it degenerates into a double-stochastic manifold.
Therefore, our manifold possesses generalizability.
- Broad application space: In most tasks such as clustering and classification that require learning an indicator matrix, relaxing to the RIM manifold may be a viable alternative to relaxing to the single-stochastic manifold (K-means) or the Stiefel manifold (spectral clustering).
- Introducing column sum information: Compared to other relaxation methods, this approach incorporates column sum information into the indicator matrix, representing the probability distribution of each category. This provides learning systems with additional prior information that can be leveraged.
- Complete theory: We provide a full proof, including the Riemannian gradient and Retraction mapping, ensuring the completeness of our work.
-
Efficiency and effectiveness of the algorithm: In some tests, the speed
$\mathcal{O}(n)$ surpasses that of the double-stochastic manifold$\mathcal{O}(n^3)$ . Additionally, compared to various existing algorithms, it achieves a lower optimization objective value. At the same time, it significantly improves clustering performance and can also serve as a post-processing tool to enhance clustering results.
- arXiv Preprint: arXiv:2503.20505
- Ranking 3rd in clustering-algorithms: the paper has garnered 163 likes on alphaXiv as of Apr. 3, 2025.
- Ranking 2nd in clustering-algorithms: the paper has garnered 203 likes on alphaXiv as of Apr. 11, 2025.
- ACCEPT!: the paper has been Acceped by ICLR 2026.
- Please stay tuned for further updates ……
For the image denoising problem, comparing the RIM manifold and the double-stochastic manifold: RIM manifold time: 29.77s, loss value: 1.05e5. While double-stochastic manifold took 85.33s and the loss function was 1.17e5.
Here is the comparison table of various optimization algorithms' time and loss values for the Ratio Cut problem:
Here is the loss function descent curve for some datasets:
For additional information, including images and tables, please refer to the paper.
- Code:
Relaxd_Indicator_Matrix_factory.m: This is the core file, a Riemannian geometry toolbox compatible with Manopt, specifically designed for the RIM manifold.
- demo:
RIM_denoising.m: This is the file for quickly testing the RIM manifold for denoising.yjh.png: This is the same cartoon image used for testing in the paper.
- Manopt: To use the RIM manifold Riemannian toolbox based on Manopt, you first need to ensure that Manopt is installed.
- Download file: Please download the RIM manifold files as follows.
# Clone the repository
git clone https://github.com/Yuan-Jinghui/Riemannian-Optimization-on-Relaxed-Indicator-Matrix-Manifold.git- Place core files: Please place the files as follows.
-manopt;
--manifolds;
---multinomial;
----Relaxd_Indicator_Matrix_factory.m-manopt;
--examples;
---RIM_denoising.m;
---yjh.png;- Run the code! : Add all the files to the path and run
RIM_denoising.min the MATLAB command line.
If you find our work useful in your research, please consider citing:
@misc{yuan2025riemannianoptimizationrelaxedindicator,
title={Riemannian Optimization on Relaxed Indicator Matrix Manifold},
author={Jinghui Yuan and Fangyuan Xie and Feiping Nie and Xuelong Li},
year={2025},
eprint={2503.20505},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2503.20505},
}We would like to thank the manifold optimization community for their support, including Manopt, Manopt.jl, and their authors. We also appreciate You who reads and cites the article, and use the RIM manifold for research.
This project is licensed under the MIT License - see the LICENSE file for details.
For questions and inquiries about this research, please refer to the contact information in the paper.




