Liren Jin
Ph.D. Student Contact:Email: ljin@nulluni-bonn.de
Tel: +49 – 228 – 73 – 29 04
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.001
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn

Profiles: Google Scholar | Github | Linkedln
Research Interests
- Active Perception
- 3D Reconstruction
- Implicit Neural Representations
Short CV
Liren Jin is a Ph.D. student at the University of Bonn since March 2021. He is advised by Prof. Marija Popović and Prof. Cyrill Stachniss. He received his Master’s degree in 2020 from RWTH Aachen and Bachelor’s degree in 2016, both in Mechanical Engineering. During his Master’s, he spent 6 months at Volkswagen AG as an intern working on multi-robot navigation.
Projects
- PhenoRob – Robotics and Phenotyping for Sustainable Crop Production (DFG Cluster of Excellence)
Publications
2025
- Y. Pan, X. Zhong, L. Jin, L. Wiesmann, M. Popović, J. Behley, and C. Stachniss, “PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map,” Arxiv preprint, vol. arXiv:2502.05752, 2025.
[BibTeX] [PDF]Robots require high-fidelity reconstructions of their environment for effective operation. Such scene representations should be both, geometrically accurate and photorealistic to support downstream tasks. While this can be achieved by building distance fields from range sensors and radiance fields from cameras, the scalable incremental mapping of both fields consistently and at the same time with high quality remains challenging. In this paper, we propose a novel map representation that unifies a continuous signed distance field and a Gaussian splatting radiance field within an elastic and compact point-based implicit neural map. By enforcing geometric consistency between these fields, we achieve mutual improvements by exploiting both modalities. We devise a LiDAR-visual SLAM system called PINGS using the proposed map representation and evaluate it on several challenging large-scale datasets. Experimental results demonstrate that PINGS can incrementally build globally consistent distance and radiance fields encoded with a compact set of neural points. Compared to the state-of-the-art methods, PINGS achieves superior photometric and geometric rendering at novel views by leveraging the constraints from the distance field. Furthermore, by utilizing dense photometric cues and multi-view consistency from the radiance field, PINGS produces more accurate distance fields, leading to improved odometry estimation and mesh reconstruction.
@article{pan2025arxiv, author = {Y. Pan and X. Zhong and L. Jin and L. Wiesmann and M. Popovi\'c and J. Behley and C. Stachniss}, title = {{PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map}}, journal = arxiv, year = 2025, volume = {arXiv:2502.05752}, url = {https://arxiv.org/pdf/2502.05752}, abstract = {Robots require high-fidelity reconstructions of their environment for effective operation. Such scene representations should be both, geometrically accurate and photorealistic to support downstream tasks. While this can be achieved by building distance fields from range sensors and radiance fields from cameras, the scalable incremental mapping of both fields consistently and at the same time with high quality remains challenging. In this paper, we propose a novel map representation that unifies a continuous signed distance field and a Gaussian splatting radiance field within an elastic and compact point-based implicit neural map. By enforcing geometric consistency between these fields, we achieve mutual improvements by exploiting both modalities. We devise a LiDAR-visual SLAM system called PINGS using the proposed map representation and evaluate it on several challenging large-scale datasets. Experimental results demonstrate that PINGS can incrementally build globally consistent distance and radiance fields encoded with a compact set of neural points. Compared to the state-of-the-art methods, PINGS achieves superior photometric and geometric rendering at novel views by leveraging the constraints from the distance field. Furthermore, by utilizing dense photometric cues and multi-view consistency from the radiance field, PINGS produces more accurate distance fields, leading to improved odometry estimation and mesh reconstruction.} }
2024
- L. Jin, X. Zhong, Y. Pan, J. Behley, C. Stachniss, and M. Popovic, “ActiveGS: Active Scene Reconstruction using Gaussian Splatting,” Arxiv preprint, vol. arXiv:2412.17769, 2024.
[BibTeX] [PDF]@article{jin2024arxiv, author = {L. Jin and X. Zhong and Y. Pan and J. Behley and C. Stachniss and M. Popovic}, title = {{ActiveGS: Active Scene Reconstruction using Gaussian Splatting}}, journal = arxiv, year = 2024, volume = {arXiv:2412.17769}, url = {https://arxiv.org/pdf/2412.17769}, }
- L. Jin, H. Kuang, Y. Pan, C. Stachniss, and M. Popović, “STAIR: Semantic-Targeted Active Implicit Reconstruction,” in Proc. of the ieee/rsj intl. conf. on intelligent robots and systems (iros), 2024. doi:10.1109/IROS58592.2024.10801401
[BibTeX] [PDF] [Code]@inproceedings{jin2024iros, author = {L. Jin and H. Kuang and Y. Pan and C. Stachniss and M. Popovi\'c}, title = {{STAIR: Semantic-Targeted Active Implicit Reconstruction}}, booktitle = iros, year = 2024, codeurl = {https://github.com/dmar-bonn/stair}, doi = {10.1109/IROS58592.2024.10801401}, }
- S. Pan, L. Jin, X. Huang, C. Stachniss, M. Popović, and M. Bennewitz, “Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning,” in Proc. of the ieee/rsj intl. conf. on intelligent robots and systems (iros), 2024. doi:10.1109/IROS58592.2024.10802551
[BibTeX] [PDF]@inproceedings{pan2024iros, author = {S. Pan and L. Jin and X. Huang and C. Stachniss and M. Popovi\'c and M. Bennewitz}, title = {{Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning}}, booktitle = iros, year = 2024, doi = {10.1109/IROS58592.2024.10802551}, }
- S. Pan, L. Jin, X. Huang, C. Stachniss, M. Popovic, and M. Bennewitz, “Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning,” in In proc. of the icra workshop on neural fields in robotics (robonerf), 2024.
[BibTeX]@inproceedings{pan2024icraws, title={{Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning}}, author={S. Pan and L. Jin and X. Huang and C. Stachniss and M. Popovic and M. Bennewitz}, booktitle={In Proc. of the ICRA Workshop On Neural Fields In Robotics (RoboNerF)}, year={2024}, }
2023
- L. Jin, X. Chen, J. Rückin, and M. Popović, “NeU-NBV: Next Best View Planning Using Uncertainty Estimation in Image-Based Neural Rendering,” in Proc. of the ieee/rsj intl. conf. on intelligent robots and systems (iros), 2023.
[BibTeX] [PDF] [Code]@inproceedings{jin2023iros, title = {{NeU-NBV: Next Best View Planning Using Uncertainty Estimation in Image-Based Neural Rendering}}, booktitle = iros, author = {Jin, Liren and Chen, Xieyuanli and Rückin, Julius and Popović, Marija}, year = {2023}, codeurl = {https://github.com/dmar-bonn/neu-nbv}, }
2022
- J. Rückin, L. Jin, F. Magistri, C. Stachniss, and M. Popović, “Informative Path Planning for Active Learning in Aerial Semantic Mapping,” in Proc. of the ieee/rsj intl. conf. on intelligent robots and systems (iros), 2022.
[BibTeX] [PDF] [Code]@InProceedings{rueckin2022iros, author = {J. R{\"u}ckin and L. Jin and F. Magistri and C. Stachniss and M. Popovi\'c}, title = {{Informative Path Planning for Active Learning in Aerial Semantic Mapping}}, booktitle = iros, year = {2022}, codeurl = {https://github.com/dmar-bonn/ipp-al}, }
- J. Rückin, L. Jin, and M. Popović, “Adaptive Informative Path Planning Using Deep Reinforcement Learning for UAV-based Active Sensing,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2022.
[BibTeX] [PDF] [Code]@inproceedings{rueckin2022icra, author = {R{\"u}ckin, Julius and Jin, Liren and Popović, Marija}, booktitle = icra, title = {{Adaptive Informative Path Planning Using Deep Reinforcement Learning for UAV-based Active Sensing}}, year = {2022}, codeurl = {https://github.com/dmar-bonn/ipp-rl}, }
- L. Jin, J. Rückin, S. H. Kiss, T. Vidal-Calleja, and M. Popović, “Adaptive-resolution field mapping using Gaussian process fusion with integral kernels,” Ieee robotics and automation letters (ra-l), vol. 7, iss. 3, p. 7471–7478, 2022.
[BibTeX] [PDF] [Code]@article{jin2022ral, title={{Adaptive-resolution field mapping using Gaussian process fusion with integral kernels}}, author={Jin, Liren and R{\"u}ckin, Julius and Kiss, Stefan H and Vidal-Calleja, Teresa and Popovi{\'c}, Marija}, journal=ral, volume={7}, number={3}, pages={7471--7478}, year={2022}, codeurl = {https://github.com/dmar-bonn/argpf_mapping}, }