Real-Time Rendering for AR/VR with Novel View Synthesis
(2025) MAMM15 20251Ergonomics and Aerosol Technology
Department of Design Sciences
- Abstract
- Real-time 3D reconstruction is crucial for immersive AR/VR but is hard to achieve with traditional photogrammetry and mesh-based pipelines due to latency and limited scalability. Neural Radiance Fields (NeRF) enable high-quality novel view synthesis but suffer from long training and slow rendering, while Gaussian Splatting (GS) achieves real-time rendering at the cost of heavy preprocessing and memory usage. This thesis presents a comparative study of NeRF and GS in terms of visual quality, efficiency, and scalability, and proposes an optimized GS-based pipeline for real-time AR/VR. The system introduces three main techniques: (1) clustering-based reduction of COLMAP points to cut preprocessing to under two minutes; (2) gradient-aware... (More)
- Real-time 3D reconstruction is crucial for immersive AR/VR but is hard to achieve with traditional photogrammetry and mesh-based pipelines due to latency and limited scalability. Neural Radiance Fields (NeRF) enable high-quality novel view synthesis but suffer from long training and slow rendering, while Gaussian Splatting (GS) achieves real-time rendering at the cost of heavy preprocessing and memory usage. This thesis presents a comparative study of NeRF and GS in terms of visual quality, efficiency, and scalability, and proposes an optimized GS-based pipeline for real-time AR/VR. The system introduces three main techniques: (1) clustering-based reduction of COLMAP points to cut preprocessing to under two minutes; (2) gradient-aware clustering of Spherical Harmonic coefficients to shrink storage; and (3) quantization-aware training to reduce precision and memory requirements without harming visual quality. The optimized pipeline lowers total processing time to as little as 5 minutes, reaches up to 180 FPS at 1080p, reduces model size by 6–7×, and maintains competitive fidelity (PSNR > 26 dB, SSIM ≈ 0.80), bringing practical real-time novel view synthesis closer to deployable AR/VR systems. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9215547
- author
- Maithani, Garima LU
- supervisor
- organization
- course
- MAMM15 20251
- year
- 2025
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- Real-time Rendering, 3D reconstruction, Gaussian Splatting, Novel View Synthesis, Neural Radiance Fields, Virtual Reality, Augmented Reality, Artificial Intelligence, Machine Learning
- language
- English
- id
- 9215547
- date added to LUP
- 2025-11-19 10:08:19
- date last changed
- 2025-11-19 10:08:19
@misc{9215547,
abstract = {{Real-time 3D reconstruction is crucial for immersive AR/VR but is hard to achieve with traditional photogrammetry and mesh-based pipelines due to latency and limited scalability. Neural Radiance Fields (NeRF) enable high-quality novel view synthesis but suffer from long training and slow rendering, while Gaussian Splatting (GS) achieves real-time rendering at the cost of heavy preprocessing and memory usage. This thesis presents a comparative study of NeRF and GS in terms of visual quality, efficiency, and scalability, and proposes an optimized GS-based pipeline for real-time AR/VR. The system introduces three main techniques: (1) clustering-based reduction of COLMAP points to cut preprocessing to under two minutes; (2) gradient-aware clustering of Spherical Harmonic coefficients to shrink storage; and (3) quantization-aware training to reduce precision and memory requirements without harming visual quality. The optimized pipeline lowers total processing time to as little as 5 minutes, reaches up to 180 FPS at 1080p, reduces model size by 6–7×, and maintains competitive fidelity (PSNR > 26 dB, SSIM ≈ 0.80), bringing practical real-time novel view synthesis closer to deployable AR/VR systems.}},
author = {{Maithani, Garima}},
language = {{eng}},
note = {{Student Paper}},
title = {{Real-Time Rendering for AR/VR with Novel View Synthesis}},
year = {{2025}},
}