Ziyi Yang
I am a master student at Zhejiang University advised by professor Xiaogang
Jin.
My research lies at the neural rendering, inverse rendering, and computer graphics.
I am currently interested in 3D Gaussian Splatting (But I am not optimistic about it).
Now, I am having a very pleasant time at ByteDance MMLab as a research intern.
Prior to joining ZJU, I got my Bachelor's Degree from Shanghai
Jiao Tong University in 2022.
I'm now looking for a 25 Fall Ph.D. position in computer graphics / 3D
vision! Welcome to contact me!
Email /
Google Scholar /
Github
|
|
- [2024.09] two papers have been accepted by the NeurIPS 2024.
- [2024.07] one paper has been accepted by the SIGGRAPH Asia 2024 (TOG).
- [2024.02] two papers have been accepted by the CVPR 2024.
- [2023.12] one paper has been accepted by the AAAI 2024.
Publications
|
|
Spec-Gaussian: Anisotropic View-Dependent Appearance for 3D Gaussian Splatting
Ziyi Yang, Xinyu Gao, Yang-Tian Sun, Yi-Hua Huang, Xiaoyang Lyu, Wen
Zhou, Shaohui Jiao, Xiaojuan Qi†, Xiaogang Jin†
NeurIPS, 2024.
Paper /
Project Page /
Code
Spec-Gaussian aims to tackle scenes with specular
highlights and anisotropy. The key idea is to employ the ASG appearance field instead of SH
to model the appearance of 3D Gaussian.
|
|
RobIR: Robust Inverse Rendering for High-Illumination Scenes
Ziyi Yang, Yanzhen Chen, Xinyu Gao, Yazhen Yuan, Yu Wu, Xiaowei Zhou,
Xiaogang Jin†
NeurIPS, 2024.
Project Page /
arXiv /
code
We propose a novel neural field-based inverse
rendering framework for high-illumination scenes. We employ a scene-specific ACES tone
mapping and regularized visibility estimation to eliminate the shadow in the PBR
materials.
|
|
3DGSR: Implicit Surface Reconstruction with 3D Gaussian Splatting
Xiaoyang Lyu, Yang-Tian Sun, Yi-Hua Huang, Xiuzhe Wu, Ziyi Yang, Yilun
Chen, Jiangmiao Pang, Xiaojuan Qi†
SIGGRAPH Asia (TOG), 2024.
Project Page /
arXiv /
code
We propose a joint reconstruction technique coupling a
GS and neural SDFs to achieve high quality reconstructions.
|
|
SC-GS: Sparse-Controlled Gaussian Splatting for Editable Dynamic
Scenes
Yi-Hua Huang*, Yang-Tian Sun*, Ziyi
Yang*, Xiaoyang Lyu,
Yan-Pei Cao†, Xiaojuan Qi†
CVPR, 2024.
Paper
/
Project Page /
Code
We propose a new representation that
explicitly decomposes the motion
and appearance of dynamic scenes into sparse control points and dense Gaussians,
respectively.
Our key idea is to use sparse control points, significantly fewer in number than
the Gaussians,
to learn compact 6 DoF transformation bases, which can be locally interpolated
through learned interpolation weights
to yield the motion field of 3D Gaussians. Please visit project page for more
demos.
|
|
Deformable 3D Gaussians for High-Fidelity Monocular Dynamic Scene Reconstruction
Ziyi Yang, Xinyu Gao, Wen Zhou, Shaohui Jiao, Yuqing Zhang, Xiaogang
Jin†
CVPR, 2024.
Final score: 5, 5, 5
arXiv /
Project Page /
Code
The first deformation-based Gaussian splatting for
dynamic scenes. We propose a deformable 3D Gaussian Splatting that reconstructs scenes
using 3D Gaussians and learns them in canonical space with a deformation field to model
monocular dynamic scenes. We also introduce an annealing smoothing training to mitigate the
impact of inaccurate poses in real-world datasets.
|
|
A General Implicit Framework for Fast NeRF Composition and Rendering
Xinyu Gao, Ziyi Yang, Yunlu Zhao, Yuxiang Sun, Xiaogang Jin†,
Changqing Zou†
AAAI, 2024.
ArXiv
We propose a general implicit pipeline for composing
NeRF objects quickly.
|
Open-source Contribution
1. depth-diff-gaussian-rasterization
Add many extensions to vanilla Gaussian rasterization pipeline
used in 3D Gaussian Splatting, including depth forward pass, backward pass, and 4-th SH.
Code
2. Awesome-Inverse-Rendering
A collection of papers on NeRF-Based Inverse Rendering.
Code
3. floater-free-gaussian-splatting
Fix the densification bug to eliminate floaters in 3D-GS.
Code
4. My-exp-Gaussian
Early attempt to model specular highlights with ASG.
Code
|
|