Matterport3D 360° RGBD Dataset
This dataset is an extension of Matterport3D that contains data to train and validate high resolution 360 monocular depth estimation models. The data is structured in 90 folders belonging to 90 different buildings storing a total of 9684 samples. Each sample of the dataset consists of 4 files: the RGB equirectangular 360 image (.png), its depth ground-truth (.dpt), a visualisation of the depth ground-truth (.png) and the camera to world extrinsic parameters for the image (.txt) saved as 7 parameters: 3 for the camera center and the last 4 for the XYWZ rotation quaternion.
Cite this dataset as:
              
  Rey-Area, M.,
  Yuan, M.,
  Richardt, C.,
2022.
Matterport3D 360° RGBD Dataset.
Bath: University of Bath Research Data Archive.
Available from: https://doi.org/10.15125/BATH-01126.
            
Export
Data
data_00.zip
application/zip (12GB)
Creative Commons: Attribution 4.0
data_01.zip
application/zip (13GB)
Creative Commons: Attribution 4.0
data_02.zip
application/zip (11GB)
Creative Commons: Attribution 4.0
data_03.zip
application/zip (11GB)
Creative Commons: Attribution 4.0
data_04.zip
application/zip (11GB)
Creative Commons: Attribution 4.0
data_05.zip
application/zip (11GB)
Creative Commons: Attribution 4.0
data_06.zip
application/zip (9GB)
Creative Commons: Attribution 4.0
Creators
Manuel Rey-Area
                  
                  
University of Bath
                
Mingze Yuan
                  
                  
University of Bath
                
Christian Richardt
                  
                  
University of Bath
                
Contributors
University of Bath
                  
Rights Holder
                
Matterport3D authors
                  
Data Collector
                
Documentation
Methodology link:
Rey-Area, M., Yuan, M., and Richardt, C., 2021. 360MonoDepth: High-Resolution 360° Monocular Depth Estimation. Version 2. arXiv. Available from: https://doi.org/10.48550/ARXIV.2111.15669.
Documentation Files
README.md
text/plain (3kB)
Creative Commons: Attribution 4.0
Funders
Engineering and Physical Sciences Research Council
                  
https://doi.org/10.13039/501100000266
                
EPSRC Centre for Doctoral Training in Digital Entertainment
                  
EP/L016540/1
                
Engineering and Physical Sciences Research Council
                  
https://doi.org/10.13039/501100000266
                
Fellowship - Towards Immersive 360° VR Video with Motion Parallax
                  
EP/S001050/1
                
Engineering and Physical Sciences Research Council
                  
https://doi.org/10.13039/501100000266
                
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) - 2.0
                  
EP/T022523/1
                
Publication details
            
              Publication date: 25 March 2022
            
              
by: University of Bath
            
            
Version: 1
DOI: https://doi.org/10.15125/BATH-01126
URL for this record: https://researchdata.bath.ac.uk/1126
Related papers and books
Rey-Area, M., Yuan, M., and Richardt, C., 2022. 360MonoDepth: High-Resolution 360° Monocular Depth Estimation. In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 3752-3762. Available from: https://doi.org/10.1109/cvpr52688.2022.00374.
Related datasets and code
Chang, A., Dai, A., Funkhouser, T., Halber, M., Niessner, M., Savva, M., Song, S., Zeng, A., and Zhang, Y., n.d. Matterport3D. GitHub. Available from: https://github.com/niessner/Matterport.
Related online resources
Rey-Area, M., Yuan, M., and Richardt, C., 2022. 360MonoDepth: High-Resolution 360° Monocular Depth Estimation. GitHub. Available from: https://manurare.github.io/360monodepth/.
Contact information
Please contact the Research Data Service in the first instance for all matters concerning this item.
Contact person: Manuel Rey-Area
Faculty of Science
              
Computer Science
Research Centres & Institutes
              
Centre for Digital Entertainment (CDE)
Centre for the Analysis of Motion, Entertainment Research & Applications
