Generating Large-Scale Datasets for Spacecraft Pose Estimation via a High-Resolution Synthetic Image Renderer

dc.contributor.authorWarunyu Hematulin
dc.contributor.authorPatcharin Kamsing
dc.contributor.authorThaweerath Phisannupawong
dc.contributor.authorThanayuth Panyalert
dc.contributor.authorShariff Manuthasna
dc.contributor.authorPeerapong Torteeka
dc.contributor.authorPisit Boonsrimuang
dc.date.accessioned2025-07-21T06:12:49Z
dc.date.issued2025-04-12
dc.description.abstractThe trend toward conducting vision-based spacecraft pose estimation using deep neural networks, which necessitates accurately labeled datasets for training, is addressed in this paper. A method for generating an image regression-labeled dataset for spacecraft pose estimation through simulations involving Unreal Engine 5 is proposed herein. This work provides detailed algorithms for pose sampling and image generation, making it easy to reproduce the employed dataset. The dataset consists of images obtained under harsh lighting conditions and high-resolution backgrounds, featuring spacecraft models including Dragon, Soyuz, Tianzhou, and the ascent vehicle of Chang’E-6. The dataset comprises 40,000 high-resolution images, which are evenly distributed, with 10,000 images for each spacecraft model in scenes with both the Earth and the Moon. Each image is labeled with multivariate pose vectors that represent the relative position and attitude of the corresponding spacecraft with respect to the camera. This work emphasizes the critical role of realistic simulations in creating cost-effective synthetic datasets for training neural network-based pose estimators and publicly available for further study.
dc.identifier.doi10.3390/aerospace12040334
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/14378
dc.subject.classificationSpace Satellite Systems and Control
dc.titleGenerating Large-Scale Datasets for Spacecraft Pose Estimation via a High-Resolution Synthetic Image Renderer
dc.typeArticle

Files

Collections