Free-viewpoint AR human-motion reenactment based on a single RGB-D video stream
Description
When observing a person (an actor) performing or demonstrating some activity for the purpose of learning the action, it is best for the viewers to be present at the same time and place as the actor. Otherwise, a video must be recorded. However, conventional video only provides two-dimensional (2D) motion, which lacks the original third dimension of motion. In the presence of some ambiguity, it may be hard for the viewer to comprehend the action with only two dimensions, making it harder to learn the action. This paper proposes an augmented reality system to reenact such actions at any time the viewer wants, in order to aid comprehension of 3D motion. In the proposed system, a user first captures the actor's motion and appearance, using a single RGB-D camera. Upon a viewer's request, our system displays the motion from an arbitrary viewpoint using a rough 3D model of the subject, made up of cylinders, and selecting the most appropriate textures based on the viewpoint and the subject's pose. We evaluate the usefulness of the system and the quality of the displayed images by user study.
Journal
-
- 2014 IEEE International Conference on Multimedia and Expo (ICME)
-
2014 IEEE International Conference on Multimedia and Expo (ICME) 1-6, 2014
IEEE
- Tweet
Keywords
- augmented reality
- image colour analysis
- image motion analysis
- image sensors
- image texture
- solid modelling
- video signal processing
- 3D motion comprehension
- action learning
- augmented reality system
- free-viewpoint AR human-motion reenactment
- rough 3D model
- single RGB-D camera
- single RGB-D video stream
- texture selection
- Cameras
- Joints
- Sensors
- Solid modeling
- Streaming media
- Three-dimensional displays
- Augmented reality
- free-viewpoint image generation
- human motion capture
Details 詳細情報について
-
- CRID
- 1050858784329715968
-
- NII Article ID
- 120006659630
-
- HANDLE
- 10061/11256
-
- Text Lang
- en
-
- Article Type
- conference paper
-
- Data Source
-
- IRDB
- Crossref
- CiNii Articles
- KAKEN
- OpenAIRE