S-Space College of Engineering/Engineering Practice School (공과대학/대학원) Dept. of Mechanical Aerospace Engineering (기계항공공학부) Theses (Master's Degree_기계항공공학부)
Hand motion based 3D printing slicer
손동작 기반의 3D Printing Slicer 개발
- 공과대학 기계항공공학부
- Issue Date
- 서울대학교 대학원
- 3D modeling; 3D printing slicer; CAD; Hand gesture recognition; Hand tracking; Hand motion interface; Leap Motion
- 학위논문 (석사)-- 서울대학교 대학원 : 공과대학 기계항공공학부, 2018. 2. 이건우.
- With the advancement of computer-aided design (CAD), 3D modeling has been extended not only in manufacturing and product design but also in architecture, civil engineering, and even art. The public become familiar with 3D modeling as self-production technologies, such as 3D printing and 3D scanning, become popular. However, 3D modeling is still considered a difficult task for users. One of the reasons for this stereotype is the conventional 2D work environment, operated by mouse, which results in a dimensional gap with the 3D model in the stereoscopic workspace with which the user wants to interact.
This paper describes an intuitive, easy-to-use 3D work environment for 3D modeling, using a 3D printing slicer as an example of a 3D modeler. The proposed 3D printing slicer selects a users hand for 3D input device and operates by hand motion. Because a users hand is an optimal input device that can transmit user intention through simple gesture without distortion, this research has mainly focused on providing intuitive human hand interaction with 3D models.
The proposed 3D printing slicer collects data regarding the hand and each finger joint through a ready-made hand tracking device, Leap Motion, which it uses to recognize hand gestures. Then, mesh processing function associated with the recognized hand gesture is activated, which enables easy deformation of the model for 3D printing. As a result, hand motion makes it possible to implement the functions that were difficult to use in the existing 2D work environment. Furthermore, hand motion makes it possible to interact with 3D model more efficiently by conveying the users intuition to slicer functions that were not efficient with simple algorithms. Therefore, the contribution of this paper is more than just an implementation of the hand-controlled slicer function. It proves that, in interaction with 3D models, hand motion can be superior to existing 2D input devices.