Publications¶
This is a list of scientific publications related to the Augmented Carpentry research thesis. During the developing of the research project we publish and contenarize each component independently and share it with the community together with as agnostic code as possible. Here you can find the list of publications and the related code repositories.
TSlam: a tag-based object-centered monocular navigation system for augmented manual woodworking¶
Andrea Settimi , Hong-Bin Yang , Julien Gamerro , Yves Weinand
Content: If you are interested in the self-localization system and the tag system we developer for Augmented Carpentry, have a look at this paper.
Abstract: TimberSLAM (TSLAM) is an object-centered, tag-based visual self-localization and mapping (SLAM) system for monocular RGB cameras. It was specifically developed to support a robust and augmented reality pipeline for close-range, noisy, and cluttered fabrication sequences that involve woodworking operations, such as cutting, drilling, sawing, and screwing with multiple tools and end-effectors. By leveraging and combining multiple open-source projects, we obtain a functional pipeline that can map, three-dimensionally reconstruct, and finally provide a robust camera pose stream during fabrication time to overlay an execution model with its digital-twin model, even under close-range views, dynamic environments, and heavy scene obstructions. To benchmark the proposed navigation system under real fabrication scenarios, we produce a data set of 1344 closeups of different woodworking operations with multiple tools, tool heads, and varying parameters (e.g., tag layout and density). The evaluation campaign indicates that TSLAM is satisfyingly capable of detecting the camera’s millimeter position and subangular rotation during the majority of fabrication sequences. The reconstruction algorithm’s accuracy is also gauged and yields results that demonstrate its capacity to acquire shapes of timber beams with up to two preexisting joints. We have made the entire source code, evaluation pipeline, and data set open to the public for reproducibility and the benefit of the community.
TTool: A Supervised Artificial Intelligence-Assisted Visual Pose Detector for Tool Heads in Augmented Reality Woodworking¶
Andrea Settimi , Naravich Chutisilp , Florian Aymanns , Julien Gamerro , Yves Weinand
Content: This paper and publication presents the object-tracking system we developed for tracking tool heads in Augmented Carpentry.
Abstract: We present TimberTool (TTool v2.1.1), a software designed for woodworking tasks assisted by augmented reality (AR), emphasizing its essential function of the real-time localization of a tool head’s poses within camera frames. The localization process, a fundamental aspect of AR-assisted tool operations, enables informed integration with contextual tracking, facilitating the computation of meaningful feedback for guiding users during tasks on the target object. In the context of timber construction, where object pose tracking has been predominantly explored in additive processes, TTool addresses a noticeable gap by focusing on subtractive tasks with manual tools. The proposed methodology utilizes a machine learning (ML) classifier to detect tool heads, offering users the capability to input a global pose and utilizing an automatic pose refiner for final pose detection and model alignment. Notably, TTool boasts adaptability through a customizable platform tailored to specific tool sets, and its open accessibility encourages widespread utilization. To assess the effectiveness of TTool in AR-assisted woodworking, we conducted a preliminary experimental campaign using a set of tools commonly employed in timber carpentry. The findings suggest that TTool can effectively contribute to AR-assisted woodworking tasks by detecting the six-degrees-of-freedom (6DoF) pose of tool heads to a satisfactory level, with a millimetric positional error of 3.9 ± 1 mm with possible large room for improvement and 1.19 ± 0.6° for what concerns the angular accuracy.
Augmented-reality-assisted timber drilling with smart retrofitted tools¶
Andrea Settimi , Julien Gamerro , Yves Weinand
Content: In this publication you will find the very first faisability study that lead to the beggining of Augmented Carpentry. Thee pilot study consisted in exploring augmented-reality-assisted fabrication for one of the most simple operations in timber carpentry: drilling.
Abstract: An ordinary electric drill was integrated into a context-aware augmented reality (AR) framework to assist in timber-drilling tasks. This study is a preliminary evaluation to detail technical challenges, potential bottlenecks, and accuracy of the proposed object- and tool-aware AR-assisted fabrication systems. In the designed workflow, computer vision tools and sensors are used to implement an inside-out tracking technique for retrofitted drills based on a reverse engineering approach. The approach allows workers to perform complex drilling angle operations according to computer-processed feedback instead of drawing, marking, or jigs. First, the developed methodology was presented, and its various devices and functioning phases were detailed. Second, this first proof of concept was evaluated by experimentally scanning produced drillings and comparing the discrepancies with their respective three-dimensional execution models. This study outlined improvements in the proposed tool-aware fabrication process and clarified the potential role of augmented carpentry in the digital fabrication landscape.