Return to search

Wood fingerprint recognition and detection of thin cracks

The first part of this thesis deals with recognition of wood fingerprints extracted from timber surfaces. It presents different methods to track sawn wood products through an industrial process using cameras. The possibility of identifying individual wood products comes from the biological variation of trees, where the genetic code, environment, and breakdown process means that every board has a unique appearance. Wood fingerprint recognition experiences many of the same challenges as found in human biometrics applications.  The vision for the future is to be able to utilize existing imaging sensors in the production line to track individual products through a disordered and diverging product flow. The flow speed in wood industries is usually very high, 2-15 meters per second, with a high degree of automation. Wood fingerprints combined with automated inspection makes it possible to tailor subsequent processing steps for each product and can be used to deliver customized products. Wood tracking can also give the machine operators vital feedback on the process parameters.  The motivation for recognition comes from the need for the wood industry to keep track of products without using invasive methods, such as bar code stickers or painted labels. In the project Hol-i-Wood Patching Robot, an automatic scanner- and robot system was developed. In this project, there was a wish to keep track of the shuttering panels that were going to be repaired by the automatic robots.  In this thesis, three different strategies to recognize previously scanned sawn wood products are presented. The first approach uses feature detectors to find matching features between two images. This approach proved to be robust, even when subjected to moderate geometric- and radiometric image distortions. The recognition accuracy reached 100% when using high quality scans of Scots pine boards that had more than 20 knots.  The second approach uses local knot neighborhood geometry to find point matches between images. The recognition accuracy reached above 99% when matching simulated Scots pine panels with realistically added noise to the knot positions, given the assumption that 85% of the knots could be detected. The third approach uses template matching to match a small part of a board against a large set of full-length boards. Cropping and heavy downsampling was implemented in this study. The intensity normalized algorithms using cross-correlation (CC-N) and correlation coefficient (CCF-N) obtained the highest recognition accuracy and had very similar overall performance. For instance, the matching accuracy for the CCF-N method reached above 99% for query images of length 1 m when the pixel density was above 0.08 pixels/mm. The last part of this thesis deals with the detection of thin cracks on oak flooring lamellae using ultrasound-excited thermography and machine learning. Today, many people manually grade and detect defects on wooden lamellae in the parquet flooring industry. The last appended paper investigates the possibility to use ensemble methods random forests and boosting to automate the process. When friction occurs in thin cracks they become warm and thus visible for a thermographic camera. Several image processing techniques were used to suppress noise and enhance likely cracks in the images. The best ensemble methods reached an average classification accuracy of 0.8, which was very close to the authors own manual attempt at separating the images (0.83).

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:ltu-65701
Date January 2017
CreatorsPahlberg, Tobias
PublisherLuleå tekniska universitet, Träteknik, Luleå University of Technology, Luleå
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeDoctoral thesis, comprehensive summary, info:eu-repo/semantics/doctoralThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess
RelationDoctoral thesis / Luleå University of Technology 1 jan 1997 → …, 1402-1544, info:eu-repo/grantAgreement/EC/FP7/284573

Page generated in 0.0055 seconds