Yes / Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy
because it is portable and easy to use, especially in remote monitoring health-services.
However, during the WCE process, the large amount of captured video data demands a
significant deal of computation to analyze and retrieve informative video frames. In order to
facilitate efficient WCE data collection and browsing task, we present a resource- and
bandwidth-aware WCE video summarization framework that extracts the representative
keyframes of the WCE video contents by removing redundant and non-informative frames.
For redundancy elimination, we use Jeffrey-divergence between color histograms and
inter-frame Boolean series-based correlation of color channels. To remove non-informative
frames, multi-fractal texture features are extracted to assist the classification using an
ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the
WCE system to perform computationally intensive video summarization tasks. To resolve
computational challenges, mobile-cloud architecture is incorporated, which provides resizable
computing capacities by adaptively offloading video summarization tasks between the client
and the cloud server. The qualitative and quantitative results are encouraging and show that
the proposed framework saves information transmission cost and bandwidth, as well as the
valuable time of data analysts in browsing remote sensing data. / Supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2013R1A1A2012904).
Identifer | oai:union.ndltd.org:BRADFORD/oai:bradscholars.brad.ac.uk:10454/17184 |
Date | 18 July 2019 |
Creators | Mehmood, Irfan, Sajjad, M., Baik, S.W. |
Source Sets | Bradford Scholars |
Language | English |
Detected Language | English |
Type | Article, Published version |
Rights | © 2014 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/)., CC-BY |
Page generated in 0.0018 seconds