Fog Computing is an emerging computing paradigm that shifts certain processing closer to the Edge of a network, generally within one network hop, where latency is minimized, and results can be obtained the quickest. However, not a lot of research has been done on the effectiveness of Fog in real-world applications. The aim of this research is to show the effectiveness of the Fog Computing paradigm as the middle layer in a 3-tier architecture between the Internet of Things (IoT) and the Cloud. Two applications were developed: one utilizing Fog in a 3-tier architecture and another application using IoT and Cloud with no Fog. A quantitative and qualitative analysis followed the application development, with studies focused on application response time and walkthroughs for AWS Greengrass and Amazon Machine Learning.
Furthermore, the application itself demonstrates an architecture which is of both business and research value, providing a real-life coffee shop use-case and utilizing a newly available Fog offering from Amazon known as Greengrass. At the Cloud level, the newly available Amazon Machine Learning API was used to perform predictive analytics on the data provided by the IoT devices. Results suggest that Fog-enabled applications have a much lower range of response times as well as lower response times overall. These results suggest Fog-enabled solutions are suitable for applications which require network stability and reliably lower latency.
Identifer | oai:union.ndltd.org:unf.edu/oai:digitalcommons.unf.edu:etd-1889 |
Date | 01 January 2018 |
Creators | Wheeler, Nathan |
Publisher | UNF Digital Commons |
Source Sets | University of North Florida |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | UNF Graduate Theses and Dissertations |
Page generated in 0.002 seconds