Research Area:  Fog Computing
The traditional cloud computing technology provides services to a plethora of applications by providing resources. These services support numerous industries for computational purposes and data storage. However, the obstruction of the cloud computing framework is its inadequate flexibility and problem to accommodate the diverse requirements generated from an IoT-based environment. Cloud computing is emerging with the latest paradigms to ensure that the connected heterogeneous system can achieve high-performance computing (HPC). Furthermore, many of todays requirements prefer diverse geographic distribution of resources and near to the end device location. Hence, the new fog computing paradigm provides some innovative solutions for real-time applications. The fog computing frameworks prime agenda is to support latency-sensitive applications by utilizing all available resources. In this paper, a novel approach is designed for resource allocation and management. TRAM , a technique for resource allocation and management, is proposed to ensure resource utilization at the fog layer. This approach is used to track the intensity level of existing tasks using expectation maximization (EM) algorithm and calculate the current status of resources. All the available resources manage by using a wireless system. This paper provides a scheduling algorithm for the resource grading process in the fog computing environment. The performance of this approach is tested on the iFogSim simulator and compared the results with SJF, FCFS and MPSO. The experimental results demonstrated that TRAM effectively minimizes execution time, network consumption, energy consumption and average loop delay of tasks.
Keywords:  
Author(s) Name:  Heena Wadhwa & Rajni Aron
Journal name:  The Journal of Supercomputing
Conferrence name:  
Publisher name:  Springer
DOI:  10.1007/s11227-021-03885-3
Volume Information:  volume 78, pages 667–690 (2022)
Paper Link:   https://link.springer.com/article/10.1007/s11227-021-03885-3