مشخصات مقاله | |
انتشار | مقاله سال 2017 |
تعداد صفحات مقاله انگلیسی | 7 صفحه |
هزینه | دانلود مقاله انگلیسی رایگان میباشد. |
منتشر شده در | نشریه IEEE |
نوع مقاله | ISI |
عنوان انگلیسی مقاله | Resource Aware Placement of IoT Application Modules in Fog-Cloud Computing Paradigm |
ترجمه عنوان مقاله | جایگذاری آگاه منابع واحد های کاربردی اینترنت اشیا در پارادایم محاسبات ابری – مه |
فرمت مقاله انگلیسی | |
رشته های مرتبط | مهندسی فناوری اطلاعات |
گرایش های مرتبط | اینترنت و شبکه های گسترده |
مجله | مدیریت مجتمع شبکه و خدمات – (Integrated Network and Service Management (IM |
دانشگاه | Telecommunications Software and Systems Group – Ireland |
کلمات کلیدی | محاسبات مه، محاسبات ابر، حساس بودن به زمان وقوع، ماژول کاربرد، انتقال آگاهانه منابع |
کد محصول | E5808 |
وضعیت ترجمه مقاله | ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید. |
دانلود رایگان مقاله | دانلود رایگان مقاله انگلیسی |
سفارش ترجمه این مقاله | سفارش ترجمه این مقاله |
بخشی از متن مقاله: |
I. INTRODUCTION
The Internet of Things (IoT) has reformed the future of connectivity and reachability. It aims to bring every object online, hence generating a huge amount of data that can overwhelm the storage systems and cause a significant surge in the application reaction time. With IoT into play, the near future will involve billions of interconnected IoT devices emitting large volumes of data streams for processing. In its report, McKinsey [1] estimates that the user base will have 1 trillion interconnected IoT devices by 2025, further substantiating the impending scenario. According to this estimate, by 2025 the IoT will have a potential economic impact of USD 11 trillion per year, which nearly represents 11 percent of the world economy. Cloud computing can help here by offering on-demand scalable storage and processing services that can scale the IoT requirements; however, for latency-sensitive applications the delay caused because of communication distance between user base and Cloud is still unpleasant. Cloud computing has its advantages, but with accelerated increase in ubiquitous mobile and sensing devices coupled with upgradation in technology, the upcoming IoT ecosystem challenges the traditional Cloud computing network architecture. To address the above said challenges, meet the dynamic scalability, efficient in-network processing and latency sensitive communication, the need for IoT applications has led to the evolution of Fog Computing paradigm [2], [3]. Fog computing aims to extend the Cloud services and utilities to the network edge, thus catering to the need of latency sensitive applications and providing real-time data processing and dispatching. In this new paradigm, computing is dynamically distributed across the Cloud sites and the network elements based on the Quality of Service (QoS) requirements. Together, these two paradigms can offer a fruitful interplay between the Fog and Cloud, particularly when it comes to catering the needs of latency sensitive applications. However, the devices closer to the network edge (routers, access points, gateways etc.) are traditionally not computationally powerful enough to host all the modules of an application (or heterogeneous modules of various applications, for that matter) in the IoT ecosystem, and hence the strategy needs to be formulated in a way which keeps these constraints in mind, i.e., iterates from Fog layer towards the Cloud and tries to place the modules first on the available resources on Fog layer, thereafter iterating towards the Cloud. It is beyond doubt that there would arise a need of further research to meet the challenges related to the evolving FogCloud Architecture. |