دانلود رایگان مقالات الزویر - ساینس دایرکتدانلود رایگان مقالات بیس کامپیوتردانلود رایگان مقالات پژوهشی کامپیوتردانلود رایگان مقالات ژورنالی کامپیوتردانلود رایگان مقالات سال 2019دانلود رایگان مقاله ISI الگوریتم به زبان انگلیسیدانلود رایگان مقاله ISI الگوریتم و محاسبات به زبان انگلیسیدانلود رایگان مقاله ISI مهندسی کامپیوتر به زبان انگلیسی سال 2022 و 2023دانلود رایگان مقاله ISI مهندسی نرم افزار به زبان انگلیسیدانلود رایگان مقاله ISI هوش مصنوعی به زبان انگلیسیسال انتشار

مقاله انگلیسی رایگان در مورد تنظیم عضویت الگوریتم های NLMS کرنل انطباقی – الزویر ۲۰۱۹

 

مشخصات مقاله
ترجمه عنوان مقاله تنظیم عضویت الگوریتم های NLMS کرنل انطباقی: طراحی و تجزیه و تحلیل
عنوان انگلیسی مقاله Set-membership adaptive kernel NLMS algorithms: Design and analysis
انتشار مقاله سال ۲۰۱۹
تعداد صفحات مقاله انگلیسی ۳۳ صفحه
هزینه دانلود مقاله انگلیسی رایگان میباشد.
پایگاه داده نشریه الزویر
نوع نگارش مقاله
مقاله پژوهشی (Research Article)
مقاله بیس این مقاله بیس میباشد
نمایه (index) Scopus – Master Journal List – JCR
نوع مقاله ISI
فرمت مقاله انگلیسی  PDF
ایمپکت فاکتور(IF)
۳٫۹۳۳ در سال ۲۰۱۷
شاخص H_index ۱۰۵ در سال ۲۰۱۹
شاخص SJR ۰٫۹۴۰ در سال ۲۰۱۷
شناسه ISSN ۰۱۶۵-۱۶۸۴
شاخص Quartile (چارک) Q1 در سال ۲۰۱۷
رشته های مرتبط مهندسی کامپیوتر
گرایش های مرتبط مهندسی الگوریتم ها و محاسبات، مهندسی نرم افزار، هوش مصنوعی
نوع ارائه مقاله
ژورنال
مجله  پردازش سیگنال – Signal Processing
دانشگاه Centre for Telecommunications Studies (CETUC), PUC-Rio, Rio de Janeiro, Brazil
کلمات کلیدی الگوریتم های انطباقی، تنظیم عضویت الگوریتم ها، تکنیک های داده انتخابی، روش های گرنل، تجزیه و تحلیل آماری
کلمات کلیدی انگلیسی Adaptive algorithms، Set-membership algorithms، Data-selective techniques، Kernel methods، Statistical analysis
شناسه دیجیتال – doi
https://doi.org/10.1016/j.sigpro.2018.07.007
کد محصول E11146
وضعیت ترجمه مقاله  ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید.
دانلود رایگان مقاله دانلود رایگان مقاله انگلیسی
سفارش ترجمه این مقاله سفارش ترجمه این مقاله

 

فهرست مطالب مقاله:
Abstract

۱- Introduction

۲- Principles of kernel methods and set-membership techniques

۳- Proposed centroid-based set-membership kernel normalized least-mean-square algorithm

۴- Proposed nonlinear regression-based SM-KNLMS algorithm

۵- Analysis

۶- Simulations

۷- Conclusions

References

بخشی از متن مقاله:

Abstract

In the last decade, a considerable research effort has been devoted to developing adaptive algorithms based on kernel functions. One of the main features of these algorithms is that they form a family of universal approximation techniques, solving problems with nonlinearities elegantly. In this paper, we present data-selective adaptive kernel normalized least-mean square (KNLMS) algorithms that can increase their learning rate and reduce their computational complexity. In fact, these methods deal with kernel expansions, creating a growing structure also known as the dictionary, whose size depends on the number of observations and their innovation. The algorithms described herein use an adaptive step-size to accelerate the learning and can offer an excellent tradeoff between convergence speed and steady state, which allows them to solve nonlinear filtering and estimation problems with a large number of parameters without requiring a large computational cost. The data-selective update scheme also limits the number of operations performed and the size of the dictionary created by the kernel expansion, saving computational resources and dealing with one of the major problems of kernel adaptive algorithms. A statistical analysis is carried out along with a computational complexity analysis of the proposed algorithms. Simulations show that the proposed KNLMS algorithms outperform existing algorithms in examples of nonlinear system identification and prediction of a time series originating from a nonlinear difference equation.

Introduction

Adaptive filtering algorithms have been the focus of a great deal of research in the past decades and the machine learning community has embraced and further advanced the study of these methods. In fact, adaptive algorithms are often considered with linear structures, which limits their performance and does not draw attention to nonlinear problems that can be solved in various applications. In order to deal with nonlinear problems a family of nonlinear adaptive algorithms based on kernels has been developed. In particular, a kernel is a function that compares the similarity between two inputs and can be used for filtering, estimation and classification tasks. Kernel adaptive filtering (KAF) algorithms have been tested in many different scenarios and applications [1], [2], [3], [4], [5], showing very good results. One of the main advantages of KAF algorithms is that they are universal approximators [1], which gives them the ability to address complex and nonlinear problems. However, their computational complexity is much higher than their linear counterparts [1]. One of the first KAF algorithms to appear, which is widely adopted in the KAF family because of its simplicity, is the kernel least-mean square (KLMS) algorithm proposed in [6] and later extended in [7]. The KLMS algorithm has been inspired by the least-mean square (LMS) algorithm and, thanks to its good performance, led many researchers to work in the development of kernel versions of conventional adaptive algorithms. For instance, a kernel version of the NLMS algorithm has been proposed in [5] using a nonlinear regression approach for time series prediction. In [8], [9], the affine projection algorithm (APA) has been used as the basis of the derivation of kernel affine projection (KAP) algorithms. Adaptive projection algorithms using kernel techniques have been reported in [10], [11]. The recursive least squares algorithm (RLS) has been extended in [12], where the kernel recursive least squares (KRLS) has been described. Later, the authors of [13] proposed an extension of the KRLS algorithm and the use of multiple kernels has been studied in [14] and [15].

نوشته های مشابه

دیدگاهتان را بنویسید

نشانی ایمیل شما منتشر نخواهد شد. بخش‌های موردنیاز علامت‌گذاری شده‌اند *

دکمه بازگشت به بالا