مقاله انگلیسی رایگان در مورد شبکه عصبی احتمالاتی وزن دار – الزویر ۲۰۱۸

مقاله انگلیسی رایگان در مورد شبکه عصبی احتمالاتی وزن دار – الزویر ۲۰۱۸

 

مشخصات مقاله
انتشار مقاله سال ۲۰۱۸
تعداد صفحات مقاله انگلیسی ۱۲ صفحه
هزینه دانلود مقاله انگلیسی رایگان میباشد.
منتشر شده در نشریه الزویر
نوع مقاله ISI
عنوان انگلیسی مقاله Weighted probabilistic neural network
ترجمه عنوان مقاله شبکه عصبی احتمالاتی وزن دار
فرمت مقاله انگلیسی  PDF
رشته های مرتبط مهندسی کامپیوتر، فناوری اطلاعات
گرایش های مرتبط شبکه های کامپیوتری و هوش مصنوعی
مجله علوم اطلاعاتی – Information Sciences
دانشگاه Rzeszow University of Technology – Rzeszow – Poland
کلمات کلیدی شبکه عصبی احتمالاتی، وزن، آنالیز حساسیت، طبقه بندی، دقت
کلمات کلیدی انگلیسی Probabilistic neural network, Weights, Sensitivity analysis, Classification Accuracy
کد محصول E6035
وضعیت ترجمه مقاله  ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید.
دانلود رایگان مقاله دانلود رایگان مقاله انگلیسی
سفارش ترجمه این مقاله سفارش ترجمه این مقاله

 

بخشی از متن مقاله:
۱٫ Introduction

Classical feedforward neural networks such as multilayer perceptron or radial basis function network have their layers linked using weighted connections. Within the training process, the utilized weights must be first initialized and then iteratively recomputed to optimize some assumed performance measure of the model for a given training data. The probabilistic neural network [36,37], however, is the model unequipped with any additional weighting factors inside its structure. PNN is therefore free from time consuming weights’ update. This fact gives this network the advantage in application popularity. The usage of PNN can be found in the domains of medical diagnosis and prediction [9,15,17,18,20], image classification and recognition [4,25,45], earthquake magnitude prediction [1], multiple partial discharge sources classification [43], interval information processing [12–۱۴], phoneme recognition [7], email security enhancement [41], intrusion detection systems [40], classification in a time-varying environment [27,28] or hardware implementation [46]. The variant of PNN in regression tasks, known as general regression neural network [38], is also studied by many authors, e.g.: in function approximation [10] or knowledge discovery in data streams [6]. Four layers create a structure of a conventional PNN: the input layer represented by data attributes, the pattern layer composed of as many neurons as training patterns, the summation layer containing a single neuron for each class and the output layer with a decision neuron which determines the classification outcome. In the research centered on PNN architecture, some minor emphasis is usually placed on applying the weights inside this network. The first contribution related to PNN’s weights is presented in [22]. However, the coefficients are not computed explicitly. The network operates using anisotropic Gaussians to provide the output for a considered sample by utilizing the covariance matrix instead of a single smoothing parameter in the activation function. In the works of [34] and [35], the weighting coefficients are directly introduced inside the PNN model. They are placed between pattern and summation layer. These factors are calculated from soft labeling probability matrix based on Bayesian algorithm. In turn, the authors of [23–۲۵] create weighted PNN based on the class separability in the data. The weight coefficients are defined as the ratio of ‘between-class variance’ and ‘withinclass variance’ for a particular training pattern. They are included between a pattern and summation layer of PNN.

ثبت دیدگاه