مقاله انگلیسی رایگان در مورد ANN-MIND: آموزش شبکه های عصبی با مجموعه داده های ناقص – IEEE 2018
مشخصات مقاله | |
ترجمه عنوان مقاله | ANN-MIND: مطالعه مقایسه ای در آموزش شبکه های عصبی با مجموعه داده های ناقص |
عنوان انگلیسی مقاله | ANN-MIND: A Comparative Study on the Training of Neural Networks with Incomplete Datasets |
انتشار | مقاله سال ۲۰۱۸ |
تعداد صفحات مقاله انگلیسی | ۸ صفحه |
هزینه | دانلود مقاله انگلیسی رایگان میباشد. |
پایگاه داده | نشریه IEEE |
مقاله بیس | این مقاله بیس نمیباشد |
فرمت مقاله انگلیسی | |
رشته های مرتبط | مهندسی فناوری اطلاعات و کامپیوتر |
گرایش های مرتبط | شبکه های کامپیوتری و هوش مصنوعی |
نوع ارائه مقاله |
کنفرانس |
مجله / کنفرانس | کنفرانس هفتگی IST آفریقا – IST-Africa Week Conference |
دانشگاه | University of Johannesburg – Kingsway and University Road – South Africa |
کلمات کلیدی | شبکه عصبی مصنوعی، خروج شبکه عصبی، داده های از دست رفته |
کلمات کلیدی انگلیسی | Artificial neural networks, neural network dropout, missing data |
کد محصول | E9514 |
وضعیت ترجمه مقاله | ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید. |
دانلود رایگان مقاله | دانلود رایگان مقاله انگلیسی |
سفارش ترجمه این مقاله | سفارش ترجمه این مقاله |
فهرست مطالب مقاله: |
Abstract ۱ Introduction ۲ Objectives ۳ Methodology ۴ Developments ۵ Results ۶ Business Benefits ۷ Conclusions References |
بخشی از متن مقاله: |
Abstract
The quality of a dataset plays a central role in the results and conclusion that can be drawn from analysis such a dataset. As it is often said; garbage in, garbage out. In recently years, neural networks have displayed good performance in solving a diverse number of problems. Unfortunately, artificial neural networks are not immune to this misfortune presented by missing values. Furthermore, in most real word settings, it is often the case that, the only data available for the training of artificial neural networks consists of a significant amount of missing values. In such cases, we are left with little choice but to use this data for the purposes of training neural networks, although doing so may result in a poorly performing neural network. In this paper, we describe the use of neural network dropout as a technique for training neural networks in the presence of missing values. We test the performance of different neural network architectures on different levels of artificial generated missing values introduces on the MNIST handwriting recognition dataset, Cifar-10 and the Pima Indians Diabetes Dataset and find that in most cases it results in significantly better performance of the neural network compared to other missing data handling techniques. Introduction For one reason or another, some records in a database might contain missing fields. A malfunctioning data collection device might record some pieces of data and not others. In a questionnaire, for one reason or another, some applicants might choose to not answer some of the question. This missing information is what we call missing data. Regardless of the reason(s) that lead to the missing data, missing data is a common occurrence in real life settings and it plays a crucial role in the quality of machine learning models built on top of this datasets. Most systems currently in use, merely discard the missing observation from the training datasets, while others just proceed to use this data and ignore the problems presented by the missing values. Still other approaches choose to impute this missing value with fixed constants such as means and mode. Like most machine learning algorithms, most neural network architectures work under the assumption that the supplied data contains no missing values. Whist neural networks are generally known to be resilient to noise, in the presence of a significant amount of missing values, this resilience soon fades away [1] and [2]. This dissertation explores a method for training neural networks in the event where the training dataset consists missing values. Its contribution is the introduction of four new datasets containing artificial generated missing values and the proposed use of neural network dropout (a widely used regularisation technique) as a technique for handling missing values. We refer to this use of neural network dropout as Artificial Neural Network Missing INputs Dropout (ANN-MIND). |