مشخصات مقاله | |
عنوان مقاله | Bagging exponential smoothing methods using STL decomposition and Box–Cox transformation |
ترجمه عنوان مقاله | روش هموارسازی نمایی باگینگ با تجزیه STL و تبدیل جعبه کاکس |
فرمت مقاله | |
نوع مقاله | ISI |
نوع نگارش مقاله | مقاله پژوهشی (Research article) |
سال انتشار | |
تعداد صفحات مقاله | 10 صفحه |
رشته های مرتبط | اقتصاد، مدیریت، مهندسی فناوری اطلاعات IT و مهندسی کامپیوتر |
مجله | مجله بین المللی پیش بینی – International Journal of Forecasting |
دانشگاه | دانشکده فناوری اطلاعات، دانشگاه موناش ملبورن، استرالیا |
کلمات کلیدی | باگینگ، بوت استرپ، صاف نمایی، تجزیه STL |
کد محصول | E4026 |
نشریه | نشریه الزویر |
لینک مقاله در سایت مرجع | لینک این مقاله در سایت الزویر (ساینس دایرکت) Sciencedirect – Elsevier |
وضعیت ترجمه مقاله | ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید. |
دانلود رایگان مقاله | دانلود رایگان مقاله انگلیسی |
سفارش ترجمه این مقاله | سفارش ترجمه این مقاله |
بخشی از متن مقاله: |
1. Introduction
After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available (Goodwin, 2010). This is because of its simplicity and transparency, as well as its ability to adapt to many different situations. It also has a solid theoretical foundation in ETS state space models (Hyndman & Athanasopoulos, 2013; Hyndman, Koehler, Ord, & Snyder, 2008; Hyndman, Koehler, Snyder, & Grose, 2002). Here, the acronym ETS stands both for ExponenTial Smoothing and for Error, Trend, and Seasonality, which are the three components that define a model within the ETS family. Exponential smoothing methods obtained competitive results in the M3 forecasting competition (Koning, Franses, Hibon, & Stekler, 2005; Makridakis & Hibon, 2000), and the forecast package (Hyndman, 2014; Hyndman & Khandakar, 2008) in the programming language R (R Core Team, 2014) means that a fully automated software for fitting ETS models is available. Thus, ETS models are both usable and highly relevant in practice, and have a solid theoretical foundation, which makes any attempts to improve their forecast accuracy a worthwhile endeavour. Bootstrap aggregating (bagging), as proposed by Breiman (1996), is a popular method in machine learning for improving the accuracy of predictors (Hastie, Tibshirani, & Friedman, 2009) by addressing potential instabilities. These instabilities typically stem from sources such as data uncertainty, parameter uncertainty, and model selection uncertainty. An ensemble of predictors is estimated on bootstrapped versions of the input data, and the output of the ensemble is calculated by combining (usingthe median, mean, trimmed mean, or weighted mean, for example), often yielding better point predictions. In this work, we propose a bagging methodology for exponential smoothing methods, and evaluate it on the M3 data. As our input data are non-stationary time series, both serial dependence and non-stationarity have to be taken into account. We resolve these issues by applying a seasonaltrend decomposition based on loess (STL, Cleveland, Cleveland, McRae, & Terpenning, 1990) and a moving block bootstrap (MBB, see, e.g., Lahiri, 2003) to the residuals of the decomposition. |