A New Approach to Smoothing Time Series Data
2020
Sign up for access to the world's latest research
Abstract
In the study of time series, new technique for smoothing the data is very important and evolving. This study presents a new technique for smoothing time series data. It is based on average and works in the time domain. It is aimed at creating a new series by redistributing the average of the reduced (halved) series of the variable of interest; it has the potential of reducing large peak. In this work, simulated stock data (S&P 500 quarterly stock value for 2005:1 -2014:4)) will be smoothed and residual analysis carried out to determined the performance of the technique. From the application of the technique, amongst other findings, the residual ɛ t from the technique was found to be white noise, which is the building block for time series models.
Related papers
Applied Stochastic Models in Business and Industry, 2009
We propose to decompose a financial time series into trend plus noise by means of the exponential smoothing filter. This filter produces statistically efficient estimates of the trend that can be calculated by a straightforward application of the Kalman filter. It can also be interpreted in the context of penalized least squares as a function of a smoothing constant has to be minimized by trading off fitness against smoothness of the trend. The smoothing constant is crucial to decide the degree of smoothness and the problem is how to choose it objectively. We suggest a procedure that allows the user to decide at the outset the desired percentage of smoothness and derive from it the corresponding value of that constant. A definition of smoothness is first proposed as well as an index of relative precision attributable to the smoothing element of the time series. The procedure is extended to series with different frequencies of observation, so that comparable trends can be obtained for say, daily, weekly or intraday observations of the same variable. The theoretical results are derived from an integrated moving average model of order (1, 1) underlying the statistical interpretation of the filter. Expressions of equivalent smoothing constants are derived for series generated by temporal aggregation or systematic sampling of another series. Hence, comparable trend estimates can be obtained for the same time series with different lengths, for different time series of the same length and for series with different frequencies of observation of the same variable. the analyst to study the behavior of each component separately, on the assumption that they have different causal forces.
2021
Time series has a leading position in statistical Analysis. Nowadays, many economic and industrial operations have been built based on time series. These operations include predicting the product demand variation, the future product prices oscillation, the stock storing control etc. This paper presents a study to show the effect of transformation and smoothing on the performance of the time series. The research results have shown a significant improvement in time-series operation can be noticed when the principles of transformation and smoothing are applied on time series. Introduction Time series topic is one of the essential topics nowadays because it is starting to be applied to different types of science widely. Mathematical-statistical possesses in analyzing the time series have begun to provide important functions of estimation. Furthermore, other involved significant points have made important decisions, and they used to simulate some mathematics and statistics samples for th...
Verslas: teorija ir praktika, 2011
The financial crisis of 2008-2009 caused lots of discussions between Academia and as a result researches on financial crisis and bubble prediction possibilities appeared. Academia shows its growing interest in the issue during the last decade. The majority of researches made are based on different forms of forecast used. Some of previous studies claim that the trend of the stock market can be forecasted using moving average method. After the finance market crashed, a need to forecast further possible bubbles arises. As the economics of the Baltic States is very sensitive to such bubbles it is very important to forecast preliminary the trends of the finance markets ant to plan the right actions in order to temper such bubble influence on the national economics. Although economic theory is opposite to the technical analysis theory which is the main tool for traders in stock markets it is used widely. This paper examines whether a proper technical analysis rule such as Exponential Moving Average (EMA) has a predictive power on stock markets in the Baltic States. The method is applied to OMX Baltic Benchmark Index and industrial indexes as they are more or less sensitive to the main index fluctuations. The results were compared using systematic error (mean square error, the mean absolute deviation, mean forecast error, the mean absolute percentage error) and tracking signal evaluation, CAPM method and appropriate period of EMA finding for each market forecast. A graphical analysis was used in order to determine whether EMA can forecast the main trends of the stock market fluctuations. The Santrauka. 2008-2009 m. finansų krizė sukėlė daug diskusijų tarp mokslininkų. Buvo tiriama, ar įmanoma prognozuoti finansų krizes ir rinkų burbulus. Pastaruosius dešimtmečius mokslininkai vis labiau domisi šia tema. Dauguma atliktų tyrimų grindžiami įvairiais prognozavimo tipais. Remiantis kai kuriais ankstesniais tyrimais, akcijų rinkos tendencijas galima prognozuoti taikant slankiojo vidurkio metodą. Žlugus finansų rinkoms, atsirado poreikis ateityje numatyti besiformuosiančius rinkų burbulus. Kadangi Baltijos valstybių ekonomika yra labai jautri tokiems rinkų burbulams, labai svarbu prognozuoti ateities vystymosi finansų rinkų tendencijas ir tinkamai planuoti veiksmus, siekiant sušvelninti tokių burbulų įtaką nacionalinei ekonomikai. Nors ekonominė teorija yra priešinga techninės analizės teorijai, kuri yra pagrindinė prekybininkų priemonė akcijų rinkose, Verslas: Teorija ir prakTika Business: Theory and pracTice issn 1648-0627 print / issn 1822-4202 online 2011 12(1): 63-74 http://www.btp.vgtu.lt/en Ð Audrius Dzikevičius , Svetlana Šaranda IÐLYGINIMO METODų TAIKYMAS RINKOS SVYRAVIMAMS PROGNOzUOTI conclusions made during the research suggest new research issues and new hypotheses for its further testing.
2013
Time series analysis deals with records that are collected over time. The objectives of time series analysis depend on the applications, but one of the main goals is to predict future values of the series. These values depend, usually in a stochastic manner, on the observations available at present. Such dependence has to be considered when predicting the future from its past, taking into account trend, seasonality and other features of the data. Some of the most successful forecasting methods are based on the concept of exponential smoothing. There are a variety of methods that fall into the exponential smoothing family, each having the property that forecasts are weighted combinations of past observations. But time series analysis needs proper statistical modeling. The model that better describes the behavior of the series in study can be crucial in obtaining “good” forecasts.
International Journal of Mathematics and Statistics Invention (IJMSI), 2019
In this paper, three existing methods used to estimate the degree of smoothness of a Spline Smoothing techniques was compared with a proposed smoothing method for a time series data under the assumption that the error terms are independent. The intention is to investigate the method that is most effective and consistent in estimating smoothing parameters, a simulation program written in R provides a comparison for GCV, GML, UBR and the proposed method, based on sample 20, 60 and 100, for four smoothing parameters 1, 2, 3 and 4 under two sigma levels i.e. 0.8 and 1.0. It was discovered that when the sample size is small (n = 20), UBR and GCV were equally preferred and for n = 60 and 100 at smoothing parameters (λ = 1, 2, 3 and 4) UBR method was the best for estimating the degree of smoothness.
Statistical Analysis and …
We adapt smoothing methods to histogram-valued time series (HTS) by introducing a barycentric histogram that emulates the "average" operation, which is the key to any smoothing filter. We show that, due to its linear properties, only the Mallows-barycenter is acceptable if we wish to preserve the essence of any smoothing mechanism. We implement a barycentric exponential smoothing to forecast the HTS of daily histograms of intradaily returns to both the SP500 and the IBEX 35 indexes. We construct a onestep-ahead histogram forecast, from which we retrieve a desired -Value-at-Risk forecast. In the case of the SP500 index, a barycentric exponential smoothing delivers a better forecast, in the MSE sense, than those derived from vector autoregression models, especially for the 5% Value-at-Risk. In the case of IBEX35, the forecasts from both methods are equally good.
Lambert Academic Publishing , 2023
Spline smoothing is a technique used to filter out noise in time series observations when predicting nonparametric regression models. Its performance depends on the choice of smoothing parameter lambda. Most of the existing smoothing methods applied to time series data tend to overfit in the presence of autocorrelated errors. The aim of this study is to propose a smoothing method which is the arithmetic weighted value of Generalized Cross-Validation (GCV) and Unbiased Risk (UBR) methods The objectives of the study were to (i) determine the best-fit smoothing method for the time series observation; (ii) identify the best smoothing method that does not overfit timeseries data when autocorrelation is present in the error term; (iii)establish the optimum value of the proposed smoothing method; (iv) compare GCV, GML and UBR smoothing methods to the proposed smoothing methods in terms of sample size; and (v)test the results of simulation using real life-data. A hybrid smoothing method of the Generalized Cross-Validation (GCV) and Unbiased Risk (UBR) was developed by adding the weighted values of Generalized CrossValidation (GCV) and Unbiased Risk (UBR). The Proposed Smoothing Method (PSM) was compared with Generalized Maximum Likelihood (GML), GCV and UBR smoothing methods. A Monte Carlo experiment of 1,000 trials was carried out at three different sample sizes (20, 60 and 100), three levels of the autocorrelation (02, 05 and 08), and four degrees of smoothing (1, 2, 3 and 4) Real-life data on Standard International Trade Classification (SITC) export and import price indices in Nigeria between 1970 2018 extracted from CBN 2019 edition were also used. The four smoothing methods' performances were estimated and compared using the Predictive Mean Squared Error (PMSE) criterion. The findings of the study revealed that: (i)for a time series observation with autocorrelated errors, Ǥሺ ൌͲͲሻ ൌ ͳ ሺ ሻൈሺ ሻ ൌͲͻͳǡ provides the besfit smoothing method for the model ; (ii)he PM does not over-fit data at all the autocorrelation levels considered ( ͲǤʹ ǤͷǤͺሻǢ (iii) t optium value of the PSM was at the weighted value of 0.04, withthe the eqtion is given as ሺ ሻ ൌ ሺͲͲͶሻ ! ሾ$ሺ&' ሻሿ( ሺͲͻሻ *+") (%&' ( , ) *ሼ% &' ሻሽ- (; (i wh thee is autocorrelation in the error term, PSM performed better than the GCVGML and UBR smoothing methods were considered at all-time seres sizes (T = 20, 60 d 100); (v) forhe eal-life data employed in the study, PSM proved to be the most efficit among the GCV, GML, PSM and UBR smoothing methods compar. The stu concluded that the PSM method provides the best-fit as asmoothing method, works well atutocorrelation levels (=0.2, 0.5 and 0.8), and does not overfit time-series obervations. The studrecommended that the proposed smoothing is appropriate for time series observations with autorrelation in the error term and econometrics real-life data. This stuy can be applied to; non parametric regression, non – parametric forecasting, spatial, survival and econometrics.
Discussiones Mathematicae Probability and Statistics, 2010
Time series analysis deals with records that are collected over time. The objectives of time series analysis depend on the applications, but one of the main goals is to predict future values of the series. These values depend, usually in a stochastic manner, on the observations available at present. Such dependence has to be considered when predicting the future from its past, taking into account trend, seasonality and other features of the data. Some of the most successful forecasting methods are based on the concept of exponential smoothing. There are a variety of methods that fall into the exponential smoothing family, each having the property that forecasts are weighted combinations of past observations. But time series analysis needs proper statistical modeling. The model that better describes the behavior of the series in study can be crucial in obtaining "good" forecasts.
International Journal of Forecasting, 2000
In this work we derive an analytical relationship between exact fixed-interval smoothed moments and those obtained from an arbitrarily initialized smoother. Combining this result with a conventional smoother we obtain an exact algorithm that can be applied to stationary, non-stationary or partially non-stationary systems. Other advantages of our method are its computational efficiency and numerical stability. Its extension to forecasting, filtering, fixed-point and fixed-lag smoothing is immediate, as it only requires modification of a conditioning information set. Three examples illustrate the adverse effect of an inadequate initialization on smoothed estimates.
In this paper we present an algorithm that can be implemented recursively or iteratively, to smooth waves by filtering out "noise" until the base case is reached, a canonical form that we call the wave's imprint. Unlike other wave smoothing algorithms that consider extrema as outliers or noise, our wave smoothing algorithm considers extrema to be essential data as is the case with seismic activity, epileptic seizures, and daily highs and lows of the DJIA. It is applicable to any wave structure that has a fixed time period during which a high and low are recorded. We limit the scope of this paper to the analysis of financial markets, demonstrating commonality over a broad spectrum of financial markets including indexes, equities, commodities and currencies. As an application of the algorithm, we devise a simple trading system that is profitable over multiple markets.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (9)
- Applied Technical Analysis, TSLabs@TradeStation.com.
- Box, E. George and Jenkins, Gwilym (1970). Time Series Analysis: Forecasting and Control, San Francisco: Holden Day.
- Cecil Bozarth (2011). Measuring Forecast Accuracy: Approaches to Forecasting: A Tutorial. http://scm.ncsu.edu/scm-articles/article/measuring-forecast-accuracy- approaches-to-forecasting-a-tutorial
- Chatfield, Chris (1995). The Analysis of Time Series: An Introduction, London: Chapman & Hall/CRC
- Garcia, D. (2010). Robust Smoothing of Gridded Data in one and Higher Dimensions with Missing Values. Journal of Computational Statistics and Data Analysis 54 1167- 1178
- Hastie, T., Loader, C. (1993. Local regression: Automatic kernel carpentry. Statistical Science 8, 120_129.
- Http://www.investopedia.com/terms/d/data-smoothing.asp Jianqing Fan and Qiwei Yao (2003). Nonlinear Time Series: Parametric and Non Parametric Methods, Springer, Verlag, New York.
- José Luis Guiñón, Emma Ortega, José García-Antón, and Valentín Pérez-Herranz (2007).
- Zakaria, R., Muda, T. Z. T. and Ismail, S. (2011). Issues in Univariate Forecasting. UUM College of Arts and Sciences, Universiti Utara Malaysia.