Aspects of Time Series Analysis with Entropies and Complexity Measures
No Thumbnail Available
This exploratory work looks on time series complexity, for direct applications in change detection of the structure and properties of the signals. The objective of the paper was to compare the results of the evaluation of the complexity of time series data, obtained with measures based on entropy and data complexity indices. As test signals, determinist, random and chaotic signals are considered, in an independent and mixed probabilistic approach. Complexity descriptors are based on entropies, as Renyi, Tsallis, Multiscale technique, and two data complexity indices, as Lempel-Ziv complexity and Lyapunov exponent. High values of complexity measures are expected for all cases where random or chaotic components are dominant, i.e. greater amplitudes than the determinist components. The complexity measures are evaluated in terms of monotony, sensitivity at the length of time series, and change detection capability of the structure of the analyzed signal. The results of the experiments based on computer-based simulations are presented with fuzzy labels and show a variety of results, i.e. good results for some cases and small for others. Such results suggest an aggregate criterion for change detection, with at least two terms, one based on entropy and another one on complexity indices.
systems, signals, determinist, random, chaos, complexity, time series, change detection