[1] T Cover, J Thomas, Elements of information theory, J Wiley, New York (2006).

[2] I Grosse, P Bernaola Galvan, P Carpena, R Roman Roldan, J Oliver, H E Stanley, Analysis of symbolic sequences using the Jensen-Shannon divergence, Phys. Rev. E, 65, 041905 (2002).

[3] M A Ré, R K Azad, Generalization of entropy based divergence measures for symbolic sequence analysis, PLoS ONE 9, e93532 (2014).

[4] B W Silverman, Density estimation for statistics and data analysis, Chapman and Hall, London (1986).

[5] R Steuer, J Kurths, C O Daub, J Weise, J Selbig, The mutual information: Detecting and evaluating dependencies between variables, Bioinformatics 18, S231 (2002).

[6] B C Ross, Mutual Information between discrete and continuous data sets, PLoS ONE 9, e87357 (2014).

[7] A Kraskov, H Stogbauer, P Grassberger, Estimating mutual information, Phys. Rev. E. 69, 066138 (2004).

[8] W Gao, S Kannan, S Oh, P Viswanath, Estimating mutual information for discret continuous mixtures, 31st Conference on Neural Information Processing Systems (NIPS), 5986 (2017).

[9] A Moreira, P Prats-Iraola, M Younis, G Krieger, I Hajnsek, K P Papathanassiou, A tutorial on synthetic aperture radar, IEEE Geosci. Remote S. Magazine 1, 6 (2013).

[10] Y Liu, J Hallett, On size distributions of cloud droplets growing by condensation: a new conceptual model, J. Atmos. Sci. 55, 527 (1998).

[11] Y Liu, P Daum, J Hallett, A generalized systems theory for the effect of varying fluctuations on cloud droplets size distributions, J. Atmos. Sci. 59, 2279 (2002).

[12] M E Pereyra, P W Lamberti, O A Rosso, Wavelet Jensen-Shannon divergence as a tool for studying the dynamics of frequency band components in EEG epileptic seizures, Phys. A 379, 122 (2007).

[13] D M Mateos, L E Riveaud, P W Lamberti, Detecting dynamical changes in time series by using Jensen Shannon divergence, Chaos 27, 083118 (2017).

[14] S J Sheather, Density estimation, Stat. Sci. 19, 588 (2004).

[15] A Papoulis, Probability, random variables and stochastic processes, McGraw-Hill, New York (1991).

[16] J Burbea, C R Rao, On the convexity of some divergence measures based on entropy functions, IEEE T. Inform. Theory 28, 489 (1982).

[17] J Lin, Divergence measures based on the Shannon entropy, IEEE T. Inform. Theory 37, 145 (1991).

[18] S Kullback, R A Leibler, On information and sufficiency, Ann. Math. Stat. 22, 79 (1951).