Теорія та методи оцінки ентропії дискретних маніпульованих сигналів
Presented the theoretical basis of analytical assessment of the entropy of random processes. Systematization of the entropy assessment methods developed by R. Hartley, C. Kramp, N.Kolmohorova, C. Shannon, G. Longo, G. Shults, B. Oliver, D.Middleton, W. Tuller, V.Boyuna, Ya.Nykolaychuka. The analys...
Збережено в:
| Дата: | 2015 |
|---|---|
| Автори: | , , |
| Формат: | Стаття |
| Мова: | Ukrainian |
| Опубліковано: |
Vinnytsia National Technical University
2015
|
| Теми: | |
| Онлайн доступ: | https://oeipt.vntu.edu.ua/index.php/oeipt/article/view/398 |
| Теги: |
Додати тег
Немає тегів, Будьте першим, хто поставить тег для цього запису!
|
| Назва журналу: | Optoelectronic Information-Power Technologies |
Репозитарії
Optoelectronic Information-Power Technologies| Резюме: | Presented the theoretical basis of analytical assessment of the entropy of random processes. Systematization of the entropy assessment methods developed by R. Hartley, C. Kramp, N.Kolmohorova, C. Shannon, G. Longo, G. Shults, B. Oliver, D.Middleton, W. Tuller, V.Boyuna, Ya.Nykolaychuka. The analysis of various assessments estimates entropy revealed that their foundation is a logarithmic function of probability information sources, multivariate statistical distributions of increments mathematical expectation, dispersion, standard deviation and different analytical expressions autocorrelation functions. Shown that the amount of R.Hartley entropy is the upper estimate of the volume of information Dzh.Lonho and H.Shultsa and taken into account coefficients of the information utility. The measure of C. Shannon takes into account the probability states and the Ya. Nykolaychuk measure takes into account the variance and autocorrelation function, to the greatest degree close to its own entropy sources. |
|---|