Теорія та методи оцінки ентропії дискретних маніпульованих сигналів
Presented the theoretical basis of analytical assessment of the entropy of random processes. Systematization of the entropy assessment methods developed by R. Hartley, C. Kramp, N.Kolmohorova, C. Shannon, G. Longo, G. Shults, B. Oliver, D.Middleton, W. Tuller, V.Boyuna, Ya.Nykolaychuka. The analys...
Saved in:
| Date: | 2015 |
|---|---|
| Main Authors: | , , |
| Format: | Article |
| Language: | Ukrainian |
| Published: |
Vinnytsia National Technical University
2015
|
| Subjects: | |
| Online Access: | https://oeipt.vntu.edu.ua/index.php/oeipt/article/view/398 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Journal Title: | Optoelectronic Information-Power Technologies |
Institution
Optoelectronic Information-Power Technologies| Summary: | Presented the theoretical basis of analytical assessment of the entropy of random processes. Systematization of the entropy assessment methods developed by R. Hartley, C. Kramp, N.Kolmohorova, C. Shannon, G. Longo, G. Shults, B. Oliver, D.Middleton, W. Tuller, V.Boyuna, Ya.Nykolaychuka. The analysis of various assessments estimates entropy revealed that their foundation is a logarithmic function of probability information sources, multivariate statistical distributions of increments mathematical expectation, dispersion, standard deviation and different analytical expressions autocorrelation functions. Shown that the amount of R.Hartley entropy is the upper estimate of the volume of information Dzh.Lonho and H.Shultsa and taken into account coefficients of the information utility. The measure of C. Shannon takes into account the probability states and the Ya. Nykolaychuk measure takes into account the variance and autocorrelation function, to the greatest degree close to its own entropy sources. |
|---|