In the first part of this paper a new method of applying the Maximum Entropy Principle (MEP) is presented, which makes use of a “frequency related” entropy, and which is valid for…
Abstract
In the first part of this paper a new method of applying the Maximum Entropy Principle (MEP) is presented, which makes use of a “frequency related” entropy, and which is valid for all stationary processes. The method is believed valid only in the case of discrete spectra. In the second part of the paper, a method of estimating continuous spectra in the presence of noise is presented, which makes use of the Mutual Information Principle (MIP). Although the method proceeds smoothly in mathematical terms, there appear to be some difficulties in interpreting the physical meaning of some of the expressions. Examples in the use of both methods are presented, for the usual practical problem of estimating a power spectrum for a process whose autocorrelation function is partially known a priori.
J.F. CYRANSKI and N.S. TZANNES
The Mutual Information Principle (MIP) was proposed as a method of inferring the pdf of a continuous random variable based on discrete observations. Its main disadvantage has been…
Abstract
The Mutual Information Principle (MIP) was proposed as a method of inferring the pdf of a continuous random variable based on discrete observations. Its main disadvantage has been the unavailability of closed form solutions. The purpose of this paper is to present some new, easily obtainable closed form solutions, which are based on a new result in Rate Distortion Theory (RDT). The solutions shed new light on the workings of the MIP, but are not unique. This lack of uniqueness is explained and its effects are discussed.
N.A. KATSAKOS‐MAVROMICHALIS, M.A. TZANNES and N.S. TZANNES
Four entropy methods (MESA, SMESA, MCESA and SMCESA) are reviewed and then used in the problem of resolving two sinusoids in the presence of additive white and 1/f noise. SMCESA…
Abstract
Four entropy methods (MESA, SMESA, MCESA and SMCESA) are reviewed and then used in the problem of resolving two sinusoids in the presence of additive white and 1/f noise. SMCESA appears to have the overall edge in this study. The frequency resolution problem is, of course, an example in communications, radar, etc.
The Rate Distortion Theory is a branch of the Information Theory applicable to the case when the entropy of the source exceeds the capacity of the Channel. A rate distortion…
Abstract
The Rate Distortion Theory is a branch of the Information Theory applicable to the case when the entropy of the source exceeds the capacity of the Channel. A rate distortion function R(D) is defined between the input and output alphabets X, Y of a channel. It can be shown that it is possible to design a communication system which achieves a fidelity D when the capacity of the channel C is greater than R(D). In this paper, the formulation of the Rate Distortion Theory is used for the problem of derived probability models. The variables X, Y and the Channel are given new interpretations, and the result is an ability to pick a derived probability model for Y when X is of a known probability structure. The fidelity criterion assumes the rle of an error function in this terminology. Two specific cases are discussed.
A simple extension of the concept of the inner product leads to orthonormal expansions of time‐shifted signals with coefficients dependent on the shift variable. It is shown that…
Abstract
A simple extension of the concept of the inner product leads to orthonormal expansions of time‐shifted signals with coefficients dependent on the shift variable. It is shown that such expansions have their counterparts of Parseval's identity and Bessel inequality. The Projection Theorem holds, and a version of Mercer's theorem and Karhumen—Loeve's expansion are also shown to hold, in a non‐stochastic regime. The approach leads to new interpretations of time correlation functions and Fourier Series expansions.
RALLIS C. PAPADEMETRIOU, THOMAS J. KETSEOGLOU and NICOLAOS S. TZANNES
Multiple Information Principle (MIP) is reviewed as a method of assigning a prior probability mass of density function to a random variable in the presence of some prior…
Abstract
Multiple Information Principle (MIP) is reviewed as a method of assigning a prior probability mass of density function to a random variable in the presence of some prior information. It is compared to the Maximum Information (MI) method and shown to be more general and inclusive of prior data available to the investigator. The image restoration problem is outlined as an inverse source problem with insufficient data for yielding a unique solution.
Details
Keywords
The problem of modeling the performance distributions of queueing systems, on the basis of partial knowledge of the service time distribution, is examined from an information…
Abstract
The problem of modeling the performance distributions of queueing systems, on the basis of partial knowledge of the service time distribution, is examined from an information theory point of view. A new method is proposed, based on the Mutual Information Principle (MIP) which generalizes the Maximum Entropy Principle (MEP) approach proposed by Shore. An example is given to illustrate the method and its advantages are discussed.
The Mutual Information Princip le (MIP) has already been used in various areas, as a generalization of the Maximum Entropy Principle (MEP), in the very common situation where our…
Abstract
The Mutual Information Princip le (MIP) has already been used in various areas, as a generalization of the Maximum Entropy Principle (MEP), in the very common situation where our measurements of a random variable contain errors having some known average value. An axiomatic derivation of the MIP is given below, in order to place it in a rigorous mathematical framework with the least possible intuitive arguments. The procedure followed is similar to the one proposed by Shore and Johnson for the Minimum Cross‐entropy Principle, and some relationships between the two methods of inductive inference are pointed out.
The possibility of improving the quality of the minimum relative entropy spectral estimates by properly selecting the set of autocorrelation values is demonstrated. The study…
Abstract
The possibility of improving the quality of the minimum relative entropy spectral estimates by properly selecting the set of autocorrelation values is demonstrated. The study concentrates on two aspects: resolvability and accuracy of peak location. Several numerical examples are given.
Details
Keywords
JOSEPH P. NOONAN and JAMES R. MARCUS
The problem of modelling stochastic systems when only a partial statistical description is available is considered. Specifically, a procedure is proposed for assigning an optimal…
Abstract
The problem of modelling stochastic systems when only a partial statistical description is available is considered. Specifically, a procedure is proposed for assigning an optimal joint probability model relating the input and output of the system where the partial statistical description becomes constraints. The Mutual Information functional is used to establish the model leading to a criteria which is optimal in an information theory sense. Results showing general solutions for cases of interest in digital communications as well as continuous systems with noise variance knowledge are given.