The purpose of this paper is to explain why fractal, self‐similarity, and fractional Brownian motions are so pervasive in human systems.
Abstract
Purpose
The purpose of this paper is to explain why fractal, self‐similarity, and fractional Brownian motions are so pervasive in human systems.
Design/methodology/approach
The analysis involves mainly relative observation, Minkowskian observation, Euclidean observation, and fractional calculus.
Findings
It is shown that observation with informational invariance, which is a modeling of subjectivity, creates fractal, and self‐similarity.
Research limitations/implications
This result could have an application to the quantitative analysis of volatility in finance, for instance.
Practical implications
The paper supports the use of fractional dynamics to describe human systems.
Originality/value
The paper provides practical arguments that may explain why fractals are so pervasive in natural science, and mainly in systems involving human factors.
Details
Keywords
Surveys some of the important contributions of information theory (IT) to the understanding of systems science and cybernetics. Presents a short background on the main definitions…
Abstract
Surveys some of the important contributions of information theory (IT) to the understanding of systems science and cybernetics. Presents a short background on the main definitions of IT, and examines in which way IT could be thought of as a unified approach to general systems. Analyses the topics: syntax and semantics in information, information and self‐organization, entropy of forms (entropy of non‐random functions), and information in dynamical systems. Enumerates some suggestions for further research and takes this opportunity to describe new points of view, mainly by using entropy of non‐random functions.
Details
Keywords
In this paper, one combines information theory, and more especially the concept of entropy, with the statistical theory of decision to derive new criteria for pattern recognition…
Abstract
In this paper, one combines information theory, and more especially the concept of entropy, with the statistical theory of decision to derive new criteria for pattern recognition. A generalized definition of entropy is considered as a risk function, and the generalized decision rules so obtained contain the family of the Bayesian decisions as special cases. These criteria may help to check the results obtained by usual techniques; they can be used in adaptive and learning systems, and more generally they can be useful in cybernetic systems.
The theory of possibility (Zadeh, Sugeno) and the theory of relative information (Jumarie) both aim to deal with the meaning of information, but their mathematical frameworks are…
Abstract
The theory of possibility (Zadeh, Sugeno) and the theory of relative information (Jumarie) both aim to deal with the meaning of information, but their mathematical frameworks are quite different. In the first approach, possibility is described either by fuzziness (Zadeh) or by generalized measures (Sugeno), and in the second, possibility is obtained as the result of observing probability via an observation process with informational invariance. Shows that a combination of (classical) information theory with generalized maximum likelihood via geometric programming exhibits a link between relative information, fuzziness and possibility. Some consequences are outlined.
Details
Keywords
The problem of expanding a meaningful entropic theory for fuzzy information cannot be thought of as being a mere (more or less formal) extension of Shannon theory. By using the…
Abstract
The problem of expanding a meaningful entropic theory for fuzzy information cannot be thought of as being a mere (more or less formal) extension of Shannon theory. By using the information theory of deterministic functions, the present author had already obtained some results in this way, and he herein continues this approach. After a short background on the different entropies of deterministic functions and on membership entropy of fuzzy sets, successively mixed entropy of fuzzy sets, joint membership functions of independent fuzzy sets, and conditional entropy of fuzzy sets with respect to other fuzzy sets are considered; the problem of defining transinformation between fuzzy sets, as a generalisation of the well known Shannon concept, is then examined. One of the conclusions of the article is that it is possible to build up a meaningful information theory of fuzzy sets by using the entropy of deterministic functions.
Details
Keywords
In the present literature on fuzzy sets and fuzzy information, there is much confusion between entropies of fuzzy sets and fuzzy sets of entropies. After a thorough critical…
Abstract
In the present literature on fuzzy sets and fuzzy information, there is much confusion between entropies of fuzzy sets and fuzzy sets of entropies. After a thorough critical review of this question, proposes a unified approach based on the theory of deterministic functions. One must carefully distinguish between index of fuzziness, uncertainty of fuzziness and uncertainty of randomness on the one hand; and uncertainty of fuzzy sets and uncertainty of possibility on the other hand. This new framework could provide new approaches to management of uncertainty originating from both probability and possibility distributions.
Details
Keywords
By combining the theory of relative information and the information of deterministic functions, one can obtain a model of possibility‐probability transformation. Relative…
Abstract
By combining the theory of relative information and the information of deterministic functions, one can obtain a model of possibility‐probability transformation. Relative information defines possibility in terms of syntax‐semantics coupling of natural languages, while entropy of deterministic functions refers to the maximum conditional entropy principle. These two theories use the basic concepts of Shannon information theories, and do not apply any of the new recent notions such as belief, necessity, confusion, dissonance, nonspecificity, and so on. The model appears as a direct consequence of the Shannon theory itself.
In the framework of his theory of relative information, derives formulae to convert probability into possibility and conversely, and shows how they can be utilized to reconsider…
Abstract
In the framework of his theory of relative information, derives formulae to convert probability into possibility and conversely, and shows how they can be utilized to reconsider many questions related to fuzzy sets and approximate reasoning on a meaningful information theoretic standpoint. The key of this approach is that the membership of a fuzzy set is now thought of as equivalent to the pair (probability, fuzziness coefficient). Therefore a sound new concept of informational membership entropy, which is fully consistent with Shannon theory, is evolved. In this new paradigm, the fuzzy object is simultaneously defined by the set itself and its complement. Thus obtains a new modelling for the union and the intersection of fuzzy sets, and new approaches to quantitatively modelling “If A then B”.
Details
Keywords
By combining the subjective probabilistic viewpoint of fuzziness with the entropy of deterministic functions, it is possible to expand an information theory of fuzzy sets which is…
Abstract
By combining the subjective probabilistic viewpoint of fuzziness with the entropy of deterministic functions, it is possible to expand an information theory of fuzzy sets which is fully compatible and consistent with the classical Shannonian information theoretic framework. A model of transinformation between fuzzy sets, which could be of help in approximate reasoning can be obtained, an interesting feature of which is that it can be duplicated in the framework of fuzzy set theory.
Details
Keywords
The complexity of a general system is identified with its temperature and, analogously with Boltzmann's probability density in thermodynamics, this temperature is related to the…
Abstract
The complexity of a general system is identified with its temperature and, analogously with Boltzmann's probability density in thermodynamics, this temperature is related to the informational entropy of the system. The concept of informational entropy of deterministic functions provides a straightforward modelling of Brillouin's negentropy (negative entropy), therefore a system can be characterized by its complexity and its dual complexity. States composition laws for complexities expressed in terms of Shannonian entropy with or without probability, and then the approach is extended to quantum entropy of non‐probabilistic data. Outlines some suggestions for future investigation.