About 201,000 results
Open links in new tab
  1. Entropy | An Open Access Journal from MDPI

    Entropy Entropy is an international and interdisciplinary peer-reviewed open access journal of entropy and information studies, published monthly online by MDPI.

  2. Entropy - MDPI

    The concept of entropy constitutes, together with energy, a cornerstone of contemporary physics and related areas. It was originally introduced by Clausius in 1865 along abstract lines …

  3. Entropy | Aims & Scope - MDPI

    Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes.

  4. Methods to Calculate Entropy Generation - MDPI

    Mar 7, 2024 · Entropy generation, formulated by combining the first and second laws of thermodynamics with an appropriate thermodynamic potential, emerges as the difference …

  5. Shannon Entropy in Uncertainty Quantification for the Physical ...

    Feb 6, 2025 · This models is randomized here using the Monte-Carlo simulation apparatus for estimation of the Shannon entropy of all these physical parameters, which is the crucial …

  6. A Brief Review of Generalized Entropies - MDPI

    Oct 23, 2018 · Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure-preserving dynamical systems, topological dynamics, etc.) as a …

  7. Entropy and the Brain: An Overview - MDPI

    Aug 21, 2020 · Entropy is a powerful tool for quantification of the brain function and its information processing capacity. This is evident in its broad domain of applications that range from …

  8. Applications of Entropy in Data Analysis and Machine Learning: A …

    Dec 23, 2024 · Precisely, this paper aims to provide an up-to-date overview of the applications of entropy in data analysis and machine learning, where entropy stands here not only for the …

  9. Entropy: From Thermodynamics to Information Processing - MDPI

    Oct 14, 2021 · Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from …

  10. A Review of Shannon and Differential Entropy Rate Estimation

    Aug 13, 2021 · Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the …