## EntropyThe concept of entropy arose in the physical sciences during the nineteenth century, particularly in thermodynamics and statistical physics, as a measure of the equilibria and evolution of thermodynamic systems. Two main views developed: the macroscopic view formulated originally by Carnot, Clausius, Gibbs, Planck, and Caratheodory and the microscopic approach associated with Boltzmann and Maxwell. Since then both approaches have made possible deep insights into the nature and behavior of thermodynamic and other microscopically unpredictable processes. However, the mathematical tools used have later developed independently of their original physical background and have led to a plethora of methods and differing conventions. |

### ¤ÇÒÁ¤Ô´àËç¹¨Ò¡¼ÙéÍ×è¹ - à¢ÕÂ¹º·ÇÔ¨ÒÃ³ì

### à¹×éÍËÒ

Introduction | 1 |

Fundamental Concepts | 17 |

Entropy a Subtle Concept in Thermodynamics | 19 |

Probabilistic Aspects of Entropy | 37 |

Entropy in Thermodynamics | 55 |

Phenomenological Thermodynamics and Entropy Principles | 57 |

Entropy in Nonequilibrium | 79 |

Entropy for Hyperbolic Conservation Laws | 107 |

Large Deviations and Entropy | 199 |

Relative Entropy for Random Motion in a Random Medium | 215 |

Metastability and Entropy | 233 |

Entropy Production in Driven Spatially Extended Systems | 251 |

Entropy a Dialogue | 269 |

Entropy and Information | 277 |

Classical and Quantum Entropies Dynamics and Information | 279 |

Complexity and Information in Data | 299 |

Irreversibility and the Second Law of Thermodynamics | 121 |

The Entropy of Classical Thermodynamics | 147 |

Entropy in Stochastic Processes | 197 |

Entropy in Dynamical Systems | 313 |

Entropy in Ergodic Theory | 329 |