Arieh Ben-Naim

university of jerusalem



Toggle Menu
Scroll To Top

New Books

itsmart kidsthe 4 laws
information Theory

time's arrow

best books
This book provides a clear and mystery-free presentation of the central concepts in thermodynamics — probability, entropy, Helmholtz energy and Gibbs energy. It presents the concepts of entropy, free energy and various formulations of the Second Law in a friendly, simple language. It is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject.

The book focuses on the Four Laws of Thermodynamics. As it is said in the dedication page, this book is addressed to readers who might have already been exposed to Atkins' book having a similar title. It challenges both the title, and the contents of Atkins' book, Four Laws That Drive The Universe. One can glean from the title of this new book that the author's views are diametrically opposed to the views of Atkins.

The book is addressed to any curious and intelligent reader. It aims to tickle, and hopefully to satisfy your curiosity. It also aims to challenge your gray matter, and to enrich your knowledge by telling you some facts and ideas regarding the Four Laws of Thermodynamics.

Readership: Anyone interested in the sciences, students, researchers; as well as layman.

This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as "Entropy," this book makes a clear distinction between the SMI and Entropy. In the last chapter, Entropy is derived as a special case of SMI. Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory - the Shannon's Measure of Information. This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy. Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.

Dear Visitor,
Welcome to my website
. This site is to share with you my interests, publications & research activities. If you are looking for information about my books, my teaching or any information not available on my web site, feel free to contact me at

My interests in science range from theory of liquids and solutions, to theories of water and aqueous solutions, to any theoretical problems in biochemistry and biophysics.

  1. Classical music, clarinet particularly & I love Schubert
  2. Israeli folk dancing (that is what keeps me fit)
  3. Travelling (I am always thrilled to visit, or better to reside in a new place).
My CV, list of publications and list of courses I have given are included in this website.

Please peruse through my other books

time book bo bo bo Visitor website stats

This book discusses the proper definitions of entropy, the valid interpretation of entropy and some useful applications of the concept of entropy. Unlike many books which apply the concept of entropy to systems for which it is not even defined (such as living systems, black holes and the entire universe), these applications will help the reader to understand the meaning of entropy. It also emphasizes the limitations of the applicability of the concept of entropy and the Second Law of Thermodynamics. As with the previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in thermodynamics the entropy. In this book, the concepts of entropy and the Second Law are presented in a friendly, simple language. It is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject.

This book is unique in the following senses: First, it provides three different, but equivalent definitions of Entropy. Second, it provides a simple, valid, and proven interpretation of Entropy, and the Second Law of Thermodynamics based on Shannon's Measure of Information. Third, this is the only book that proves that Entropy is not a function of time. It also shows that Clausius had over-generalized when he formulated the Second Law, and that Boltzmann, misinterpreted his own H-Theorem.

This book discusses entropy and the Second Law of Thermodynamics in such a way that everyone can understand its subject matter. Entropy is one of the most interesting concepts in physics. Although it is a well-defined concept, it is still perceived by even well-known scientists as a concept cloaked in mystery. It is also the most misused, and often abused, concept in physics. In order to understand entropy, one needs to understand the Shannon measure of information, and in order to grasp this idea, one must be familiar with some basic concepts of probability. Therefore, this book consists of three chapters: the first discusses probability, the second addresses Information Theory, and the third considers entropy and the Second Law of Thermodynamics. Readers will discover that the Second Law is nothing but a law of probability.

The greatest blunder ever in the history of science.

The Second Law of thermodynamics, the law of entropy, is one of the longest-standing laws of physics, unchanged even by the last century’s two drastic revolutions in physics.

However, the concept of entropy has long been misinterpreted and misused – making it the greatest ever blunder in the history of science, propagated for decades by scientists and non-scientists alike.

This blunder was initially and primarily brought on by a deep misunderstanding of the concept of entropy. Ironically, ignorance about the meaning of entropy has led some scientists to associate entropy with ignorance, and the Second Law with the “law of spreading ignorance.”

In his book, Arieh Ben-Naim, a respected professor of physical chemistry, attempts to right these wrongs. He scrutinizes twelve misguided definitions and interpretations of entropy, brings order to the chaos, and finally lays out the true meaning of entropy in clear and accessible language anyone can understand. Some Appendices

water lifeThis book is unique in presenting all aspects of water. It includes discussion of the theory of a water molecule, its properties, both in the pure state and as a solvent. In particular, it emphasizes the relevance of water to life.Water is the most important liquid. It is also a vital component of all living systems. It has very unusual properties which makes it the most interesting for research and study.


This book addresses all aspects of Time, and is addressed to all readers from laypersons to research scientists. Setting the tone for the introduction are some amusing notes about Time in the Bible, which dovetails into the elusive question of the "definition" of time. Here, the author suggests a practical, or an operational "definition" of Time. This definition involves a counting number of any periodic process, and is later used to debunk one of the greatest blunders ever in the history of science; misconstruing entropy with time. In addition, the book surveys the history of thinking about time, the perception of time, the development of devices for measuring time, and featuring time in art and humor. All these and more are presented in a simple, and accessible language for a wide range of audience. Pleas check all the graphic Figures for the book

This monograph presents the molecular theory and necessary tools for the study of solvent-induced interactions and forces. After introducing the reader to the basic definitions of solvent-induced interactions, the author provides a brief analysis of the statistical thermodynamics. The book thoroughly overviews the connection of those interactions with thermodynamics and consequently focuses on specifically discussing the hydrophobic-hydrophilic interactions and forces. The importance of the implementation of hydrophilic interactions and forces in various biochemical processes is thoroughly analyzed, while evidence based on theory, experiments, and simulated calculations supporting that hydrophilic interactions and forces are far more important than the corresponding hydrophobic effects in many biochemical processes such as protein folding, self-assembly of proteins, molecular recognitions, are described in detail. This title is of great interest to students and researchers working in the fields of chemistry, physics, biochemistry, and molecular biology.

This book focuses on analysing the applications of the Shannon Measure of Information (SMI). The book introduces the concept of frustration and discusses the question of the quantification of this concept within information theory (IT), while it also focuses on the interpretation of the entropy of systems of interacting particles in terms of the SMI and of mutual information. The author examines the question of the possibility of measuring the extent of frustration using mutual information and discusses some classical examples of processes of mixing and assimilation for which the entropy changes are interpreted in terms of SMI. A description of a few binding systems and the interpretation of cooperativity phenomena in terms of mutual information are also presented, along with a detailed discussion on the general method of using maximum SMI in order to find the “best-guess” probability distribution. This book is a valuable contribution to the field of information theory and will be of great interest to any scientist who is interested in IT and in its potential applications.