What we do?

  • We help people act with insight.
  • We help companies grow from the inside.
  • We help employees turn into thinkers.

We ignite thought

If nature has made any one thing less susceptible than all others of exclusive property, it is the action of the thinking power called an idea, which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the receiver cannot dispossess himself of it. Its peculiar character, too, is that no one possesses the less, because every other possesses the whole of it. He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.

--Thomas Jefferson
on Patents and Freedom of Ideas

Programs > Insights in Science Lecture Abstracts

Abstracts of currently available lectures are provided below. All lectures will be delivered by Prof. Rajendra Bera. This list is subject to change.

To know more about the background and objectives of these lectures, please visit Insights in Science.

From hunter-gatherer to knowledge-worker

In very broad terms, the world's economic development can be divided into four stages: hunter-gatherer (till about 12,000 years ago; more than 99% of our time on earth), agricultural (beginning about 12,000 years ago till about 1500 AD), industrial (from about 1500 AD to later half of 20th century), and postindustrial (later half of 20th century and continuing)1 , although a substantial comingling of two or more stages can be seen even today in many countries, including the world's most advanced nations. The hunter-gatherer stage can support only about one inhabitant per square mile and demands a nomadic life involving extraordinary land-intensive activity. In the post-industrial information (knowledge-gatherer) age, we are primarily concerned about creating knowledge and using it to produce marketable products and services as quickly and economically as possible. The focus is therefore on knowledge workers. The knowledge-gatherer stage can support several orders of magnitude more inhabitants per square mile than was possible in the hunter-gatherer stage.

Axiomatic mathematics

Euclid's geometry is the first specific evidence of an axiomatic treatment of mathematics. Some 2000 years after Euclid, several mathematicians reexamined its axioms and discovered non-Euclidean geometry. One such geometry forms the space-time geometry of Einstein's general theory of relativity. The discovery of non-Euclidean geometry was a revolution in mathematics, which led to what now forms the heart of mathematics-formal axiomatic systems. Formal systems form the basis of reasoning in mathematics and of all the computations we do on digital computers.

How reliably can we compute?

Several simple computations, as implemented on digital computers, will be examined. Their surprising common feature is that while there is no flaw in the coded logic, the computations fail. The reason for their failure and their remedies will be discussed. The lesson: programming is not about coding; it is about algorithms and their error propagation characteristics. We shall also take a look at some unusual ways humans prove mathematical propositions.

On symmetry

The notion of symmetry plays a central role in theoretical physics. The central theme of this lecture is the Emmy Nöther theorem, which states that for every observable symmetry in Nature there is a corresponding entity that is conserved. And for every conservation law there is a corresponding symmetry. For example, the law of conservation of angular momentum is a consequence of the isotropy of space.

Quantum cryptography and quantum teleportation

The world of quantum mechanics is truly magical. In this lecture we will look at the basic mathematical framework around which QM is built, and then look at the amazingly simple solutions to two problems: (i) the safe exchange of keys for encrypted messages, and (ii) the teleportation of matter. In both these solutions, Charles Bennett, a distinguished IBM researcher, played a pioneering role.

Set theory

The boldness with which Georg Cantor looked at the notion of infinity was a defining moment in mathematics. In a very real sense, he is the father of set theory. His proof that the set of real numbers is uncountable, and his proof that the set of points called the Cantor set is also uncountable and as numerous as the number of points on the real line are two remarkable examples of ingenious mathematical proofs. Of Cantor's work on set theory, Hilbert was to say, "No one will drive us from the paradise that Cantor has created." There were others who disagreed, Leopold Kronecker among them. The Cantor set plays an important role in non-linear dynamics and is a famous example of a fractal object.

Fractals – Part I: A new geometry

Benoit Mandelbrot, a distinguished IBM researcher, observed, "Clouds are not spheres, mountains are not cones, coastlines are not circles, bark is not smooth, nor does lightning travel in a straight line." Indeed, Mandelbrot saw jaggedness almost everywhere: the charts of stock market prices, of river water levels, etc. Remarkably, he also noticed that they displayed self-similarity, because not only did they produce detail at finer and finer scales, they produced details with certain constant measurements. Mandelbrot created the word fractal. The word now stands for a way of describing, calculating, and thinking about shapes that are irregular and fragmented, jagged and broken up-shapes from the crystalline curves of snowflakes to the discontinuous dusts of galaxies. A fractal curve implies an organizing structure that lies hidden among the hideous complication of such shapes. These shapes live in a world of fractional dimensions!

Fractals – Part II: The Mandelbrot set

The creation of the Mandelbrot set is simple. Given zn = z2n-1 + c, pick a value for the complex number c, and iterate with the seed z = 0. If the iterations diverge, then c is not in the Mandelbrot set, otherwise (even when it is trapped in some repeating loop, or is wandering chaotically) it is in the set. Many claim that the Mandelbrot set is the most complex object in mathematics. You can spend a lifetime and more studying "its disks studded with prickly thorns, its spirals and filaments curling outward and around, bearing bulbous molecules that hang, infinitely variegated, like grapes on God's personal vine," so wrote James Gleick in his famous book, Chaos. Those who were first to understand the way the set commingles complexity and simplicity were caught unprepared, including Mandelbrot!

Information theory – Part I: Information is physical

While computer science was making rapid progress in the 1940s, a parallel revolution was taking place in communication theory. Modern information theory began with Shannon's famous twin papers, The Mathematical Theory of Communication, which appeared in the Bell System Technical Journal in July and October 1948. His seminal contribution was first, to mathematically define the concept of information, and then to consider the transmission of information as a statistical phenomenon and to give communications engineers a method to determine the capacity of a communication channel in terms of the currency of bits. In another seminal paper titled Irreversibility and Heat Generation in the Computing Process, which appeared in IBM Journal of Research and Development, Vol. 5, No. 3, 1961, Rolf Landauer, an IBM researcher, showed the intimate relationship between the concept of information and physics. These contributions now form the foundations of information theory.

Information theory – Part II: Shannon entropy

In information theory, entropy is conceptually the actual amount of information in a piece of data. Shannon's entropy is seen to be closely related to thermodynamic entropy as defined by Boltzmann in statistical thermodynamics. Indeed, Shannon was inspired by Boltzmann's work and adopted the term entropy from there. Similarly, Landauer attempted to apply thermodynamic reasoning to digital computers. The relationship between entropy in the informational and thermodynamic senses is deep, in fact so deep that Charles Bennett, an IBM researcher, was eventually able to solve the famous paradox known as Maxwell's demon in theoretical physics.

Chaotic dynamical systems – Part I: The logistic map

In the 20th century three revolutionary discoveries took place in science: quantum mechanics (to deal with microscopic level phenomena, < 10-8 cm), theory of relativity (to deal with objects speeding close to the speed of light ~ 1010 cm/sec), and non-linear dynamics. All three have presented unexpected concepts and insights and stunning results. However, the third revolution, the development of non-linear dynamics has been more recent and rather quiet. Yet, unlike quantum mechanics and theory of relativity, non-linear dynamics covers systems of every scale and any speed. And, it encompasses all the existing disciplines in science (both natural and social sciences). Therefore, there is a clear opportunity to unify a large number of diverse phenomena. In this lecture, we shall study the unusual behavior seen in a very simple system described by the logistic equation. Among other things, it models population development in a limited environment, light amplification in a laser, etc. It is one of the simplest examples of a dynamical system which shows chaotic behavior.

Chaotic dynamical systems – Part II: Chaos in higher dimensions

We shall study two cases: (a) in two dimensions, the behavior of the nonlinear, damped, driven pendulum, and (b) in three dimensions, the behavior of an idealized weather system (the famous Lorenz equation). We shall also see that Lorenz's equation not only describes an idealized weather system but also the dynamics of a waterwheel!

Chaotic dynamical systems – Part III: Quantum chaos

Quantum mechanics does not admit chaotic behavior in the classical sense. However, it turns out that there are features of a quantum system that correspond to chaos in a classical system. This correspondence is revealed in an intriguing fashion when one asks: "Do quantum versions of classically chaotic systems behave differently from quantum versions of classically regular systems?"

Some tantalizing possibilities have emerged. For example, it has been found that the eigenvalues of a chaotic quantum system (the quantum version of a classically chaotic system) have different statistical properties than do the eigenvalues of a regular quantum system (the quantum version of a regular system). Another interesting development is that the distribution of eigenvalues in a chaotic quantum system can be determined using information about periodic orbits of the classical system!

Formal systems – Part I: Meaning and form in mathematics

Our ability to reason has often been claimed to be what distinguishes us from other species; so it seems somewhat paradoxical to mechanize that which is most human. Yet even ancient Greeks knew that reasoning is a patterned process, and is at least partially governed by statable laws. Aristotle codified syllogisms, and Euclid codified geometry; but thereafter, many centuries had to pass before progress in the study of axiomatic reasoning would take place again. In this lecture we shall try to understand how we reason using formal axiomatic systems, and how we struggle to find meaning when confronted with an unknown formal system.

Formal systems – Part II: Consistency and completeness

The human mind instinctively seeks consistency and completeness in axiomatic systems. We shall see why achieving these properties is so difficult. In particular, we seek answers to the following two questions: (1) How is meaning mediated in formal systems by isomorphism? (2) What is the relationship between meaning and form?

Formal systems – Part III: (a) Propositional calculus, (b) arithmetization of axiomatic systems

(a) Propositional Calculus gives us a set of rules for producing statements which would be true in all conceivable worlds. No matter how we complete the interpretation, the final result will be a true statement. It does this by specifying the form of statements that are universally true.

(b) The enormous power of arithmetic is displayed through examples by showing that axiomatic systems can be converted into arithmetical systems and hence into computer programs.

Formal systems – Part IV: Limits to axiomatic reasoning

In 1900, David Hilbert addressed the International Congress on Mathematics and posed a series of problems for mathematicians to investigate. One of them was truly breathtaking in its ambition: Using, say, the axioms which formed the basis of Principia Mathematica by Bertrand Russell and Alfred North Whitehead (1910-1913), or some other formal system of logic, to prove the truth or falsity of all propositions in mathematics. If this could be proven to be true, its consequences would be truly enormous. It would, in principle, allow the possibility of relegating proof and disproof to a formal procedure that could be carried out by a computer! Mathematicians would then be free to make conjectures and computers could do the slave work of proving or disproving those conjectures. In this lecture we shall see whether such mechanization of proofs is possible.

The central mystery of quantum mechanics

The two slit experiment has played a crucial role both in classical physics and quantum mechanics. In classical physics the experiment was used to establish that light has wave behavior, in quantum physics it was used to establish that even quantum particles such as electrons have wave behavior. About the two-slit experiment, Richard Feynman said that this is

"a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery. We cannot make the mystery go away by "explaining" how it works. We will just tell you how it works. In telling you how it works we will have told you about the basic peculiarities of all quantum mechanics."

The mysterious world of quantum computing

Even to physicists, the world of quantum mechanics is immensely mysterious. No one really understands it and yet its predictions are the most accurate in the whole of physics. It is now possible to build quantum computers based on the laws of quantum mechanics and use their amazing power to run computations in parallel. Quantum computers derive their enormous computing power from two unique quantum phenomena called superposition and entanglement. This lecture describes the laws of quantum mechanics, a new interpretation of those laws, the quantum bit (called qubit, the counterpart of the classical computer bit), and the role superposition and entanglement plays in manipulating the state of qubits. Finally, a few simple quantum algorithms are described.

Quantum computing – Part I

The axioms of quantum mechanics permit some unusual algorithms to be developed for solving standard mathematical problems. We shall look at some basic algorithms that provide the foundation for more complex algorithms in the later lectures. The algorithms include random number generation, computing xy, x+y, Deutsch-Jozsa algorithm, and computing f(x) in parallel for a set of x values.

Quantum computing – Part II

The quantum Fourier transform (QFT) will be discussed. In particular, we shall describe a particular decomposition of it into a product of simpler unitary matrices and how it can be implemented using Hadamard and phase shift gates. We shall also look at some applications of QFT, such as in phase estimation, and order finding algorithms.

Quantum computing – Part III

In 1994, Peter Shor surprised the world by describing a polynomial time quantum algorithm for factoring integers. This became a killer application. The difficulty of factorization underpins security of many common methods of encryption, for example, RSA, the most popular public key crypto-system which is often used to protect electronic bank accounts. The security of such systems comes from the difficulty of factoring large numbers. Potential use of quantum computation for code breaking purposes then provided the much needed impetus for building future quantum computers.

On Dec 19, 2001 IBM announced that it had built a quantum computer based on seven atoms which, because of the physical properties of those atoms, are able to work together as both the computer's processor and memory, and that they were able to use the computer to show that Shor's algorithm works by correctly identifying 3 and 5 as the factors of 15.

Quantum computing – Part IV

In 1996, Lov Grover described an algorithm for finding a given item from a list of n items. Classical methods require n/2 searches on average. Grover's quantum algorithm requires O(√n) steps. Not only that, Grover's algorithm is also the fastest even among all possible quantum mechanical algorithms for this problem.

In 1997, Isaac L. Chuang (IBM, Almaden), Neil A. Gershenfeld (MIT, Cambridge), and Mark G. Kubinec (Univ. of California, Berkeley) actually built a simple two-qubit NMR quantum computer made of liquid chloroform at IBM and successfully ran Grover's algorithm.

Quantum computing – Part V

Quantum computers are difficult to build because of their need to be isolated from the environment to avoid decoherence. Currently available computing times before decoherence are very small. However, if error correcting codes are available then computing times can be extended. The discovery of error correcting codes is a major advancement since for quite some time it was thought that error correction would not be possible because of the restrictions imposed by the no-cloning theorem. Quantum error correcting codes work by encoding quantum states in a special way and then decoding when it is required to recover the original state.

Fundamental limits in computing – Part I: Computability

Is there a limit to what we can, in principle, compute? We answer this question in terms of Gödel's incompleteness theorems, and Turing's halting theorem. The discussion revolves around Hilbert's second and tenth problems, recursive sets, recursive processes, recursive functions, and Turing machines.

Fundamental limits in computing – Part II: Thermodynamic considerations

We discuss the remarkable connection between information and thermodynamic entropy first noted by Rolf Landauer of IBM and succinctly captured by the aphorism "Information is physical". The connection was later used by Charles Bennett, also of IBM, to finally resolve the famous Maxwell's demon paradox.

Fundamental limits in computing – Part III: Complexity

The computational complexity of a problem is determined by the amount of computational resources (time and memory space) required to complete a given computational task. Unfortunately, complexity theory provides strong evidence that many optimization problems are likely to be intractable and have no efficient algorithm. That is, each such problem is effectively impossible to solve, not because we cannot find an algorithm to solve the problem, but because all known algorithms consume such vast amounts of computer memory space or computing time as to render them practically useless. It is likely that questions about intractability and quantum computation may help to shed light on the fundamental properties of matter.

Why software development goes awry?

Software development has at least two sources of complexity: complexity inherent in the problem domain over which the software developer has no control, and self-inflicted complexity arising from the activities of the software development team. It is the second, which is of greater concern because it is so widespread; its origin is the absence of a sophisticated symbolic system for the development of software products. Many quality-related problems in the software industry are directly related to this fact. Since the richness of symbolic systems is closely tied to the conceptual levels at which a professional can work, it is not surprising that the opportunity to work with current programming languages seldom attracts intellectual giants. Consequently, in no other responsible profession or industry does one find a 20:1 variation in productivity. Such a variation is indeed unacceptable elsewhere.

Chemical bonds

The smallest unit of a chemical element is the atom. Atoms can chemically react with other atoms to form molecules. Molecules have properties unique from their atoms. Atoms possess a central core comprising a nucleus which is a collection of zero or more electrically neutral neutrons and one or more positively charged protons in a tight mass, surrounded by a cloud of orbiting negatively charged electrons whose numbers match those of protons. Protons determine the properties of the element. Electrons determine interactions (chemical bonds) between atoms. We shall look at chemical bonds from a quantum mechanical point of view.

Organic molecules

Carbon compounds form the physical framework for all biological molecules. The ability of carbon atoms to form large complex polymers plays an important role in the generation of diverse biological structures. Organic molecules contain carbon and hydrogen. In addition, they may contain other elements and molecules.

Molecular biology – Part I: The double helix and beyond the double helix

1953 marks the beginning of a new era in biology. On pages 737-738 of the April 25, 1953 issue of Nature there appeared a short paper titled, Molecular Structure of Nucleic Acids by J. D. Watson and F. H. C. Crick. Watson was a postdoc and Crick a PhD student at the Cavendish. Indeed, in this paper they made the now famous understated remark, "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material." In 1962, Crick and Watson shared the Nobel Prize for Physiology or Medicine with M. H. F. Wilkins, "for their discoveries concerning the molecular structure of nucleic acids and its significance for information transfer in living material". It was the potential for explaining biological function of DNA that led to the widespread acceptance of the Watson-Crick model rather than any compelling structural evidence. The DNA structure was not rigorously determined by X-ray crystallography until the late 1970s.

Molecular biology – Part II: Genes and proteins

The discovery of genes and the genetic code, and how they are related to the remarkable process of protein synthesis will be discussed. Computer scientists and information theorists can learn much from it. Robert W. Holley, Har Gobind Khorana, and Marshall W. Nirenberg received the Nobel Prize for Physiology or Medicine, 1968 "for their interpretation of the genetic code and its function in protein synthesis".

Molecular biology – Part III: Energy transfer

The temporary storage and transfer of energy in the cell depends on several so-called energy carrier molecules, of which the most important is adenosine triphosphate (ATP). These specialized molecules transfer the chemical energy in their covalent bonds. In the case of ATP, energy is stored in the form of covalent bonds between phosphate groups; hence such bonds are often called energy-rich bonds. In this lecture we shall discuss the role of ATP and how it is synthesized in the cell.

Molecular biology – Part IV: Metabolism

Metabolism involves both assimilation of and degradation of molecules. Thus an important function of metabolism is to transform food molecules into those molecules that the organism needs. Since the process of taking molecules apart (catabolism) and reassembling the pieces (anabolism) requires energy, the other major functions of metabolism are the extraction of energy stored in chemical compounds and the conversion of energy into useful forms. The central issue therefore is the transformation of energy from one form to another and the efficiency of the processes that convert energy into work.

Molecular biology – Part V: The immune response system

Immunity is a state of heightened resistance or accelerated reactivity toward micro-organisms, transplants, or any other "non-self" substances that gain access to the body. Among the system's remarkable characteristics is its ability to distinguish between "self" and "non-self", and remember previous experiences (such as infections) and react accordingly. The system displays both enormous diversity and extraordinary specificity; not only is it able to recognize many millions of distinctive non-self molecules, it can produce molecules and cells to match up with and counteract each one of them. And it has at its command an impressive array of weapons. Its success comes from an incredibly elaborate and dynamic regulatory-communications network. Computer scientists can learn much from the immune system and apply the lessons learnt when designing self-healing computing systems.

Knowledge, reasoning, explanation

Many scientists are ambivalent about the question "Can machines think?" Also, there is a widespread belief (without an iota of proof) that animals cannot think. So here is a poser: "If genetic engineering succeeds in cloning humans or in creating transgenic humans with the ability to speak, can we still claim that machines cannot think?"

We now recognize that all our knowledge begins with beliefs. That is, we constrain the development of knowledge so that it does not violate any of our held beliefs. The belief system is granted immunity to any questioning or held to be self-evident and therefore requiring no further explanation. The demolition or the alteration of a widely held belief system is an epochal event in human history. Each such epochal event in the history of science has left philosophers bewildered as to what they understood by objective truth.

The world is ruled by ideas

"In the field of observation, chance favors the prepared mind." - Louis Pasteur

To find science based innovative solutions to problems one often requires insight into the laws of nature and proficiency in the art of axiomatic thinking (that is, mathematics). Therefore, creating a scientific culture in an organization is a way of preparing the mind. Invention mining and identifying potential inventors must become ongoing activities in an innovative organization. There are tremendous opportunities for innovation in healthcare, life sciences, government, education, transportation and manufacturing – as technology becomes infused into the business of society.


1 One may infer its advent to coincide with the elimination of armies of telephone operators due to the introduction of satellite, cellular, and fiber-optic networks and the introduction of automatic telephone exchanges.