Invited talks

Video links

Lecture Series on Bosonization of chiral fermions in 1D

Lecture slides

Proofs by Prof. Haldane

Research Interests

I work in the field of Theoretical Condensed Matter Physics. I am interested in understanding and accounting for the properties of everyday bulk materials from a knowledge of the fundamental constituents of the substance and the fundamental physical laws governing those constituents. I am the inventor of a new technique called 'non-chiral bosonization' which is uniquely suited to study strongly inhomogeneous Luttinger liquids. I also invented the notion of a non-local particle hole creation operator and showed that it may be used to diagonalize interacting Fermi systems in any dimension. I am also interested in topological materials, specifically their nonlinear optical properties.

About my research: Taking the bull by its horns

I am interested in understanding and accounting for the properties of everyday bulk materials from a knowledge of the fundamental constituents of the substance and the fundamental physical laws governing those constituents. This subject is known as theoretical condensed matter physics. But the way in which it is practiced differs substantially from researcher to researcher. The main difficulty in this subject comes from the ambiguity associated with identifying the fundamental constituent of a material. Should it be a molecule, an atom or electron and protons or even quarks and leptons? The vast majority of practicing condensed matter theorists do not hold a rigid view on exactly where one should draw the line and what should be regarded as the fundamental or elementary constituent. Indeed, the view is that each phenomenon comes associated with typical length and energy scales (which are experimentally determined) that tell us whether to regard molecules as fundamental or atoms or even nuclei and electrons as fundamental. In fact, in many situations, only some aspects of an atom such as its spin may be important and other attributes of the atom may be totally ignored. Generations upon generations of theorists have trained themselves to think along these lines and as a result, in the present day, there are innumerable number of unrelated theories that purport to account for various specific properties of the same material. These theories assume that for the phenomenon being studied only a few relevant degrees of freedom are important and they are different for different phenomena even though all the phenomena occur in the same material. These theories may appear to be economical in their use of concepts since for a given phenomenon only a small number of carefully chosen degrees of freedom are invoked. The price one has to pay for this is two fold. One is that there is an over reliance on experiments to tell us exactly what degrees of freedom are relevant and what are the length and energy scales involved. The second drawback of this approach is that it leads to a proliferation of unrelated theories each purporting to describe a tiny aspect of the system and thus one loses an over all picture (think of the blind men and the elephant analogy). There are other drawbacks with the traditional approach. Since the theories are phenomena-specific, the parameters that enter into the theory have to be fitted from experiment. Thus the predictive power of these theories is limited to describing trends and qualitative behavior (decreases/increases, power law etc.) rather than actual numbers. Many physicists regard these attributes as being more valuable and robust than actual numbers.

My view, which is probably shared by no one, is that it is about time the traditional approach is given a rest. My fresh approach involves tackling the bull by its horns - to use a metaphor. For the purposes of condensed matter physics, I regard nuclei and electrons as point charges. The claim is since all condensed matter phenomena relate to electromagnetic forces, instead of repeatedly asking someone for data we have to simply work out the properties of neutral matter with positive and negative charges using powerful mathematical techniques. One such technique is known as higher dimensional bosonization. I am involved in developing such tools that hopefully will completely avoid the need to calibrate models using experiments. Besides, there will not be `models' anymore but just one model. Such an approach is usually dismissed as unworkable by most and indeed they may even regard it as a sign of an immature thinker. The usual rebuttal to my ideas (from those who are charitable enough to speak to me at all) is to react with remarks such as `Surely, core electrons do not participate in low energy physics which is what is important to condensed matter physics, aren’t you being naive by including them ?' Or `The free electron model of a metal is so successful why bother considering electron-electron interactions?' and many more. I can respond to these two objections at least by pointing out that the term `core electron' betrays a bias toward the independent electron in an isolated atom mental picture. Richard Burton observes eloquently in `Where Eagles Dare' that a ``hole is a hole is a hole''. I couldn’t have said it better myself. In other words, an electron is an electron, indistinguishable from others. As far as the second objection goes, the free particle picture works not by some miracle but by the phenomenon of screening which is a rather difficult many-body effect. Thus to strictly justify the free electron picture one has to first invoke the concept of screening and only then rely on the simplified model. One cannot adopt the approach `let's try some really simple approach and if it works we are through'. If an over-simplified model accounts for all the details then it is, in my view, a cause for concern rather than a cause for celebration since we now have a bigger mystery on our hands. We have to explain why such a ridiculous model which we know overlooks many effects that should otherwise be present is nevertheless so successful.

Therefore my view is that condensed matter theorists should step back, take a deep breath, and ask themselves what it means to understand something ?Understanding has always meant, since the time of the ancient Greeks, accounting for the behavior of the big things from the little things those big things are made of. One is not entitled to change the definition of understanding itself simply because the going has got tough. Before the reader exits this webpage thinking that all this is empty bragging, I have a few published papers to support my claims.


  1. The paper that started it all. [Phys. Rev. B 57, 15144 (1998)]
  2. The paper that fixes the mathematical difficulties in (1). [Phys. Rev. B 65, 165111 (2002)]
  3. The paper that finally gets the physics right too! [Pramana 62, 115 (2004)]
  4. The paper that shows how gauge fields may be naturally incorporated in the bosonization language. [Pramana 62, 101 (2004)]
  5. The work that generalizes (4) to fermions that has appeared in print. [Pramana 66, 575 (2006)]
  6. It appears that the mathematical difficulties in (1) were not fixed by (2) even though later papers circumvented these difficulties and got the physics right. These difficulties have partially been fixed in my latest effort. [see this link]
  7. The Green function of a Luttinger liquid with a single impurity (Kane Fisher problem) has been reduced to a closed form. [APS March Meeting 2012 Boston Talk]
  8. Finally applying my theory called ‘Non chiral bosonization’ to study a one-step fermionic ladder [Physica E 94, 216 (2017)] and slowly moving impurities in a Luttinger liquid. [EPL 123, 27002]





                      Home         Teaching         Research         Research Group         Publications         Career         Hobbies         Contact