Grand Unified Theory

The Grand Unified Theory is a vision of a physics theory that can combine three of the four fundamental forces into one single equation. The four forces are the Strong Nuclear Force, the Weak Nuclear Force, the Electro-Magnetic Force, and the Gravitational Force. The EM and Weak forces were initially thought to be two separate forces until scientists discovered one theory (the Electro Weak theory) to explain both of them and then went on to observe this unified force in action (much like Maxwell unified the electric and magnetic forces into the Electro-Magnetic Force).

Apparently M-Theory is making some progress towards a TOE (Theory Of Everything), but I don't know much about that because I don't really know what M-Theory is.

A unified field theory may lie beyond our current means of scientifically understanding and describing the universe. A fundamentally new approach and outlook on the universe may be required for an understanding of the link between the four fundamental forces in quantum theory. This is where the theory formerly known as Super Strings (now M-Theory) comes in.

The most dramatic and ultimate proof of quantum theory is the Aspect experiment, named after the French quantum physicist Alain Aspect. In 1982, he and his research team implemented successfully the test that had been long in the making, starting with a thought experiment suggested by Einstein. Very simplified, Aspect and his colleagues created two photons from the same quantum event and observed them as they speeded into opposite directions. After they had traveled some distance with the speed of light, the researchers changed the polarization for only one of them. (Polarization is the orientation of the wave that corresponds to each photon.) As a result, the other photon adopted the opposite polarization, even though the two were far apart. Relativity theory tells us that no information can travel faster than light. What kind of interaction between the two photons could account for this?

Changed the polarization, or measured it? There is a difference.

The best explanation of this phenomenon is given by Everett's Many-Worlds theory of quantum mechanics.

[This was a good philosophical interpretation (not a "theory of quantum mechanics") prior to about 1970. It is now largely obsolete, having been replaced with an actual theory (see quantum entanglement).] -- You can interpret entanglement according to many-worlds or Copenhagen as easily as you could original QM. Mathematics was never the problem here.

Reference: Quantum Religion http://home.sprynet.com/~jowolf/essay.htm this is not serious metaphysics, not unless you badly abuse the word. At best, it's a few trivial facts dressed up as profound insights.

Alain Aspect, Atom Optics group, Institut d'Optique. http://atomoptic.iota.u-psud.fr/people/aa.html


The Grand Unified Theory is analogous to the rules of Chess. Always remember that merely knowing the rules is not enough to make you an expert player. They are emergent phenomena (and EmergentBehaviors). Solving physics won't solve chemistry, let alone biology, psychology or politics. And of course, physics is far from solved.

The theory formerly known as superstrings (M theory) unifies all 4 forces, is background free, and the math all works. Previously, there were 5 competing superstring theories, however they have now been reconciled into a single theory along with supergravity. The only problem is, though the math works, and the top theorists agree it's very elegant and probably true, the strings are too small to be directly observable. As a result, some people question whether it's a valid theory however most believe M theory will indeed be tested by astronomical observations of the early universe. In any case, superstrings is the first complete grand theory.

Last I heard, M theory was still descriptive rather than predictive in general. This is a much stronger issue than whether the strings are too small to test. It isn't even close to parameter-free.

What parameters are you talking about? So far as I know, there's only one, alpha, the string tension.

Besides, a theory of science doesn't need to predict a single goddamn thing to be a valid physical theory. It only needs to explain, to "describe" as you so disparage it. The requirement that a theory explain something you've never seen (predict) is only a practical measure to keep scientists honest, to keep them from tailoring an arbitrary theory to fit their observations. This is not necessary in the case of superstrings since they cannot be tailored even in principle and even if they could've been, the math is much too complicated to ever do so in practice. Come on people, this is basic philosophy of science stuff!

Example: do singularities actually exist, naked or not? Is something like a GravaStar? the only possible physical embodiment of what previously was considered to be a BlackHole? No and yes respectively.

Example: is time travel, and therefore FTL, truly impossible? If not, does it in fact unravel causality?

Example: why is there assymmetry in CPT (charge-parity-time) pseudo-invariance?

Example: why is there a vast preponderance of matter rather than antimatter? Not a question of fundamental physics. This is explained at the second level by big bang theory, not something M-theory should address directly.

Example: what is causality?

In increasing order of difficulty. These are open questions, and M-theory cannot address them. So you say. I find that claim to be incredibly stupid.


Maybe somebody can clarify or point to some web resources. I understand that physicists are going on a limb to find a "unifying theory". I've graduated mathematics, but I've been long time exposed only to the discrete domain of computer science, so I am either puzzled or biased. I understand that they want to find a "nice" set of equations, probably something to look as nice as Maxwell's equations of electromagnetic field, that will "explain everything". But isn't this some kind of methaphysical belief ? That a theory of everything will necessarily look nice ? Or exactly what's the criteria for unifying theory of everything ?

In other words in computer science we have no problem looking at a function: f x -> if (x="gravit") then gravityCase x else quantumCase(x), just like any other function, without any big expectation to find something fundamental and unifying at the bottom of it. Why is there a belief among physicists that a theory that reads like: "if gravityCase() then gravityTheory else quantumTheory", is not a theory of everything ?

Because the cases are not mutually exclusive, and all existing theories fail in the area of overlap.

Maybe I'm incredibly naive and unread, but what's a good book or web source to describe where they overlap and where they fail ?

Last time I read something at the "popularization of science" level there were the four forces and they unified three of them, but their ranges of applications made sure that they did not overlap, for example gravity effects among atomic particles were negligible. And if these theories in some phenomena and they fail to explain whatever that phenomena, why is there an expectation that by collapsing the equations (does this involve reducing the number of variables also ?) into a "theory of everything" will lead to a solution rather than adding another entity maybe ?

Can someone explain what is the mathematical nature of this theory that they search for ? For somebody accustomed with mathematical logic, universal algebra, algebraic theories, model theory, category theory, etc, how would a physicist describe what they are looking for ?

Interesting. Where did that "popular notion" come form, I wonder? Theoretical physics, pure maths and applied maths have been leapfrogging one another for centuries (if not millennia, although they get hard to tell apart after too many centuries).

It probably comes from the fact that non-Euclidean geometries are centuries old. That no new math has been invented by physics in the last century, until M-theory came along, and that people don't have very good memories of what came before they were born.


Maybe the universe is organically designed/evolved such that there is no simple model, only semi-close approximations. It's presumptuous to assume there is a simple model behind it all. There is no universal law the says behind everything there must be a simple model. DNA is turning out messier than originally thought, for example. It doesn't just code for proteins but has all kinds flags and conditionals such that it is more like GO-TO code than functional or declarative programming. I suspect the future of science is in analyzing and tracking complexity, not in finding simple models like E=mc^2. There's probably not many of those left. Economics is moving that way also, using more person-level simulations (not unlike The Sims) rather than only aggregate models and formulas.


See also TheoryOfEverything, EverythingIsa

CategoryPhysics


EditText of this page (last edited July 15, 2011) or FindPage with title or text search