Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's basically the opposite situation from 150 years ago.

Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).

Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.





I wouldn't say that we have no experimental data which contradicts them. Rather, we do have experimental data which contradicts them, but no experimental data that points us in the direction of a solution (and whenever we go looking for the latter, we fail).

Consider e.g. neutrino masses. We have plenty of experimental data indicating that neutrinos oscillate and therefore have mass. This poses a problem for the standard model (because there are problems unless the mass comes from the Higgs mechanism, but in the standard model neutrinos can't participate in the Higgs mechanism due to always being left-handed). But whenever we do experiments to attempt to verify one of the ways of fixing this problem -- are there separate right-handed neutrinos we didn't know about, or maybe instead the right-handed neutrinos were just antineutrinos all along? -- we turn up nothing.


> the standard model neutrinos can't participate in the Higgs mechanism due to always being left-handed

This again? It's only true if you insist on sticking with the original form of Weinberg's "model of leptons" from 1967 [1], which was written when massless neutrinos were consistent with available experimental data. Adding quark-style (i.e. Dirac) neutrino mass terms to the Standard Model is a trivial exercise. If doing so offends some prejudice of yours that right-handed neutrino can not exist because they have no electric and weak charge (in which case you must really hate photons too, not to mention gravity) you can resort to a Majorana mass term [2] instead.

That question (are neutrinos Dirac or Majorana?) is not a "contradiction", it's an uncertainty caused by how difficult it is to experimentally rule out either option. It is most certainly not "a problem for the standard model".

[1] https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.19.1264

[2] https://en.wikipedia.org/wiki/Majorana_equation#Mass_term


It's trivial to add a matrix to account for neutrino masses, but that doesn't explain their origin.

That is not a trivial problem at all. It certainly has not been solved, and it's possible experiments will say "Both the current ideas are wrong."


> It's trivial to add a matrix to account for neutrino masses

The matrix you are thinking of is presumably the PMNS matrix [1]. It's equivalent to the CKM matrix for quarks [2]. The purpose of both is to parametrize the mismatch between flavor [3] and mass eigenstates, not "to account for neutrino masses" or "explain their origin".

As far as the standard model is concerned, neutrino masses and quark masses all originate from Yukawa couplings [4] with the Higgs field. Adding such terms to Weinberg's original model of leptons is very much a trivial exercise, and was done already well before there was solid evidence for non-zero neutrino masses.

> it's possible experiments will say "Both the current ideas are wrong."

Assuming that by "Both current ideas" you mean Dirac vs Majorana mass, those are the only available relativistic invariants. For both to be wrong, special relativity would have to be wrong. Hopefully I don't need to explain how extraordinarily unlikely that is.

[1] https://en.wikipedia.org/wiki/Pontecorvo%E2%80%93Maki%E2%80%...

[2] https://en.wikipedia.org/wiki/Cabibbo%E2%80%93Kobayashi%E2%8...

[3] https://en.wikipedia.org/wiki/Flavour_(particle_physics)

[4] https://en.wikipedia.org/wiki/Yukawa_coupling


Thanks Lord Kelvin

I disagree, but maybe only because we are using different definitions. For example, we have neutrino oscillations, this requires neutrino mass, which is not part of the standard model of particle physics. In cosmology, there is "lithium problem" (amongst others), which cannot be explained by Lambda-CDM. We know our physical theories are incomplete not only because our mathematical frameworks (GR & QFT) are incompatible (similar to the incompatibility of Maxwell's equations and the Galilean transformations that form the basis of Newtonian mechanics), but also there are these unexplained phenomena, much like the blackbody radiation at the turn of previous century.

> neutrino mass, which is not part of the standard model of particle physics

This is getting tiresome...

https://news.ycombinator.com/item?id=46956197


What about underexplained cosmological epicycles like dark matter (in explaining long-standing divergences of gravitational theory from observation), or the Hubble tension?

The dark matter theory broadly is that there is amount of invisible matter that obeys the laws of Einsteinian gravity but isn't otherwise visible. By itself, it has considerable experimental evidence. It doesn't resemble Ptolemaic theories of planetary motion notably in that doesn't and hasn't required regular updating as new data arrives.

It really fits well with the OP comments. Nothing really contradicts the theory but there's no deeper theory beyond it. Another comment mentioned as "nightmare" of dark matter only have gravitational interaction with other matter. That would be very unsatisfying for physicists but wouldn't something that really disprove any given theory.


When you say dark matter theory doesn't require updates when new data arrives, it sounds like you don't count the parameters that describe the dark matter distribution to be part of the theory.

This is your regular reminder that epicycles were not an incorrect theory addition until an alternative hypothesis could explain the same behavior without requiring them.

Sure, but in that regard dark matter is even more unsatisfying than (contemporary) epicycles, because not only does it add extra complexity, it doesn't even characterize the source of that complexity beyond its gravitational effects.

FYI, very recently (as in this has been in the news the past few days, and the article is from December) an article was published that suggested we might already have experimental evidence for dark matter being primordial black holes, though there are reasons to doubt it as well. I just posted the article: https://news.ycombinator.com/item?id=46955545

But this might be easier to read: https://www.space.com/astronomy/black-holes/did-astronomers-...


Even better, there are the "nightmare" scenarios where dark matter can only interact gravitationally with Standard Model particles.

Personally—and this is where I expect to lose the materialists that I imagine predominate HN—I think we are already in a nightmare scenario with regard to another area: the science of consciousness.

The following seem likely to me: (1) Consciousness exists, and is not an illusion that doesn't need explaining (a la Daniel Dennett), nor does it drop out of some magical part of physical theory we've somehow overlooked until now; (2) Mind-matter interactions do not exist, that is, purely physical phenomena can be perfectly explained by appeals to purely physical theories.

Such are the stakes of "naturalistic dualist" thinkers like David Chalmers. But if this is the case, it implies that the physics of matter and the physics of consciousness are orthogonal to each other. Much like it would be a nightmare to stipulate that dark matter is a purely gravitational interaction and that's that, it would be a nightmare to stipulate that consciousness and qualia arise noninteractionally from certain physical processes just because. And if there is at least one materially noninteracting orthogonal component to our universe, what if there are more that we can't even perceive?


I don't think any of this is particularly nightmarish. Just because we don't yet know how this complex system arises from another lower level one doesn't make it new physics. There's no evidence of it being new or orthogonal physics.

Imagine trying to figure out what is happening on someone's computer screen with only physical access to their hardware minus the screen, and an MRI scanner. And that's a system we built! We've come exceedingly far with brains and minds considering the tools we have to peer inside.


Knowing how to build a brain is different from knowing whether that brain has consciousness in the sense that you or I do. The question of consciousness appears to demand new/orthogonal physics because according to our existing physics, there's no sense in which you or should "feel" any differently than a rock does, or a computer does, or Searle's room does, or a Chinese brain does, or the universe as a whole does, etc.

I don't believe in the hard consciousness problem. Yes, materialist. And yes, it might be that we can never actually put together the path of physical level to how it feels, just like we might never find the fundamental physical rules of the universe. At this time both our positions are belief.

> The question of consciousness appears to demand new/orthogonal physics because according to our existing physics, there's no sense in which you or should "feel" any differently than a rock does,

deepak chopra may interest you


I don't think there is any mystery to what we call "consciousness". Our senses and brain have evolved so we can "sense" the external world, so we can live in it and react to it. So why couldn''t we also sense what is happening inside our brains?

Our brain needs to sense our "inner talk" so we can let it guide our decision-making and actions. If we couldn't remember sentences, we couldn't remember "facts" and would be much worse for that. And talking with our "inner voice" and hearing it, isn't that what most people would call consciousness?


This is not nearly as profound as you make it out to be: a computer program also doesn't sense the hardware that it runs on, from its point of view it is invisible until it is made explicit: peripherals.

You also don’t consciously use your senses until you actively think about them. Same as “you are now aware of your breathing”. Sudden changes in a sensation may trigger them to be conscious without “you” taking action, but that’s not so different. You’re still directing your attention to something that’s always been there.

I agree with the poster (and Daniel Dennet and others) that there isn’t anything that needs explaining. It’s just a question framing problem, much like the measurement problem in quantum mechanics.


another one that thinks they solved the hard problem of consciousness by addressing the easy problem. how on earth does a feedback system cause matter to "wake up"? we are making lots of progress on the easy problem though

This is not as good a refusal as you think it is. To me (and I imagine, the parent poster) there is no extra logical step needed. The problem IS solved in this sense.

If it’s completely impossible to even imagine what the answer to a question is, as is the case here, it’s probably the wrong question to pose. Is there any answer you’d be satisfied by?

To me the hard problem is more or less akin to looking for the true boundaries of a cloud: a seemingly valid quest, but one that can’t really be answered in a satisfactory sense, because it’s not the right one to pose to make sense of clouds.


> If it’s completely impossible to even imagine what the answer to a question is, as is the case here, it’s probably the wrong question to pose. Is there any answer you’d be satisfied by?

I would be very satisfied to have an answer, or even just convincing heuristic arguments, for the following:

(1) What systems experience consciousness? For example, is a computer as conscious as a rock, as conscious as a human, or somewhere in between? (2) What are the fundamental symmetries and invariants of consciousness? Does it impact consciousness whether a system is flipped in spacetime, skewed in spacetime, isomorphically recast in different physical media, etc.? (3) What aspects of a system's organization give rise to different qualia? What does the possible parameter space (or set of possible dynamical traces, or what have you) of qualia look like? (4) Is a consciousness a distinct entity, like some phase transition with a sharp boundary, or is there no fundamentally rigorous sense in which we can distinguish each and every consciousness in the universe? (5) What explains the nature of phenomena like blindsight or split brain patients, where seemingly high-level recognition, coordination, and/or intent occurs in the absence of any conscious awareness? Generally, what behavior-affecting processes in our brains do and do not affect our conscious experience?

And so on. I imagine you'll take issue with all of these questions, perhaps saying that "consciousness" isn't well defined, or that an "explanation" can only refer to functional descriptions of physical matter, but I figured I would at least answer your question honestly.


I think most of them are valid questions!

(1) is perhaps more of a question requiring a strict definition of consciousness in the first place, making it mostly circular. (2) and especially (3) are the most interesting, but they seem part of the easy problem instead. And I’d say we already have indications that the latter option of (4) is true, given your examples from (5) and things like sleep (the most common reason for humans to be unconscious) being in distinct phases with different wake up speed (pun partially intended). And if you assume animals to be conscious, then some sleep with only one hemisphere at a time. Are they equally as conscious during that?

My imaginary timeline of the future has scientific advancements would lead to us noticing what’s different between a person’s brain in their conscious and unconscious states, then somehow generalize it to a more abstract model of cognition decoupled from our biological implementation, and then eventually tackle all your questions from there. But I suspect the person I originally replied to would dismiss them as part of the easy problem instead, i.e. completely useless for tackling the hard problem! As far as I’m concerned, it’s the hard problem that I take issue with, and the one that I claim isn’t real.


I much agree, especially on the importance of defining what we mean by the word "conscicousness", before we say we cannot explain it. Is a rock conscious? Sure according to some deifinition of the word. Probably everybody would agree that there are different levels of consciousness, and maybe we'd need different names for them.

Animals are clearly conscious in that they observe the world and react to it and even try to proactively manipulate it.

The next level of consciousness, and what most people probably mean when they use the word is human ability to "think in language". That opens up a whole new level, of consciousness, because now we can be conscious of our inner voice. We are conscious of ourselves, apart from the world. Our inner voice can say things about the thing which seems to be the thing uttering the words in our mind. Me.

Is there anything more to consciousness than us being aware that we are conscious? It is truly a wondrous experience which may seem like a hard problem to explain, hence the "Hard Problem of Consciousness", right? But it's not so mysterious if we think of it in terms of being able to use and hear and understand language. Without language our consciousness would be on the level of most animals I assume. Of course it seems that many animals use some kind of language. But, do they hear their "inner voice"? Hard to say. I would guess not.

And so again, in simple terms, what is the question?


How can consciousness have information about the material world if it doesn't interact with it in any way?

And when your fingers type that you experience qualia, are they bullshitting because your fingers have never actually received any signals from your consciousness in any direct or indirect way?


I think the old theory of the planes of existence has a lot of utility here - if you substitute "the dimensionality at which you're analyzing your dataset" for the hermetic concept of "planes of existence" you get essentially the same thing, at least in lower dimensions like one (matter) or two (energy). Mind, specifically a human mind, would be a four dimensional under the old system, which feels about right. No idea how you'd set up an experiment to test that theory though. It may be completely impossible because experiments only work when they work in all contexts and only matter is ever the same regardless of context.

That would certainly be a difficult scenario. But it doesn't seem very likely. For example, consciousness and material systems seem to interact. Putting drugs in your blood changes your conscious experience etc.

Yes, but it doesn't even need mysticism or duality.

There's a more straightforward problem, which is that all of science is limited by our ability to generate and test mental models, and there's been no research into the accuracy and reliability of our modelling processes.

Everything gets filtered through human consciousness - math, experiment, all of it. And our definition of "objective" is literally just "we cross-check with other educated humans and the most reliable and consistent experience wins, for now."

How likely is it that human consciousness is the most perfect of all possible lenses, doesn't introduce distortions, and has no limits, questionable habits, or blind spots?


Yeah - the nightmare situation doesn't exist if you take a materialist approach. Maybe that's evidence for it?

I've thought about this possibility but come to reject it. If mind-matter interactions did not exist, then matter could not detect the presence of mind. And if the brain cannot detect the mind then we wouldn't be able to talk or write about the mind.

Or, the mind is in spectator mode?

From a physics point of view should be as every effect is caused by previous state. And next tick is always next tick, except quantum bacause has some randomness, but let’s assume it’s a seeded randomness.

I think every tick is predictable from previous state. Inevitable. Therefore I really like how you put it: mind is just spectating.


That doesn't answer the question though.

If a rock starts moving in one tick, it affects other things in the next tick. Despite being deterministic, that rock is not a spectator.

So if the mind is a spectator, it's not for that reason, it's some other reason.


Scientific theories are not curve-fitting.

>GR and QFT are incompatible

I did physics at uni and kind of dropped out when it got too hard.

I've long guessed the incompatibility is because the maths is just too hard for human brains, though I'm probably biased there, and we'll get a breakthrough when AI can handle much more complex maths than us. Probably not so long till we find out on that one.

I once tried to write a simplified explanation for why a spin-2 quantum theory naturally results in something like general relativity and totally failed - man that stuff's hard.


The math is hard, but I don't think that's the problem. Hard math eventually succumbs.

I think that even if AI were to find a good unification of GR and QM, we wouldn't be able to test it. We might accept it without additional confirmation if it were sufficiently natural-feeling (the way we accepted Newtonian gravity long before we could measure G), but there's no guarantee that we'd ever be able to meaningfully test it.

We could get lucky -- such a theory might point at a solution to some of the few loose threads we get out of existing collider and cosmological measurements -- but we might not. We could be stuck wishing we had a galaxy-sized collider.


It might explain some of the many physics observations that we don't have explanations for like why do we have the particles we have and why those properties.

Doesn't that imply our theories are "good enough" for all practical purposes? If they're impossible to empirically disprove?

Yes, for all practical purposes. This is the position of physicist Sean Carroll and probably others. We may not know what is happening in the middle of a black hole, or very close to the big bang, but here on Earth we do.

"in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]

0: https://philpapers.org/archive/CARCAT-33


ER=EPR says something completely shocking about the nature of the universe. If there is anything to it, we have almost no clue about how it works or what its consequences are.

Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.

Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.


The fundamental theories are good enough in that we can't find a counterexample, but they're only useful up to a certain scale before the computational power needed is infeasible. We're still hoping to find higher-level emergent theories to describe larger systems. By analogy, in principle you could use Newton's laws of motion (1685) to predict what a gas in a room is going to do, or how fluid will flow in a pipe, but in practice it's intractable and we prefer to use the higher-level language of fluid mechanics: the ideal gas law, the navier-stokes equations, etc.

Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.

You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.

But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.


I agree, and I think that your claim is compatible with the comment that you are responding to. Indeed, perhaps it's turtles all the way down and there is systematic complexity upon systematic complexity governing our universe that humanity has been just too limited to experience.

For a historical analogy, classical physics was and is sufficient for most practical purposes, and we didn't need relativity or quantum mechanics until we had instruments that could manipulate them, or that at least experienced them. While I guess that there were still macroscopic quantum phenomena, perhaps they could have just been treated as empirical material properties without a systematic universal theory accounting for them, when instruments would not have been precise enough to explore and exploit predictions of a systematic theory.


The experiments that lead to the invention of quantum theory are relatively simple and involve objects you can touch with your bare hands without damaging them. Some are done in high school, eg the photoelectric effect.

Whereas I did hedge my point regarding macroscopic quantum phenomena, I think that the quantum nature of the photoelectric effect would have been harder to discern without modern access to pure wavelength lighting. But you could still rely on precise optics to purify mixed light I suppose. But without even optics it should be even harder.

All the 19th century experiments that desired monochromatic light, including those that have characterized the photoelectric effect, used dispersive prisms, which separated the light from the Sun or from a candle into its monochromatic components. These are simple components, easily available.

This allowed experiments where the frequency of light was varied continuously, by rotating the prism.

Moreover, already during the first half of the 19th century, it became known that using gas-discharge lamps with various gases or by heating certain substances in a flame you can obtain monochromatic light corresponding to certain spectral lines specific to each substance. This allowed experiments where the wavelength of the light used in them was known with high accuracy.

Already in 1827, Jacques Babinet proposed the replacement of the platinum meter standard with the wavelength of some spectral line, as the base for the unit of length. This proposal has been developed and refined later by Maxwell, in 1870, who proposed to use both the wavelength and the period of some spectral line for the units of length and time. The proposal of Babinet has been adopted in SI in 1960, 133 years later, while the proposal of Maxwell has been adopted in SI in 1983, 113 years later.

So there were no serious difficulties in the 19th century for using monochromatic light. The most important difficulty was that their sources of monochromatic light had very low intensities, in comparison with the lasers that are available today. The low intensity problem was aggravated when coherent light was needed, as that could be obtained only by splitting the already weak light beam that was available. Lasers also provide coherent light, not only light with high intensity, thus they greatly simplify experiments.


> You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.

That presupposes that there's a bottom, and that each subsequent layer gets simpler. Neither proposition is guaranteed, indeed the latter seems incorrect since quantum chromodynamics governing the internal structure of the proton is much more complex than the interactions governing its external behavior.


Yeah that's the outcome theorized by Gödel.

Incompleteness is inherent to our understanding as the universe is too vast and endless for us to ever capture a holistic model of all the variables.

Gödel says something specific about human axiomatic systems, akin to a special relativity, but it generalizes to physical reality too. A written system is made physical writing it out, and never complete. Demonstrates that our grasp of physical systems themselves is always incomplete.


Gödel’s incompleteness says almost nothing about this. I wish people wouldn’t try to apply it in ways that it very clearly is not applicable to.

An environment living in Conway’s Game of Life could be quite capable of hypothesizing that it is implemented in Conway’s Game of Life.


That's not what they were saying.

Systems can hypothesize about themselves but they cannot determine why the rules they can learn exist in the first place. Prior states are no longer observable so there is always incomplete history.

Conway's Game of Life can't explain its own origins just itself. Because the origins are no longer observable after they occur.

What are the origins of our universe? We can only guess without the specificity of direct observation. Understanding is incomplete with only simulation and theory.

So the comment is right. We would expect to be able to define what is now but not completely know what came before.


If they had a correct point, the appeal to Gödel’s results did not help to justify it.

Indeed, as I think I commented before here, this kind of self-reference is exactly what makes Gödel's proof work.

Now the question is are we in Conways Game of Life?

The point is not to make better predictions of the things we already know how to predict. The point is to determine what abstractions link the things we don't presently understand--because these abstraction tend to open many new doors in other directions. This has been the story of physics over and over: relativity, quantum theory, etc, not only answered the questions they were designed to answer but opened thousands of new doors in other directions.

Maybe? We seem to be able to characterize all the stuff we have access to. That doesn't mean we couldn't say produce new and interesting materials with new knowledge. Before we knew about nuclear fission we didn't realize that we couldn't predict that anything would happen from a big chunk of uranium or the useful applications of that. New physics might be quite subtle or specific but still useful.

All the stuff we have access to?

There isn't even a general physical theory of window glass -- i.e. of how to resolve the Kauzmann paradox and define the nature of the glass transition. Glass is one of man's oldest materials, and yet it's still not understood.

There's also, famously, no general theory for superconducting materials, so superconductors are found via alchemical trial-and-error processes. (Quite famously a couple of years ago, if you remember that circus.)

Solid-state physics has a lot of big holes.


The existing theories are extremely far from being good enough for practical purposes.

There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.

For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.

The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.

It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.


This depends very much on what "practical purposes" are. For almost all conceivable technology, relativistic quantum mechanics for electrons and light, ie QED, is sufficient fundamental theory. This is unlike before quantum mechanics, when we basically didn't have fundamental laws for chemistry and solid-state physics.

The vast majority of useful things cannot be computed with QED from fundamental principles. You cannot compute even simple atomic energy spectra.

The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.

Solid-state physics is a much better example, because little of it existed before quantum physics.

Nevertheless, solid-state physics is also the most obvious example that the current quantum physics cannot be used to compute anything of practical value from first principles.

All solid-state physics is based on experimentally-measured parameters, which cannot be computed. All mathematical models that are used in solid-state physics are based on guesses about how the solutions could behave, e.g. by introducing various fictitious averaged potentials in equations, like the Schroedinger equation, and they are not based on computations that use primary laws, without guesses that do not have any other justification, except that when the model is completed with the experimentally-measured values for its parameters, it can make reasonably accurate predictions.

Using empirical mathematical models of semiconductor materials, e.g. for designing transistors, is perfectly fine and entire industries have been developed with such empirical models.

However, the fact that one must develop custom empirical models for every kind of application, instead of being able to derive them from what are believed to be the universal laws of quantum physics, demonstrates that these are not good enough.

We can live and progress very well with what we have, but if someone would discover a better theory or a mathematical strategy for obtaining solutions, that could be used to compute the parameters that we must now measure and which could be used to model everything that we need in a way for which there would be guarantees that the model is adequate, then that would be a great advance in physics.


You seem to be familiar with the field, yet this is a very strange view? I work on exactly this slice of solid state physics and semiconductor devices. I’m not sure what you mean here.

The way we construct Hamiltonians is indeed somewhat ad hoc sometimes, but that’s not because of lack of fundamental knowledge. In fact, the only things you need are the mass of the electron/proton and the quantum of charge. Everything else is fully derived and justified, as far as I can think of. There’s really nothing other than the extremely low energy limit of QED in solid state devices, then it’s about scaling it up to many body systems which are computationally intractable but fully justified.

We don’t even use relativistic QM 95% of the time. Spin-orbit terms require it, but once you’ve derived the right coefficients (only needed once) you can drop the Dirac equation and go back to Schrödinger. The need for empirical models has nothing to do with fundamental physics, and all to do with the exorbitant complexity of many-body systems. We don’t use QFT and the standard model just because, as far as I can tell, the computation would never scale. Not really a fault of the standard model.


> The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.

Um, false? The fundamentals of chemistry are about electron orbitals (especially the valence ones) and their interactions between atoms to form molecules. All of my college chemistry courses delved somewhat into quantum mechanics, with the biggest helping being in organic chemistry. And modern computational chemistry is basically modeling the QED as applied to atoms.


What are you talking about? The spectra of hydrogen is very well understood and a text book example for students to calculate.

We use spectra to test QED calculations to something like 14 digits.


The hydrogenoid atoms and ions, with a single electron, are the exception that proves the rule, because anything more complex cannot be computed accurately.

The spectrum of hydrogen (ignoring the fine structure) could be computed with the empirical rules of Rydberg before the existence of quantum physics. Quantum physics has just explained it in terms of simpler assumptions.

Quantum physics explains a great number of features of the atomic spectra, but it is unable to compute anything for complex atoms with an accuracy comparable with the experimental measurements.

The QED calculations with "14 digits" of precision are for things that are far simpler than atomic spectra, e.g. for the gyromagnetic ratio of the electron, and even for such things the computations are extremely difficult and error-prone.


> The hydrogenoid atoms and ions, with a single electron, are the exception that proves the rule, because anything more complex cannot be computed accurately.

Rather: there is no known closed-form solution (and there likely won't be any).


Lattice-QCD can, by now, actually calculate the masses of the proton, neutron from first principles pretty accurately.

This is of course a brute-force approach. We currently lack, in all fields, theory for emergent properties. And the mass of the proton definitely is such.


There have been claims about this, starting with "Ab Initio Determination of Light Hadron Masses" (Science, 2008).

Nevertheless, until now I have not seen anything that qualifies as "computing the masses".

Research papers like that do not contain any information that would allow someone to verify their claims. Moreover, such papers are much more accurately described as "fitting the parameters of the Standard Model, such as quark masses, to approximately match the measured masses", and not as actually computing the masses.

The published results of hadron masses are not much more accurate than you could compute mentally, without using any QCD, much less Lattice QCD, by estimating approximate quark masses from the composition in quarks of the hadrons and summing them. What complicates the mass computations is that while the heavy quarks have masses that do not vary much, the effective masses of the light quarks (especially u and d, which compose the protons and neutrons) vary a lot between different particles. Because of this, there is a very long way between a vague estimate of the mass and an accurate value.


The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).

Classical physics was indeed "good enough for all practical purposes" as well at the time... but those didn't include electronics, nuclear power, most all basic understanding of materials, chemistry, and just a tremendous amount of things.

The point being it's not at all clear what we might be missing without these impractical little mysteries that so far are very distant from every day life.


If I have to make a guess, we are at the level of pre-copernicus in particle physics.

We are finding local maximums(induction) but the establishment cannot handle deduction.

Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'

But with particles.


The sun is not at the center of the solar system. The intellectual leap was not to replace earth with the sun. Earth does not "revolve around the sun". The intellectual leap was to realize that the situation is somewhat symmetric -- they both attract each other, and they orbit around their center of gravity (which, yes, is in the sun. But not because the sun is the center.)

This sounds like a distinction without consequence, but I think that's wrong. The sun is not special. It just has a lot of mass. If somebody learns: The earth orbits the sun-- They don't understand how two black holes can orbit each other. If somebody learns: The sun and the earth orbit their CM -- They will be able to understand that.


There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.

Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.

We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...


I think the problem is that GR and QFT are at odds with each other? (I am not quite versed in the subject and this is my high-level understanding of the “problem”)

Absolutely not. Newtonian physics was 'good enough' until we disproved it. Imagine where we would be if all we had was Newtonian physics.

You would still make it to the moon (so I've heard). Maybe you wouldn't have GPS systems?

Newtonian physics is good enough for almost everything that humans do. It's not good for predicting the shit we see in telescopes, and apparently it's not good for GPS, although honestly I think without general relativity, GPS would still get made but there'd be a fudge factor that people just shrug about.

For just about anything else, Newton has us covered.


Microchips? A lot of quantum physics is applied here from the top of my mind.

Quantum mechanics is relevant to humanity because we build things which are very small. General relativity is not, because we're more or less incapable of actually doing things on a scale where it matters.

General relativity is pretty relevant to GPS satellites.

Oh sure, nothing major. Just transistors, lasers, MRI, GPS,nuke power, photovoltaics, LEDs, x-rays, and pretty much anything requiring maxwells equations.

Nothing major.


quantum mechanics (also very much not Newtonian) is much more important to our day-to-day lives.

this kind of distinction is quite stupid in general as plenty of things that we rely on for day-to-day activities such as our houses, desks, chairs, beds, shoes, clothes, etc are all based on Newtonian/classical mechanics. Basically everything that we use which existed pre-transistor strictly speaking only required classical physics.

I mean sure, but the transistor is pretty important to the way I live my life now!

I'd argue so is the bed you sleep in every night, and the roof over your head. Best not to take those for granted, as I don't think the transistor would last so long if it wasn't sheltered from the environment.

The argument is that these kind of distinctions between how "classical" and "quantum" physics affects our lives is just a pointless endeavor that even academics don't waste their time with.


Is it?

Flash memory (quantum tunneling), lasers (stimulated emission), transistors (band theory), MRI machines (nuclear spin), GPS (atomic transition), LED's (band gap), digital cameras (photoelectric effect), ...the list does, in fact, go on, and on, and on.

Did you intentionally list things that are clearly not essential to day-to-day life?

I'd argue flash memory and transistors certainly are.

This era might be one where we have to earn the next clue much more slowly

I find the idea that reality might be quantized fascinating, so that all information that exists could be stored in a storage medium big enough.

It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.

Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...


Note that "reality" is not quantized in any existing theory. Even in QM/QFT, only certain properties are quantized, such as mass or charge. Others, like position or time, are very much not quantized - the distance between two objects can very well be 2.5pi planck lengths. And not only are they not quantized, the math of these theories does not work if you try to discretize space or time or other properties.

> all information that exists could be stored in a storage medium big enough

Why is quantization necessary for information storage? If you're speculating about a storage device external to our universe, it need not be constrained by any of our physical laws and their consequences, such as by being made up of finitely many atoms or whatever. It might have components like arbitrary precision real number registers.

And if you're speculating about a storage device that lives within our universe, you have a contradiction because it's maximum information capacity can't exceed the information content of its own description.


If reality is quantized, how can you store all the information out there without creating a real simulation? (Essentially cloning the environment you want stored)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: