The existing theories are extremely far from being good enough for practical purposes.
There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.
For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.
The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.
It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.
This depends very much on what "practical purposes" are. For almost all conceivable technology, relativistic quantum mechanics for electrons and light, ie QED, is sufficient fundamental theory. This is unlike before quantum mechanics, when we basically didn't have fundamental laws for chemistry and solid-state physics.
The vast majority of useful things cannot be computed with QED from fundamental principles. You cannot compute even simple atomic energy spectra.
The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.
Solid-state physics is a much better example, because little of it existed before quantum physics.
Nevertheless, solid-state physics is also the most obvious example that the current quantum physics cannot be used to compute anything of practical value from first principles.
All solid-state physics is based on experimentally-measured parameters, which cannot be computed. All mathematical models that are used in solid-state physics are based on guesses about how the solutions could behave, e.g. by introducing various fictitious averaged potentials in equations, like the Schroedinger equation, and they are not based on computations that use primary laws, without guesses that do not have any other justification, except that when the model is completed with the experimentally-measured values for its parameters, it can make reasonably accurate predictions.
Using empirical mathematical models of semiconductor materials, e.g. for designing transistors, is perfectly fine and entire industries have been developed with such empirical models.
However, the fact that one must develop custom empirical models for every kind of application, instead of being able to derive them from what are believed to be the universal laws of quantum physics, demonstrates that these are not good enough.
We can live and progress very well with what we have, but if someone would discover a better theory or a mathematical strategy for obtaining solutions, that could be used to compute the parameters that we must now measure and which could be used to model everything that we need in a way for which there would be guarantees that the model is adequate, then that would be a great advance in physics.
> The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.
Um, false? The fundamentals of chemistry are about electron orbitals (especially the valence ones) and their interactions between atoms to form molecules. All of my college chemistry courses delved somewhat into quantum mechanics, with the biggest helping being in organic chemistry. And modern computational chemistry is basically modeling the QED as applied to atoms.
The hydrogenoid atoms and ions, with a single electron, are the exception that proves the rule, because anything more complex cannot be computed accurately.
The spectrum of hydrogen (ignoring the fine structure) could be computed with the empirical rules of Rydberg before the existence of quantum physics. Quantum physics has just explained it in terms of simpler assumptions.
Quantum physics explains a great number of features of the atomic spectra, but it is unable to compute anything for complex atoms with an accuracy comparable with the experimental measurements.
The QED calculations with "14 digits" of precision are for things that are far simpler than atomic spectra, e.g. for the gyromagnetic ratio of the electron, and even for such things the computations are extremely difficult and error-prone.
Lattice-QCD can, by now, actually calculate the masses of the proton, neutron from first principles pretty accurately.
This is of course a brute-force approach. We currently lack, in all fields, theory for emergent properties. And the mass of the proton definitely is such.
There have been claims about this, starting with "Ab Initio Determination of Light Hadron Masses" (Science, 2008).
Nevertheless, until now I have not seen anything that qualifies as "computing the masses".
Research papers like that do not contain any information that would allow someone to verify their claims. Moreover, such papers are much more accurately described as "fitting the parameters of the Standard Model, such as quark masses, to approximately match the measured masses", and not as actually computing the masses.
The published results of hadron masses are not much more accurate than you could compute mentally, without using any QCD, much less Lattice QCD, by estimating approximate quark masses from the composition in quarks of the hadrons and summing them. What complicates the mass computations is that while the heavy quarks have masses that do not vary much, the effective masses of the light quarks (especially u and d, which compose the protons and neutrons) vary a lot between different particles. Because of this, there is a very long way between a vague estimate of the mass and an accurate value.
There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.
For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.
The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.
It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.