Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
ALPHA observes light spectrum of antimatter for first time (phys.org)
199 points by dnetesn on Dec 19, 2016 | hide | past | favorite | 82 comments


> Antihydrogen is made by mixing plasmas of about 90,000 antiprotons from the Antiproton Decelerator with positrons, resulting in the production of about 25,000 antihydrogen atoms per attempt. Antihydrogen atoms can be trapped if they are moving slowly enough when they are created. Using a new technique in which the collaboration stacks anti-atoms resulting from two successive mixing cycles, it is possible to trap on average 14 anti-atoms per trial

The amount of work that goes in to producing 14 anti-hydrogen atoms is astonishing. It's simultaneously the height of human technical accomplishment yet vanishingly small quantities of the most primitive element possible.


To help appreciate the difficulty: Antimatter can only be created from matter by smashing it together with enough energy to spontaneously produce matter-antimatter pairs (more or less). The amount of energy to create an antiproton is about a billion electron volts, while the binding energy to split a hydrogen atom (or antiatom) apart is just 13 electron volts. So you have to cram this fantastic amount of energy into individual particles, then cool them down by 8 orders of magnitude without touching them (to avoid annihilating them).


If you enjoyed that, you may also enjoy the discussions of how people synthesize super heavy elements, such as in this video: https://www.youtube.com/watch?v=z3oY-XHwss8

Superheavy elements even beat antihydrogen on this scale; one of the ones recently named was confirmed to have been synthesized based on only three atoms!


Yes, it's amazing work. Also, don't be fooled that because it is so difficult to produce today that it can't be scaled immensely. Plutonium production was scaled from nanogram levels to ton levels in less than a decade.


Well... 14 anti-hydrogen atoms is a lot less than a nanogram. It's about 1/100th of a trillionth of a nanogram.

If we scale up production in the next decade at the same pace you're describing, we might get to nanogram levels.


> we might get to nanogram levels

Right! Now we're talking about some antimatter!

Kidding of course. But there is no reason to believe ab initio that it could not be scaled to whatever quantity is useful given enough incentive for that use.


A nanogram of antimatter is 90 kilojoules waiting to be released if it touches any regular matter. Not the biggest bomb in the world, but something to be treated with at least mild respect.


In case anyone is wondering about what antimatter actually is, antimatter is basically like normal matter, but the electrical charges are reverse. Take hydrogen for example. It is a massive, positive particle (the proton), orbited by a light, negative particle (the electron). Anti-hydrogen is a massive, negative particle (the antiproton) orbited by a light, positive particle (the positron) [1].

Why do matter and antimatter annihilate each other? Matter and antimatter are very similar, except for opposite electrical charges. Opposite charges attract. If antihydrogen meets normal matter, the positrons quickly find and combine with the electrons, and the antiprotons do the same with protons. These combinations all release radiation. Therefore, one of the main challenge with studying antimatter is making sure that it doesn't touch normal matter, which is basically everything in the lab.

Why don't electrons and protons normally combine, being oppositely charged? Electron sit in stable orbitals, volumes of space with associated energies. These orbitals are like valleys. While an electron nearing the nucleus will fall into an orbital, it would take extra energy to get the electron out of the orbital and into the nucleus [2].

[1]: https://en.wikipedia.org/wiki/Antihydrogen

[2]: http://physics.stackexchange.com/questions/30939/what-keeps-...


"Falling into the nucleus", as stated in that Stack Exchange question, is a classical (as in "wrong" :) ) way to look at it.

The 1s orbital of the electron is spherically symmetric, with non-zero probability of the electron being right in the center of the atom (so within the nucleus), and decreasing probability of it being farther away. So e.g. a good approximation of the He atom is of two neutrons, two protons, and two electrons, all in exactly the same position (with the electrons having "more spread").

At the end of the day, there are interactions that are seen in Nature and interactions that aren't. We codify the patterns as conservation of lepton number, baryon number, etc. If the numbers of what you have add up to zero, they can annihilate.

These conservation laws also accommodate other weird things, like beta decay, in which a neutron decays ("splitting") into a proton, an electron, and an anti-neutrino. It's not that the neutron is "composed" of those three things. But its total energy is higher than the sum of the energy at rest of those three particles: Whenever that's the case and the conservation laws allow it, we see it happen naturally.


As someone who hasn't studied this (yet) does that mean that the reason protons and electrons don't annihilate is because their Baryon Number wouldn't sum up to zero? (With Protons having 1 and Electrons 0 if I understood correctly)


That's what we currently observe, yes. The particles of matter in the Standard Model are quarks (that join to form protons and neutrons among other things) and leptons (electrons, muons, taus, and neutrinos). The number of {quarks - antiquarks}, as well as that of {leptons - antileptons}, is conserved.

If you start with a proton and an electron, you have 3 quarks and 1 lepton, so whatever the interaction you have to end up with 3 quarks and 1 lepton (plus any number of quark-antiquark and lepton-antilepton pairs). E.g. for beta decay:

neutron (3 quarks) -> proton (3 quarks) + electron (lepton) + antineutrino (antilepton)


Neutron has a baryon number of 1 and so it is conserved in your example. Even in normal matter, some elements undergo Electron Capture (p + e- -> n + ν).

But baryon number conservation is an asserted symmetry and there is no fundamental reason it holds. Finding proton decay would demonstrate that it does not but so far there is no evidence that protons (or bound neutrons) spontaneously decay.


> At the end of the day, there are interactions that are seen in Nature and interactions that aren't.

That's provably false. Every reaction occurs in nature, even the vanishingly improbable ones, just at an extremely low reaction coefficient. They're still there, just at a probability of 10^-14 or whatever.


What? There are interactions that can never occur because they're forbidden by the laws of physics - you'll never violate conservation of [energy, baryon number, momentum, etc.] even with probability 10^-14.


Sure, technically correct is the best kind of correct I suppose.

But if the reaction is the least bit reversible (and most are) then the reverse reaction is still proceeding even though the forward reaction is stronger. It's just doing so at an incredibly low rate, and the products are likely to be immediately converted by the forward reaction.

You are looking at averages and claiming they hold for every single event in a stochastic simulation. Taking the example of entropy - it's not impossible that entropy decreases in a system, it's just less likely than it increasing. On a stochastic level, entropy decreases all the time, it's just that on average it increases more than it decreases.

Reactions work the same way. You're not actually making chemicals react as a singular act, you're creating a forward reaction that occurs more rapidly than the reverse reaction.

If we had a hypothetical chemical Maxwell's Demon - you could "bottle" up the tiny bits of those outputs from the reverse reaction before they underwent the forward reaction again.

https://en.wikipedia.org/wiki/Maxwell's_demon


You're assuming one can expect the knowledge of chemical reactions to apply to particle interactions, and that is not the case.

For example particle accelerators can shoot particles against some target one by one, and so there's no need to look at averages. Some interactions never occur, to the best of our measuring ability (i.e. highest energies and number of repetitions). Some others are right away forbidden by laws much stronger than entropy in thermodynamics, in the sense that they're not averages, but mathematical derivations off the symmetries of the universe.


I'm very interested in seeing that proof.


> The concentrations of reactants and products in an equilibrium mixture are determined by the analytical concentrations of the reagents (A and B or C and D) and the equilibrium constant, K. The magnitude of the equilibrium constant depends on the Gibbs free energy change for the reaction.[2] So, when the free energy change is large (more than about 30 kJ mol−1), then the equilibrium constant is large (log K > 3) and the concentrations of the reactants at equilibrium are very small. Such a reaction is sometimes considered to be an irreversible reaction, although in reality small amounts of the reactants are still expected to be present in the reacting system. A truly irreversible chemical reaction is usually achieved when one of the products exits the reacting system, for example, as does carbon dioxide (volatile) in the reaction

https://en.wikipedia.org/wiki/Reversible_reaction

Assuming you do not outright lose some reactants from the system, the reverse reaction is still occurring. However, because the reaction constant is so small, the resulting product is highly probable to essentially immediately undergo the forward reaction. However, it will still be present in some equilibrium - just an incredibly low one, in the ratio of the reaction rate coefficients. You'll have 10^14 times as much of the forward reaction, or whatever.


Chemical reactions, ignoring those with nuclear interactions, are only bound by the law of conservation of energy and the decrease of entropy. A chemist looks at them in the macro scale, in which macroscopic properties like temperature and pressure are set, and thus each molecule's energy isn't a fixed value (instead being taken off a set, according to Maxwell's probability distribution). This, and the probabilistic nature of the law of entropy, leads to all allowed reactions happening somewhere at the micro scale, even if extremely rarely.

If you look at the micro scale, though, and instead of having a soup of molecules you deal with single molecules, whether a reaction is possible or not isn't a probabilistic thing anymore. The energies your molecules have are actual numbers, and if they don't add up, you won't have a reaction.


And what if there is a micro-scale reversal of entropy, as I previously mentioned? Let's say a collision of two electrons, or an electron with a surrounding gas molecule, that results in a sudden increase in orbital energy of an electron (plus another particle losing all its energy of course). Vanishingly unlikely of course, but with an uncountable number of collisions occurring every instant it will occur somewhere, sometime.

You say that it's "observable" with single-particle experiments. Are you sure you've performed enough of those experiments to unconditionally guarantee that such situations will never under any circumstances occur?

As an example of how you can be wrong on this: since the 1940s, Bi-209 was believed to be a stable element. However, recent research has actually determined that it is very slightly unstable - with a half life of approximately 4.6 x 10^19 years. That's more than one billion times the current estimated age of the universe. It's not stable at all, it just had unmeasurably low amounts of instability.

If you sat around looking at a single-molecule sample of that reaction, you would see absolutely nothing - unless you had a few trillion years to sit around. That particular experiment doesn't exhibit the behavior you're trying to model. It's like saying that because Newton's Laws Of Motion adequately explain everything on Earth, they could never be superceded by General Relativity.

I don't accept that other physical reactions categorically cannot occur at similarly improbable rates. The decay of smaller "stable" atoms, for example, may occur with a half-life of 10^100 or 10^1000 years that is simply beyond our current ability to measure.

With a sufficient number of simulations you can come up all-heads any arbitrary number of times that you want to name. You can even roll a perfect 20 on a D20 any number of times. Even something like spontaneous fission is theoretically possible - it's just vanishingly unlikely.


> And what if there is a micro-scale reversal of entropy, as I previously mentioned? Let's say a collision of two electrons, or an electron with a surrounding gas molecule, that results in a sudden increase in orbital energy of an electron (plus another particle losing all its energy of course). Vanishingly unlikely of course, but with an uncountable number of collisions occurring every instant it will occur somewhere, sometime.

Sorry, I'm not sure what you mean by this.

> You say that it's "observable" with single-particle experiments. Are you sure you've performed enough of those experiments to unconditionally guarantee that such situations will never under any circumstances occur?

I said (in the other reply) it's observable "to the best of our measuring ability" (repetitions, and energy), precisely because we'll never reach 100% certainty.

But your wrong claim wasn't "things that we don't know yet might be happening", rather "all particle interactions imaginable are happening all the time, just with a small probability". The former is tautologically true. The latter is just pseudoscience, as it is:

1. Unfalsifiable: the more we keep measuring that only some interactions happen in Nature, the further you'd just push that small probability.

2. Without predictive power.

It's not like superseding Newton's laws of motion with GR. It's like saying "beyond the speeds and masses we've observed, objects are free from any laws of motion whatsoever". And furthermore adding that it can be proved.

And that's only for laws like conservation of quark and lepton numbers. For conservation of energy and momentum, the prohibition is much stronger: Noether proved mathematically that they're another way of saying the laws of Nature are the same today than yesterday, and the same here than a meter away. Claiming they're being broken all the time is the same as saying the laws of physics are different all over the place, and from one moment to the next. The slightest evidence or proof of something of the sort, and we pretty much start all physics from scratch :)


In no way have I ever claimed that conservation of momentum is violated. I've specifically disclaimed that fact - with the caveat that dice are played constantly on a galactic scale, and anything is stochastically possible will eventually occur - even if it's probable at 10^-100 like Bi-209. I also happen to think that smaller molecules may also decay, and that you just haven't happened to observe enough single-particle data to observe an event that occurs at 10^-1000. Sue me.


Yes you did, and that was the whole point of the thread?

>> At the end of the day, there are interactions that are seen in Nature and interactions that aren't.

> That's provably false. Every reaction occurs in nature, even the vanishingly improbable ones, just at an extremely low reaction coefficient. They're still there, just at a probability of 10^-14 or whatever.

To the best of our knowledge, not every reaction occurs in Nature, and chemical equilibrium has nothing to do with it. The patterns we observe as to which can occur and which can't, we call conservation laws. You might want to accept that or not; doesn't change what the experiments output. And of course if there was proof of the contrary you'd be onto something very very big.


> Today's ALPHA result is the first observation of a spectral line in an antihydrogen atom, allowing the light spectrum of matter and antimatter to be compared for the first time. Within experimental limits, the result shows no difference compared to the equivalent spectral line in hydrogen. This is consistent with the Standard Model of particle physics


>consistent with the Standard Model

Is generally one of the most disappointing phrases one can read in science news. Hooray for progress and all, but everybody is hoping for exactly the opposite result.


Maybe we're running out of surprising things.


Lord Kelvin said something similar in 1900 about there being nothing left to discover. Within 30 years quantum mechanics would be discovered/created.


At that time everyone knew there was something up with blackbody radiation, classical physics gave no guide why the equipartition theorem should not apply any more at short wavelengths.


And now we know something's up with gravity, because we can't seem to reconcile it with quantum mechanics.


I know, and the Lord Kelvin quote is too suspicious to be credible, a man like him would never have said such a thing.

If we can trust Wikipedia he never did: https://en.wikiquote.org/wiki/William_Thomson#Misattributed


Didn't Isaac Newton die thinking that he solved all of physics though?


Newton was the first to experiment with light reflectance of glass of various thicknesses. He found the relationship but had no idea how a light corpuscule could know the thickness of the glass when it decided to reflect or not. Newton published his corpuscular theory of light but I certainly don't think he would have claimed to know what was going on.


Our current theory of gravity doesn't even explain how galaxies form and rotate unless over a quarter of the universe is made of unspecified "dark matter." We are children playing on the shores of undiscovered oceans.


I'm with you, I'm uncertain we even have the right question...


Now we know something is up with dark energy and early inflation of the universe, essentially fudge-factors to make certain things appear nice.


We are probably just looking in the wrong places.


I remember Feynman said to check the spin of the photons emitted from alien species. If they're 90 degrees out of phase with ours then they're made of antimatter and we shouldn't shake hands. (At least I think it was spin, been a while since I read QED)


Actually his story was about trying to explain the difference between left and right in terms of nuclear spin experiments. You also explain that humans greet by shaking their right hands. If you meet the alien and he extends his left hand to shake, he must be made of antimatter (since the combination of parity and antimatter conjugation is a symmetry of the universe).

edit: jess is correct, and that's the whole point of the story actually. if CP was invariant you wouldn't be able to tell they were antimatter by describing the nuclear physics decay experiment. instead the antimatter version of the parity experiment gives the opposite result. I shouldn't try to physics so early in the morning :)


> combination of parity and antimatter conjugation is a symmetry of the universe

It's only an approximate symmetry. Look up CP symmetry violations.


CP symmetry violation is the reason that the universe appears to be made more of matter than antimatter. Though if I'm not mistaken, the scale of the violations that we have seen, and the violations that must exist in order for the observable universe to be mostly matter aren't in agreement. We don't see enough CP violating processes to explain this asymmetry in the universe.

One of the big open problems in Physics.


CPT still holds, though, right?


Yes


Assuming a two handed alien of course.


In Feynman's story, the "alien" we meet is an artificial human (a biological Locutus, if you will) built using a description we sent them rather than a coincidentally humanoid life form. For a throw-away illustration taking only a couple of minutes in the middle of a lecture about symmetry, the guy filled in a lot of the obvious gaping holes.


And also attached to different sides, not in a vertical row straight up the front of the creature, for instance... (When asked "What do you look for in a woman?" the Mathematician answers "Bilateral symmetry, of course!")


I'm just proud when a physicist assumes a cows not a perfect spheroid.


Many handed


But only bilaterally symmetrical. Radially symmetric aliens with hands give no indication!

(Well, aside from whether or not they blow up whatever local matter surface they're standing on.)


That's neither necessary or sufficient! Notice that they could be bilaterally symmetric and two arms/hands positioned vertically on their front (anterior?) so no left/right hand distinction (but top/bottom instead) or; they could be asymmetric but have two hands/arms on one side, and one on the other, still giving them a left/right side choice.


Would antimatter interact with gravity opposite that of matter, prohibiting an alien (or anything) from being made of antimatter ?


No, it interacts identically.

This is a good question - so much so that there was an experiment[0] done at CERN before this one to find the answer. You are in good company asking this, despite downvotes!

[0] https://en.wikipedia.org/wiki/Gravitational_interaction_of_a...


So, if antimatter has the same light spectrum as matter, what makes us theorize that there is currently more matter than antimatter in the universe?


If you're asking how do we know there is more matter than antimatter in the universe the answer is that it is consistent with the observations/calculations surrounding Big Bang nucleosynthesis and CMB anisotropies. If you're asking what do we theorize is the reason there is more matter than antimatter in the universe there isn't a satisfactory answer yet. It's one of the biggest open problems in physics.


If there were more antimatter than theorized, we'd expect to see much more radiation due to matter-antimatter annihilation. If we don't see this, then there must be a heavy bias in one direction or the other.


Dumb question: Is it easier to create antimatter if you already have some? I mean using it as a nucleation point, or perhaps energy source.


Nope. It's a sensible question, but it doesn't help.


It would be easier if you could build your entire experimental apparatus out of antimatter, but if you could do that you're probably beyond making experiments with single atoms of it... ;-)


I would not want to be in charge of filing the Environmental Impact Statement for such a facility.


Yeah, you should see the paperwork they make us fill out for something as simple as a few hundred lbs of mercury.


:)


Simply amazing. I still thought antimatter was a theory. Can anyone recommend a book on this topic for the layman?


It is a theory, like everything else in physics, including the workings of gravity.

It's a bit of a shame that in elementary school and high school, people are taught about "facts" such as Newtonian physics without exploring also some of the weirdness of quantum physics that challenges classic physics. I.e. I had no idea about all the controversy around how gravity "actually" works until I read a book on string theory - I went 25 years just assuming "yup gravity is a thing and we understand 100% how it works."

Sorry, I don't have a book to recommend, just babbling away.

EDIT: Maybe the string theory book I read would be a good place, actually? I can't remember if it delves much into anti matter - The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory (Paperback) https://www.goodreads.com/book/show/8049273-the-elegant-univ...


>"I had no idea about all the controversy around how gravity "actually" works until I read a book on string theory"

FYI, there seems to be quite a bit of controversy regarding whether string theory is even science:

>'Many of today’s theorists — chief among them the proponents of string theory and the multiverse hypothesis — appear convinced of their ideas on the grounds that they are beautiful or logically compelling, despite the impossibility of testing them. Ellis and Silk accused these theorists of “moving the goalposts” of science and blurring the line between physics and pseudoscience. “The imprimatur of science should be awarded only to a theory that is testable,” Ellis and Silk wrote, thereby disqualifying most of the leading theories of the past 40 years. “Only then can we defend science from attack.”'

https://www.quantamagazine.org/20151216-physicists-and-philo...


I think it's not correct to hold this kind of physics to the standard of science that other disciplines use.

The standard, scientific method version of science is 'guess a model of how the world works, then run an experiment to see if it's true'. This packages (hides, even) a bunch of principles of rational thought inside of it - for instance, that a stance about how the world works had to be able to be wrong, and that you should be picking your opinions about how the world works based on what you can (repeatedly) demonstrate.

But there are other paths to knowledge- and revelation- gaining that are performed by scientists all the time, yet don't fit this model. It's perfectly legitimate to get a grant to run an experiment to just look at something closely, such as a star or a blank patch of sky, or a material, or an organism. 'I want to collect data on X' is perfectly legitimate as a way to learn about the world. After all you need observations about something in order to build the initial model that you use to generate hypotheses in the Scientific Method (tm) anyway. Another example: sometimes experiments are done just to find more accurate readings of numerical constants.

Anyway, mathematicians and the more theoretical physicists aren't really looking to run experiments to test hypotheses. Instead their 'experiments' are finding new models for looking at things and their 'results' are finding new mathematical statements, or finding ways to prove things that were previously hard to prove, or just finding new ways of looking at things that make thinking about them easier. This is 'output', and a net gain in human knowledge, without being a testable hypothesis, and I think that's fine. It's still subject to the underlying rationalism behind science. But validation is entirely theoretical: a good theory makes things make sense, and doesn't make things not make sense in ways that disagree with physical experiments, and makes things simpler and better. And it's fine that these criteria are abstract and to an extent subjective.

Of course it's still necessary to have a way to say if theorists are failing, or wasting their time, or producing too little or too quality output, and I don't know how that's done or it ought to be done. But it doesn't bother me that they don't produce physically testable results.


I think it is important to keep in mind that theories can sometimes only become testable after sometimes unforeseen advances in technology. Consider the Higgs field, theorized in 1964 [1]. This field was only detected, via the Higgs boson, in 2012 [2]. The Large Hadron Collider (LHC) [3] and its computing grid, which detected the Higgs boson, generated 25 PB of data per year, had 150 PB storage, and 200,000 processing cores [4] (which I assume were at least 1 GHz on average). In the 1960s, the Cray CDC 7600 supercomputer had a computing speed of 36.4 MHz and 65 kword memory [5], and IBM produced the 1311 HDD with about 12 MB storage [6]. Thus, the LHC required advances in fiber optic cables, a 5-million-times increase in computing over existing state-of-the-art processors, an 89-million-times increase in computer storage (from 12 MB to assuming 1 TB per HDD), and integrated-circuit RAM. More directly, the 1971 ISR hadron collider had energies of 31.5 GeV [7] compared to the LHC's 6.5 TeV [8], 200x increase.

What other theories might become testable if we could reach 130 TeV, analyzed by a computing grid with 1 trillion processors and 1 yottabyte of storage?

[1]: https://en.wikipedia.org/wiki/1964_PRL_symmetry_breaking_pap...

[2]: http://www.smithsonianmag.com/science-nature/how-the-higgs-b...

[3]: https://en.wikipedia.org/wiki/Large_Hadron_Collider

[4]: https://en.wikipedia.org/wiki/Worldwide_LHC_Computing_Grid

[5]: https://en.wikipedia.org/wiki/CDC_7600

[6]: https://en.wikipedia.org/wiki/History_of_IBM_magnetic_disk_d...

[7]: https://en.wikipedia.org/wiki/Intersecting_Storage_Rings

[8]: https://en.wikipedia.org/wiki/Large_Hadron_Collider


Antimatter is old news. When CERN discovered the higgs boson a few years back, we did so by looking for the higgs decay products (the higgs is really unstable), a significant fraction of which are antimatter.

For example: One of the most important higgs signals was the Higgs -> 2 Z boson -> 2 electron + 2 anti-electron decay chain. Or in particle physics jargon H -> ZZ, Z -> e+e-. The important thing is that e+ is an anti-particle, one of the few that was discovered early enough to get its own name (the "positron" was discovered in 1932 [1]).

Since the positron discovery we've discovered so many anti-particles that we stopped giving them special names. We just call them e+, mu+, tau+, p-, etc, to say nothing of the composite particles that are composed of both matter and antimatter.

[1]: https://en.wikipedia.org/wiki/Positron#Experimental_clues_an...


Not only is it not just a theory, it's actually used in Medicine!

For example, PET Scans use positrons: https://en.wikipedia.org/wiki/Positron_emission_tomography


Not a theory at all. You get antimatter out of many common radioactive materials: https://en.wikipedia.org/wiki/Positron_emission

It's the building atoms out of it that's tricky, but antimatter is nothing new.


Not only that, but a PET scan (a common medical imaging technique, tangentially similar to an MRI or CT scan) relies on antimatter. The name stands for Positron Emission Tomography, in fact.

The general idea: You get injected with a tracer containing a β+ emitter. This produces positrons (antimatter) through radioactive decay. When the antimatter collides with regular matter inside your body, it annihilates, producing a pair of gamma rays moving in opposite directions. Those gamma rays can be detected and used to triangulate where the annihilation occured, generating a 3D image of where the tracer has accumulated in your body.

Typically, the tracer will be something that looks like glucose to the body, so it's accumulated in areas of high metabolic activity. This allows us to see what parts of your body are active. (For example, seeing which neurons in your brain are firing.)

More reading: https://en.wikipedia.org/wiki/Positron_emission_tomography


And with practical applications as well: https://en.wikipedia.org/wiki/Positron_emission_tomography


According to the Wikipedia articles on the topic, antimatter (specifically, positrons) were first suggested as possible in 1928 (a consequence of the Dirac equation), and the first observation of a positron was 1929, and the linkage between the theory and experiment was done in 1932.

Antimatter, then, predates the discovery of quarks.


scientific theories are not hypotheses.


Right. Science uses the word "theory" in the same sense as music theory. A scientific theory is a collection of related ideas that form a robustly cohesive whole, with the power to explain some set of phenomena.


Theory literlly means observation in ancient greek, look it up


We don't speak ancient greek.


> I still thought antimatter was a theory.

Your use of the word "theory" here seems to be a misnomer akin to the common statement "But X is just a theory."


Perhaps better to say "I thought [the creation of] antimatter was still theoretical." i.e. as opposed to practical?


[flagged]


I work at CERN. Before I worked there I worked on trapping "normal" atoms (as opposed to antimatter ones). So some of us do understand this stuff.


I understood the article, and I'm not a physicist or anything like that. And I suspect almost everyone else here who read it understood it too. It's not uncommon for HN readers to be interested in how the Universe works, and therefore this wouldn't be the first physics article they've ever read.


You may want to reread the HN guidelines. And although I am not a physicist, some HN readers are.


Reading is how you learn.


Reading real physics, not "popular science" articles.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: