I followed the links to the CMS, Atlas and other detectors that the images appear to be from but couldn't find any information on what these images actually mean. By reading the layman's descriptions (which I most certainly am) I assume that the images show where along the "surface", if there is such a thing, of the detector various particles were detected and the length of the bars show either how much energy the particle(s) have or the density of particles at that point.
Is there anywhere I can go to get a more detailed description of these without delving into academic papers that I have no hope of understanding?
A detector is typically a series of concentric cylinders, with the beam pipe, where the collisions occur, running through the center. The inner layers are tracking chambers, which detect the paths of charged particles. This is what produces all the curved lines radiating from the center.
The outer layers are calorimeters, which catch particles and measure their kinetic energy. As you correctly assumed, these produce the bar plots. Often there will be one layer of calorimeters for photons and electrons, and a second for hadrons (protons, mesons, etc.)
Very helpful, thank you. Do you know if they have the ability to plot the various events of the collision by time, such that the bar plots could be shown appearing one after another as collision events are recorded?
I'm not familiar with the nitty-gritty of the LHC experiments, but in general the various detector subsystems have different response times and rate capabilities. There are certain types of detectors with explicit time granularity (e.g. time of flight) but for most detectors there would be no time structure within a single collision record ("event").
But you could work backwards to make an animation for each event. From a scientific perspective it's not that interesting though.
You've got the right general idea. It's a tricky thing to simultaneously display three-dimensional tracks and the amount of energy deposited in each detector system.
I'm surprised that Atlas and CMS don't have a better viewer's guide immediately at hand, but perhaps this article will help:
I've worked at two accelerator facilities (RHIC and CEBAF) writing software for doing particle tracking and pseudo-data generation, and I always wished that they could divert more time into preparing articles like the one you linked. They're super helpful to laypeople, especially family and friends, but they're also great for getting students at universities interested in maybe coming to pitch in.
Sadly, the lab personnel (especially those coordinating outreach) are generally crazy overworked and don't have a lot of time, and the interns are just too inexperienced and lack the requisite ___domain knowledge to do it themselves.
If you ever feel like writing a paper on this I can highly suggest Ludwik Fleck as a starting point (his writing on thought collectives).
At least that's where I started when I needed a concept of marketplace for ideas and why laypeople matter.
Thanks! this is exactly what I was looking for. I may be missing something but I found it odd that the CERN website didn't have any more detailed breakdowns of the experiments and setups like the article you linked.
The amount of data that comes out of the different detectors on the LHC is staggering. It takes a fair amount of post processing to get meaningful results from collisions, and there can be millions of collisions in a run. There's some more information about how they process these results on the LHC computing grid page: http://wlcg-public.web.cern.ch/.
When I did my traineeship at ATLAS DAQ around 10 years ago, one issue was how to per-process the data that could be considered invalid fast enough, to be able to throw it away before it arrived into the cluster.
Talking about ATLAS experiment: protons in LHC collide 40 million times a second, but "only" around 100 events per second are stored for later analysis. LHC grid concerns the 100; three levels of trigger (one hardware and two software levels) are used to select the useful 100 out of 40 millions, in real time.
I understood more after using some citizen science project where you classified particles by their trajectories using images like these. Basically, if you didn't figure out that part, the images are cross sections of the vacuum pipe and show the trajectories of the particles that come to exist in the collision of the accelerated particles
The LHC is a big ring around which particles travel at high speed and, from time to time, collide with one another.
Given that the particles were not travelling around the ring 100 years ago, and will probably no longer be travelling around the ring 100 years from now, there needs to be a way of getting them in and out.
In (INJECT): beams of particles (travelling rapidly but in a straight line) are produced by other accelerators and then diverted into the LHC ring.
Out (DUMP): the particles are diverted out of the LHC ring, the exiting beam is spread out using electromagnets, and it is sent into a big block of graphite encased in concrete.
I don't know, I'm afraid. According to the Wikipedia page on the LHC, though, injection takes "several minutes", acceleration after that takes "20 minutes", and then the beams circulate for 5-24 hours while the detectors watch for products of collisions. (Beams plural because the LHC has two beams circulating in opposite directions.)
What does the raw data that generates these renders look like, or do their dashboards literally spit out images that look like that? I'm guessing their instrumentation can roughly figure out what energy/size/charge of particle is where, but the how just seems like magic to me.
this one bothered me around the time of the higgs announcement,
i did some research and found a number of exciting and upsetting results:
the exciting :
* SCOPA3 : (i)(ii)
cerns commitment to open access publishing
* public data samples (iii)
* open data portal (iv)
the upsetting :
"LHC data are exotic, they are complicated and they are big. At peak performance, about one billion proton collisions take place every second inside the CMS detector at the LHC. CMS has collected around 64 petabytes (or over 64,000 terabytes) of analysable data from these collisions so far."
* exotic : requires special analisys tools
that require vms to install smoothly (v)
* big : by nature of personal computing
it would be impossible for you to get
all of the data even on a single event
why is this upsetting? it seems possible,probable, that the data released will only be data that correlates with 'discovery', meaning there is little that can be done to aide the effort from home, and seemingly even less of a chance of finding counter theories or errors
If one had to explain to somebody who has absolutely no clue whatsoever what the LHC does and why, one could do worse than point them to this video. :)
Well, there's been some alarmism about AI recently, and in that same vein I'll admit it - even with a physics degree I'm uneasy about the LHC. By definition we don't know what will happen. We are generating kinetic energies that haven't been seen since literally the dawn of time. I would just be a lot more comfortable if these experiments were being done in space, far away from life. It's just that even 99.999% certainty that nothing catastrophic will happen really isn't high enough considering the stakes.
On the one hand, yes. There should always be a healthy concern for safety.
But keep in mind these sorts of things happen relatively frequently - https://en.wikipedia.org/wiki/Oh-My-God_particle Furthermore, they have a really good idea of what's going to happen. With bowling, I can't tell you if i'm going to roll a strike or a split, but i'm sure i won't hit you with a bowling ball.
To put it another way, if there was some scary possible outcome, there would be evidence of that from the many particles that hit us regularly. Mini black holes sound scary, but they're really tiny and just evaporate because they're, well, mini. Nuclear reactions are well understood. If there was some other effect, we'd see it in the sun. Just think about how hard it is to get fusion power working. What are the chances of that happening accidentally?
So yes. You and I are right to be concerned, we're facing the unknown. But if you look into it, you'll see the folks doing the research, well, it's not unknown to them. They're looking for small details in well understood systems. Maybe something bad will happen, but they're not shooting from the hip.
Re: "We are generating kinetic energies that haven't been seen since literally the dawn of time."
Is 13 TeV really considered a lot of energy in terms of what happens daily with particles hitting our atmosphere? http://en.wikipedia.org/wiki/Cosmic_ray seems to suggest that we've observed naturally occurring 3 × 10^20 eV, which is a lot ( larger than 13 TeV (10^12).
You have to look at center of mass energy, the energy actually available for reactions in a collision. The collisions of ultra-high-energy-cosmic-rays (UHECRs) with air occur at hundreds of TeV center of mass energy. And they may be nucleons, in which case this energy is spread over several nuclei. So assuming that Lorentz symmetry holds (probably the safest assumption in modern physics), 13 TeV is in a similar ball park as UHECR collisions.
> For the Oh-My-God particle, this gives 7.5×10^14 eV, roughly 50 times the collision energy of the Large Hadron Collider.
> The speed of the particle (0.999 999 999 999 999 999 999 9951c), if it was a proton, is so high that it would experience relativistic time dilation by a factor of about 320 billion. At that rate, the particle could have traveled for the entire duration of the universe's existence while experiencing less than sixteen days subjective time.
Is there anywhere I can go to get a more detailed description of these without delving into academic papers that I have no hope of understanding?