By now, you must have heard about distributed computing – projects such as [email protected] and [email protected] are mentioned frequently in the media. We also covered the subject at some depth previously on Instant Fundas.
[email protected] is another distributed computing project run by volunteers on behalf of the European Organization for Nuclear Research (CERN). The project was initially launched in 2004, to help engineers process the gigantic amount of data generated by the world’s largest collider and use the information to maintain and improve the operation and efficiency of the accelerator, and to predict possible problems that could arise from adjustment or modification of the LHC’s equipment.
One of the first projects deployed as a part of [email protected], called SixTrack, allowed users to typically simulate about 60 particles whizzing around the collider’s ring for 10 seconds, or up to million loops. SixTrack helped the engineers at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t fly off course into the walls of the vacuum tube, causing serious damage.
Last week, CERN began public testing a next version of the [email protected]ome – [email protected] 2.0, centering on a new project Test4Theory. It allows science-minded users to run simulations of high-energy particle physics using their home computers. The results are submitted to a central database which is used as a common resource by both experimental and theoretical scientists working on the Large Hadron Collider at CERN.
Computer simulations of high-energy particle collisions provide a detailed theoretical reference for the measurements performed at accelerators like the Large Hadron Collider (LHC), against which models of both known and ‘new’ physics can be tested, down to the level of individual particles.
By looking for discrepancies between the simulations and the data, we are searching for any sign of disagreement between the current theories and the physical universe. Ultimately, such a disagreement could lead us to the discovery of new phenomena, which may be associated with new fundamental principles of Nature.
Less spectacular discrepancies also help guide us towards the most accurate possible description of the Standard Model of Particle Physics and its phenomena – refining the simulations of the known physical laws, by pointing to areas where current simulations succeed and where they fail.
To run [email protected] 2.0, you need a computer connected to the Internet with at least 512MB of RAM and 9GB of free hard disk space. You have to first install VirtualBox and then install BOINC, a standard volunteer computing application that allows you to share your PC with research projects that you choose to attach to. The VirtualBox system allows you to run CernVM, the software that runs the simulation used in the project, independent of the operating system you use.
The Large Hadron Collider has its own distributing computing networks – the Worldwide LHC Computing Grid (WLCG), one of the largest in the world, but the resource is almost exclusively for the use of the experiments which pump data out of their detectors at about 300 MB per second. Little is left for theoretical physicists for simulations. [email protected] would make a real difference.
The simulation [email protected] uses can produce one collision per millisecond, on average. But the LHC produces 40 million per second. If the project can get just 40,000 volunteers to run the simulations at the same time, the researchers could produce a full-fledged virtual atom smasher.
Recommended Reading: How the LHC works? Learn it through a game