Einstein @ home

from Wikipedia, the free encyclopedia
Einstein @ Home
Area: astronomy
Target: Detection of gravitational waves ,
search for binary radio pulsars
Operator: LIGO Scientific Collaboration (LSC)
Country: International
Platform: BOINC
Website: einstein.phys.uwm.edu
Project status
Status: active
Start: February 19, 2005
The End: still active

Einstein @ Home is a volunteer computing project operated by the LIGO Scientific Collaboration, an international scientific collaboration between physical research institutes. It is a complex data analysis project, the most time-consuming part of which runs on computers belonging to the participating public that are otherwise underutilized.

The project searches the data collected by the Laser Interferometer Gravitational wave Observatory in the United States and the German GEO600 for indications of gravitational waves from pulsars that have an atypical asymmetry. Since March 2009, binary radio pulsars have also been searched for in the data from the Areci telescope . In addition, data from the Fermi Gamma-ray Space Telescope is searched for pulsars emitting gamma rays.


Einstein @ home is used for basic astrophysical research . According to the general theory of relativity , massive and accelerated objects like deformed neutron stars , the space-time around them, with measurable gravitational waves should arise. The measurement of the gravitational waves should be done by gravitational wave detectors based on Michelson interferometers . Only a fraction of such stars can be seen by telescope because of the great distances .

The verification of such waves is pursued with two approaches. On the one hand, the entire sky is searched for pulsars using a more sophisticated method than classic telescopes. On the other hand, the detectors are used to search directly for gravitational waves. The main goal is to find continuously emitted waves. The search process therefore focuses on the environment of pulsars and comparable objects with a known position.


In this fall due to unknown parameters such as location and mass of large amounts of data that even at today's supercomputers to clusters require long term basis for the full analysis. In order to be able to achieve the comparably high computing power for the analyzes inexpensively with a higher software maintenance effort, the approach of distributed computing came up .

Another project goal is to increase the number of active participants in order to be able to analyze the data obtained by the detectors in almost real time.

Organization and participants

The Max Planck Institute for Gravitational Physics (Albert Einstein Institute) is heavily involved in the project on the German side . A large part of the scientific software is developed there. On the international side, there is a cooperation with the University of Wisconsin – Milwaukee . Bruce Allen is in charge of management . The institutes are part of the LIGO Scientific Collaboration, a consortium of several hundred experts in fields such as physics and computer science, who belong to research institutions in numerous countries, mainly the United States. These also make up a considerable part of the computers required for the project. The other computers provide volunteer project participants, i. H. private companies or individuals from a comparable range of countries.


In the data center of the ATLAS cluster

The project was officially started on February 19, 2005 as part of the year of physics 2005 .

In March 2008, the Albert Einstein Institute, with its Merlin (180 dual Athlon XP machines) and Morgane (615  AMD Opteron nodes) clusters, was the second largest individual participant in the project. The D-Grid initiative also made computing time available to the project on a similar scale . The Albert Einstein Institute operates the computer network ATLAS to evaluate the data in Hanover . Most of the work, however, is carried out by the computers of several 100,000 volunteers in cumulative terms, of which only several 10,000 participants, sometimes with several devices, are regularly active. In September 2010, around 118,000 computers provided the project with over 300  teraflops of computing power, which at that time corresponded to 14th place in the list of the world's fastest supercomputers. According to Moore's Law , the performance increases steadily. In 2016, the project made more than 1,600 teraflops, the ATLAS cluster of which approximately 3,000  Intel Xeon - CPUs about a quarter contributed and thus fastest individual participants and also fastest computer was the gravitational wave research.



The BOINC platform is used to manage the work packages . After installing the BOINC software and selecting the Einstein @ home project, project participants automatically receive data packages that are processed on their desktops or smartphones during otherwise unused computing time. On end devices such as smartphones, tablets or Raspberry Pi , the data is processed particularly energy-efficiently due to the ARM architecture , but also comparatively slowly, while the fastest calculations are carried out on end devices such as desktops with GPGPU .

The data packages and the calculation results are stored on servers of the Albert Einstein Institute. The data center is designed as a server farm , the devices of which are interconnected by Gigabit Ethernet . The tasks to be processed within the data center are distributed using HTCondor . Each server can be monitored individually via Ganglia and the Intelligent Platform Management Interface . A small part of the server organizes the tasks for Einstein @ home, the largest part processes tasks of the project. The system is not based on high performance computing ; H. for the fastest possible processing of a task, optimized, but for fast throughput of different tasks to be processed in parallel.


An essential sub-goal is the separation of the measurement signal from interfering influences, since the expected event is very weak. For this purpose, the data determined by the telescopes are divided into segments. The ends of the segments overlap to avoid false negative results. The window function prevents these ends from being overestimated. The segments are subjected to a fast Fourier transformation and, after sorting, a chi-square test . Analyzes of continuous gravitational waves also include a complex statistical calculation such as B. a Hough transform . This determines signals that stand out from the white noise . These are ultimately compared by means of pattern matching using an optimal filter with theoretically expected signals whose mathematical pattern was calculated from the parameters amplitude , phase and the polarization of the gravitational wave resulting from the axis of rotation of the neutron star .

During the evaluation, the limited sensitivity of the detectors has a negative effect on the signal-to-noise ratio . On the one hand, this is to be counteracted by continuous technical improvements to the detectors, and on the other hand, the numerous incorrect findings that had to be sorted out manually up to now are to be found by computer in the future. To do this, a considerable number of manual findings must be recorded, which the software can later use as a database. To speed up the process, the Citizen Science concept was used and the "Gravity Spy" project was started on the Zooniverse platform in 2016 .

Raw data

The observatories are each used by several research projects. In addition, the amount of continuously accumulating data is considerable with a few megabytes per second, because the LIGO raw data is collected at a frequency of 16 kHz, which is well above the Nyquist frequency , so that millisecond pulsars are also reliably recorded. Therefore, their raw data is cached. Data from the Arecibo Observatory is transferred from the data center at Cornell University to the Einstein @ Home project, where it is subject to hierarchical storage management . Data from the Large Area Telescope is transmitted from the data center of the Jodrell Bank Radio Observatory to the Einstein @ Home project. Data from the LIGO observatories can be transmitted to the Einstein @ Home project by the California Institute of Technology or the mirror server of the University of Wisconsin – Milwaukee involved. If the original instrument data is available in 16-bit or 4-bit format, it is already converted to IEEE 754 floating point numbers in the project server.


Since the computing capacity is ultimately limited, raw data is collected in a limited period of time and then analyzed in a workflow. Depending on the objective, such a process is called a test run or a scientific run (S = English science run). A workflow essentially consists of three work steps, the first and third of which are carried out by experts, while the second is carried out on the computers of the participating laypeople. The first step involves creating the algorithm , processing the data and adapting it to the performance of the different computers and their Linux , Windows , macOS and Android operating systems and the configuration of the server. In the second step, the Einstein @ home software works on the tasks prepared in the first step, i.e. H. Work units, on the devices of the participants and causes the results to be uploaded to the server. The third step comprises saving the results of the second step including evaluation and follow-up as well as the scientific publication.


Einstein @ Home began his analyzes with 600 hours of data from the LIGO run S3, the accuracy of which, however, was still a long way from the desired precision of the LIGO detector. The data had already been examined in the computer network beforehand, whereby no abnormalities had been discovered. The first Einstein @ Home run with S3 data was therefore primarily used to test the scientific application and better calibration. Numerous interfering signals were discovered and removed. These interference signals are caused by the sensitivity of the detectors . Mainly due to seismic disturbances, but also due to signals from the power grid or sea surf, they are constantly beating out. Each detector is individually affected by these disturbances. A gravitational wave would be betrayed by the fact that all detectors worldwide deflect simultaneously. After the S3 data had been “cleaned up”, this new version was analyzed again. In addition, some false signals were interspersed in order to be able to make statements about the detection probability of relevant signals between the disturbances.

From the end of June 2005 to the middle of 2006, the analysis of the LIGO run S4 carried out at the beginning of 2005, which was supposed to achieve an accuracy factor of 2, ran.

S5 should be the first run that reaches the targeted accuracy. As part of S5, the LIGO detectors were operated continuously for one year. The analysis of the S5 data began in June 2006. The first S5R1 search run in this data set was completed in February 2007. This was followed by a short S5RI search in a limited frequency range with a modified set of parameters that lasted until mid-April 2007. Meanwhile, the detectors continued to collect data as part of S5. The calculation effort increases exponentially.

In order to be able to cope with the calculations, a new application was tested in search run S5R2 that implements a hierarchical search. At first, the search is only carried out in a rough grid and later on concentrating on the most promising positions.

On September 23, 2007, the S5R3 search began with a second version of the hierarchical search algorithm, which improves the sensitivity by a factor of around 6. The S5R3b run is a seamless continuation of the S5R3 in the frequency band above 800 Hz. The S5R4a search run has been taking place since August 2008.

While test runs for the search for binary radio pulsars were carried out in December 2008, work units for the Arecibo Binary Pulsar Search application have been distributed to all participants in the Einstein @ Home project since the end of March 2009 , provided that the participants do not deactivate them in the participant -specific settings. After a few weeks of testing, the GPU applications for Arecibo Binary Pulsar Search for Windows and Linux were released on November 26, 2009 . While the majority of the calculations continue to take place on the CPU , the fast Fourier transforms are now calculated on the GPU, which at least for this part of the task leads to a considerable reduction in computing time.

In June 2010, the project succeeded in finding the previously unknown pulsar PSR J2007 + 2722 in the constellation Fuchs . A second success was achieved in March 2011 with the discovery of the pulsar PSR J1952 + 2630 in the 2005 data from the Arecibo Observatory . By August 2012, 46 new pulsars had been discovered by the project. In 2013, the discovery of 24 pulsars was published as part of an analysis based on data collected with the Parkes Observatory .

In terms of computing power, the project exceeded the 1 petaflop limit in January 2013 and, in terms of the computing power of the grid, is on a par with the 23 most powerful mainframe computers in the world. The project is also linked to the discovery of gravitational waves .

The discoveries include a pulsar detected in 2015 that was behind another star. In 2016, attempts were unsuccessful to detect gravitational waves in the relatively young object Cassiopeia A. In the same year, 17 gamma-emitting pulsars were discovered, one of which went through a glitch . In 2016, this process was used to discover two neutron stars that work together as a double star .

In 2018, 2 more pulsars were detected using data from the previous five years from the Fermi Gamma-ray Space Telescope . For the first time, these were those that emitted gamma radiation , but the radiation that could be detected with radio telescopes was too weak or did not radiate towards the earth. Thus, a tool was presented with which objects of this type can be detected even if there are influences of scattering or dark matter between the telescope and the object.

Continuous gravitational waves could not be detected until early 2019, although the search was focused on some supernova remnants such as Cassiopeia A. Nevertheless, the detection limit was lowered.

So the results concern:

  • the search for gravitational waves ( English Gravitational Wave search )
  • the search for binary pulsars in the data of the Arecibo telescope (English Binary Radio Pulsar Search )
  • the search for gamma pulsars (English Gamma-ray pulsar search )

See also

Web links

Commons : Einstein @ Home  - collection of pictures, videos and audio files

Individual evidence

  1. Press release: Starting shot for a new Einstein @ Home project
  2. Fermi Gamma-ray Space Telescope - New insights into Pulsar Physics
  3. Start for ATLAS - Germany's fourth fastest computer for scientific purposes is inaugurated
  4. Top 500 list of the fastest supercomputers
  5. Martin A. Green, JW Moffat: Extraction of black hole coalescence waveforms from noisy data, 2017, arXiv 1711.00347
  6. J. Aasi et al .: Einstein @ Home all-sky search for periodic gravitational waves in LIGO S5 data, Phys. Rev. D87 (2013) no.4, 042001 online
  7. Gravity Spy
  8. Werner Becker: Neutron Stars and Pulsars, 2009, p. 666
  9. Allen, B .; Knispel, B .; Cordes, JM; Deneva, JS; Hessels, JWT; et al .: The Einstein @ Home search for radio pulsars and PSR J2007 + 2722 discovery, The Astrophysical Journal. 773 (2): 91, 2013
  10. B. Knispel et al .: EINSTEIN @ HOME discovery of 24 pulsars in the parkes multi-beam pulsar survey, arXiv 1302.0467v3, 2013
  11. Ian J. Taylor, Ewa Deelman, Dennis B. Gannon, Matthew Shields (eds.): Workflows for e-Science - Scientific Workflows for Grids, 2007, pp. 43-46
  12. Sintes, A .: Gravitational wave astronomy: now and future (PDF; 481 kB) p. 6.
  13. Forum Post of the project scientist Reinhard Prix for S5R3
  14. Pulsar Discovery by Global Volunteer Computing Science Mag
  15. ^ Binary Pulsar Discovery by Volunteer Computing Cornell University
  16. Einstein @ Home pulsar discoveries in Parkes Multibeam Survey data. Retrieved September 6, 2011 .
  17. Seven new pulsars discovered by Einstein @ Home volunteers! Retrieved August 27, 2012 .
  18. Einstein @ Home new discoveries and detections of known pulsars in the BRP4 search. Retrieved December 19, 2011 .
  19. B. Knispel et al .: EINSTEIN @ HOME discovery of 24 pulsars in the parkes multi-beam pulsar survey, arXiv 1302.0467v3, 2013 online
  20. Einstein @ Home passes 1 Petaflop of computing power! Retrieved January 13, 2013 .
  21. Gravitational waves: who found it? Retrieved February 14, 2016 .
  22. Sylvia J. Zhu et al .: Einstein @ Home search for continuous gravitational waves from Cassiopeia A, Phys. Rev. D 94, 082008 (2016)
  23. CJ Clark et al .: The Einstein @ home gamma-ray pulsar survey I - search methods, sensitivity and discovery of new young gamma-ray pulsars, 2016, arxiv : 1611.01015
  24. Home computers discover record-breaking pulsar neutron star system
  25. Colin J. Clark, Holger J. Pletsch et al .: Einstein @ Home discovers a radio-quiet gamma-ray millisecond pulsar, in: Science Advances 4, 2, 2018 ( online )
  26. Jing Ming et al .: Results from an Einstein @ Home search for continuous gravitational waves from Cassiopeia A, Vela Jr. and G347.3, 2019, arXiv 1903.09119