Integration Of The SAM-grid Infrastructure To The D0 Data Reprocessing Effort
2005
- 16Usage
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Usage16
- Downloads16
Thesis / Dissertation Description
The D0 experiment is one of the two high energy physics experiments currently being conducted at Fermi National Accelerator Laboratory, in Batavia, IL, on what is currently the world's highest energy particle accelerator, the Tevatron. The experiment produces vast amounts of raw data of the order of several hundred terabytes. This data needs to be converted from the raw format that comes from the detector, ie. digitized data, to a format that is close to the physics, ie. data that can be subjected to analysis. This process, called reconstruction, is done according to constantly evolving and improving reconstruction algorithms. This process of conversion from raw detector data to analyzable data is done periodically to obtain data of very high quality, and it is done over the entire existing dataset to obtain consistent results. The scale of computation that is necessary to process such a large dataset is enormous and presents a challenge. This thesis discusses the computation problem that needs to be solved to achieve such a high level of scalability, the design of a computational model that will solve this problem, the issues that were encountered during implementation of the design, and steps taken to resolve these issues.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know