Even bigger data: preparing for the LHC/ATLAS upgrade
The Large Hadron Collider's (LHC) experiments' data volume is expected to grow one order of magnitude following the machine operation conditions upgrade in 2013-2014. The challenge to the scientific results of our team is: i) how to deal with a 10-fold increase in the data volume that must be processed for each analysis, while ii) supporting the increase in the complexity of the analysis applications, iii) reduce the turnover time of the results and iv) these issues must be addressed with limited additional resources given Europe's present political and economic panorama. In this paper we take a position in this challenge and on the research directions to be explored. A systematic analysis of the analysis applications is presented to study optimization opportunities of the application and of the underlying running system. Than a new local system architecture is proposed to increase resource usage efficiency and to provide a gradual upgrade route from current systems. ; FCT grants SFRH/BPD/63495/2009; SFRH/BPD/47928/2008, by the UT Austin | Portugal FCT grant SFRH/BD/47840/2008; FCT project ...