Communication à un colloque (Conference Paper) – Poster
During the first 3 years of operation at the Large Hadron Collider, the Compact Muon Solenoid (CMS) experiment has collected data across evolving conditions of center of mass energy; instantaneous luminosity and collisions pile up. Following this evolution, the CMS collaboration has constantly strived to guarantee the prompt availability of high quality reconstructed data, in order to ensure early and sound physics results. This has relied on a few key areas of constant attention covering: careful preparation and maintenance of the event simulation and reconstruction algorithms; efficient and robust strategies and algorithms for the calibration and the alignment of the detector elements; continuous scrutiny of the data quality and the validation of any changes to the software or calibrations which were introduced during the operations. This contribution covers the major development and operational aspects of the CMS offline workflows during the 2010-2013 data taking, underlying its essential role towards the main physics achievements and discoveries of the CMS experiment.