Running applications on large data sets is inevitably more demanding than on smaller ones. Resource usage is heavier, and as well as longer elapsed times and increased degradation of machine performance, this can lead ultimately to failure of the job. CCDBIG does not give a benchmark result as such, but provides a tool to gauge resource usage for data reduction of frames of varying size, using various options, so that the capability of a given platform to handle data sets of a given size can be assessed, and the reasons for any failures can be determined.
Typically, CCDBIG will be run by a user who is about to embark on reduction of unusually large images, or by a system manager who expects his system to be used for such work, or suspects that problems are being encountered as the result of it. Its use requires some care and can disrupt the system being assessed, so that novice users might wish to seek assistance from their system manager in using the package.
How the results of such an investigation are used is outside the scope of this document, but at the coarsest level, if an application fails or the machine crashes as a result of the size of a job, then it's too big. At a more sophisticated level, if a job takes ``too long'', or degrades performance of the machine to an unacceptable degree, then it may be too big. In either of these cases, examining the log file generated by CCDBIG should give some insight into the cause of the problem. The approach can then be modified; either by editing the script as described in sections 2.1 and 2.2, or by reconfiguring or reselecting the platform it's being run on, to find a happier combination.
This guide is a brief instruction for use of CCDBIG and interpretation of its output. For further discussion of the issues involved in running large data reduction jobs in general, and using CCDPACK in particular, see SC/5.
CCDBIG: assessing CCDPACK resource usage for large data sets