The next generation of instruments at the OT will achieve data rates in the range of 70 TB per day. Given several such instruments from different partners, and in anticipation of EST where the expected data rate is more in the range of 1 - 2 PB per day, the limitation to one site is no longer adequate. A data volume in this range requires a flexible keeping of partial data stocks at different locations with guaranteed redundancies and lifetimes.
Data injection from the different instruments, their initial calibration, their (redundant) distribution to various locations, and the generation of standard products must be automated.
Ideally, calculations should be performed on these data sets close to the data storage locations. If this is not possible, the data transport to the analysis location should be transparent to the user, considering the available resources, bandwidth, and the costs incurred.