Skip to content
Snippets Groups Projects
Florian Uhlig's avatar
Administrator authored
The latency value is used to decide when data can be moved from the daq buffer
to the timeslice. It is a value for the time disordering between digis in the
data send to the daq buffer during digitization. In case of the MVD detector
this delay can be tens of microseconds such that in case the MVD is in the
setup the used latency must be increased from the default value.

This commit implements the possibility that each detector system can define
its own latency. The latency used in CbmDaq is then the maximum of all defined
detector latencies. Currently only the MVD detector defines a non standard
latency.
e4cd01aa
History
Name Last commit Last update
..