Project
Loading...
Searching...
No Matches
o2FairMQHeaderSizeTest.cxx File Reference

Just a simple workflow to test how much messages can be stored internally, when nothing is consumed. Used for tuning parameter shm-message-metadata-size. More...

#include "Framework/ConfigParamSpec.h"
#include "Framework/ControlService.h"
#include "Framework/CallbackService.h"
#include "Framework/EndOfStreamContext.h"
#include "Framework/DeviceSpec.h"
#include "Framework/runDataProcessing.h"
#include <chrono>
#include <thread>
#include <vector>
#include <random>

Go to the source code of this file.

Functions

std::string random_string (size_t length)
 
std::string filename ()
 
WorkflowSpec defineDataProcessing (ConfigContext const &specs)
 This function hooks up the the workflow specifications into the DPL driver.
 

Detailed Description

Just a simple workflow to test how much messages can be stored internally, when nothing is consumed. Used for tuning parameter shm-message-metadata-size.

Author
Michal Tichak, micha.nosp@m.l.ti.nosp@m.chak@.nosp@m.cern.nosp@m..ch

Definition in file o2FairMQHeaderSizeTest.cxx.

Function Documentation

◆ defineDataProcessing()

WorkflowSpec defineDataProcessing ( ConfigContext const &  specs)

This function hooks up the the workflow specifications into the DPL driver.

To be implemented by the user to specify one or more DataProcessorSpec.

Use the ConfigContext context in input to get the value of global configuration properties like command line options, number of available CPUs or whatever can affect the creation of the actual workflow.

Returns
a std::vector of DataProcessorSpec which represents the actual workflow to be executed

The workflow executable for the stand alone CPV reconstruction workflow The basic workflow for CPV reconstruction is defined in RecoWorkflow.cxx and contains the following default processors

  • digit reader
  • clusterer

The default workflow can be customized by specifying input and output types e.g. digits, raw, tracks.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

The workflow executable for the stand alone CTP reconstruction workflow

  • digit and lumi reader This function hooks up the the workflow specifications into the DPL driver.

The workflow executable for the stand alone EMCAL reconstruction workflow The basic workflow for EMCAL reconstruction is defined in RecoWorkflow.cxx and contains the following default processors

  • digit reader
  • clusterer

The default workflow can be customized by specifying input and output types e.g. digits, raw, tracks.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

The workflow executable for the stand alone TOF reconstruction workflow The basic workflow for TOF reconstruction is defined in RecoWorkflow.cxx and contains the following default processors

  • digit reader
  • clusterer
  • cluster raw decoder
  • track-TOF matcher

The default workflow can be customized by specifying input and output types e.g. digits, raw, clusters.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

DPL Workflow to process MCH or MID DCS data points.

The expected input is a vector of DataPointCompositeObject containing only MCH (or only MID) data points.

Those datapoints are accumulated into DPMAPs (map from alias names to vector of DataPointValue).

The accumulated DPMAPs are sent to the output whenever :

  • they reach a given size (–xx-max-size option(s))
  • they span a given duration (–xx-max-duration option(s))
  • the workflow is ended

DPL workflow which generates fake random MCH DCS data points.

Data points are generated for HV (currents and voltages) as well as for LV (DualSampa analog and digital voltages, and SOLAR voltages).

DPL workflow which generates fake random MID DCS data points.

Data points are generated for HV (currents and voltages).

The workflow executable for the stand alone PHOS reconstruction workflow The basic workflow for PHOS reconstruction is defined in RecoWorkflow.cxx and contains the following default processors

  • digit reader
  • clusterer

The default workflow can be customized by specifying input and output types e.g. digits, raw, tracks.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

The workflow executable for the stand alone TPC reconstruction workflow The basic workflow for TPC reconstruction is defined in RecoWorkflow.cxx and contains the following default processors

  • digit reader
  • clusterer
  • ClusterHardware Decoder
  • CA tracker

The default workflow can be customized by specifying input and output types e.g. digits, clustershw, tracks.

MC info is processed by default, disabled by using command line option --disable-mc

This function hooks up the the workflow specifications into the DPL driver.

This function is required to be implemented to define the workflow specifications

This shows how to get a condition for the origin "TES" and the description "STRING".

This function hooks up the the workflow specifications into the DPL driver.

We define at run time the number of devices to be attached to the workflow and the input matching string of the device. This is is done with a configuration string like the following one, where the input matching for each device is provide in comma-separated strings. For instance

A:TOF/RAWDATA/768;B:TOF/RAWDATA/1024,C:TOF/RAWDATA/1280;D:TOF/RAWDATA/1536

will lead to a workflow with 2 devices which will input match

tof-compressor-0 --> A:TOF/RAWDATA/768;B:TOF/RAWDATA/1024 tof-compressor-1 --> C:TOF/RAWDATA/1280;D:TOF/RAWDATA/1536

This gets a table handle from the message.

From the handle, we construct the actual arrow table which is then used as a source for the RDataFrame. This is probably easy to change to a:

auto rdf = ctx.inputs().get<RDataSource>("xz");

Definition at line 62 of file o2FairMQHeaderSizeTest.cxx.

◆ filename()

std::string filename ( )

Definition at line 55 of file o2FairMQHeaderSizeTest.cxx.

◆ random_string()

std::string random_string ( size_t  length)

Definition at line 36 of file o2FairMQHeaderSizeTest.cxx.