This page explains how to create and replicate run history records from an online MongoDB to UconDB.
Run history records, or run_records
, are generated by the DAQInterface at the start of each run during the bookkeeping process. By default, these records are stored in the /daq/run_records/
directory, with a subdirectory for each run number. If the artdaq_database
feature is enabled in the user_sourcefile
configuration file in the DAQ Shifter area (e.g., ~/DAQ_DevAreas/DAQ_YYYY-MM-DD_USR_vN_NN_NN/DAQInterface
), the DAQInterface also creates an entry in the online configuration database hosted on MongoDB, known as artdaq_database
. To enable this feature, uncomment line 27 in the user_sourcefile
configuration file. This action stops loading run configurations from ~/DAQ_DevAreas/DAQ_YYYY-MM-DD_USR_vN_NN_NN/DAQInterface/configs
and instead loads them from the online configuration database (artdaq_database
). This feature can be used when running the DAQ from both the console and the Run Control GUI.
Below is a snippet from the user_sourcefile
where artdaq_database
is enabled:
25 │ ######################################################
26 │ ##Uncomment the line below to enable artdaq_database.#
27 │ export DAQINTERFACE_FHICL_DIRECTORY=IGNORED
28 │ ######################################################
To access run history records via FTS, they must be replicated from MongoDB to UconDB. A cron job handles this, running periodically on sbnd-evb04
(SBND) or icarus-evb06
(Icarus) under the DAQ user account. To see the scheduled cron jobs, use: crontab -l
. An example output includes the following:
*/25 * * * * ARTDAQ_DATABASE_TOOLS_ENV=~/.artdaq_database_tools.env ~/cronjobs/copyRunHistory2UconDB-cron.sh >> /daq/log/dbtools/database-ucondb.log 2>&1
*/20 * * * * ARTDAQ_DATABASE_TOOLS_ENV=~/.artdaq_database_tools_pending.env ~/cronjobs/copyRunHistory2UconDB-cron.sh >> /daq/log/dbtools/database-ucondb-pending.log 2>&1
If run record replication fails (e.g., UconDB did not respond), it retries after 20/25 minutes. If artdaq_database
was enabled during the run, the data will appear in UconDB. If not, run records can be imported into MongoDB using ~/cronjobs/importRunHistory2ArtdaqDB.sh
. This script can be run from the command line or set up as a cron job.
Note: Due to MongoDB performance issues, the cron job is temporarily disabled and will be restored when the DBA group resolves the problem.
The replication scripts are in ~/cronjobs
and on GitHub at https://github.com/SBNSoftware/sbndaq/tree/develop/configDB_tools.
The process involves exporting run history records in the FHICL file format, concatenating them into a text document, and posting them into UconDB using the UConDB Web API. Web API details are available at UConDB Web API Documentation. The conftool.py
and combined script information are available at: ConfigDB Documentation and copyRunHistory2UconDB-cron-2.sh.
To verify if a run history record has been transferred to UconDB, run:
export UCONDB_URL=https://dbdata0vm.fnal.gov:9443
export EXPERIMENT=sbnd
#export EXPERIMENT=icarus
export RUN_NUMBER=10000
curl ${UCONDB_URL}/${EXPERIMENT}_on_ucon_prod/app/data/run_records_pending/configuration/key=${RUN_NUMBER}
# Alternative command with filtering
curl ${UCONDB_URL}/${EXPERIMENT}_on_ucon_prod/app/data/run_records_pending/configuration/key=${RUN_NUMBER} | grep -E '^(config_name|components|sbndaq_commit_or_version|metadata)'
If 30 minutes have passed since the run started and the run history record has not transferred to UconDB, troubleshoot as follows:
- Check if the record exists in the online run history database (
artdaq_database
) usingconftool.py
:
export MY_DAQ_AREA=~/DAQ_DevAreas/DAQ_YYYY-MM-DD_USR_vN_NN_NN/
cd ${MY_DAQ_AREA}/DAQInterface/
source ./setup_daqinterface.sh
conftool.py getListOfArchivedRunConfigurations 10000/
mkdir -p ${MY_DAQ_AREA}/test_config
cd ${MY_DAQ_AREA}/test_config
conftool.py exportArchivedRunConfiguration <10000/config>
Replace <10000/config>
with the actual configuration path.
-
If the record exists in the
artdaq_database
but not in UconDB, review the cronjob log at/daq/log/dbtools/database-ucondb-pending.log
for further troubleshooting. -
You can manually run the replication script (
~/cronjobs/copyRunHistory2UconDB-cron-2.sh
) from the command line using the experiment’s user account. Ensure the cronjob running the same script is disabled to prevent conflicts, allowing for manual troubleshooting and verification of the replication process.
ARTDAQ_DATABASE_TOOLS_ENV=~/.artdaq_database_tools_pending.env ~/cronjobs/copyRunHistory2UconDB-cron.sh
ARTDAQ_DATABASE_TOOLS_ENV=~/.artdaq_database_tools.env ~/cronjobs/copyRunHistory2UconDB-cron.sh
Note: A “_pending” record is created at the run’s start and includes only the begin time. The actual record, without the “_pending” suffix, is created at the run’s end and contains both the begin and end times. This approach ensures the FTS does not wait until the run ends to start moving data files to dCache.