IceCube Data Acquisition :: Status Presentation before the IceCube Science Advisory Committee W. R....

11
IceCube IceCube Data Acquisition :: Status Presentation before the IceCube Science Advisory Committee W. R. Edwards / K. Hanson for DAQ Mar 29, 2006 – Madison, WI

Transcript of IceCube Data Acquisition :: Status Presentation before the IceCube Science Advisory Committee W. R....

IceCubIceCubee

Data Acquisition :: Status

Presentation before the IceCube Science Advisory Committee

W. R. Edwards / K. Hanson for DAQMar 29, 2006 – Madison, WI

2IceCube PAP Review Mar 29 2006

DAQ Capabilities

• Ideally, DAQ should be a transparent layer between signal and analysis; don’t want to worry about things like deadtime, saturation, charge/time resolution, other detector effects.

• We must live with physical detector elements: ice, PMT, …

• DAQ hardware (and downstream, software) designed to faithfully pass on photon arrival time information

– Waveform digitization at ~300 MHz to 600 MHz,

– Dynamic range of 1 – 10,000 p.e.– Low noise background: in event

window of 10 us, 0.5 noise hits per string

– Time-stamping of detector ‘hits’ to global precision of ~ 3 ns

– Low deadtime: dominated by readout of ATWD: 30 us per ch, two ATWDs can be operated in “ping-pong” mode.

• One potential problem is depth of ATWD digitizer – only 128 samples per channel. For longer pulses one needs to fall back to FADC which is slower and has less dynamic range.

Sample ATWD from DOM showing high, medium, and low gain channels for very large pulse. Inset shows typical SPE pulse

3IceCube PAP Review Mar 29 2006

DAQ - Overview

• DOMs asynchronously collect hits at approx 500 Hz rate (optional LC trigger requirement limits rate to 5-30 Hz)

– Hard LC mode – only send up hits with neighbor DOM coincidence hits – up to 4 distant. This results in some loss of hits.

– Soft LC mode – hit compression dictated by presence of neighbor hit but some information propagated for all hits. No loss of hit information (maybe in SP) but carries higher overhead on DAQ to support additional data rate.

• Periodic readout into Hub – hub sends data packets out on TCP sockets to SP.

• Time transformation done in SP using RAPCal information (0.2 – 1 Hz rate of TCAL). Local oscillator / cable delays accounted for in real time!

• SP merges hits on global timestamp – sends trigger packets to trigger processors*

• Trigger processors merge hits from various SPs – form trigger based on requirement. Then send readout request to EB.

• EB readouts data from SP* and builds event.

4IceCube PAP Review Mar 29 2006

DOM Mainboard

• In final leg of production – MB for 70-75 strings produced by early next year. No significant changes from 5.0 (first production article).

• DOM firmware ‘complete’ with some TODOs:

– DOM mainboard compression – this will reduce hit size by approx factor of 3. This is lossless compression – no information thrown away, bits packed more efficiently.

– IceTop – specific enhancements; requirements not yet well understood.

• Some worrisome failures observed at Pole this year (1%) – under investigation – not necessarily MB per se:

– 2 DOMs drawing higher-than-normal current – appear to be operating normally however one also, perhaps not coincindentally, has LC problems

– 4 DOMs do not power up– Broken LC – with Hard LC requirement these

DOMs are lost channels – will change with Soft LC DAQ

– Bad flash sector on one DOM – this DOM is out of operation for DAQ currently but can be brought back.

5IceCube PAP Review Mar 29 2006

Surface Software

• DAQ S/W not delivered in time for last year deployment. Data taken on String-21 with TestDAQ which was the primary DAQ tool for the DOM testing arena.

• Shortcomings of TestDAQ were that FPGA image not optimized for rapid data taking – several kludges to

take data meant that deadtime was large (0.5 ms) Surface component did not trigger in real time: hits written to disk in 15-

min chunks and analyzed by background process (monolith)• However, TestDAQ still viable lightweight testing tool which is

flexible and easy to deploy – so will continue to be used for ‘odd jobs’ such as commissioning and debugging.

• For this pole season, de-scoped DAQ delivery (called Plan A-) Original design called for ‘lookback’ mode from EB SP to gather full

hit information. SP only passed downstream the trigger primitives to triggers to reduce the bandwidth,

In ‘Hard’ LC mode network / processors able to handle transmission of full hit info; EB collects all hit information internally and self-references during event building process.

• DAQ code delivered just-in-time (final cut of first release right before station close). Has been primary data-taking application since 2/13.

6IceCube PAP Review Mar 29 2006

PY05 DAQ Software activities

• Improve stability of current software base (see later slide)

• Implement new features through series of major releases: Supernova scalers AMANDA Integration Hit compression / SLC support Plan A+ IceTop specific features

• Migration to 64-bit computing platforms and certification of DAQ support for 25 strings next season

• Continue to work on components to improve efficiency, reliability, and maintainability for future operations mode

• Primary focus in near term is on improving stability of DAQ

DAQ Issues and Bugs by Date

0

10

20

30

40

50

60

03/01/06 03/08/06 03/15/06 03/22/06 03/29/06Date

Nu

mb

er

of

iss

ue

s /

fea

ture

re

qu

es

ts

Open

Resolved

Closed

Total

7IceCube PAP Review Mar 29 2006

IceCube Events

8IceCube PAP Review Mar 29 2006

Simple Majority Trigger – IceCube 9 strings

Event Rate (Hz)

0

10

20

30

40

50

60

70

80

90

8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Multiplicity Setting

Ev

en

t R

ate

(H

z)

Data Rate (Bytes / sec)

0

100000

200000

300000

400000

500000

600000

700000

800000

8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Multiplicity Setting

By

tes

/se

c

SMT currently set to 8 hits in time window of 2 us (with hits in 10 us window built into event)

Don’t have plot but this agrees reasonably well with MC data: 30% normalization error but shape

9IceCube PAP Review Mar 29 2006

DOM Noise Rates

Deployed Jan 2005

10IceCube PAP Review Mar 29 2006

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

3/10

/200

6

3/11

/200

6

3/12

/200

6

3/13

/200

6

3/14

/200

6

3/15

/200

6

3/16

/200

6

3/17

/200

6

3/18

/200

6

3/19

/200

6

3/20

/200

6

3/21

/200

6

3/22

/200

6

3/23

/200

6

3/24

/200

6

3/25

/200

6

3/26

/200

6

3/27

/200

6Day

Liv

eti

me

Fra

cti

on

DAQ Livetime – Last 2 Weeks

• Since 2/13 DAQ has collected 200 million events (cf. 80 million for entire 2005).

• DAQ livetime averaged over period of 3/10 to 3/23 was 42% - some small portion of this was due to sharing of detector with verification activities.

• DAQ run scripts modified 3/24 to detect run crash sooner and restart runs more quickly

• DAQ livetime averaged over period since then has improved considerably (77% average livetime immediately following upgrade).

• Still short of target 90% livetime.

This estimate is calculated from counting events in 24-hour period. At the current trigger setting we should have a steady rate of 138 events per second (83 physics events per sec) == 11.8 million events per day.

Since 3/24, 30% of runs terminate abnormally before completion. Major contributors to crashes are (1) JVM crash; (2) stalled splicer queues

11IceCube PAP Review Mar 29 2006

Conclusions

• DAQ H/W in very good shape – no indications that DOM design will not absolutely meet science goals for IceCube

• Production of DAQ H/W (DOM-MB, DOR, DSB, Hubs, Master Clock) well underway – nothing to indicate sign of serious problems here: DOM-DOR communication could be more well understood but currently meets our in-ice requirement of 1 Mb / s data rate.

• DAQ S/W had problems with delivery and is still struggling to catch up but currently functions for data taking. It’s a large animal, however, we need to proceed with measured progress. Some concern about maintainability of this software long term.