The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

44
The D0 Level3 Software Review Daniel Claes University of Nebraska- Lincoln April 17, 1999

description

The D0 Level3 Group Go ahead…..make my data!!! Management: Amber Boehnlein[FNAL], Dan Claes[UNL] L3 Infrastructure: Gustaaf Brooijmans [FNAL] Moacyr Souza, Carmen Silva [LAFEX] NT Platform Support & Release: FNAL Computing Division Paul Padley [Rice], Kors Bos, Onne Peters [NIKHEF] Triggerlist Programming: Carmen Silva[LAFEX], Elizabeth Gallas(FNAL), Bornali Bhattacherjee[NIU], Hurol Aslan [UNL] Online Monitoring(L3 Examine): Milan Sinor [Czech Tech] Unpacking: Ela Barberis, Charles Leggett [LBL] Paul Padley [Rice], Peter Van Gemmeren [FNAL] Tool Authors: Volker Beucher, Juan Estrada, Vishnu Zutshi [Roch] Ray Beuselinck [Imperial College] Kors Bos, Onne Peters [NIKHEF] Gustaaf Brooijmans, Dmitri Denisov [FNAL] Ken Del Signore, Andre Turcot [MICH] Guilherme Lima [LAFEX] Paul Padley + student [Rice] Lee Sawyer [LaTECH], Andy White [UTA] Steve Wasserbaech, Andy Haas [UWash] Terry Wyatt [Manchester] Daria Zieminska, Chunhui Luo [IND]

Transcript of The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Page 1: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Review

Daniel ClaesUniversity of Nebraska-Lincoln

April 17, 1999

Page 2: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

Go ahead…..make my data!!!

�High Level trigger

�fully programmable

�limited reconstruction of the event

�<100 msec decision

�1000 Hz input 50 Hz output

Page 3: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 GroupGo ahead…..make my data!!!Management: Amber Boehnlein[FNAL], Dan Claes[UNL]

L3 Infrastructure: Gustaaf Brooijmans [FNAL] Moacyr Souza, Carmen Silva [LAFEX]

NT Platform Support & Release: FNAL Computing Division Paul Padley [Rice], Kors Bos, Onne Peters [NIKHEF]

Triggerlist Programming: Carmen Silva[LAFEX], Elizabeth Gallas(FNAL), Bornali Bhattacherjee[NIU], Hurol Aslan [UNL]

Online Monitoring(L3 Examine): Milan Sinor [Czech Tech]Unpacking: Ela Barberis, Charles Leggett [LBL]

Paul Padley [Rice], Peter Van Gemmeren [FNAL]Tool Authors: Volker Beucher, Juan Estrada, Vishnu Zutshi [Roch]

Ray Beuselinck [Imperial College] Kors Bos, Onne Peters [NIKHEF] Gustaaf Brooijmans, Dmitri Denisov [FNAL] Ken Del Signore, Andre Turcot [MICH]

Guilherme Lima [LAFEX] Paul Padley + student [Rice] Lee Sawyer [LaTECH], Andy White [UTA]

Steve Wasserbaech, Andy Haas [UWash] Terry Wyatt [Manchester] Daria Zieminska, Chunhui Luo [IND]

Page 4: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons

As a general rule: Long leadtime forces conservatism.

Anticipate your needs and incorporate them into your design. Late ideas, as well as those that simply require a long period of development, become a hard sell once the run has begun.

• L2 CDC vertexing• hitfinding/datacompression

• new tool parameters occasionally even

casualties:

Page 5: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons General System OperationsNo archive and retrieval of Run Trigger Info from analysis code (inadequate record keeping for user)No offline database of triggerlists or tools.No master database of Run/triggerlist vsn/L2 vsn

Triggerlist Version never stamped on event or archived by run number in a database

Failed to tag candidates with L2 bit.Filterbit names more stable than the bit numbers.

BUT names didn’t change when their meaning changed.

Trigparse files themselves archived in the CVS library CONFIGS.Triggermeister/L2 personnel kept hand-made tables as documentation.

TSUM bank carried passed trigger/filter names

The internal “parameter set” number was indecipherable to users.

A need for actual NUMBERS (parameters and thresholds) remains.

Page 6: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons Triggermeister Duties

Too many separate tools and too little documentation for

•Implementing and certifying new triggerlists

•monitoring online rates

•documenting changesThe triggerlist tables used by most users were handbuilt and did not always reflect what was running online.Not every downloaded change was accompanied by a new version number.

Trigparse provided a single text source of all trigger levelscomplete with defaults for the numerous parameters, but error prone.Tools had LOTS of parameters…proliferation of levels of physics object quality.Proof-reading was instituted but not foolproof, and skipped when rushed.

Run summaries were not designed into the system. An automated end-of-run report we did use was text and had to be parsed to extract numbers to ntuple.

Page 7: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons Trigger SimulatorThough a separate project, L3 has a vested interest.

•Study efficiency vs rejection for designing and tuning tools, filters•Calculate final trigger efficiencies (turn-on curves)

•Estimate trigger rates

•Timing studies•Implement and test new triggers•VERIFY behavior by (re)running trigger (simulator) on real data

•VERIFY: every user becomes a potential verifier of the system!

Background studies require vast statistics, which is tough when require raw data.

Needs Mark & Pass (monitor stream) data

By running the same filtering code as the online trigger executable. Helps find triggerlist dependent bugs.

ONLINE vs SIMULATION bit by bit comparison only way to find historydependent bugs

•ONLINE/OFFLINE environments differed. Never fully replicated ONLINE data layout. •The RUNI simulator was available only on a single platform!

Page 8: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons ONLINE Bugs

•Releases were driven by ONLINE needs

•No new tool releases, tool improvements/speedups counted•No simple added protection (error checking) or expanded banks

•No simple compilation/link errors discovered in RELEASE

•No ERMSG handling bugs

•Errors that took several tries to fix NOT double counted

but most released changes were to the SIMULATORNo simulator bugs are recorded here

Only errors that made it ONLINE into an L2 node

Fixes to excessive or irrelevant ERMSGs not counted

By definition, no accounting of errors found by OFFLINE verification.

Page 9: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons ONLINE Bugs Problem Seen Error Isolated/Fixed

Inspection of CODE or LINK/Release setup

Running SIMULATOR (examining rates/output) 11

Debugging SIMULATOR

Inspecting real DATA 12

SHADOW Tests of released/running code 8

ONLINE Rates 4

ONLINE ERMSGs 3

ONLINE crashes 19

EXAMINE plots 5

ONLINE Edebug (REGULAR nodes)

Monitor Stream VERIFICATION 4

Page 10: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons ONLINE Bugs Problem Seen Error Isolated/Fixed

Inspection of CODE or LINK/Release setup 35

Running SIMULATOR (examining rates/output) 11

Debugging SIMULATOR 19

Inspecting real DATA 12

SHADOW Tests of released/running code 8 5

ONLINE Rates 4

ONLINE ERMSGs 3

ONLINE crashes 19

EXAMINE plots 5

ONLINE Edebug (REGULAR nodes) 6

Monitor Stream VERIFICATION 4

Page 11: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons ONLINE Bugs Problem Seen Error Isolated/Fixed

Inspection of CODE or LINK/Release setup 20 / 15

Running SIMULATOR (examining rates/output) 6 / 5

Debugging SIMULATOR 10 / 9

Inspecting real DATA 8 / 4

SHADOW Tests of released/running code 5 / 3 3 / 2

ONLINE Rates 1 / 3

ONLINE ERMSGs 1 / 2

ONLINE crashes 10 / 9

EXAMINE plots 3 / 2

ONLINE Edebug (REGULAR nodes) 2 / 4

Monitor Stream VERIFICATION 4

Page 12: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons VerificationMonitoring Data

•Calculate final trigger efficiencies (turn-on curves)•VERIFY behavior by (re)running trigger (simulator) on real data

Mark & Pass dataRecord all L3 trigger information and mark event with final trigger decision

but write every event regardless of trigger decision.

•Could not ALWAYS dump crashed events.•As long as ONLINE & OFFLINE environments differ need PLAYBACK mode

•Trigger studies SPECIAL dedicated (100%) Mark & Pass runs•Monitoring SPECIAL prescaled (few %) stream

Developers must be able to test code in an environment as close to the real L3as possible. PLAYBACK

Data-taking node configured to read events from an archived file (standard collider data setsor crashed events dumped to disk). Allows repeated re-execution on crashed events to findcause. Could provide additional certification of new executables as pasrt of an baselinecheck of new tools and/or bug fixes.

Page 13: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons VerificationPre-Release Verification

•Intended to catch all errors before release.•Intended as final integrated test of individually author-tested code• IN PRACTICE: Too much code released too early without adequate unit testing.• Need a checklist of unit tests to be performed.

•Requires LARGE and VARIED sets of DATA and Monte Carlo• Need to adequately cover the phase space of all parameter changes.• Need up-to-date data samples (to reflect hardware changes/be sensitive to bug fixes)

•Bugs for a large number of errors found by simple inspection• suggests frequent CODE reviews

Page 14: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons VerificationPre-Release Verification

•Too much code released too early without adequate unit testing.• Need more systematic testing before release. Need checklist of unit tests.

•Bugs found by simple inspection suggest frequent CODE reviews

L3 Filtering Group early advocate of Dave Adams’ CTEST interface supports component level testing as part of a release adapted/enforced as a Level 3 standard We require test cases for all components which include a ScriptRunner emulation and is triggerlist driven as well as integrated tests (which run a framework package over multiple event samples)

• ScriptRunner design and implementation was externally reviewed• With C++ programming NEW to most developers, internal CODE

reviews are natural, and useful learning experience for both theauthor undergoing reviewed and new developers in attendance.

Page 15: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 1 Lessons Code Releases

• Must have well-controlled version-stamped archives • Code & executables accessible “for all time.”

•MUST have a SEPARATE L3 production release

•Could not always enforce: code ONLY from official CVS

•Difficult to keep up-to-date with the main OFFLINE archive

Retro fits of fixes to the simulator were necessary even long after run finished.

The time scale (urgency) for L3 releases different from OFFLINELast run L2PRODUCTION got a late start (after Run1 already in progress)

Must never link in PRIVATE code.

In Run1 we “froze” the entire D0Library as our baseline 40 separate libraries, 5000 routines

It took 3 months to update/re-set the base libraries and certify the ONLINE exe Next run: If possible Work with clearly delineated subset of libraries under control of L3 release! Make compilation on multiple platforms part of the release procedures!

Page 16: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Run 2 Level 3 Packages Library OrganizationScriptRunner

l3exceptions

l3fMcTool

l3fMuonTools

l3fTestClasses

l3filters

l3fparser

l3registry

l3fresults

l3fshop

l3ftoolbase

l3ftools

Driver for L3 filtering system - executes filter scripts in triggerlist-directed order, returns event’s trigger decision.

L3ErrHandleTool

Monte Carlo Tools to exercise framework (Ele, Muo, Pho, Tau)

newly created

CalReco, doReco

classes that call tools, make cuts, function as nodes in traversal tree

parse triggerlists

filter map & tool map code

L3ToolResults class and specific tool inherited classes

Atool

bTool, dataTool

example tools, utility tools to exercise test packages

Page 17: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

• TOOLS- perform partial reconstruction. Their code implements the algorithms that identify the physics objects

• JET• ELECTRON• PHOTON• CENTRAL PRESHOWER• Fiber TRACKER• MUON (Forward /Central)• TAU• GLOBAL (Missing ET, Scalar ET )

• Return a list of candidates, but make no event selection or trigger decision.

• Multiple INSTANCES of any tool distinguished by different refsets.

Page 18: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Level 3 Tools

bTool fundamental tool base class from which are derived: triggerTool s

general infrastructure functionality used by many toolsL3Prescale L3MarkandPass L3ErrHandleTool

dataTool ssupply the physics information FILTERS will cut onphysicsTool s

contain templated physics results class

unpackTool s only tool base class that access the event data

Page 19: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

physicsTools

L3PhysicsResults contains the basic physics results each tool must calculate.The results class for any specific physicsTool inheritsfrom the above, i.e.L3ElePhysicsResults L3JetsPhysicsResults and contains the following attributes:

LorentzVector _kineResults;SpaceVector _vertexUsed;float _detectorEta;

float _ET;l3string _name;

in addition to attributes unique to each specific physicsTool (track parameters, energy fractions, etc)with methods to return these values, i.e.

const LorentzVector* get_kineResults( ); const SpaceVector* get_vertexUsed( ); etc.

Page 20: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

• Calls to TOOLS are made by FILTERS.

• FILTERS check returned candidates against the parameter set (i.e. number, threshold, range) to make the actual pass or fail decision.

• FILTERS correspond to individual line entries in the trigger programming.

Filter 4 ele_jet_max

L3Ele ‘e3’ ELE_LOOSE N=1 ET_min=15.0 ETA_max=2.5

L3Jet ‘j4’ JET_07 N=2 ET_min=16.0 ETA_max=2.5

Page 21: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Level 3 FiltersTrigger Filters make no physics judgement of event

prescaling pass_1_of_n(Monitor events) Mark and Pass streaming

Physics Object Filters determine whether any returned physics candidates pass trigger selection criteria

Relational Filters combine passed candidates from one or more filters into a composite quantity

rapidity differences azimuthal differences invariant mass transverse mass HT

Page 22: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Trig Bit HW trigger Condition Filter Bit Filters

1 1 Large Tile>15 GeV JET_30 1 jet > 30.0 GeV

2 1 , ||<1.7 MU_1_JET L15_confirm

1 Large Tile>10 GeV 1 (tight) > 9.0 GeV

DIMU_JET L15_confirm

1 (tight) > 9.0 GeV

2 (loose) > 9.0 GeV

3 1 e, Et>12.0 GeV EM1_EISTRKCC_MS 1 e>20 Gev, Isolation, Track

EtMISS > 15 GeV

EM1_EIS06_MS 1 e>20 Gev, Isolation, 0.6 Cone

EtMISS > 15 GeV

EM1_ESC 1 e>60 GeV

EM1_EISTRKCC_ESC 1 e>20 Gev, Isolation, Track 2 e>16 GeV EM1_2EISTEK_MS 1 e>12 GeV, Isolation 2 e>7 GeV, Isolation, Track

EtMISS > 7 GeV

D0 RunI Partial Trigger List

Page 23: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

• Tools use information across detector subsystems example: electron tracking calorimeter preshower

• Full unpacked raw data should be available to all tools

• Tools accept parameters governing the execution of their algorithm (quality of the physics object)

• Each unique set of parameters defines a new instance of each tool

JET_07 JET(JT_SEEDS,VTX_DEFAULT,CONE=0.7,EMFR_MIN=0.5,EMFR_MAX=0.95)

ELE_TIGHT ELE(EM_SEEDS,EM_TRACK, ISO=0.12, EM_FRAC=0.95, ETA_MAX=1.0)

ELE_LOOSE ELE(EM_SEEDS,EM_TRACK, ISO=0.15, EM_FRAC=0.90, ETA_MAX=1.0)

ELE_ESCAPE ELE(EM_SEEDS,EM_TRACK, ISO=0.20, EM_FRAC=0.90, ETA_MAX=1.0)

Page 24: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

L3refset JT_SEEDSJTSEED(3.0) EM_SEEDS EMSEED(1.5) EM_TRACK TRACK(0.,’FTF’)

JET_07 JET(JT_SEEDS,VTX_DEFAULT, CONE_SIZE=0.7, EM_FR_MIN=0.5,EM_FRAC_MAX=0.95) ELE_TIGHT ELE(EM_SEEDS, EM_TRACK, ISO=0.12, EM_FRAC= 0.95, ETA_MAX=1.0) ELE_LOOSE ELE(EM_SEEDS, EM_TRACK, ISO=0.15, EM_FRAC=0.90, ETA_MAX=0.5) ELE_ESCAPE ELE(EM_SEEDS, EM_TRACK, ISO=0.20, EM_FRAC=0.90, ETA_MAX=1.0)

TRIG_1 filter 1 jet_min L3jet ‘jmn’ JET_07 N=1 ET_min =15.0 ETA_max=4.0

TRIG_2 filter 2 jet30 L3Mark&Pass 50000 ‘Monitor’ L3Jet ‘j30’ JET_07 N=1 ET_min=30.0 ETA_max=2.5 TRIG_3 filter 3 ele_jet_high L3Ele ‘e3’ ELE_LOOSE N=1 ET_min=15.0 ETA_max=2.5 L3Jet ‘j3’ JET_07 N=2 ET_min=10.0 ETA_max=2.5 filter 4 ele_jet_max L3Ele ‘e4’ ELE_LOOSE N=1 ET_min=15.0 ETA_max=2.5 L3Jet ‘j4’ JET_07 N=2 ET_min=16.0 ETA_max=2.5 filter 5 ele_jet_1 L3Ele ‘e5’ ELE_LOOSE N=1 ET_min=30.0 ETA_max=4.0 L3Jet ‘j5’ JET_07 N=1 ET_min=30.0 ETA_max=4.0

Page 25: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger• TOOL performs calculations on unpacked RAW DATA • Each TOOL generates a list of candidate physics

objects satisfying the quality defined by its parameter set

• Each TOOL Instance caches its results for re-use in the case of multiple calls

• The L3ToolResults holds a pointer to L3PhysicsResults which contains the ZOOM class LorentzVector of ET, , and an abstract base ToolSpecifics *class

• Failure to find satisfactory candidates of the quality defined by the parameters terminates further execution of the algorithm

Page 26: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

L3refset JT_SEEDSJTSEED(3.0) EM_SEEDS EMSEED(1.5) EM_TRACK TRACK(0.,’FTF’)

JET_07 JET(JT_SEEDS,VTX_DEFAULT, CONE_SIZE=0.7, EM_FR_MIN=0.5,EM_FRAC_MAX=0.95) ELE_TIGHT ELE(EM_SEEDS, EM_TRACK, ISO=0.12, EM_FRAC= 0.95, ETA_MAX=1.0) ELE_LOOSE ELE(EM_SEEDS, EM_TRACK, ISO=0.15, EM_FRAC=0.90, ETA_MAX=0.5) ELE_ESCAPE ELE(EM_SEEDS, EM_TRACK, ISO=0.20, EM_FRAC=0.90, ETA_MAX=1.0)

TRIG_1 filter 1 jet_min L3jet ‘jmn’ JET_07 N=1 ET_min =15.0 ETA_max=4.0

TRIG_2 filter 2 jet30 L3Mark&Pass 50000 ‘Monitor’ L3Jet ‘j30’ JET_07 N=1 ET_min=30.0 ETA_max=2.5 TRIG_3 filter 3 ele_jet_high L3Ele ‘e3’ ELE_LOOSE N=1 ET_min=15.0 ETA_max=2.5 L3Jet ‘j3’ JET_07 N=2 ET_min=10.0 ETA_max=2.5 filter 4 ele_jet_max L3Ele ‘e4’ ELE_LOOSE N=1 ET_min=15.0 ETA_max=2.5 L3Jet ‘j4’ JET_07 N=2 ET_min=16.0 ETA_max=2.5 filter 5 ele_jet_1 L3Ele ‘e5’ ELE_LOOSE N=1 ET_min=30.0 ETA_max=4.0 L3Jet ‘j5’ JET_07 N=1 ET_min=30.0 ETA_max=4.0

Page 27: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

physicsTools: Code Development

Understanding efficiencies easier, the closerONLINE tools are to OFFLINE reconstruction every trigger level bumps thresholds up 2-3 in addition to any systematic introduced by use of different algorithms

butL3 will see 50x as much as data as OFFLINE and with limited resources (time!) SPEED dictates tools have MEMORY of previous processing Weak point identified in Run 1 Level 2 was communication from/between tools ESUM proved insufficient for all applications tool outputs were tricky for other to use again tool outputs were difficult to decode offline

Page 28: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

physicsTool Developers

CalorimetryJet Volker Beuscher (Rochester)Electron Juan Estrada, Vishnu Zutshi(Roch)Photon Greg Landsberg(Brown)CentralPreShower Andre Turcot (MICH)Tau Paul Padley + student (Rice)Global (MET, ScalET) Andy White (UTA)

Lee Sawyer + student (LaTECH)MuonCentral Muon Daria Zieminski, Chunhui Luo (IND)

Steve Wasserbaech, Andy Haas (UW) Kor Bos, Onne Peters (NIKHEF)Forward Muon Dmitri Denisov (FNAL)TrackingVertex Guilherme LimaSecondary Vertex Gustaaf Brooijmans(FNAL)

Terry Wyatt(Manchester)FiberTracker Ken Del Signore (MICH)

Ray Beuselinck + student (IC)

Page 29: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

physicsTool Code Development USERSUPPORT

LEVEL 3 ONLINE Tutorials http://www-d0.fnal.gov/~hur/L3mainpage.htmlUsers Support Mailing List

[email protected] (staffed by Amber, Gustaaf, and Dan)and associated Filtered Archive of FAQs http://www.egroups.com/list/d0l3fsupDocumentationD0Note #3266:“Description of D0 L3 Trigger Software”

D0Note #3630: “Description of D0 L3 Trigger Software Components”

D0Note #xxxx:“Description of D0 L3 Software Development Environment”

“Interactions Between L3 Framework and ScriptRunner”

Page 30: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.
Page 31: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• Initialization• Download Trigger Configuration• BeginRun• ProcessEvent• MonitorL3• Disable/EnableL3• JobSummary• ChangeParameters• EndRun• Drop/AddL2Bits

Page 32: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• Initialization L3 framework passes ScriptRunner location of calibration and geometry information. This information is cached at this time.

• Download Trigger Configuration L3 framework passes the L3 relevent pieces of the trigger programming to ScriptRunner

Page 33: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• Download Trigger Configuration

• A fixed registry of all officially released tools will be kept with a list of their legal parameters.

• ScriptRunner parses the trigger programming line by line

demanding a one-to-one correspondence in sequence between strings in the refset, filter lines, and the fixed registry’s parameter list.

Page 34: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• Download Trigger Configuration

• All Filters and Tools are promptly instantiated as Singletons

caching their parameters.• For Filters a pointer to a Tool object, keyed by

name, is returned. An element is appended to the Linked List Execution Tree.

• Note:“key”ed filters will allow filters to call filters

Page 35: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• BeginRun Framework passes Run Number and list of active L2 bits

• ProcessEvent ScriptRunner traverses execution tree • FILTER NODES launch traversal through the

filters that were defined by their filter scripts• FILTERS kick their tools check the returned

STATUS compare candidates against cuts generate list of indices pointing into tool

candidate list for passed objects• A virtual method in the Filters base class provides a

public accessor returning the address of the tool used

Page 36: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions• ScriptRunner traverses execution tree On the

RETURN PATH Filters (*mother*) status is updatedscalars (#calls, #passed) and timing updated after successfully transversed any L3 branch (at least one L3 filter bit passed) invoke:Report( ) traverse all filters again, writing to the event each NodeStatus and *mother* It.Flush( ) looping over all registered tools, writing out data for each with “IHaveData”=true It.ToolsResetand It.FiltersReset

Page 37: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• Passed• Failed In addition to those with no valid trigger decision• Bad data unpacking errors detected• L2 Unbiased no valid L2 trigger decision• L3 Unbiased special Mark&Pass data force passed• Out of time event flushed for lack of resources• Out of memory event flushed for lack of resources

ProcessEvent: FILTERS return a STATUS indicating

Page 38: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Framework ScriptRunner interactions

• MonitorL3 framework requests binary file with information on pass rates and timing spanning the entire execution tree

• Disable/EnableL3 allows the disabling of filters when necessitated by hardware problems

• JobSummary• ChangeParameters allows reseting of prescales• EndRun ScriptRunner resets counters• Drop/AddL2Bits Drops

list of trigger nodes from the execution tree.

Page 39: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

TRIG_4 filter 6 jet_mulit L3Jet ‘j6’ JET_07 N=5 ET_min=10.0 ETA_max=2.5 filter 7 jet_multi_x L3Jet ‘j7’ JET_07 N=5 ET_min=15.0 ETA_max=4.0

TRIG_5 filter 8 di_ele_high L3Mark&Pass 100000 ‘Monitor’ L3Ele ‘ze1’ ELE_TIGHT N=1 ET_min=20.0 ETA_max=1.1 L3Ele ‘ze2’ ELE_LOOSE N=2 ET_min=15.0 ETA_max=2.5 filter 9 JPSI_ee L3Prescale 100 L3Ele ‘e11’ ELE_LOOSE N=1 ET_min=3.0 ETA_max=2.5 L3Ele ‘e12’ ELE_ESCAPE N=2 ET_min=3.0 ETA_max=2.5 L3InvMass ‘e11’ ‘e12’ MASS_min=1.5 MASS_max=4.0

Page 40: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

filter 9 JPSI_ee L3Prescale 100 L3Ele ‘e11’ ELE_LOOSE N=1 ET_min=3.0 ETA_max=2.5 L3Ele ‘e12’ ELE_ESCAPE N=2 ET_min=3.0 ETA_max=2.5 L3InvMass ‘e11’ ‘e12’ MASS_min=1.5 MASS_max=4.0

“KEY”ed filters

build list of indices for passed candidates

provide public accessor for address of tool used

Page 41: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

L3refset EM_SEEDS EMSEED(3.0) EM_TRACK TRACK(0.,’FTF’) ELE_TIGHT ELE(EM_SEEDS, EM_TRACK, ISO=0.12, EM_FRAC=0.95, ETA_max=1.0) ELE_LOOSE ELE(EM_SEEDS, EM_TRACK, ISO=0.15, EM_FRAC=0.90, ETA_max=0.5) ELE_ESCAPE ELE(EM_SEEDS, EM_TRACK, ISO=0.20, EM_FRAC=0.90, ETA_max=1.0)

## “Atool” is instanciated for use with L3Ele test filterAtool TEST_ myELE(EM_SEEDS, EM_TRACK, ISO=0.15, EM_FRAC=0.90, ETA_max=1.0)

TRIG_1 filter 1 di_electric_high L3ele ‘ze1’ ELE_LOOSE N=1 ET_min =20.0 ETA_max=1.1 filter 2 di_electron_esc L3Ele ‘ze3’ ELE_ESCAPE N=1 ET_min=20.0 ETA_max=1.1

TRIG_2 filter 3 Whatever L3Ele ‘ze2’ ELE_TIGHT N=1 ET_min=20.0 ETA_max=1.1 L3Ele ‘test’ Atool N=1 ET_min=20.0 ETA_max=1.1

Page 42: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

Test-Development Shop Introducing test versions of a tool at the trigger list level

Parser recognizes TOOL types prefixed by TEST_ as special

Atool TEST_ (EM_SEEDS, EMTRK,0.15,0.90,1.0)

calls

Asingleton::Instance(“Atool”,”myELE”) can’t be same as any officially registered tool

then a FILTER likeL3Ele ‘test’ Atool N=1 ET_min=20.0 ETA_max=1.1

will actually run myELE (implemented in Atool.h, Atool.c)in place of the official electron tool L3Ele

Page 43: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

• Development and online DAQ will be under the Windows NT operating system

• An OFFLINE version of the LEVEL3 executable must be available for

• This OFFLINE version must be available under multiple platforms (WinNT and UNIX) for collaboration-wide use.

Studying rates and efficiencies from Monte Carlo data for developing trigger lists Measuring trigger “turn-on” curves from collider data Understanding trigger bias Monitoring/verifying online results Bit-by-bit comparison of offline vs online L3 output certifies simulator & detects history-dependent bugs

Page 44: The D0 Level3 Software Review Daniel Claes University of Nebraska-Lincoln April 17, 1999.

The D0 Level3 Software Trigger

• Under the control of a ScriptRunner we have implemented a scheme of TOOLS generating candidate lists FILTERS applying cuts to candidates

• sequenced by an execution tree generated by link lists built when the trigger programming is parsed

• ScriptRunner will provide timing and pass rates at both the tool and filter level.

• Multiplatform OFFLINE versions of the ONLINE executable will facilitate development, debugging, verification, monitoring, and physics studies of efficiencies and rates.

Conclusions