SC02 meeting, FNAL, June 3 '02

Introduction 

We went through each of the panels to identify what contents would be included and how it fits into the overall concept. Primary coordinator talks to graphics deign folks.

Panel 1 :why are we here:

Physics of the universe, global science, plasma panel, no door, no projector. Richard Mount & Joel Butler.

2. Accelerator design

Spence is coordinator

Add Global Accelerator, update simulations  reflecting this years work. Investigate participation from SLAC group. Statis display with monitors, new simulation animation.

3. Detector design  & Physics simulation

Coordinator: Makoto Asai, Daniel Elvira. 

Introduction,complexity, detector simulation, typical detector characterization, simulation before (design), after (simulated data for error analysis). Example detector (future or operating or both). Why do we need simulation, model validation compare data and simulation, how much computing is involved.

4. Control systems, DAQ, trigger, 

Steffan Luitz coordinator. 

Triggering in general, decision times, # collisions, computing farms for triggering,, network processing,  disk, BTeV at home, offline & online blurring, experiment neutral, monitor, control room log book (interactive monitor), emphasize size, complexity, up time.

5. Data Storage

Coordinators: Don Petrovic & Yemi Adesanya

Data movement is interesting, actual storage is a bit boring. Data management, access, latency, size of store, Computer World award, interviews with KRON TV. Don Petrovic will add new data technologies, needs to be tied to the physics demands in size distribution, access speeds.

6. Grid fabric, infrastructure, networking

Coordinators: Les Cottrell, Dane Skow.

Two SciDAC funded projects, Rice, LANL, & SLAC (INCITE), and SLAC & FNAL. Response times to world of HEP collaborators from the booth. Throughput to the Grid/Tier 0,1 sites. Topologies with ISP, RTT, bandwidth, with drill down. the topologies will be interactive with someone to assist. Also will add growth in throughput achievable across the Atlantic. Unclear whether we need an overhead display, still looking for a killer display for overhead. Add in security and AAA (authorization, authentication and accounting). Work in importance of end points to the physics.

7. Grid distributed production

Coordinators Greg Graham, 

CMS/D0 production, US MOP production. MOP (from PPDG) is a thin layer front end to Condor-G and Globus is now in production for CMS. Also talk about Globus/Condor collaboration. Also something about the Chimera virtual data system coming out of GriPhyN. Footnote on institutions involved in collaborations. Tie in content of this panel with other panels. Needs two normal monitors. First will have rotating displays detailing functional aspects, 2nd is status displays of distributed production in progress. There appears to be no SLAC component for this panel.

8. Grid distributed analysis

Coordinators: Lee Lueking, Adil Hasan

 Enabling the worldwide HEP community to perform analysis on extended data sets. Demo of data flows 2 ways between 2 sites. Build on last years demo (map of world with data moving around) on a rear projection, replay of old data (not simulated). Would be D0, might also include Babar. BaBar SRB might also fit into this panel. It would use a regular monitor. A second demo on a regular monitor would be the SAM Grid project where they have a test bed working now where can submit jobs remotely. Will also have web based monitoring tools (being developed over the summer). This might also fit in panel 7. The tools use Globus/Condor and are PPDG projects. Somewhere we need to address collaborations with the computer science people. Panels 7 & 8 need new titles. Greg, Lee and Adil need to coordinate on this.

9. Physics analysis tools

Coordinators: Joe Perl, Jeff Kallenbach. 

JAS, visualization, Wired, Miniboone (neutrino events (Cerenkov rings) which are easy to see with naked eye, and to explain, event rate is OK for real-time display, interactive display), CLARENS (Caltech analysis tool).  Three monitors and projection.

10. Theory / Lattice QCD

Don Holmgren, Jim Simone are coordinators.

Similar to last year, why does one need computing, physics parameters and associated Tflop years, slide show with lattice QCD with: what is it, institution, funding, goals, software/hardware, status. Bridge between theory and reality. 

Graphical Design of the Booth

Two mockups were presented. One had a background of accelerator components, and site photos. The other had event displays in the background.  The vote was 11:4 in favor of the event displays. Then we discussed the color scheme which for the event displays was black and red which is similar to last year. There were concerns about changing the dark background and the tracks and text showing up if we move to a lighter background. We agreed that a different (lighter) color would be good for the cantilever. There was an agreement that the text should be in boxes that are transparent to the background.

Schedule

Today we identify the panel coordinators. Between now (June 3) and July 19 the coordinators create/develop or get created the contents. Consultation with VMS (FNAL Video Media Services). This will include how many monitors, where one expects the monitors to be located, a rough idea of what is on each monitor, what is the associated back panel text (hard copy). VMS completes the design of the background and edits are completed (PDFs of design background are posted) by June 24 '02. From August 9 '02 through August 30 VMS will complete the panels and close work with coordinators. FNAL VMS will hand off the files to Freeman to printing 8/9/03. Week of Sep 16 '02, VMS is on hand for printing inspection and VMS travels to Freeman Chicago to print run. Week of Sept 23 thru Nov 15 '02 Freeman stores panels and ships to Baltimore. Week of Nov 16 '02 Freeman/Phillippe/VMS assemble the display at conference location in Baltimore. 

We have agreed with Freeman to use the same foreman as last year. We do not know if there is a printing facility in Baltimore. This was important last year for last minute problems (last year there was the question of where to cut the pictures for printing, and this required reprinting on the show floor).

Artifacts

Cloud chamber, cosmic ray detector, neutrino detector.

Administrivia

Send these notes to jeffk@fnal.gov and pcanal@fnal.gov. Next face to face is at FNAL between August 12 and 21 '02. All coordinators should plan on attending.


[ Feedback ]