DOC PREVIEW
Berkeley COMPSCI C267 - Big Bang, Big Iron

This preview shows page 1-2-23-24 out of 24 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Big Bang, Big Iron High Performance Computing and the Cosmic Microwave BackgroundThe Cosmic Microwave BackgroundCMB Physics DriversThe Concordance CosmologyObserving The CMBCMB Satellite EvolutionThe Planck SatelliteBeyond PlanckCMB Data AnalysisThe CMB Data ChallengeCMB Data Analysis EvolutionScaling In PracticeAside: HPC System EvaluationMADbench2 I/O EvaluationMADmap for Planck Map MakingPlanck First Light SurveyPlanck Sim/Map TargetSlide 18On-The-Fly SimulationCurrent Planck State-Of-The-Art : CTP3MADmap Scaling ProfileNext Generation HPC SystemsOngoing ResearchConclusionsCS267 - April 20th, 2010Big Bang, Big Iron High Performance Computing and the Cosmic Microwave BackgroundJulian BorrillComputational Cosmology Center, LBLSpace Sciences Laboratory, UCBwith Chris Cantalupo, Ted Kisner, Radek Stompor, Rajesh Sudarsanand the BOOMERanG, MAXIMA, Planck, EBEX, PolarBear & other experimental collaborationsCS267 - April 20th, 2010The Cosmic Microwave BackgroundAbout 400,000 years after the Big Bang, the expanding Universe cools through the ionization temperature of hydrogen: p+ + e- => H.Without free electrons to scatter off, CMB photons free-stream to us today.•COSMIC - filling all of space.•MICROWAVE - redshifted by the expansion of the Universe from 3000K to 3K.•BACKGROUND - primordial photons coming from “behind” all astrophysical sources.CS267 - April 20th, 2010CMB Physics Drivers•It is the earliest possible photon image of the Universe.•Its existence supports a Big Bang over a Steady State cosmology (NP1).•Tiny fluctuations in the CMB temperature (NP2) and polarization encode details of–cosmology•geometry•topology•composition•history–ultra-high energy physics•fundamental forces•beyond the standard model•Inflation & the dark sector (NP3)CS267 - April 20th, 2010The Concordance CosmologySupernova Cosmology Project (1998): Cosmic Dynamics (- m) BOOMERanG & MAXIMA (2000): Cosmic Geometry (+ m)70% Dark Energy + 25% Dark Matter + 5% Baryons95% IgnoranceWhat (and why) is the Dark Universe ?CS267 - April 20th, 20101% of static on (untuned) TVObserving The CMBCS267 - April 20th, 2010CMB Satellite EvolutionCS267 - April 20th, 2010The Planck Satellite•The primary driver for HPC CMB work for the last decade.•A joint ESA/NASA satellite mission performing a 2-year+ all-sky survey from L2.•All-sky survey at 9 microwave frequencies from 30 to 857 GHz.•The biggest data set to date:–O(1012) observations–O(108) sky pixels–O(104) spectral multipolesCS267 - April 20th, 2010Beyond Planck•EBEX (1x Planck) - Antarctic long-duration balloon flight in 2012. •PolarBear (10x Planck) - Atacama desert ground-based 2010-13.•QUIET-II (100x Planck) - Atacama desert ground-based 2012-15.•CMBpol (1000x Planck) - L2 satellite 2020-ish?CS267 - April 20th, 2010CMB Data Analysis•In principle very simple–Assume Guassianity and maximize the likelihood1. of maps given the data and its noise statistics (analytic).2. of power spectra given maps and their noise statistics (iterative).•In practice very complex–Foregrounds, asymmetric beams, non-Gaussian noise, etc.–Algorithm & implementation scaling with evolution of•Data volume•HPC architectureCS267 - April 20th, 2010The CMB Data Challenge•Extracting fainter signals (polarization mode, angular resolution) from the data requires:–larger data volumes to provide higher signal-to-noise.–more complex analyses to remove fainter systematic effects.•1000x data increase over next 15 years–need to continue to scale on the bleeding edge through the next 10 M-foldings !Experiment Date Time Samples Sky Pixels Gflop/MapCOBE 1989 109 1031*BOOMERanG 2000 109105103WMAP 2001 1010106104Planck 2009 1011107105PolarBear 2012 1012106106QUIET-II 2015 1013106107CMBpol 2020+ 1014108108CS267 - April 20th, 2010CMB Data Analysis Evolution Data volume & computational capability dictate analysis approach.Date Data System Map Power Spectrum2000 B98Cray T3E x 700Explicit Maximum Likelihood (Matrix Invert - Np3)Explicit Maximum Likelihood(Matrix Cholesky + Tri-solve - Np3)2002 B2K2IBM SP3 x 3,000Explicit Maximum Likelihood (Matrix Invert - Np3)Explicit Maximum Likelihood(Matrix Invert + Multiply - Np3)2003-7Planck subsetsIBM SP3 x 6,000PCG Maximum Likelihood(FFT - Nt log Nt)Monte Carlo(Sim + Map - many Nt)2007+Planck fullEBEXCray XT4 x 40,000PCG Maximum Likelihood(FFT - Nt log Nt)Monte Carlo(SimMap - many Nt)2010+Towards CMBpolAddressing the challenges of 1000x data & next 10 generations of HPC systems starting with Hopper, Blue Waters, etc.CS267 - April 20th, 2010Scaling In Practice2000: BOOMERanG-98 temperature map (108 samples, 105 pixels) calculated on 128 Cray T3E processors;2005: A single-frequency Planck temperature map (1010 samples, 108 pixels) calculated on 6000 IBM SP3 processors;2008: EBEX temperature and polarization maps (1011 samples, 106 pixels) calculated on 15360 Cray XT4 cores.CS267 - April 20th, 2010Aside: HPC System Evaluation•Scientific applications provide realistic benchmarks–Exercise all components of a system both individually and collectively.–Performance evaluation can be fed back into application codes.•MADbench2–Based on MADspec CMB power spectrum estimation code.–Full computational complexity (calculation, communication & I/O).–Scientific complexity removed•reduces lines of code by 90%.•runs on self-generated pseudo-data.–Used for NERSC-5 & -6 procurements.–First friendly-user Franklin system crash (90 minutes after access).CS267 - April 20th, 2010MADbench2 I/O EvaluationIO performance comparison• 6 HPC systems• Read & write• Unique & shared filesAsynchronous IO experiment• N bytes asynchronous read/write• N flops simultaneous work• Measure time spent waiting on IOCS267 - April 20th, 2010MADmap for Planck Map MakingA massively parallel, highly optimized, PCG solver for maximum likelihood maps(s) given a time-stream of observations and their noise statistics•2005: First Planck-scale map–75 billion observations mapped to 150 million pixels–First science code to use all 6,000 CPUs of Seaborg•2007: First full Planck map-set (FFP)–750 billion observations mapped to 150 million pixels–Using 16,000 cores of Franklin–IO doesn’t scale•write-dominated simulations•read-dominated mappings•May 14th 2009: Planck launches!CS267 - April 20th, 2010Planck First Light SurveyCS267 - April 20th, 2010Planck Sim/Map Target•By the end


View Full Document

Berkeley COMPSCI C267 - Big Bang, Big Iron

Documents in this Course
Lecture 4

Lecture 4

52 pages

Split-C

Split-C

5 pages

Lecture 5

Lecture 5

40 pages

Load more
Download Big Bang, Big Iron
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Big Bang, Big Iron and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Big Bang, Big Iron 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?