September 20, 2023
September 20, 2023
Present
Geoffrey Fox, Piotr Luszczek, Wes Brewer, Gregor von Laszewski, Christine Kirkpatrick, Gregg Barrett, Ying-Jung Chen
Apologies
Mallikarjun Shankar
Tentative Agenda
- Any new members
- Foundation Models
- How do we get started -- IBM-NASA Model
- White Papers
- Other Benchmarks
- AOB
- Second Order Methods?
Discussion of Foundation Models
- We agreed to contact Foundation Model projects and discussion form of interactionb. This includes
- Microsoft DeepSpeed from new announcement Announcing the DeepSpeed4Science Initiative: Enabling large-scale scientific discovery through sophisticated AI system technologies - Microsoft Research
- IBM-NASA either Manil Maskey or Yuhan Rao who joined earlier calls
- TPC. Perhaps Murali can suggest best method
- Wes noted on foundation models for climate, here is an interesting paper: [2301.10343] ClimaX: A foundation model for weather and climate
- He arranged a talk next time on a Climate Foundation model from Valentine (Val) Anantharaj, Ph.D. \<vga@ornl.gov> | GVoice: +1 646.504.7223 Valentine ANANTHARAJ | Computational Climate Scientist | PhD | Oak Ridge National Laboratory, TN | ORNL National Center for Computational Sciences / Oak Ridge National Laboratory
- Wes also discussed Foundation model for Turbulence (see last meeting) built to insert into OpenFoam
- Geoffrey noted work of Stephan Hoyer at Google https://research.google/people/StephanHoyer/ where they build a model in Julia/Python that can be trained using JAX to differentiate code Stephan Hoyer (Google): Deep learning and differentiable simulations Deep learning & differentiable simulations
- Piotr noted
- [2306.00258] Towards Foundation Models for Scientific Machine Learning: Characterizing Scaling and Transfer Behavior
- [2108.07258] On the Opportunities and Risks of Foundation Models
White Papers
- Christine led discussion of mlcommons_data_energy_usage_paper which is near completion
- We identified sections that Gregor and Geoffrey needed to work on. Geoffrey will update the efficiency analysis.
- Maybe we will submit it to Frontiers in HPC and switch to Overleaf
Second Order Methods
- Geoffrery noted that MLCommons does not directly benchmark second order methods although there are many papers saying better performance can sometimes be obtained from them
- One needs to submit to Open Division
- There was quite a bit of interest although this does not clearly belong in Science WG