Skip to content

Commit

Permalink
update 8th Apr talk
Browse files Browse the repository at this point in the history
  • Loading branch information
kmario23 committed Mar 14, 2024
1 parent f0c8f03 commit 0b21e06
Show file tree
Hide file tree
Showing 4 changed files with 121 additions and 41 deletions.
100 changes: 59 additions & 41 deletions Gemfile.lock
Original file line number Diff line number Diff line change
@@ -1,52 +1,65 @@
GEM
remote: https://rubygems.org/
specs:
activesupport (7.0.4)
activesupport (7.1.3.2)
base64
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
connection_pool (>= 2.2.5)
drb
i18n (>= 1.6, < 2)
minitest (>= 5.1)
mutex_m
tzinfo (~> 2.0)
addressable (2.8.1)
addressable (2.8.6)
public_suffix (>= 2.0.2, < 6.0)
bibtex-ruby (6.0.0)
base64 (0.2.0)
bibtex-ruby (6.1.0)
latex-decode (~> 0.0)
racc (~> 1.7)
bigdecimal (3.1.7)
citeproc (1.0.10)
namae (~> 1.0)
citeproc-ruby (1.1.14)
citeproc (~> 1.0, >= 1.0.9)
csl (~> 1.6)
colorator (1.1.0)
concurrent-ruby (1.1.10)
concurrent-ruby (1.2.3)
connection_pool (2.4.1)
crass (1.0.6)
csl (1.6.0)
namae (~> 1.0)
rexml
csl-styles (1.0.1.11)
csl (~> 1.0)
cssminify2 (2.0.1)
drb (2.2.1)
em-websocket (0.5.3)
eventmachine (>= 0.12.9)
http_parser.rb (~> 0)
eventmachine (1.2.7)
execjs (2.8.1)
feedjira (3.2.2)
loofah (>= 2.3.1)
sax-machine (>= 1.0)
ffi (1.15.5)
execjs (2.9.1)
feedjira (3.2.3)
loofah (>= 2.3.1, < 3)
sax-machine (>= 1.0, < 2)
ffi (1.16.3)
forwardable-extended (2.6.0)
gemoji (4.0.1)
google-protobuf (3.21.12-x86_64-linux)
gemoji (4.1.0)
google-protobuf (4.26.0-arm64-darwin)
rake (>= 13)
google-protobuf (4.26.0-x86_64-linux)
rake (>= 13)
html-pipeline (2.14.3)
activesupport (>= 2)
nokogiri (>= 1.4)
htmlcompressor (0.4.0)
http_parser.rb (0.8.0)
httparty (0.20.0)
mime-types (~> 3.0)
httparty (0.21.0)
mini_mime (>= 1.0.0)
multi_xml (>= 0.5.2)
i18n (1.12.0)
i18n (1.14.4)
concurrent-ruby (~> 1.0)
jekyll (4.3.1)
jekyll (4.3.3)
addressable (~> 2.4)
colorator (~> 1.0)
em-websocket (~> 0.5)
Expand Down Expand Up @@ -82,7 +95,7 @@ GEM
jekyll (>= 3.0, < 5.0)
jekyll-sass-converter (3.0.0)
sass-embedded (~> 1.54)
jekyll-scholar (7.1.1)
jekyll-scholar (7.1.3)
bibtex-ruby (~> 6.0)
citeproc-ruby (~> 1.0)
csl-styles (~> 1.0)
Expand All @@ -96,59 +109,64 @@ GEM
gemoji (>= 3, < 5)
html-pipeline (~> 2.2)
jekyll (>= 3.0, < 5.0)
json (2.6.3)
json (2.7.1)
json-minify (0.0.3)
json (> 0)
kramdown (2.4.0)
rexml
kramdown-parser-gfm (1.1.0)
kramdown (~> 2.0)
latex-decode (0.4.0)
libv8-node (16.10.0.0-x86_64-linux)
liquid (4.0.3)
listen (3.7.1)
libv8-node (18.16.0.0-arm64-darwin)
libv8-node (18.16.0.0-x86_64-linux)
liquid (4.0.4)
listen (3.9.0)
rb-fsevent (~> 0.10, >= 0.10.3)
rb-inotify (~> 0.9, >= 0.9.10)
loofah (2.19.1)
loofah (2.22.0)
crass (~> 1.0.2)
nokogiri (>= 1.5.9)
nokogiri (>= 1.12.0)
mercenary (0.4.0)
mime-types (3.4.1)
mime-types-data (~> 3.2015)
mime-types-data (3.2022.0105)
mini_racer (0.6.3)
libv8-node (~> 16.10.0.0)
minitest (5.16.3)
mini_mime (1.1.5)
mini_racer (0.8.0)
libv8-node (~> 18.16.0.0)
minitest (5.22.3)
multi_xml (0.6.0)
namae (1.1.1)
nokogiri (1.13.10-x86_64-linux)
mutex_m (0.2.0)
namae (1.2.0)
racc (~> 1.7)
nokogiri (1.16.2-arm64-darwin)
racc (~> 1.4)
nokogiri (1.16.2-x86_64-linux)
racc (~> 1.4)
pathutil (0.16.2)
forwardable-extended (~> 2.6)
public_suffix (5.0.1)
racc (1.6.2)
rake (13.0.6)
public_suffix (5.0.4)
racc (1.7.3)
rake (13.1.0)
rb-fsevent (0.11.2)
rb-inotify (0.10.1)
ffi (~> 1.0)
rexml (3.2.5)
rouge (4.0.1)
rexml (3.2.6)
rouge (4.2.0)
safe_yaml (1.0.5)
sass-embedded (1.57.1)
google-protobuf (~> 3.21)
rake (>= 10.0.0)
sass-embedded (1.72.0-arm64-darwin)
google-protobuf (>= 3.25, < 5.0)
sass-embedded (1.72.0-x86_64-linux-gnu)
google-protobuf (>= 3.25, < 5.0)
sax-machine (1.3.2)
terminal-table (3.0.2)
unicode-display_width (>= 1.1.1, < 3)
tzinfo (2.0.5)
tzinfo (2.0.6)
concurrent-ruby (~> 1.0)
uglifier (4.2.0)
execjs (>= 0.3.0, < 3)
unicode-display_width (2.3.0)
unicode-display_width (2.5.0)
unicode_utils (1.4.0)
webrick (1.7.0)
webrick (1.8.1)

PLATFORMS
arm64-darwin-23
x86_64-linux

DEPENDENCIES
Expand Down
9 changes: 9 additions & 0 deletions _news/announcement_8.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
layout: post
title: How Temporal Unrolling Supports Neural Physics Simulators
date: 2024-04-08
inline: true
---

[Björn List will give a talk on "How Temporal Unrolling Supports Neural Physics Simulators"](projects/temp_unroll_neuralphys_blist/) :fire:

53 changes: 53 additions & 0 deletions _projects/temp_unroll_neuralphys_blist.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
---
layout: page
title: How Temporal Unrolling Supports Neural Physics Simulators
description: by Björn List (Technical University Munich)
img: assets/img/talks/temp-unroll-neural-physics-simulator-blist.png
importance: 1
category: pde

---



<div class="row">
<div class="col-sm mt-3 mt-md-0">
{% include figure.html path="assets/img/talks/temp-unroll-neural-physics-simulator-blist.png" title="example image" class="img-fluid rounded z-depth-1" %}
</div>
</div>
<hr>



**Topic**: [**How Temporal Unrolling Supports Neural Physics Simulators**](https://arxiv.org/abs/2402.12971)



<hr>

**Abstract:**

Unrolling training trajectories over time strongly influences the inference accuracy of neural network-augmented physics simulators. We analyze these effects by studying three variants of training neural networks on discrete ground truth trajectories. In addition to commonly used one-step setups and fully differentiable unrolling, we include a third, less widely used variant: unrolling without temporal gradients. Comparing networks trained with these three modalities makes it possible to disentangle the two dominant effects of unrolling, training distribution shift and long-term gradients. We present a detailed study across physical systems, network sizes, network architectures, training setups, and test scenarios. It provides an empirical basis for our main findings: A non-differentiable but unrolled training setup supported by a numerical solver can yield 4.5-fold improvements over a fully differentiable prediction setup that does not utilize this solver. We also quantify a difference in the accuracy of models trained in a fully differentiable setup compared to their non-differentiable counterparts. While differentiable setups perform best, the accuracy of unrolling without temporal gradients comes comparatively close. Furthermore, we empirically show that these behaviors are invariant to changes in the underlying physical system, the network architecture and size, and the numerical scheme. These results motivate integrating non-differentiable numerical simulators into training setups even if full differentiability is unavailable. We also observe that the convergence rate of common neural architectures is low compared to numerical algorithms. This encourages the use of hybrid approaches combining neural and numerical algorithms to utilize the benefits of both.



<hr>


| | |
| ------------------- | ------------------------------------------------------------ |
| **Topic** | [**How Temporal Unrolling Supports Neural Physics Simulators**](https://arxiv.org/abs/2402.12971) |
| | |
| **Slides** | **TBA** |
| | |
| **When** | **08.04.2024, 15:00 - 16:15 (CEST) / 10:00 - 11:15 (EST) / 08:00 - 09:15 (MST)** |
| | |
| **Where** | [**https://us02web.zoom.us/j/85216309906?pwd=cVB0SjNDR2tYOGhIT0xqaGZ2TzlKUT09**](https://us02web.zoom.us/j/85216309906?pwd=cVB0SjNDR2tYOGhIT0xqaGZ2TzlKUT09) |
| | |
| **Video Recording** | **TBA** |

<hr>
**Speaker(s):**

Björn List is a PhD candidate at the Technical University Munich. Previously he completed his M.Sc. from Imperial College London. His current research interests focus on the intersection of computational fluid dynamics and machine learning, specifically the learning-based methods for turbulence modelling with the aim of leveraging such methods to improve simulation accuracy and efficiency.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 0b21e06

Please sign in to comment.