<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://fkoehler.site/feed.xml" rel="self" type="application/atom+xml" /><link href="https://fkoehler.site/" rel="alternate" type="text/html" /><updated>2026-02-18T22:16:04-08:00</updated><id>https://fkoehler.site/feed.xml</id><title type="html">Felix Köhler’s Website</title><subtitle>My personal homepage.</subtitle><author><name>Felix Matteo Köhler</name></author><entry><title type="html">From Numerical Simulators to Neural Emulators and Back (Talk at RISE ML Seminar)</title><link href="https://fkoehler.site/posts/2025/11/rise-talk/" rel="alternate" type="text/html" title="From Numerical Simulators to Neural Emulators and Back (Talk at RISE ML Seminar)" /><published>2025-11-14T00:00:00-08:00</published><updated>2025-11-14T00:00:00-08:00</updated><id>https://fkoehler.site/posts/2025/11/rise-talk</id><content type="html" xml:base="https://fkoehler.site/posts/2025/11/rise-talk/"><![CDATA[<p>I had the honor to be invited to the <a href="https://www.ri.se/en/learningmachinesseminars/felix-kohler-from-numerical-simulators-of-pdes-to-neural-emulators-and-back">RISE ML seminar
series</a>
and speak about my current research. You can find the <a href="https://www.youtube.com/watch?v=olpXyDARMJI&amp;list=PLqLiVcF3GKy1tuQFoDu5QKOM6S33t_4R1&amp;index=1">recording
here</a>
and the <a href="https://fkoehler.site/files/from_numerical_simulators_to_neural_emulators_and_back_at_RISE.pdf">slides
here</a>.</p>

<p>In it, I build a bigger narrative arc around the results from
<a href="https://tum-pbs.github.io/apebench-paper/">APEBench</a> and my recent NeurIPS 2025
paper on <a href="https://tum-pbs.github.io/emulator-superiority/">Neural Emulator Superiority</a>.</p>

<p>Below is the abstract of the talk:</p>

<blockquote>
  <p>The potential for computational speedups and tackling unsolved problems has
motivated the use of neural networks (NNs) for helping solve PDEs. In
particular, image-like models that operate on state-discrete representations
that approach time autoregressively gained popularity over the past years.</p>

  <p>In this talk, I will present a holistic perspective on learning autoregressive
neural emulators from simulated data, starting with the synthetic data
generation using classical numerical simulators, covering the training process
and ultimately investigating their benchmarking. Using a wide range of
experiments with different PDEs and neural architectures, I will highlight the
similarities between emulators and simulators. This shows how emulator
architectures were inspired by classical schemes for solving the laws of
nature and thereby inherit both their merits and limitations.</p>

  <p>Moreover, I will elaborate on the impact of reference data fidelity and
discuss a counterintuitive yet interesting finding when emulators can become
better than their training data source.</p>
</blockquote>]]></content><author><name>Felix Matteo Köhler</name></author><category term="PDE" /><summary type="html"><![CDATA[]]></summary></entry><entry><title type="html">My NeurIPS 2024 Paper: APEBench</title><link href="https://fkoehler.site/posts/2024/12/apebench/" rel="alternate" type="text/html" title="My NeurIPS 2024 Paper: APEBench" /><published>2024-12-02T00:00:00-08:00</published><updated>2024-12-02T00:00:00-08:00</updated><id>https://fkoehler.site/posts/2024/12/apebench</id><content type="html" xml:base="https://fkoehler.site/posts/2024/12/apebench/"><![CDATA[<p>🎉 I am happy to announce NeurIPS2024 paper: <a href="https://arxiv.org/abs/2411.00180">APEBench</a>.</p>

<p>🧵 Check out the <a href="https://tum-pbs.github.io/apebench-paper/">project page</a></p>

<p>👉 <a href="https://github.com/tum-pbs/apebench">Code</a></p>

<p>🤖 Or install via pip (requires JAX): <code class="language-plaintext highlighter-rouge">pip install apebench</code></p>

<p>In this <a href="https://bsky.app/profile/felix-m-koehler.bsky.social/post/3lhyknm5aw22e">Bluesky
thread</a>,
I summarize APEBench’s main contributions.</p>

<p>Check out the Twitter/X <a href="https://x.com/felix_m_koehler/status/1862034228642238872">thread with more
background and acknowledgements</a>.</p>]]></content><author><name>Felix Matteo Köhler</name></author><category term="PDE" /><summary type="html"><![CDATA[]]></summary></entry><entry><title type="html">Lecture: Automatic Differentiation &amp;amp; Adjoint Methods in Differentiable Physics</title><link href="https://fkoehler.site/posts/2024/07/autodiff-lecture/" rel="alternate" type="text/html" title="Lecture: Automatic Differentiation &amp;amp; Adjoint Methods in Differentiable Physics" /><published>2024-07-08T00:00:00-07:00</published><updated>2024-07-08T00:00:00-07:00</updated><id>https://fkoehler.site/posts/2024/07/autodiff-lecture</id><content type="html" xml:base="https://fkoehler.site/posts/2024/07/autodiff-lecture/"><![CDATA[<p>As part of our master course in <a href="https://ge.in.tum.de/teaching/">Advanced Deep Learning for Physics (IN2298)</a>, I gave a lecture on autodiff and adjoint methods. You can find the lecture slides <a href="https://fkoehler.site/files/autodiff_and_adjoints_lecture.pdf">here</a>. The lecture was recorded and is available on YouTube:</p>

<p><a href="https://www.youtube.com/watch?v=N7nVoyR0qO4"><img src="https://img.youtube.com/vi/N7nVoyR0qO4/0.jpg" alt="Link to the YouTube video" /></a></p>

<p>In it, I cover:</p>

<ol>
  <li>A functional (JAX/Julia-inspired) viewpoint on autodiff in terms of Jvp/Pushforward and vJp/Pullback</li>
  <li>Identifying hierarchy levels in autodiff (scalar-mode, vector-mode, continuous-mode)</li>
  <li>A comparison of Optimize-then-Discretize (OtD) vs. Discretize-then-Optimize (DtO)</li>
  <li>Special aspects of differentiable physics like differentiating over linear and nonlinear solvers</li>
  <li>Advanced topics and recent research directions</li>
</ol>]]></content><author><name>Felix Matteo Köhler</name></author><category term="autodiff" /><category term="adjoint" /><category term="differentiable physics" /><category term="PDE" /><summary type="html"><![CDATA[]]></summary></entry><entry><title type="html">My new Website and Blog</title><link href="https://fkoehler.site/posts/2023/02/my-new-website-and-blog/" rel="alternate" type="text/html" title="My new Website and Blog" /><published>2023-02-26T00:00:00-08:00</published><updated>2023-02-26T00:00:00-08:00</updated><id>https://fkoehler.site/posts/2023/02/my-blog</id><content type="html" xml:base="https://fkoehler.site/posts/2023/02/my-new-website-and-blog/"><![CDATA[<p>This is my new website and blog. I am still working on it, but it is already online. I will post more information about it soon.</p>

<p>Quick test on the math mode: $\int_0^1 x^2 dx$ inline</p>

<p>and full line:</p>

\[\int_0^1 x^2 dx\]

<p>And the code snippet:</p>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="n">np</span>
<span class="kn">import</span> <span class="nn">matplotlib.pyplot</span> <span class="k">as</span> <span class="n">plt</span>

<span class="n">x</span> <span class="o">=</span> <span class="n">np</span><span class="p">.</span><span class="n">linspace</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">100</span><span class="p">)</span>
<span class="n">y</span> <span class="o">=</span> <span class="n">x</span><span class="o">**</span><span class="mi">2</span>

<span class="n">plt</span><span class="p">.</span><span class="n">plot</span><span class="p">(</span><span class="n">x</span><span class="p">,</span> <span class="n">y</span><span class="p">)</span>
<span class="n">plt</span><span class="p">.</span><span class="n">show</span><span class="p">()</span>
</code></pre></div></div>]]></content><author><name>Felix Matteo Köhler</name></author><category term="a tag?" /><summary type="html"><![CDATA[This is my new website and blog. I am still working on it, but it is already online. I will post more information about it soon.]]></summary></entry></feed>