TL;DR
Multi-physics simulations often require coupling multiple specialized solvers. preCICE is a mature, open-source coupling library that enables partitioned multi-physics simulations in Python. This tutorial shows how to couple FiPy with another solver using preCICE, covering installation, adapter implementation, configuration, and common pitfalls. You’ll learn when to use partitioned coupling, how to set up data exchange, and how to debug typical issues.
Introduction: When One Solver Is Not Enough
Many scientific problems involve multiple physical processes that interact. For example:
- Heat transfer coupled with fluid flow (conjugate heat transfer)
- Structural mechanics interacting with fluid dynamics (fluid-structure interaction)
- Electrochemistry coupled with transport phenomena (battery modeling)
FiPy excels at solving partial differential equations (PDEs) for transport, diffusion, and phase-field problems. However, some multi-physics scenarios require coupling FiPy with a specialized solver for another physics domain—like OpenFOAM for fluid dynamics or a custom structural code.
Code coupling is the practice of connecting two or more independent simulation codes so they exchange data during the computation. This is where preCICE (partitioned coupling infrastructure for continuum equations) comes in.
Why Use preCICE?
preCICE provides:
- Language-agnostic coupling: Connect solvers written in C++, Python, Fortran, etc.
- Flexible data mapping: Handle mesh-to-mesh interpolation between non-matching grids
- Time-stepping coordination: Support explicit, implicit, and quasi-Newton coupling schemes
- Parallel scalability: Designed for high-performance computing environments
- Active community: Used in research and industry, with good documentation
For FiPy users, preCICE opens the door to robust multi-physics workflows without building coupling infrastructure from scratch.
Partitioned vs Monolithic Solvers
Before diving in, understand the two main approaches to multi-physics:
| Aspect | Partitioned (preCICE) | Monolithic |
|---|---|---|
| Structure | Separate solvers coupled via data exchange | Single solver handles all physics together |
| Code reuse | High – use existing specialized solvers | Low – need unified implementation |
| Development effort | Moderate – write adapters, configure coupling | Very high – implement all physics in one code |
| Performance | Good, but communication overhead | Potentially better for tightly coupled problems |
| Flexibility | Easy to swap or upgrade individual solvers | Rigid – changes affect entire codebase |
| Use case | Mature solvers exist for each physics | New or highly integrated physics models |
When to choose partitioned coupling with preCICE:
- You have reliable, optimized solvers for individual physics (e.g., FiPy for diffusion, OpenFOAM for flow)
- The physics are loosely or moderately coupled
- You want to leverage existing codebases and communities
- You need flexibility to try different solver combinations
When monolithic might be better:
- Extremely tight coupling requiring implicit treatment at the linear algebra level
- Performance-critical HPC simulations where communication overhead is unacceptable
- Very new physics where no specialized solver exists yet
For many research applications, partitioned coupling with preCICE is the pragmatic choice.
Conceptual Overview: How preCICE Works
preCICE follows a client-server architecture:
- Solver A (e.g., FiPy) runs as a preCICE client
- Solver B (another code) runs as another preCICE client
- preCICE runs as a separate process (the coupling service) that manages communication and data mapping
- At coupling interfaces, solvers exchange boundary data (e.g., temperature, heat flux, velocity) through preCICE
- preCICE handles:
- Mesh mapping when interface meshes don’t match
- Time-step synchronization
- Data relaxation and under-relaxation to improve convergence
- Coupling schemes (explicit, implicit, quasi-Newton)
The key benefit: each solver remains unaware of the other’s internal details. They only need to know how to send/receive data at their boundary via preCICE’s API.
Prerequisites
Before starting, ensure you have:
- FiPy installed and working (see Installing and Setting Up FiPy for the First Time for setup guidance)
- Python programming experience with FiPy (familiarity with FiPy’s core concepts is essential)
- A second solver ready for coupling (this tutorial uses a simple Python solver as a stand-in, but the same pattern works with OpenFOAM,CalculiX, or other preCICE-enabled codes)
- preCICE installed on your system (see preCICE installation guide)
- Basic understanding of multi-physics (review What Is Scientific Simulation and Why It Matters if needed)
Step 1: Install preCICE Python Bindings
preCICE provides Python bindings so FiPy (or any Python code) can act as a coupling client.
# Using conda (recommended)
conda install -c conda-forge precice
# Or using pip
pip install precice
Verify the installation:
python -c "import precice; print('preCICE Python version:', precice.__version__)"
You should see the preCICE version printed without errors.
Step 2: Design Your Coupling Interface
Identify where the two solvers interact. Common scenarios:
- FiPy domain → temperature field
- Other solver domain → heat flux or temperature
Define:
- Interface mesh on each side (may be different)
- Coupling data: what quantities are exchanged (e.g.,
Temperature,Heat-Flux) - Mesh scaling: ensure coordinate systems align
For this tutorial, we’ll couple a FiPy heat diffusion solver with a simple Python convective cooling solver. The interface is a 1D line where temperature and heat flux are exchanged.
Step 3: Write the FiPy preCICE Adapter
The adapter is the code that connects your solver (FiPy) to preCICE. It handles:
- Initializing the preCICE connection
- Defining the coupling interface mesh
- Reading incoming data from the other solver
- Writing outgoing data to send
- Advancing the coupling time step
Create a file fipy_precice_adapter.py:
import numpy as np
from fipy import CellVariable, Grid1D, DiffusionTerm, TransientTerm
import precice
# --- Configuration ---
# These should match the preCICE configuration file
COUPLING_INTERFACE = "Fluid-Solid" # Name of the coupling interface
COUPLING_MESH_ID = 0 # ID of the mesh on this solver side
COUPLING_DATA_TEMPERATURE_ID = 0
COUPLING_DATA_HEATFLUX_ID = 1
# Simulation parameters
nx = 50
dx = 1.0
total_time = 10.0
dt = 0.1
# --- FiPy setup ---
mesh = Grid1D(nx=nx, dx=dx)
T = CellVariable(name="Temperature", mesh=mesh, value=300.0) # Initial temp in K
k = 1.0 # Thermal conductivity
# Boundary conditions
# Left boundary: fixed temperature (will be overwritten by coupling)
# Right boundary: fixed temperature (example)
T.constrain(300.0, mesh.facesRight)
# Equation: transient diffusion
eq = TransientTerm() == DiffusionTerm(coeff=k)
# --- preCICE setup ---
solver_name = "FiPy-Solver"
config_file = "precice-config.xml" # Provided separately
precice_dll = precice.Participant(solver_name, config_file, 0, 0)
# Get the coupling mesh from preCICE
mesh_vertices = precice_dll.get_mesh_vertices(COUPLING_MESH_ID, COUPLING_INTERFACE)
# In 1D, mesh_vertices is an array of x-coordinates
# We'll map these to our FiPy mesh cells
# Main coupling loop
time = 0.0
while time < total_time:
# 1. Advance FiPy one time step (without coupling first)
eq.solve(var=T, dt=dt)
# 2. Read incoming data from the other solver (e.g., heat flux)
# preCICE expects data at the coupling interface vertices
heat_flux_in = precice_dll.read_data(
COUPLING_DATA_HEATFLUX_ID,
COUPLING_INTERFACE,
mesh_vertices
)
# 3. Apply received heat flux as Neumann boundary condition on FiPy
# This is simplified: in practice you'd map heat_flux_in to FiPy faces
# For 1D, we can directly apply to left boundary
if len(heat_flux_in) > 0:
# Neumann BC: -k * dT/dx = heat_flux
# In FiPy, you can set flux via constrained faces
# For simplicity, we'll adjust the left boundary value to mimic flux
# A proper implementation would use a FluxBoundaryCondition
pass # Implementation depends on exact boundary treatment
# 4. Write outgoing data (temperature) to send to the other solver
# Extract temperature at coupling interface cells
T_interface = T.value # In practice, sample at interface mesh locations
precice_dll.write_data(
COUPLING_DATA_TEMPERATURE_ID,
COUPLING_INTERFACE,
T_interface[:len(mesh_vertices)] # ensure matching size
)
# 5. Advance coupling
precice_dll.advance(dt)
time += dt
print(f"Time: {time:.2f}, Max T: {T.max():.2f}, Min T: {T.min():.2f}")
precice_dll.finalize()
Important notes:
- This is a minimal example. Real implementations need proper mesh mapping and boundary condition application.
- The
precice-config.xmlfile defines communication settings, mesh IDs, and data IDs. See Step 4. - The FiPy boundary condition update is simplified; for production use, implement proper Neumann BC via
FaceVariableor custom constraint.
Step 4: Create the preCICE Configuration File
preCICE uses an XML configuration file (precice-config.xml) that defines:
- Participants (solvers)
- Coupling interfaces
- Mesh and data definitions
- Communication settings
- Coupling scheme (explicit, implicit, quasi-Newton)
A minimal configuration for FiPy + Python solver coupling:
<?xml version="1.0" encoding="UTF-8"?>
<precice-configuration>
<schema>https://precice.org/schemas/coupling-schema-2.2.xsd</schema>
<!-- Parallel settings -->
<parallel>
<exchange comm-world="true" />
</parallel>
<!-- Coupling participants -->
<participants>
<participant name="FiPy-Solver">
<use-mesh name="Fluid-Solid-Mesh-FiPy" provide="true" />
<use-data name="Temperature" from="FiPy-Solver" />
<use-data name="Heat-Flux" to="FiPy-Solver" />
</participant>
<participant name="SimpleSolver">
<use-mesh name="Fluid-Solid-Mesh-Simple" provide="true" />
<use-data name="Temperature" to="SimpleSolver" />
<use-data name="Heat-Flux" from="SimpleSolver" />
</participant>
</participants>
<!-- Coupling interfaces -->
<coupling-scheme:serial-explicit>
<participants first="FiPy-Solver" second="SimpleSolver" />
<exchange>
<data name="Temperature" mesh="Fluid-Solid-Mesh-FiPy" />
<data name="Heat-Flux" mesh="Fluid-Solid-Mesh-Simple" />
</exchange>
<max-time>10.0</max-time>
<max-iterations>10</max-iterations>
</coupling-scheme:serial-explicit>
<!-- Meshes -->
<mesh name="Fluid-Solid-Mesh-FiPy" from="FiPy-Solver" />
<mesh name="Fluid-Solid-Mesh-Simple" from="SimpleSolver" />
<!-- Data -->
<data name="Temperature" mesh="Fluid-Solid-Mesh-FiPy" />
<data name="Heat-Flux" mesh="Fluid-Solid-Mesh-Simple" />
</precice-configuration>
Key elements:
- Two participants:
FiPy-SolverandSimpleSolver - Data exchange:
Temperature(FiPy → SimpleSolver) andHeat-Flux(SimpleSolver → FiPy) serial-explicitcoupling: simple explicit time stepping (good for starting)max-iterationscontrols how many sub-iterations per time step (for implicit schemes)
For more advanced coupling, consider:
serial-implicitwith quasi-Newton acceleration for tighter coupling- Under-relaxation factors to improve convergence
- Mesh connectivity definitions if meshes don’t align
Step 5: Implement the Second Solver (Simple Example)
To test the coupling, you need a second solver. Here’s a minimal Python solver that exchanges heat flux with FiPy:
solver_simple.py:
import numpy as np
import precice
# Configuration
COUPLING_INTERFACE = "Fluid-Solid"
COUPLING_MESH_ID = 0
COUPLING_DATA_TEMPERATURE_ID = 0
COUPLING_DATA_HEATFLUX_ID = 1
solver_name = "SimpleSolver"
config_file = "precice-config.xml"
precice_dll = precice.Participant(solver_name, config_file, 0, 0)
# Simple 1D mesh (could be completely different from FiPy's mesh)
mesh_vertices = np.array([[0.0], [1.0]]) # Two points at interface
T_interface = np.array([300.0, 300.0]) # Initial guess
total_time = 10.0
dt = 0.1
time = 0.0
while time < total_time:
# 1. Read incoming temperature from FiPy
T_from_fipy = precice_dll.read_data(
COUPLING_DATA_TEMPERATURE_ID,
COUPLING_INTERFACE,
mesh_vertices
)
# 2. Compute outgoing heat flux (simple Newton's law of cooling)
# q = h * (T_fluid - T_solid)
h = 10.0 # Heat transfer coefficient
T_fluid = 350.0 # Ambient fluid temperature
heat_flux = h * (T_fluid - T_from_fipy.mean())
# Send same flux to all interface vertices (simplified)
heat_flux_out = np.full_like(T_from_fipy, heat_flux)
# 3. Write heat flux to send to FiPy
precice_dll.write_data(
COUPLING_DATA_HEATFLUX_ID,
COUPLING_INTERFACE,
heat_flux_out
)
# 4. Advance coupling
precice_dll.advance(dt)
time += dt
print(f"[SimpleSolver] Time: {time:.2f}, T_fipy: {T_from_fipy.mean():.2f}, q: {heat_flux:.2f}")
precice_dll.finalize()
This simple solver represents a convective cooling environment that applies a heat flux proportional to the temperature difference from the FiPy solid.
Step 6: Run the Coupled Simulation
- Start both solvers simultaneously (preCICE manages their communication):
# Terminal 1
python fipy_precice_adapter.py &
# Terminal 2
python solver_simple.py &
- Watch the output. You should see time advancing in both solvers and temperature/heat flux values evolving.
- The simulation ends when
total_timeis reached or the coupling scheme completes.
Expected behavior:
- FiPy temperature should gradually decrease from initial 300 K toward the cooling curve
- Heat flux from SimpleSolver should be positive (heat leaving FiPy)
- Both solvers should stay synchronized in time
Step 7: Visualize Results
After running, you can plot FiPy’s temperature profile:
import matplotlib.pyplot as plt
from fipy import Grid1D, CellVariable, DiffusionTerm
# Re-run simulation while storing history
# (or modify the adapter to save snapshots to file)
# Then plot T vs x at final time
Troubleshooting: Common Issues and Solutions
| Symptom | Likely Cause | Solution |
|---|---|---|
precice: error: XML file not found |
Configuration file missing or wrong path | Ensure precice-config.xml is in working directory or provide full path |
Mesh vertex count mismatch |
FiPy and preCICE interface meshes have different numbers of vertices | Check get_mesh_vertices() returns expected count; map correctly |
Data not found in configuration |
Data name or ID doesn’t match XML | Verify COUPLING_DATA_* constants match <data name="..."> entries |
Coupling diverges (temperatures explode) |
Under-relaxation needed or explicit scheme unstable | Add <relaxation value="0.5"/> in XML or switch to implicit coupling |
Time step mismatch |
Solvers use different dt values |
Ensure both use same dt or let preCICE control time (precice_dll.advance(dt)) |
Segmentation fault in preCICE |
Mismatched preCICE versions between participants | Reinstall preCICE to ensure all participants use same version |
FiPy boundary not updating |
Adapter not applying received flux correctly | Implement proper Neumann boundary condition using FaceVariable or modify equation source term |
Communication timeout |
One solver crashed or blocked | Check both solvers are running; verify they call advance() regularly |
Debugging tips:
- Run solvers with
PRECICE_DEBUG=1environment variable for verbose logs - Start with a coarse mesh and short total time to test quickly
- Use
serial-explicitcoupling first (most forgiving) before trying implicit schemes - Validate each solver without coupling first to ensure they work standalone
Advanced Topics
Once the basic coupling works, explore:
Implicit Coupling with Quasi-Newton
For tightly coupled problems, explicit schemes may require many sub-iterations or become unstable. preCICE supports implicit coupling with quasi-Newton acceleration:
<coupling-scheme:serial-implicit>
<participants first="FiPy-Solver" second="SimpleSolver" />
<exchange>
<data name="Temperature" mesh="Fluid-Solid-Mesh-FiPy" />
<data name="Heat-Flux" mesh="Fluid-Solid-Mesh-Simple" />
</exchange>
<max-time>10.0</max-time>
<max-iterations>10</max-iterations>
<convergence-measure>
<data name="Temperature" mesh="Fluid-Solid-Mesh-FiPy" />
</convergence-measure>
<relaxation value="0.5"/>
</coupling-scheme:serial-implicit>
Mesh-to-Mesh Mapping
If FiPy’s interface mesh doesn’t match the other solver’s mesh, preCICE performs mesh mapping (interpolation). You can configure the mapping method in XML:
<mesh name="Fluid-Solid-Mesh-FiPy" from="FiPy-Solver">
<use-data name="Temperature" />
<mapping:nearest-neighbor />
</mesh>
Options: nearest-neighbor, linear, rbf (radial basis functions). Choose based on mesh quality and physics.
Handling Non-Matching Time Steps
preCICE allows each solver to use its own time step while coordinating at coupling interfaces. Set dt independently in each solver; preCICE will interpolate data as needed. However, large differences can reduce accuracy.
Performance Considerations
- Communication overhead can dominate for small problems. Profile to ensure coupling isn’t the bottleneck.
- Under-relaxation (
<relaxation>in XML) improves stability but slows convergence. - Mesh resolution on the interface affects mapping accuracy. Use sufficiently refined interface meshes.
- Parallel scaling: preCICE works with MPI-enabled solvers. Ensure both FiPy (with PETSc) and the other solver run in parallel if needed.
Next Steps
After mastering basic coupling:
- Try a real second solver (e.g., OpenFOAM for fluid flow) instead of the Python placeholder
- Explore advanced coupling schemes (
serial-implicit,multi) - Add more physics (e.g., couple three solvers: thermal, fluid, structural)
- Implement proper boundary conditions in FiPy using
FaceVariableor custom terms - Optimize performance with parallel execution and profiling
- Study case studies like thermal-fluid coupling or fluid-structure interaction
For deeper FiPy knowledge, review From Equations to Simulations: The Modeling Pipeline and Finite Volume Method Explained Simply.
Further Reading
- What Is FiPy and When Should You Use It? – FiPy fundamentals and use cases
- Installing and Setting Up FiPy for the First Time – Environment setup and troubleshooting
- What Is Scientific Simulation and Why It Matters – Simulation methodology and best practices
- From Equations to Simulations: The Modeling Pipeline – End-to-end workflow from equations to validated results
- Finite Volume Method Explained Simply – Core numerical method behind FiPy
- Understanding Phase-Field Models in Materials Science – Multi-physics phase-field applications
- Why Issue Tracking Is Critical in Scientific Projects – Reproducibility and collaboration practices
Need Help with Your Multi-Physics Project?
Building robust coupled simulations requires deep expertise in both numerical methods and software integration. If you’re tackling a complex multi-physics problem and need guidance on:
- Designing a coupling workflow with preCICE
- Implementing FiPy adapters for your specific physics
- Debugging convergence issues
- Optimizing performance for large-scale simulations
MatForge offers consultation services for research software workflows. We can help you set up reliable, reproducible multi-physics simulations tailored to your project. Visit MatForge homepage to learn more and discuss your specific needs.
Summary
- preCICE enables partitioned multi-physics coupling, allowing FiPy to work with other solvers
- Partitioned approach is practical for research: reuse existing codes, maintain flexibility
- Key steps: install preCICE, write adapter, configure XML, implement boundary data exchange
- Start simple: explicit coupling with a minimal example, then progress to implicit schemes
- Validate each solver standalone first, then together
- Monitor convergence and use under-relaxation if needed
With this foundation, you can extend FiPy to virtually any multi-physics scenario.