Light Propogation
Jon is interested in adding a better light calculation to his PV code. At the moment the Beer-Lambert law/equation is being used for light. Firstly we need to understand what is wrong with the current technique. We've been discussing the use of the level set method / fast marching method as an alternative to using FiPy's basic first order upwind scheme, which is quite inefficient and is not working for some problems. Some more thoughts:
- We don't have a second order accurate implicit scheme in FiPy.
- Would using a second order explicit scheme be very inefficient. It is a one time cost since it is static, but it could get prohibitively bad for the sort of meshes Jon may need for accurate solutions.
- FFM will be way more efficient and we have second order schemes available, but not on unstructured meshes.
- Can we even solve the Beer-Lambert equation using the Eikonal equation?
- We need to understand what is going wrong with the current solution that Jon showed me in 1D and why it seems to deviate form the analytical. I believe that it is grid spacing.
Given the Beer-Lambert equation,
let's try and solve a 1D problem with a shock using implicit upwind. See trunk/misc/light.py. With a fixed grid things seem quite good even though this is a 1D scheme. Not sure if this carries over to 1D. Basically the velocity goes from 10 to 1 half way across the domain and we see a shock as expected ( is constant). The results are not so bad.
The upwind implicit scheme can deal with shocks (at least in 1D). Let's observe how it deals with a large change in the mesh density (see trunk/misc/lightVariableMesh.py). Let's try a 100 fold change. A small jump occurs.
A ten fold increase. Seems fine.
A thousand fold increase. Big jump.
The jumps are not surprising and consistent with a first order scheme. The tail has the right shape even though it has a large displacement. The PV 1D result did not have a tail that modeled the local solution correctly. Maybe that is to do with a gradient of changes rather than 1 jump in the mesh density.
Adding a gradual change example trunk/misc/lightVariableMeshGradual.py shows this
This could be the issue with the PV code.
Tom's Electrical Problem
Tom has come up with a new problem that involves solving the electric field. Here are the equations at the electrode interface ():
and in the bulk:
on the top electrode ():
At first glance, I didn't see that this system is closed. Since the equation is steady state, and surrounded by Neumann boundary conditions it only needs one boundary to fix the value (it is an initial value problem for the slope of rather than a boundary value problem for .
Solving for the 1D problem
We have (assuming the normal points out of the electrolyte)
where is the distance between the electrodes
This leads to an ODE for
This can be solved with scipy's ODE integrator. To do this I took an example <http://docs.scipy.org/doc/scipy/reference/generated/scipy.integrate.ode.html> and used that. The code is source:trunk/moffat/electrical/1D.py.
The results are
and
With the following numbers:
We get the following steady state values:
Installing Samrai
In order to use lsmlib in parallel, I need to install SAMRAI. Of course this is easier said than done. I downloaded Samrai 2.4.4 and did the following on bunter:
$ mkdir build $ cd build $ ../configure --prefix=/users/wd15/.virtualenvs/lsmlib --with-F77=/usr/bin/gfortran checking build system type... x86_64-unknown-linux-gnu ... config.status: executing default commands $ make make NDIM=1 compile || exit 1 ... g++ -g -Wall -fno-implicit-templates -I. -I../../../include -I../../../../include -I/usr/lib/openmpi/include -c ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C -o HDFDatabaseFactory.o ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C: In member function ‘virtual SAMRAI::tbox::Pointer<SAMRAI::tbox::Database> SAMRAI::tbox::HDFDatabaseFactory::allocate(const std::string&)’: ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: ‘HDFDatabase’ was not declared in this scope ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: template argument 1 is invalid ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: invalid type in declaration before ‘=’ token ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: expected type-specifier before ‘HDFDatabase’ ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: invalid conversion from ‘int*’ to ‘int’ ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:20: error: expected ‘,’ or ‘;’ before ‘HDFDatabase’ ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:21: error: invalid conversion from ‘int’ to ‘SAMRAI::tbox::Database*’ ../../../../source/toolbox/restartdb/HDFDatabaseFactory.C:21: error: initializing argument 1 of ‘SAMRAI::tbox::Pointer<TYPE>::Pointer(TYPE*, bool) [with TYPE = SAMRAI::tbox::Database]’ make[4]: *** [HDFDatabaseFactory.o] Error 1 make[4]: Leaving directory `/users/wd15/packages/SAMRAI/build/source/toolbox/restartdb' make[3]: *** [libXd] Error 1 make[3]: Leaving directory `/users/wd15/packages/SAMRAI/build/source/toolbox' make[2]: *** [libXd] Error 1 make[2]: Leaving directory `/users/wd15/packages/SAMRAI/build/source' make[1]: *** [compile] Error 1 make[1]: Leaving directory `/users/wd15/packages/SAMRAI/build' make: *** [library] Error 1
So that didn't work. I had the following exchange with someone on the SAMRAI list. I wrote
Hi, I'm trying to build SAMRAI version 2.4.4. The configure step seems to have no issues, but the compilation step throws the following error. $ make make NDIM=1 compile || exit 1 make[1]: Entering directory `/users/wd15/Documents/python/SAMRAI' (cd source && make libXd) || exit 1 ... ... ... g++ -g -Wall -fno-implicit-templates -I. -I../../../include -I../../../include -I/usr/lib/openmpi/include -c HDFDatabaseFactory.C -o HDFDatabaseFactory.o HDFDatabaseFactory.C: In member function ‘virtual SAMRAI::tbox::Pointer<SAMRAI::tbox::Database> SAMRAI::tbox::HDFDatabaseFactory::allocate(const std::string&)’: HDFDatabaseFactory.C:20: error: ‘HDFDatabase’ was not declared in this scope HDFDatabaseFactory.C:20: error: template argument 1 is invalid HDFDatabaseFactory.C:20: error: invalid type in declaration before ‘=’ token HDFDatabaseFactory.C:20: error: expected type-specifier before ‘HDFDatabase’ HDFDatabaseFactory.C:20: error: invalid conversion from ‘int*’ to ‘int’ HDFDatabaseFactory.C:20: error: expected ‘,’ or ‘;’ before ‘HDFDatabase’ HDFDatabaseFactory.C:21: error: invalid conversion from ‘int’ to ‘SAMRAI::tbox::Database*’ HDFDatabaseFactory.C:21: error: initializing argument 1 of ‘SAMRAI::tbox::Pointer<TYPE>::Pointer(TYPE*, bool) [with TYPE = SAMRAI::tbox::Database]’ make[4]: *** [HDFDatabaseFactory.o] Error 1 ... ... ... make[1]: *** [compile] Error 1 make[1]: Leaving directory `/users/wd15/Documents/python/SAMRAI' make: *** [library] Error 1 Any ideas what the issue may be? Other details: $ uname -a Linux bunter 2.6.32-5-amd64 #1 SMP Mon Oct 3 03:59:20 UTC 2011 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Debian Description: Debian GNU/Linux 6.0.3 (squeeze) Release: 6.0.3 Codename: squeeze All other packages are at standard versions for Debian Squeeze including the HDF5 libraries. Thanks
and I got the reply
Daniel, HDFDatabase.h is guarded with a #ifdef HAVE_HDF5. From the error message it appears that HAVE_HDF5 is not defined. Grep for HAVE_HDF5 in SAMRAI_config.h to see if it is defined. I suspect that something went wrong during configuration and you are not getting HDF5. Another clue that this is likely the problem is that there is no hdf5 include directory on the compile line. Let us know what you find. -- Bill Arrighi Lawrence Livermore National Lab
so I tried a new configure step to include hdf5
$../configure --prefix=/users/wd15/.virtualenvs/lsmlib --with-F77=/usr/bin/gfortran --with-hdf5 checking build system type... x86_64-unknown-linux-gnu ... checking for HDF5 installation... /usr checking whether HDF5 link works... no configure: error: HDF5 compile/link test failed
Andrew suggested debugging the configure script with the -x flag. The "set -x" was places on line 19422. Any way it showed that the configure script is trying to build and link the following c++ file (conftest.cppa):
#include "hdf5.h" #define FILE "file.h5" int main() { hid_t file_id; /* file identifier */ herr_t status; /* Create a new file using default properties. */ file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT); /* Terminate access to the file. */ status = H5Fclose(file_id); }
with
$ g++ -o conftest -g -O2 conftest.cpp -lz -ldl -lm -lz -lm /tmp/ccn31txn.o: In function `main': /users/wd15/packages/SAMRAI/build/conftest.cpp:10: undefined reference to `H5check_version' /users/wd15/packages/SAMRAI/build/conftest.cpp:10: undefined reference to `H5Fcreate' /users/wd15/packages/SAMRAI/build/conftest.cpp:13: undefined reference to `H5Fclose' collect2: ld returned 1 exit status
if I do
$ g++ -o conftest -g -O2 conftest.cpp -lz -ldl -lm -lz -lm -lhdf5 $ ./conftest
it works fine. Looking at the configure code, it looks like it wants a directory for --with-hdf5:
../configure --prefix=/users/wd15/.virtualenvs/lsmlib --with-F77=/usr/bin/gfortran --with-hdf5=/usr
gets through the config without issue. However make fails with loads of
g++ -g -Wall -fno-implicit-templates -I. -I../../../include -I../../../../include -I/usr/include -I/usr/lib/openmpi/include -c ../../../../source/toolbox/restartdb/HDFDatabase.C -o HDFDatabase.o /usr/include/H5Gpublic.h: In static member function ‘static herr_t SAMRAI::tbox::HDFDatabase::iterateKeys(hid_t, const char*, void*)’: /usr/include/H5Gpublic.h:148: error: too many arguments to function ‘hid_t H5Gopen1(hid_t, const char*)’ ../../../../source/toolbox/restartdb/HDFDatabase.C:160: error: at this point in file /usr/include/H5Gpublic.h:148: error: too many arguments to function ‘hid_t H5Gopen1(hid_t, const char*)’ ../../../../source/toolbox/restartdb/HDFDatabase.C:180: error: at this point in file ../../../../source/toolbox/restartdb/HDFDatabase.C:204: error: invalid conversion from ‘herr_t (**)(hid_t, void*)’ to ‘void**’ /usr/include/H5Epublic.h:212: error: too many arguments to function ‘herr_t H5Eget_auto1(herr_t (**)(void*), void**)’ ../../../../source/toolbox/restartdb/HDFDatabase.C:204: error: at this point in file /usr/include/H5Epublic.h:216: error: too many arguments to function ‘herr_t H5Eset_auto1(herr_t (*)(void*), void*)’ ... make[4]: *** [HDFDatabase.o] Error 1 make[4]: Leaving directory `/users/wd15/packages/SAMRAI/build/source/toolbox/restartdb' make[3]: *** [libXd] Error 1 make[3]: Leaving directory `/users/wd15/packages/SAMRAI/build/source/toolbox' make[2]: *** [libXd] Error 1 make[2]: Leaving directory `/users/wd15/packages/SAMRAI/build/source' make[1]: *** [compile] Error 1 make[1]: Leaving directory `/users/wd15/packages/SAMRAI/build' make: *** [library] Error 1