Welcome to the forums. Please post in English or French.

You are not logged in.

#1 2016-07-11 19:19:21

reiteres
Member
Registered: 2013-06-03
Posts: 557

Parallel installation of aster-13.2: blacs not found

Hi!

I have the following problem:
I installed aster-full package 13.2 without problems. When I try to install the mpi version as usual I run into the problem that
waf does not find blacs, although it should be there (either in the libpath or as scalapack). See here

[maldun@drachenhorst ~/Salome_Meca/devel/13.2/src]$ ./waf configure --use-config-dir=$ASTER_ROOT/13.2/share/aster/ --use-config=Fedora_gnu_mpi --prefix=$ASTER_ROOT/13.2mpi
Setting top to                           : /home/maldun/Salome_Meca/devel/13.2/src 
Setting out to                           : /home/maldun/Salome_Meca/devel/13.2/src/build 
Setting prefix to                        : /home/maldun/Salome_Meca/devel/13.2/13.2mpi 
Checking for custom configuration        : Fedora_gnu_mpi 
Checking for 'mpicc'                     : yes 
Checking for 'mpif90'                    : yes 
Checking for 'gcc' (c compiler)          : mpicc 
Checking for 'g++' (c++ compiler)        : mpicxx 
Checking for 'gfortran' (fortran compiler) : mpif90 
Checking for header mpi.h                : yes 
Checking for C compiler version          : gcc 5.3.1 
Checking for Fortran compiler version    : gfortran 5.3.1 
fortran link verbose flag                : -v 
Checking for OpenMP flag -fopenmp        : yes 
Getting fortran runtime link flags       : ok (-L/usr/lib64/openmpi/lib -L/lib64/codeaster/mumps-5.0.1-openmpi/ -L/lib64/codeaster/scalapack-openmpi-2.0.2/lib/ -L/lib64/codeaster/petsc-3.4.5/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/usr/lib64/ -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mumps-5.0.1/lib -L/home/maldun/Salome_Meca/devel/13.2/public/metis-5.1.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/scotch-6.0.4/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib64 -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib64 -L/usr/lib64/openmpi/lib -L/usr/lib64/openmpi/lib -L/lib64/codeaster/mumps-5.0.1-openmpi/ -L/lib64/codeaster/scalapack-openmpi-2.0.2/lib/ -L/lib64/codeaster/petsc-3.4.5/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/usr/lib64/ -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mumps-5.0.1/lib -L/home/maldun/Salome_Meca/devel/13.2/public/metis-5.1.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/scotch-6.0.4/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib64 -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib64 -L/usr/lib64/openmpi/lib -L/usr/lib64/openmpi/lib -L/lib64/codeaster/mumps-5.0.1-openmpi/ -L/lib64/codeaster/scalapack-openmpi-2.0.2/lib/ -L/lib64/codeaster/petsc-3.4.5/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/usr/lib64/ -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mumps-5.0.1/lib -L/home/maldun/Salome_Meca/devel/13.2/public/metis-5.1.0/lib -L/home/maldun/Salome_Meca/devel/13.2/public/mfront-2.0.3/lib -L/home/maldun/Salome_Meca/devel/13.2/public/scotch-6.0.4/lib -L/home/maldun/Salome_Meca/devel/13.2/public/hdf5-1.8.14/lib64 -L/home/maldun/Salome_Meca/devel/13.2/public/med-3.2.0/lib64 -L/usr/lib64/openmpi/lib -L/usr/lib/gcc/x86_64-redhat-linux/5.3.1 -L/usr/lib/gcc/x86_64-redhat-linux/5.3.1/../../../../lib64 -L/lib/../lib64 -L/usr/lib/../lib64 -L. -L/usr/lib/gcc/x86_64-redhat-linux/5.3.1/../../.. -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lquadmath -lm -lpthread) 
Checking for library pthread             : yes 
Checking for library dl                  : yes 
Checking for library util                : yes 
Checking for program python              : /usr/bin/python 
Checking for python version              : (2, 7, 11, 'final', 0) 
Checking for library python2.7 in LIBDIR : yes 
Checking for program /usr/bin/python-config,python2.7-config,python-config-2.7,python2.7m-config : /usr/bin/python-config 
Checking for header Python.h             : yes 
Checking for numpy                       : yes 
Checking for numpy version               : 1.9.2 
Checking for numpy include               : ['/usr/lib64/python2.7/site-packages/numpy/core/include'] 
Getting platform                         : LINUX64 
Checking for library pthread             : yes 
Checking for library m                   : yes 
Checking for static library openblas     : yes 
Setting libm after files                 : yes ("-lm" removed from LINKFLAGS_CLIB) 
Checking for a program using blas/lapack : yes 
Checking for a program using blacs       : no 
The configuration failed
(complete log in /home/maldun/Salome_Meca/devel/13.2/src/build/config.log)

normally scalapack should contain the missing reference, and it works for 13.0, and 13.1 without problem.
I include the log file. Please help.


Attachments:
config.log, Size: 82.6 KiB, Downloads: 153

Offline

#2 2016-07-14 13:32:03

reiteres
Member
Registered: 2013-06-03
Posts: 557

Re: Parallel installation of aster-13.2: blacs not found

Hi again!

The problems seems to be in waf. I just uncommented the critical line in the blacs test, and everything built fine.
I also tested to compile the testprogram manually and everything worked fine.

Offline

#3 2016-07-14 19:30:44

3rav
Member
From: Poland - Cracow
Registered: 2016-03-30
Posts: 58
Website

Re: Parallel installation of aster-13.2: blacs not found

Hi,

Hello,
I have the same problem. Which file should be modified?

Regards
Rafal

Offline

#4 2016-07-15 14:32:41

reiteres
Member
Registered: 2013-06-03
Posts: 557

Re: Parallel installation of aster-13.2: blacs not found

Hi!

You have to change the lines at the end of the file waftools/mathematics.py
namely

# program testing a blacs call, output is 0 and 1
blacs_fragment = r"""
program test_blacs
    integer iam, nprocs
    call blacs_pinfo (iam, nprocs)
    print *,iam
    print *,nprocs
end program test_blacs
"""

to

# program testing a blacs call, output is 0 and 1
blacs_fragment = r"""
program test_blacs
    integer iam, nprocs
    !call blacs_pinfo (iam, nprocs)
    !print *,iam
    !print *,nprocs
end program test_blacs
"""

But more trouble is on the way: hdf5 is not correctly recognized with MPI, one has to add the -ldl flag in your config file for waf:

self.env.append_unique('LINKFLAGS', ['-Wl,--allow-multiple-definition','-ldl'])

I still have trouble with mumps tough ...

Offline

#5 2016-07-18 07:07:43

3rav
Member
From: Poland - Cracow
Registered: 2016-03-30
Posts: 58
Website

Re: Parallel installation of aster-13.2: blacs not found

Hi

Thanks for the advise

Regards

Rafal

Offline

#6 2016-07-22 17:31:15

TMK
Member
Registered: 2015-05-11
Posts: 36

Re: Parallel installation of aster-13.2: blacs not found

Hi,

I am also having this same problem. When building Mumps 5.0.1 with waf I get error when checking library blacs .. Missing the scalapack library. I also tried to change waftools/mathematics.py and config fiel for waf as Reiteres guided and not working still. Am I doing something wrong here or has somebody managed to install parallel aster 13.2 or at least mumps 5.0.1 parallel?

Br,

TMK


Attachments:
config.log, Size: 116.17 KiB, Downloads: 141

Offline

#7 2016-07-28 08:15:05

mathieu.courtois
Administrator
From: France
Registered: 2007-11-21
Posts: 1,146

Re: Parallel installation of aster-13.2: blacs not found

Sure that if you comment the symbol call waf will not check if blacs is found !
(blacs is only required for mumps, not code_aster itself)


reiteres wrote:

Hi!

You have to change the lines at the end of the file waftools/mathematics.py
namely

# program testing a blacs call, output is 0 and 1
blacs_fragment = r"""
program test_blacs
    integer iam, nprocs
    call blacs_pinfo (iam, nprocs)
    print *,iam
    print *,nprocs
end program test_blacs
"""

to

# program testing a blacs call, output is 0 and 1
blacs_fragment = r"""
program test_blacs
    integer iam, nprocs
    !call blacs_pinfo (iam, nprocs)
    !print *,iam
    !print *,nprocs
end program test_blacs
"""

But more trouble is on the way: hdf5 is not correctly recognized with MPI, one has to add the -ldl flag in your config file for waf:

self.env.append_unique('LINKFLAGS', ['-Wl,--allow-multiple-definition','-ldl'])

I still have trouble with mumps tough ...


Code_Aster release : last unstable on Ubuntu 16.04 64 bits - GNU Compilers

Please do not forget to tag your first post as *SOLVED* when it is!

Offline

#8 2016-08-01 08:31:50

TMK
Member
Registered: 2015-05-11
Posts: 36

Re: Parallel installation of aster-13.2: blacs not found

Hi,

SOrry but I am very unexperienced when in comes to installing something for Linux. Is it possible to provide some example how to get this work? What lines should I comment and do I use #-mark for comment and what does these exclamation marks mean?

Br,

TMK

Offline

#9 2016-08-04 23:09:17

katoulos
Member
Registered: 2011-06-10
Posts: 61

Re: Parallel installation of aster-13.2: blacs not found

I had the same problem: I added an environment variable (which is readable by waf I think)

export LIBPATH=/pfs/nobackup/home/m/mimis14/opt/ScaLAPACK/2.0.2/lib/

seems to have done the trick

mimis

Offline

#10 2016-11-01 09:55:46

TMK
Member
Registered: 2015-05-11
Posts: 36

Re: Parallel installation of aster-13.2: blacs not found

Hi, now I managed to install MUMPS (works with ubuntu 14) but I still have problems when configuring aster. Now i get:

Checking for a program using blas/lapack : yes
Checking for a program using blacs       : yes
Checking for a program using omp thread  : yes (on 8 threads)
Checking for static library hdf5         : yes
Checking for static library z            : yes
Checking for header hdf5.h               : yes
Checking hdf5 version                    : 1.8.14
Checking for API hdf5 v18                : default v18
Checking for static library med          : yes
Checking for static library stdc++       : yes
Checking for header med.h                : yes
Checking size of med_int integers        : 4
Checking med version                     : 3.2.0
Checking for library metis               : yes
Checking for header metis.h              : yes
Checking metis version                   : 5.1.0
Checking for smumps_struc.h              : no
The configuration failed

What might cause this problem and anyone have a resolution for this?

Br,

TMK


Attachments:
config.log, Size: 89.09 KiB, Downloads: 95

Offline

#11 2016-11-28 10:10:49

TOBI
Member
Registered: 2013-12-02
Posts: 44

Re: Parallel installation of aster-13.2: blacs not found

Hello everybody,
Hello TMK,
I have installed my first parallel version of Code_Aster 13.2 also on Ubuntu 16.04 on friday. During the installation everything works well until it comes to configure waf for the parallel build. I follow the instructions here:
https://sites.google.com/site/codeaster … aster-12-6
First I also get the error:

Checking for a program using blacs       : no

But I could manage it with changing the waftools/mathematics.py like reiteres!
After this I get the same error as TMK

Checking for smumps_struc.h              : no 

Searching smumps_struc.h I find out, that the directory where this file is located, is not included in the definition in the configuration file Ubuntu_gnu_mpi.py. So I change:

   self.env.prepend_value('INCLUDES', [
        '/opt/petsc-3.6.4/arch-linux2-c-opt/include',
        '/opt/petsc-3.6.4/include'])

to

   self.env.prepend_value('INCLUDES', [
        '/opt/mumps-5.0.1_mpi/include', # add these line
        '/opt/petsc-3.6.4/arch-linux2-c-opt/include',
        '/opt/petsc-3.6.4/include'])

This solves the problem with  smumps_struc.h, and is maybe the solution for you, TMK!?

But I get an additional error, that there is no hdf5-Version. I'm a little bit unexperienced with this, but I try to commend the line
 

 # opts.embed_hdf5 = True # commended  

in Ubuntu_gnu.py!
After this the configuration and installation finished successfully. The parallel version works well with MUMPS and PETSc.

But I have one final problem, I cannot set mpi_nbcpu higher then 1, so I cannot benefit from the parallel processing. I don't know if it depends on the calculation or on my settings in ASTK and the COMM-file. So I start another topic for this on Friday here:
http://web-code-aster.org/forum2/viewtopic.php?id=21201
I hope my explanations can help to solve your problem, maybe you have the same problem then me, after installation!? Maybe someone has an idea for me. Attached you find my config.log!

Best regards
TOBI


Attachments:
config.log, Size: 273.97 KiB, Downloads: 98

Offline

#12 2016-12-12 12:47:28

TMK
Member
Registered: 2015-05-11
Posts: 36

Re: Parallel installation of aster-13.2: blacs not found

Hi,

Thanks TOBI for the hints. I managed to install CA 13.2 in parallel by changing ubuntu_gnu_mpi. py -file as you suggested. I did not have problems with hdf5, but when running parallel calculation code stops with error during AFFE_MODELE:

"The file of name fort.99.part.4 associated with the logical unit ??????? does not exist."

Anyway after changing DISTRIBUTION=_F(PARTITIONNEUR='METIS',
                                     METHODE='SOUS_DOMAINE',),

into

DISTRIBUTION=_F(PARTITIONNEUR='SCOTCH',
                                     METHODE='SOUS_DOMAINE',),

makes calculation to run into end with mpi_nbcpu=4.

Does this error with METIS refer somehow to my installation or is it something else?

Br,

TMK

Offline

#13 2016-12-20 16:53:54

TOBI
Member
Registered: 2013-12-02
Posts: 44

Re: Parallel installation of aster-13.2: blacs not found

Hi TMK,

thanks for solution! With default settings (METIS) I had exactly the same error message for mpi_nbcpu=4. But changing to SCOTCH make the calculation run.
I'm no expert for this, but in my opinion the calculation also has to run with metis, because it's the default. I think there is something wrong with the installation or linking of metis in both of our installations!
It would be a good idea to run some testcases and validate if they are running normally with our installation. Maybe from here:
http://www.code-aster.org/spip.php?article252
or maybe the petscXX or mumpsXX with mpi_nbcpu > 1.
But at the moment I have a lack of time, I will test this as soon as possible, maybe you are faster!

Did you have an error during the compilation of mumps? During my installation I have to delete the option -lmpi_f77 in line 89 of the Makefile.inc to make the compilation run. See below!

INCPAR = -I/usr/include/openmpi $(IPORD)
LIBPAR = $(SCALAP)  -L/usr/lib -lmpi -lmpi_f77

Unfortunately I don't know why. But METIS is also linked here.
Also I have the feeling, that my parallel version (with mpi_nbcpu=4) is bit slower than the sequential one. This seems to be strange. On the other hand not all calculations are suitable for parallelization. Maybe I need more research for this.

Best regards

TOBI

Offline

Board footer