0

iTough2 parallelization by PVM

Hello,

I appreciate a response on some aspects of PVM parallelization, in case you have some experiences with it.

Is it possible to do the parallelization on a single machine with extended number of cores, say 16 or 32? If so, is it the same installation instruction used for multi-node parallelization is to be used, what should you then do differently?

Thanks

Keshvad

7 replies

null
    • Senior Geothermal Scientist - Vatnaskil Consulting Engineers
    • Andri_Arnaldsson
    • 6 yrs ago
    • Reported - view

    Hello Keshvad

    If you are using a single multi-core machine, you may want to look at the >>> PARALLEL: option rather than PVM - no need to install PVM.

    Cheers!
     

    • Godarzi
    • 6 yrs ago
    • Reported - view

    Thank you Andri, but seemingly this doesn't work for inverse simulation. Is there any alternative for that as well?

    Regards

    Keshvad

    • Senior Geothermal Scientist - Vatnaskil Consulting Engineers
    • Andri_Arnaldsson
    • 6 yrs ago
    • Reported - view

    I'm sure it works for inverse simulations, there is no mechanism for solver parallelization in the standard iTOUGH2. Also the documentions mentions the Levenberg-Marquardt algorithm. But then again, I have never tried it ...

    Cheers!

    • Finsterle GeoConsulting
    • Stefan_Finsterle
    • 6 yrs ago
    • Reported - view

    Keshvad,

     

    Andri is right; I just repeat for clarification:

    (1) iTOUGH2 embarrassingly parallelizes multiple forward runs that are independent of each other (e.g., for the evaluation of the Jacobian in the LM algorithm, for sensitivity analysis, and Monte Carlo simulations). Please see the iTOUGH2-PVM manual for a detailed description. Parallelization is exactly for "inverse simulations"; however, iTOUGH2 does NOT parallelize the forward run.

    (2) You could use PVM to access multiple processors on a single machine.

    (3) You could also use the PARALLEL feature (by including it2parallel.f into your build; only available for iTOUGH2 V7.1); it would not require the installation of PVM.

    Stefan

    • Godarzi
    • 6 yrs ago
    • Reported - view

    Thank you both indeed. We know now and we are going to test it.

    I might use the opportunity to ask also about alternative approaches to multi node solution.  We have different approaches; for iTough2 v7.1, (VPM), and for ToughReact, (OPM). We have some challenges in incorporating these into our IT queuing system (which is governed by LSF), meaning we might have to re-write/code it in, in order to enable its smooth co-existence with other ongoing simulations within the same cluster. Beyond that, the VPM code seemed to be not quite up to the task regarding network vulnerability. I assume that there isn't any quite easy solution to this, but is there any potential solution to this? Is it possible to re-write OPM and use it for iTough2 instead of VPM? Is there any other parallelization code that is closest to VPM but more secure to be used or rewritten? If you want to have the Tough2 package (minimum: itough2+TReact) working within the same IT environment and able to be run in parallel (decoupled), what would your advise be?

    Thanks again

    Keshvad

    • Finsterle GeoConsulting
    • Stefan_Finsterle
    • 6 yrs ago
    • Reported - view

    Keshvad,

    I assume with "VPM" you mean "PVM" and with "OPM" you mean "OMP". PVM has been used as a very secure message passage tool across heterogeneous clusters (which is more challenging than a single multi-core machine). I am not aware that there are any network vulnerability issues (there won't be any network access if you run on a single multi-core machine, won't there?), and would not know how to address them. All queuing systems I worked with were able to handle MPI, OMP, and PVM. Please contact your system administrator.

    The PARALLEL option mentioned previously is integrated into iTOUGH2 and does message passing through standard text files. If you can write and read text files, you can use PARALLEL.

    Good luck,

    Stefan

    • Godarzi
    • 6 yrs ago
    • Reported - view

    Thank you Stefan.

    Keshvad

Content aside

  • 6 yrs agoLast active
  • 7Replies
  • 342Views
  • 2 Following