Index index by Group index by Distribution index by Vendor index by creation date index by Name Mirrors Help Search

mvapich2_2_3_7-gnu-hpc-doc-2.3.7-150500.1.37 RPM for ppc64le

From OpenSuSE Leap 15.5 for ppc64le

Name: mvapich2_2_3_7-gnu-hpc-doc Distribution: SUSE Linux Enterprise 15
Version: 2.3.7 Vendor: SUSE LLC <https://www.suse.com/>
Release: 150500.1.37 Build date: Thu May 18 17:47:56 2023
Group: Development/Libraries/Parallel Build host: mourvedre
Size: 1603255 Source RPM: mvapich2_2_3_7-gnu-hpc-2.3.7-150500.1.37.src.rpm
Packager: https://www.suse.com/
Url: http://mvapich.cse.ohio-state.edu
Summary: OSU MVAPICH2 MPI package - Documentation
This is an MPI-3 implementation which includes all MPI-1 and MPI-2 features.  It
is based on MPICH2 and MVICH. This package contains the static libraries

Provides

Requires

License

BSD-3-Clause

Changelog

* Wed Jul 06 2022 nmoreychaisemartin@suse.com
  - Add mvapich2-allow-building-with-external-hwloc.patch
    to allow building against an external hwloc library
  - Build mvapich2 HPC flavors against pmix and hwloc system libraries
* Wed Jun 29 2022 kkaempf@suse.com
  - add pass-correct-size-to-snprintf.patch to fix potential buffer
    overflows (required to make 'sundials' testsuite pass)
  - Update to mvapich2 2.3.7
    * Features and Enhancements (since 2.3.6):
    - Added support for systems with Rockport's switchless networks
    * Added automatic architecture detection
    * Optimized performance for point-to-point operations
    - Added support for the Cray Slingshot 10 interconnect
    - Enhanced support for blocking collective offload using
      Mellanox SHARP
    * Scatter and Scatterv
    - Enhanced support for non-blocking collective offload using
      Mellanox SHARP
    * Iallreduce, Ibarrier, Ibcast, and Ireduce
    * Bug Fixes (since 2.3.6):
    - Removed several deprectated functions
    - Thanks to Honggang Li @RedHat for the report
    - Fixed a bug where tools like CMake FindMPI would not
      detect MVAPICH  when compiled without Hydra mpiexec
    - Thanks to Chris Chambreau and Adam Moody @LLNL for the report
    - Fixed compilation error when building with mpirun and without hydra
    - Thanks to James Long @University of Illinois for the report
    - Fixed issue with setting RoCE mode correctly without RDMA_CM.
    - Thanks to Nicolas Gagnon @Rockport Networks for the report
    - Fixed an issue on heterogeneous clusters where QP attributes were
      set incorrectly
    - Thanks to X-ScaleSolutions for the report and fix
    - Fixed a memory leak in improbe on the PSM channel
    - Thanks to Gregory Lee @LLNL Beichuan Yan @University of Colorado
      for the report
    - Added retry logic for PSM connection establishment
    - Thanks to Gregory Lee @LLNL for the report and X-ScaleSolutions
      for the patch
    - Fixed an initialization error when using PSM and gcc's -pg option
    - Thanks to Gregory Lee @LLNL for the report and X-ScaleSolutions for
      the patch
    - Fixed a potential integer overflow when transfering large arrays
    - Thanks to Alexander Melnikov for the report and patch
  - Fix Url: link
* Wed Feb 16 2022 nmoreychaisemartin@suse.com
  - Disable dlopen for verbs library (bsc#1196019)
* Tue Oct 19 2021 nmoreychaisemartin@suse.com
  - Move rpm macros to %_rpmmacrodir (bsc#1191386)
* Tue Sep 28 2021 nmoreychaisemartin@suse.com
  - Remove obsolete python dependency (bsc#1190996)
* Tue May 18 2021 nmoreychaisemartin@suse.com
  - Update to mvapich2 2.3.6
    - Enhanced performance for UD-Hybrid code
    - Add multi-rail support for UD-Hybrid code
    - Enhanced performance for shared-memory collectives
    - Enhanced job-startup performance for flux job launcher
    - Use PMI2 by default when SLURM is selected as process manager
    - Add support to use aligned memory allocations for multi-threaded
      applications
    - Architecture detection and enhanced point-to-point tuning for
      Oracle BM.HPC2 cloud shape
    - Add support for GCC compiler v11
    - Update hwloc v1 code to v1.11.14
    - Update hwloc v2 code to v2.4.2
  - Drop obsolete patches:
    - fix-missing-return-code.patch as it was fixed upstream
    - mvapich2-remove-deprecated-sys_siglist.patch
    - rdma_find_network_type-return-MV2_NETWORK_CLASS_UNKNOWN-when-dev_list-is-freed.patch
  - Refresh reproducible.patch
* Wed Mar 24 2021 eich@suse.com
  - Update mvapich2 to 2.3.5.
    * Enhanced performance for MPI_Allreduce and MPI_Barrier
    * Support collective offload using Mellanox's SHARP for Barrier
    - Enhanced tuning framework for Barrier using SHARP
    * Remove dependency on underlying libibverbs, libibmad, libibumad, and
      librdmacm libraries using dlopen
    * Add support for Broadcom NetXtreme RoCE HCA
    - Enhanced inter-node point-to-point support
    * Support architecture detection for Fujitsu A64fx processor
    * Enhanced point-to-point and collective tuning for Fujitsu A64fx processor
    * Enhanced point-to-point and collective tuning for AMD ROME processor
    * Add support for process placement aware HCA selection
    - Add "MV2_PROCESS_PLACEMENT_AWARE_HCA_MAPPING" environment variable to
      enable process placement aware HCA mapping
    * Add support to auto-detect RoCE HCAs and auto-detect GID index
    * Add support to use RoCE/Ethernet and InfiniBand HCAs at the same time
    * Add architecture-specific flags to improve performance of certain CUDA
      operations
    - Thanks to Chris Chambreau @LLNL for the report
    * Read MTU and maximum outstanding RDMA operations from the device
    * Improved performance and scalability for UD-based communication
    * Update maximum HCAs supported by default from 4 to 10
    * Enhanced collective tuning for Frontera@TACC, Expanse@SDSC,
      Ookami@StonyBrook, and bb5@EPFL
    * Enhanced support for SHARP v2.1.0
    * Generalize code for GPU support
  - Obsolete: wrapper-revert-ldflag-order-change.patch.
  - Replace: mvapich2-fix-double-free.patch by
    rdma_find_network_type-return-MV2_NETWORK_CLASS_UNKNOWN-when-dev_list-is-freed.patch
* Thu Feb 18 2021 nmoreychaisemartin@suse.com
  - Re-add mvapich2-fix-double-free.patch as the bug was
    somehow be reintroduced (bsc#1144000)
  - Add mvapich2-remove-deprecated-sys_siglist.patch to
    fix compilation errors with newer glibc
* Sun Nov 29 2020 eich@suse.com
  - HPC: Fix environment module settings for MANPATH.
* Sat Jul 25 2020 eich@suse.com
  - For HPC builds check for gnu compiler suite version >= 10 and
    set -fallow-argument-mismatch.
* Thu Jul 23 2020 eich@suse.com
  - Add build support for gcc8/9/10 to HPC build (bsc#1174439).
* Wed Jun 03 2020 nmoreychaisemartin@suse.com
  - Update so mvapich2 2.3.4
    - See CHANGELOG for fixes and new features
  - Add fix-missing-return-code.patch to fix compilation errors
  - Add 0001-Drop-Real-16.patch to disable Real(16) support on Armv7
  - Add wrapper-revert-ldflag-order-change.patch to revert LDFLAGS order
    change done in 2.3.4.
    This allows legacy builds to work without adding a -lmpi flag
* Tue Jan 21 2020 nmoreychaisemartin@suse.com
  - Update to mvapich2 2.3.3 (jsc#SLE-8497)
    - See CHANGELOG for fixes and new features
* Fri Sep 06 2019 nmoreychaisemartin@suse.com
  - Update to mvapich2 2.3.2 (jsc#SLE-8544)
    - See CHANGELOG for fixes and new features
  - Drop mvapich2-make-sure-ibv_get_device_list-returned-one-before-freeing-it.patch
    as it was fixed upstream.
  - Use FAT LTO objects in order to provide proper static library.
  - Add 0001-Drop-real128.patch to fix compilation on armv7
* Fri Aug 02 2019 nmoreychaisemartin@suse.com
  - Add mvapich2-make-sure-ibv_get_device_list-returned-one-before-freeing-it.patch
    to Fix segfault when ib_uverbs is not loaded (bsc#1144000)
* Mon May 27 2019 bwiedemann@suse.com
  - Add reproducible.patch to sort readdir to make package build reproducible
    (boo#1041090)
* Mon May 13 2019 nmoreychaisemartin@suse.com
  - Update to mvapich2 2.3.1
    - See CHANGELOG for fixes and new features
  - Refreshed patches against the new version:
    - 0001-Drop-GCC-check.patch
    - mvapich2-arm-support.patch
    - mvapich2-s390_get_cycles.patch
  - Drop mvapich2-fix-double-free.patch as it was merged upstream
* Thu May 02 2019 nmoreychaisemartin@suse.com
  - Add mvapich2-fix-double-free.patch to fix a segfault
    when running on a machine with no RDMA hardware (bsc#1133797)
* Wed Mar 20 2019 aguerrero@suse.com
  - Add patch to remove obsolete GCC check (bnc#1129421). It also patches
    autogen.sh to get the autotools working in SLE12SP4.
    * 0001-Drop-GCC-check.patch
  - Force to re-run autotools to generate properly the files after
    patching src/binding/cxx/buildiface
* Sun Nov 18 2018 eich@suse.com
  - Add macro _hpc_mvapich2_modules for modules support (bsc#1116458).
* Mon Sep 10 2018 nmoreychaisemartin@suse.com
  - Remove bashism in postun scriptlet
* Wed Sep 05 2018 nmoreychaisemartin@suse.com
  - Fix handling of mpi-selector during updates (bsc#1098653)
* Sun Aug 19 2018 eich@suse.com
  - macros.hpc-mvapich2:
    replace %%compiler_family by %%hpc_compiler_family
* Mon Jul 16 2018 msuchanek@suse.com
  - Use sched_yield instead of pthread_yield (boo#1102421).
    - drop mvapich2-pthread_yield.patch
* Mon Jun 18 2018 nmoreychaisemartin@suse.com
  - Add missing bsc and fate references to changelog
* Tue Jun 12 2018 nmoreychaisemartin@suse.com
  - Disable HPC builds for SLE12 (fate#323655)
* Sun Mar 25 2018 kasimir_@outlook.de
  - Change mvapich2-arm-support.patch to provide missing functions for
    armv6hl
* Fri Feb 09 2018 cgoll@suse.com
  - Fix summary in module files (bnc#1080259)
* Tue Jan 30 2018 eich@suse.com
  - Use macro in mpivars.(c)sh to be independent of changes to the module
    setup for the compiler (boo#1078364).
* Fri Jan 05 2018 eich@suse.com
  - Switch from gcc6 to gcc7 as additional compiler flavor for HPC on SLES.
  - Fix library package requires - use HPC macro (boo#1074890).
* Fri Oct 06 2017 nmoreychaisemartin@suse.com
  - Add conflicts between the macros-devel packages
* Thu Oct 05 2017 nmoreychaisemartin@suse.com
  - Add BuildRequires to libibmad-devel for older release (SLE <= 12.2, Leap <= 42.2)
* Tue Sep 12 2017 eich@suse.com
  - Add HPC specific build targets using environment modules
    (FATE#321712).
* Tue Sep 12 2017 nmoreychaisemartin@suse.com
  - Drop unnecessary dependency to xorg-x11-devel
* Mon Sep 11 2017 nmoreychaisemartin@suse.com
  - Only requires verbs libraries for verbs build.
    libibverbs devel causes a SEGV when run in a chroot using the
    psm or psm2 conduits
  - Add testuite packages for all build flavours
* Thu Jul 13 2017 nmoreychaisemartin@suse.com
  - Add LD_LIBRARY_PATH to mpivars.sh and mpivars.csh
* Thu Jul 13 2017 nmoreychaisemartin@suse.com
  - Disable rpath in pkgconfig files
* Wed Jul 05 2017 nmoreychaisemartin@suse.com
  - Remove redondant configure options already passed by %configure
* Mon Jun 26 2017 nmoreychaisemartin@suse.com
  - Change install dir to allow multiple flavor to be installed
    at the same time (bsc#934090)
  - Fix bsc#1045955
    - Fix mvapich2-psm package to use libpsm (TrueScale)
    - Add mvapich2-psm2 package using libpsm2 (OmniPath)
* Mon Jun 26 2017 nmoreychaisemartin@suse.com
  - Use _multibuild to build the various mvapich2-flavours
* Fri Jun 23 2017 nmoreychaisemartin@suse.com
  - Replace dependency from libibmad-devel to infiniband-diags-devel
* Wed Jun 14 2017 nmoreychaisemartin@suse.com
  - Have mvapich2 and mvapich2-psm conflicts between them
  - Cleanup spec file
  - Remove mvapich2-testsuite RPM
* Thu Jun 08 2017 nmoreychaisemartin@suse.com
  - Reenable arm compilation
  - Rename and cleanup mvapich-s390_get_cycles.patch to
    mvapich2-s390_get_cycles.patch for coherency
  - Cleanup mvapich2-pthread_yield.patch
  - Add mvapich2-arm-support.patch to provide missing functions for
    armv7hl and aarch64
* Thu Jun 08 2017 nmoreychaisemartin@suse.com
  - Remove version dependencies to libibumad, libibverbs and librdmacm
* Tue May 16 2017 nmoreychaisemartin@suse.com
  - Fix mvapich2-testsuite packaging
  - Disable build on armv7
* Wed Mar 29 2017 pth@suse.de
  - Make dependencies on libs now coming from rdma-core versioned.
* Tue Nov 29 2016 pth@suse.de
  - Create environment module (bsc#1004628).
* Wed Nov 23 2016 pth@suse.de
  - Fix URL.
  - Update to mvapich 2.2 GA. Changes since rc1:
    MVAPICH2 2.2 (09/07/2016)
    * Features and Enhancements (since 2.2rc2):
    - Single node collective tuning for Bridges@PSC, Stampede@TACC and other
      architectures
    - Enable PSM builds when both PSM and PSM2 libraries are present
    - Add support for HCAs that return result of atomics in big endian notation
    - Establish loopback connections by default if HCA supports atomics
    * Bug Fixes (since 2.2rc2):
    - Fix minor error in use of communicator object in collectives
    - Fix missing u_int64_t declaration with PGI compilers
    - Fix memory leak in RMA rendezvous code path
    MVAPICH2 2.2rc2 (08/08/2016)
    * Features and Enhancements (since 2.2rc1):
    - Enhanced performance for MPI_Comm_split through new bitonic algorithm
    - Enable graceful fallback to Shared Memory if LiMIC2 or CMA transfer fails
    - Enable support for multiple MPI initializations
    - Unify process affinity support in Gen2, PSM and PSM2 channels
    - Remove verbs dependency when building the PSM and PSM2 channels
    - Allow processes to request MPI_THREAD_MULTIPLE when socket or NUMA node
      level affinity is specified
    - Point-to-point and collective performance optimization for Intel Knights
      Landing
    - Automatic detection and tuning for InfiniBand EDR HCAs
    - Warn user to reconfigure library if rank type is not large enough to
      represent all ranks in job
    - Collective tuning for Opal@LLNL, Bridges@PSC, and Stampede-1.5@TACC
    - Tuning and architecture detection for Intel Broadwell processors
    - Add ability to avoid using --enable-new-dtags with ld
    - Add LIBTVMPICH specific CFLAGS and LDFLAGS
    * Bug Fixes (since 2.2rc1):
    - Disable optimization that removes use of calloc in ptmalloc hook
      detection code
    - Fix weak alias typos (allows successful compilation with CLANG compiler)
    - Fix issues in PSM large message gather operations
    - Enhance error checking in collective tuning code
    - Fix issues with UD based communication in RoCE mode
    - Fix issues with PMI2 support in singleton mode
    - Fix default binding bug in hydra launcher
    - Fix issues with Checkpoint Restart when launched with mpirun_rsh
    - Fix fortran binding issues with Intel 2016 compilers
    - Fix issues with socket/NUMA node level binding
    - Disable atomics when using Connect-IB with RDMA_CM
    - Fix hang in MPI_Finalize when using hybrid channel
    - Fix memory leaks
* Tue Nov 15 2016 pth@suse.de
  - Update to version 2.2rc1 (fate#319240). Changes since 2.1:
    MVAPICH2 2.2rc1 (03/29/2016)
    * Features and Enhancements (since 2.2b):
    - Support for OpenPower architecture
    - Optimized inter-node and intra-node communication
    - Support for Intel Omni-Path architecture
    - Thanks to Intel for contributing the patch
    - Introduction of a new PSM2 channel for Omni-Path
    - Support for RoCEv2
    - Architecture detection for PSC Bridges system with Omni-Path
    - Enhanced startup performance and reduced memory footprint for storing
      InfiniBand end-point information with SLURM
    - Support for shared memory based PMI operations
    - Availability of an updated patch from the MVAPICH project website
      with this support for SLURM installations
    - Optimized pt-to-pt and collective tuning for Chameleon InfiniBand
      systems at TACC/UoC
    - Enable affinity by default for TrueScale(PSM) and Omni-Path(PSM2)
      channels
    - Enhanced tuning for shared-memory based MPI_Bcast
    - Enhanced debugging support and error messages
    - Update to hwloc version 1.11.2
    * Bug Fixes (since 2.2b):
    - Fix issue in some of the internal algorithms used for MPI_Bcast,
      MPI_Alltoall and MPI_Reduce
    - Fix hang in one of the internal algorithms used for MPI_Scatter
    - Thanks to Ivan Raikov@Stanford for reporting this issue
    - Fix issue with rdma_connect operation
    - Fix issue with Dynamic Process Management feature
    - Fix issue with de-allocating InfiniBand resources in blocking mode
    - Fix build errors caused due to improper compile time guards
    - Thanks to Adam Moody@LLNL for the report
    - Fix finalize hang when running in hybrid or UD-only mode
    - Thanks to Jerome Vienne@TACC for reporting this issue
    - Fix issue in MPI_Win_flush operation
    - Thanks to Nenad Vukicevic for reporting this issue
    - Fix out of memory issues with non-blocking collectives code
    - Thanks to Phanisri Pradeep Pratapa and Fang Liu@GaTech for
      reporting this issue
    - Fix fall-through bug in external32 pack
    - Thanks to Adam Moody@LLNL for the report and patch
    - Fix issue with on-demand connection establishment and blocking mode
    - Thanks to Maksym Planeta@TU Dresden for the report
    - Fix memory leaks in hardware multicast based broadcast code
    - Fix memory leaks in TrueScale(PSM) channel
    - Fix compilation warnings
    MVAPICH2 2.2b (11/12/2015)
    * Features and Enhancements (since 2.2a):
    - Enhanced performance for small messages
    - Enhanced startup performance with SLURM
    - Support for PMIX_Iallgather and PMIX_Ifence
    - Support to enable affinity with asynchronous progress thread
    - Enhanced support for MPIT based performance variables
    - Tuned VBUF size for performance
    - Improved startup performance for QLogic PSM-CH3 channel
    - Thanks to Maksym Planeta@TU Dresden for the patch
    * Bug Fixes (since 2.2a):
    - Fix issue with MPI_Get_count in QLogic PSM-CH3 channel with very large
      messages (>2GB)
    - Fix issues with shared memory collectives and checkpoint-restart
    - Fix hang with checkpoint-restart
    - Fix issue with unlinking shared memory files
    - Fix memory leak with MPIT
    - Fix minor typos and usage of inline and static keywords
    - Thanks to Maksym Planeta@TU Dresden for the patch and suggestions
    - Fix missing MPIDI_FUNC_EXIT
    - Thanks to Maksym Planeta@TU Dresden for the patch
    - Remove unused code
    - Thanks to Maksym Planeta@TU Dresden for the patch
    - Continue with warning if user asks to enable XRC when the system does not
      support XRC
    MVAPICH2 2.2a (08/17/2015)
    * Features and Enhancements (since 2.1 GA):
    - Based on MPICH 3.1.4
    - Support for backing on-demand UD CM information with shared memory
      for minimizing memory footprint
    - Reorganized HCA-aware process mapping
    - Dynamic identification of maximum read/atomic operations supported by HCA
    - Enabling support for intra-node communications in RoCE mode without
      shared memory
    - Updated to hwloc 1.11.0
    - Updated to sm_20 kernel optimizations for MPI Datatypes
    - Automatic detection and tuning for 24-core Haswell architecture
    * Bug Fixes (since 2.1 GA):
    - Fix for error with multi-vbuf design for GPU based communication
    - Fix bugs with hybrid UD/RC/XRC communications
    - Fix for MPICH putfence/getfence for large messages
    - Fix for error in collective tuning framework
    - Fix validation failure with Alltoall with IN_PLACE option
    - Thanks for Mahidhar Tatineni @SDSC for the report
    - Fix bug with MPI_Reduce with IN_PLACE option
    - Thanks to Markus Geimer for the report
    - Fix for compilation failures with multicast disabled
    - Thanks to Devesh Sharma @Emulex for the report
    - Fix bug with MPI_Bcast
    - Fix IPC selection for shared GPU mode systems
    - Fix for build time warnings and memory leaks
    - Fix issues with Dynamic Process Management
    - Thanks to Neil Spruit for the report
    - Fix bug in architecture detection code
    - Thanks to Adam Moody @LLNL for the report
* Fri Oct 14 2016 pth@suse.de
  - Create and include modules file for Mvapich2 (bsc#1004628).
  - Remove mvapich2-fix-implicit-decl.patch as the fix is upstream.
  - Adapt spec file to the changed micro benchmark install directory.
* Sun Jul 24 2016 p.drouand@gmail.com
  - Update to version 2.1
    * Features and Enhancements (since 2.1rc2):
    - Tuning for EDR adapters
    - Optimization of collectives for SDSC Comet system
    - Based on MPICH-3.1.4
    - Enhanced startup performance with mpirun_rsh
    - Checkpoint-Restart Support with DMTCP (Distributed MultiThreaded
      CheckPointing)
    - Thanks to the DMTCP project team (http://dmtcp.sourceforge.net/)
    - Support for handling very large messages in RMA
    - Optimize size of buffer requested for control messages in large message
      transfer
    - Enhanced automatic detection of atomic support
    - Optimized collectives (bcast, reduce, and allreduce) for 4K processes
    - Introduce support to sleep for user specified period before aborting
    - Disable PSM from setting CPU affinity
    - Install PSM error handler to print more verbose error messages
    - Introduce retry mechanism to perform psm_ep_open in PSM channel
    * Bug-Fixes (since 2.1rc2):
    - Relocate reading environment variables in PSM
    - Fix issue with automatic process mapping
    - Fix issue with checkpoint restart when full path is not given
    - Fix issue with Dynamic Process Management
    - Fix issue in CUDA IPC code path
    - Fix corner case in CMA runtime detection
    * Features and Enhancements (since 2.1rc1):
    - Based on MPICH-3.1.4
    - Enhanced startup performance with mpirun_rsh
    - Checkpoint-Restart Support with DMTCP (Distributed MultiThreaded
      CheckPointing)
    - Support for handling very large messages in RMA
    - Optimize size of buffer requested for control messages in large message
      transfer
    - Enhanced automatic detection of atomic support
    - Optimized collectives (bcast, reduce, and allreduce) for 4K processes
    - Introduce support to sleep for user specified period before aborting
    - Disable PSM from setting CPU affinity
    - Install PSM error handler to print more verbose error messages
    - Introduce retry mechanism to perform psm_ep_open in PSM channel
    * Bug-Fixes (since 2.1rc1):
    - Fix failures with shared memory collectives with checkpoint-restart
    - Fix failures with checkpoint-restart when using internal communication
      buffers of different size
    - Fix undeclared variable error when --disable-cxx is specified with
      configure
    - Fix segfault seen during connect/accept with dynamic processes
    - Fix errors with large messages pack/unpack operations in PSM channel
    - Fix for bcast collective tuning
    - Fix assertion errors in one-sided put operations in PSM channel
    - Fix issue with code getting stuck in infinite loop inside ptmalloc
    - Fix assertion error in shared memory large message transfers
    - Fix compilation warnings
    * Features and Enhancements (since 2.1a):
    - Based on MPICH-3.1.3
    - Flexibility to use internal communication buffers of different size for
      improved performance and memory footprint
    - Improve communication performance by removing locks from critical path
    - Enhanced communication performance for small/medium message sizes
    - Support for linking Intel Trace Analyzer and Collector
    - Increase the number of connect retry attempts with RDMA_CM
    - Automatic detection and tuning for Haswell architecture
    * Bug-Fixes (since 2.1a):
    - Fix automatic detection of support for atomics
    - Fix issue with void pointer arithmetic with PGI
    - Fix deadlock in ctxidup MPICH test in PSM channel
    - Fix compile warnings
    * Features and Enhancements (since 2.0):
    - Based on MPICH-3.1.2
    - Support for PMI-2 based startup with SLURM
    - Enhanced startup performance for Gen2/UD-Hybrid channel
    - GPU support for MPI_Scan and MPI_Exscan collective operations
    - Optimize creation of 2-level communicator
    - Collective optimization for PSM-CH3 channel
    - Tuning for IvyBridge architecture
    - Add -export-all option to mpirun_rsh
    - Support for additional MPI-T performance variables (PVARs)
      in the CH3 channel
    - Link with libstdc++ when building with GPU support
      (required by CUDA 6.5)
    * Bug-Fixes (since 2.0):
    - Fix error in large message (>2GB) transfers in CMA code path
    - Fix memory leaks in OFA-IB-CH3 and OFA-IB-Nemesis channels
    - Fix issues with optimizations for broadcast and reduce collectives
    - Fix hang at finalize with Gen2-Hybrid/UD channel
    - Fix issues for collectives with non power-of-two process counts
    - Make ring startup use HCA selected by user
    - Increase counter length for shared-memory collectives
  - Use download Url as source
  - Some other minor improvements
  - Add mvapich2-fix-implicit-decl.patch

Files

/usr/share/doc/mvapich2_2_3_7-gnu-hpc
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/index.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/install.pdf
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/logging.pdf
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/user.pdf
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/index.htm
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/mpicc.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/mpicxx.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/mpiexec.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/mpif77.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www1/mpifort.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/Constants.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPIX_Comm_agree.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPIX_Comm_failure_ack.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPIX_Comm_failure_get_acked.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPIX_Comm_revoke.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPIX_Comm_shrink.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Abort.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Accumulate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Add_error_class.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Add_error_code.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Add_error_string.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Address.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Aint_add.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Aint_diff.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Allgather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Allgatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Alloc_mem.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Allreduce.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Alltoall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Alltoallv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Alltoallw.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Attr_delete.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Attr_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Attr_put.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Barrier.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Bcast.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Bsend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Bsend_init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Buffer_attach.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Buffer_detach.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cancel.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_coords.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_map.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_rank.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_shift.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cart_sub.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Cartdim_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Close_port.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_accept.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_call_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_compare.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_connect.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_create_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_create_group.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_create_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_delete_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_disconnect.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_dup.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_dup_with_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_free_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_get_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_get_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_get_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_get_parent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_group.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_idup.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_join.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_rank.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_remote_group.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_remote_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_set_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_set_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_set_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_set_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_spawn.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_spawn_multiple.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_split.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_split_type.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Comm_test_inter.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Compare_and_swap.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Dims_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Dist_graph_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Dist_graph_create_adjacent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Dist_graph_neighbors.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Dist_graph_neighbors_count.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Errhandler_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Errhandler_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Errhandler_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Errhandler_set.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Error_class.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Error_string.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Exscan.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Fetch_and_op.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_c2f.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_call_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_close.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_create_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_delete.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_f2c.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_amode.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_atomicity.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_byte_offset.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_group.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_position.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_position_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_type_extent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_get_view.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iread.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iread_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iread_at.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iread_at_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iread_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iwrite.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iwrite_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iwrite_at.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iwrite_at_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_iwrite_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_open.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_preallocate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_all_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_all_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_at.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_at_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_at_all_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_at_all_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_ordered.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_ordered_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_ordered_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_read_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_seek.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_seek_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_set_atomicity.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_set_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_set_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_set_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_set_view.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_sync.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_all_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_all_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_at.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_at_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_at_all_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_at_all_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_ordered.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_ordered_begin.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_ordered_end.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_File_write_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Finalize.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Finalized.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Free_mem.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Gather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Gatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_accumulate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_address.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_count.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_elements.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_elements_x.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_library_version.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_processor_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Get_version.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graph_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graph_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graph_map.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graph_neighbors.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graph_neighbors_count.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Graphdims_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Grequest_complete.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Grequest_start.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_compare.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_difference.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_excl.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_incl.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_intersection.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_range_excl.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_range_incl.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_rank.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_translate_ranks.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Group_union.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iallgather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iallgatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iallreduce.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ialltoall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ialltoallv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ialltoallw.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ibarrier.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ibcast.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ibsend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iexscan.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Igather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Igatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Improbe.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Imrecv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ineighbor_allgather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ineighbor_allgatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ineighbor_alltoall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ineighbor_alltoallv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ineighbor_alltoallw.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_delete.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_dup.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_get.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_get_nkeys.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_get_nthkey.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_get_valuelen.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Info_set.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Init_thread.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Initialized.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Intercomm_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Intercomm_merge.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iprobe.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Irecv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ireduce.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ireduce_scatter.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ireduce_scatter_block.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Irsend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Is_thread_main.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iscan.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iscatter.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Iscatterv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Isend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Issend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Keyval_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Keyval_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Lookup_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Mprobe.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Mrecv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Neighbor_allgather.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Neighbor_allgatherv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Neighbor_alltoall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Neighbor_alltoallv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Neighbor_alltoallw.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Op_commute.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Op_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Op_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Open_port.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Pack.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Pack_external.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Pack_external_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Pack_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Pcontrol.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Probe.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Publish_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Put.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Query_thread.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Raccumulate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Recv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Recv_init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Reduce.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Reduce_local.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Reduce_scatter.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Reduce_scatter_block.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Register_datarep.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Request_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Request_get_status.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Rget.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Rget_accumulate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Rput.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Rsend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Rsend_init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Scan.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Scatter.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Scatterv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Send.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Send_init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Sendrecv.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Sendrecv_replace.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ssend.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Ssend_init.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Start.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Startall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Status_set_cancelled.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Status_set_elements.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Status_set_elements_x.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_changed.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_categories.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_cvars.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_index.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_num.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_category_get_pvars.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_get_index.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_get_num.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_handle_alloc.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_handle_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_read.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_cvar_write.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_enum_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_enum_get_item.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_finalize.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_init_thread.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_get_index.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_get_num.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_handle_alloc.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_handle_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_read.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_readreset.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_reset.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_session_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_session_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_start.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_stop.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_T_pvar_write.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Test.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Test_cancelled.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Testall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Testany.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Testsome.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Topo_test.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_commit.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_contiguous.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_darray.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_hindexed.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_hindexed_block.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_hvector.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_indexed_block.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_resized.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_struct.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_create_subarray.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_delete_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_dup.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_extent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_free_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_contents.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_envelope.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_extent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_extent_x.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_true_extent.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_get_true_extent_x.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_hindexed.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_hvector.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_indexed.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_lb.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_match_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_set_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_set_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_size.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_size_x.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_struct.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_ub.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Type_vector.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Unpack.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Unpack_external.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Unpublish_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Wait.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Waitall.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Waitany.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Waitsome.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_allocate.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_allocate_shared.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_attach.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_call_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_complete.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_create.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_create_dynamic.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_create_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_create_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_delete_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_detach.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_fence.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_flush.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_flush_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_flush_local.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_flush_local_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_free.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_free_keyval.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_get_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_get_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_get_group.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_get_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_get_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_lock.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_lock_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_post.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_set_attr.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_set_errhandler.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_set_info.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_set_name.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_shared_query.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_start.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_sync.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_test.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_unlock.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_unlock_all.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Win_wait.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Wtick.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/MPI_Wtime.html
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/index.htm
/usr/share/doc/mvapich2_2_3_7-gnu-hpc/www3/mpi.cit


Generated by rpm2html 1.8.1

Fabrice Bellet, Tue Jul 9 17:57:49 2024