This thread is locked.Only browsing is available.
Top Page > Browsing
Memory exhaustion by cell optimization
Date: 2020/03/28 20:46
Name: Hikaru Sawahata   <sawahata@cphys.s.kanazawa-u.ac.jp>

Dear OpenMX developers,

Hi, I have a trouble.

I computed the super cell system Atoms.Number = 16 in ISSP supercomputer systemB.
When I use cell optimization "OptC5", the calculation stopped at MD step =150 - 160.
This problem may be caused by memory limit exceeded (resources_used.mem is over 100GB).

Please let me know how to avoid this problem.
Do you have any idea?

Best regards,

Hikaru Sawahata
メンテ
Page: [1]

Re: Memory exhaustion by cell optimization ( No.1 )
Date: 2020/03/28 21:16
Name: Mitsuaki Kawamura  <mkawamura@issp.u-tokyo.ac.jp>

Dear Hikaru Sawahata

Hello,
The solutions I know are
1, Increase nodes
2, Decrease MPI processes and increase OpenMP threads

Best regards,
Mitsuaki Kawamura (ISSP, U-Tokyo)
メンテ
Re: Memory exhaustion by cell optimization ( No.2 )
Date: 2020/03/28 23:15
Name: T. Ozaki

Hi,

The advice by Dr. Kawamura can be effecitve.
However, a more effetictive way is to reduce the number of MPI processes per node,
while keeping a signle OMP thread.

Also, even if your job is terminated, you can exactly restart the optimization using
the dat# file which allows us to restart with a set of exactly same parmereters such as
the approximate Hessian and variable prefactors generated during the optimization.
The optimization will start from the terminated optimization step with the dat# file.
See http://www.openmx-square.org/openmx_man3.9/node52.html

Regards,

TO
メンテ

Page: [1]