Sent to CCL by: "Pablo Echenique" [echenique.p]^[gmail.com]
Dear CCLers,
excuse me for my newbieness but I want to know whether or not the memory and, specially, disk requirements of a job are distributed among the machines if I launch it in a parallel rather than in serial way.
Coming down to an example, say I want to perform a CCSD single point energy calculation that requires 200GB of disk space to be run. If I launch the job in a machine with 100GB available space for the scratch folder, the job dies. But, what if I launch the same job, say, in a parallel way, to 4 machines with 60GB available space each? Will the disk requirements be split and the job will successfully end or will it die?
And what about RAM memory?
Thank you very much in advance for your help and best regards from Spain,
Pablo Echenique
--
Pablo Echenique
Instituto de Biocomputacin y
Fsica de los Sistemas Complejos (BIFI)
Departamento de Fsica Terica
Universidad de Zaragoza
Pedro Cerbuna 12, 50009 Zaragoza
Spain
Tel.: +34 976761260
Fax: +34 976761264
echenique.p_-_gmail.com
http://www.pabloechenique.com
http://www.ccl.net/cgi-bin/ccl/send_ccl_message
http://www.ccl.net/cgi-bin/ccl/send_ccl_message
http://www.ccl.net/chemistry/sub_unsub.shtml
http://www.ccl.net/spammers.txt