Code Monkey home page Code Monkey logo

arc's Introduction

Automated Rate Calculator | ARC

Build Status codecov MIT license Release DOI

arc logo

The Automated Rate Calculator (ARC) software is a tool for automating electronic structure calculations and attaining thermo-kinetic data relevant for chemical kinetic modeling.

ARC has many advanced features, yet at its core it is simple: It accepts 2D graph representations of chemical species (i.e., SMILES, InChI, or RMG's adjacency list), and automatically executes, tracks, and processes relevant electronic structure calculation jobs on user-defined server(s). The principal outputs of ARC are thermodynamic properties (H, S, Cp) and high-pressure limit kinetic rate coefficients of species and reactions of interest.

Mission

ARC's mission is to provide the kinetics community with a well-documented and extensible codebase for automatically calculating species thermochemistry and reaction rate coefficients.

Documentation

Visit out documentation pages for installation instructions, examples, API, advanced features and more.

Licence

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Developers and contributors: Visit ARC's Developer's Guide on the wiki page.

If you have a suggestion or find a bug, please post to our Issues page.

Questions

If you are having issues, please post to our Issues page. We will do our best to assist.

arc's People

Contributors

alongd avatar amarkpayne avatar calvinp0 avatar codacy-badger avatar goldmanm avatar jintaowu98 avatar josemvdh avatar kfir4444 avatar kspieks avatar leegoldfryd avatar lgtm-migrator avatar lilachn91 avatar mbprend avatar mefuller avatar michalkesl avatar mjohnson541 avatar mliu49 avatar naddeu avatar oscarwumit avatar xiaoruidong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arc's Issues

self.mol_from_xyz is None

self.mol_from_xyz is None for certain xyz.

Example: smile = CN[CH]c1ccccc1
xyz:
N 2.32397400 -0.37071000 0.05181600
C 3.69770300 0.04615800 -0.17159700
C -0.06407100 0.26034800 0.05446700
C 1.31495400 0.56397900 0.10172300
C -1.02031200 1.31529800 0.08296600
C -0.57124100 -1.06787500 -0.02897900
C -2.37619400 1.05920800 0.01429000
C -2.85272500 -0.25556300 -0.08271800
C -1.93443600 -1.30859800 -0.09906600
H 4.36668600 -0.79830800 0.00251100
H 3.96008400 0.83096500 0.54189900
H 3.87534700 0.43142200 -1.18618800
H 2.07908800 -1.25655500 -0.36716800
H 1.63554500 1.58877900 0.24552000
H -0.66693200 2.33923800 0.15352300
H -3.07748200 1.88673800 0.03538900
H -3.91667800 -0.45271600 -0.13706000
H -2.29038900 -2.33193300 -0.15889800
H 0.10769900 -1.91377400 -0.00893900

Rotor troubleshooting

If there's a huge gap in a-b or a-z, the freeze all other rotors

D 6 1 7 4 S 72        5.0
D 7 1 6 2 F
D 6 2 3 16 F
D 3 2 6 1 F
D 5 4 7 1 F
D 7 4 5 10 F

Job type '1d_rotors' is not supported

I recently installed ARC in rmg server and try to run ARC. (I checked that 'make test' worked fine).

When I ran ARC in rmg server using slurm, I got following errors as below.

File "/home/jeehyun/ARC/arc/common.py", line 460, in initialize_job_types
raise InputError(f"Job type '{job_type}' is not supported. Check the job types dictionary "
arc.exceptions.InputError: Job type '1d_rotors' is not supported. Check the job types dictionary (either in ARC's input or in default_job_types under settings).

adaptive distance for auto1dmin

When using auto1dmin, ARC gives a minimum distance between the bath gas and the molecule of interest of 2 A and maximum distance of 5 A. When running for larger molecules, it might make sense that the distance between the two molecules increase. To make auto1dmin more reliable, it might make sense to use adapt the range of distance depending on the size of the molecule and bath gas. For example, when running a C4H10O3, the original settings failed, but the adjusting to 3 A and 6 A gave lennard jones parameters

It also seems like the wider range given, the longer it takes to run.

Restart.yml file cannot be read in due to ConstructorError

I was running a large ARC job that crashed due to an unrelated and now fixed issue, and I tried to restart from the restart file generated by ARC previously. However, it appears that something is not getting saved correctly to the restart file. Below is the Traceback (I have edited to reduce the lengths of file paths, but I checked that everything was being called from the correct place).

Traceback (most recent call last):
  File "ARC/ARC.py", line 73, in <module>
    main()
  File "ARC/ARC.py", line 53, in main
    input_dict = read_yaml_file(input_file)
  File "ARC/arc/common.py", line 265, in read_yaml_file
    content = yaml.load(stream=f, Loader=yaml.FullLoader)
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/__init__.py", line 114, in load
    return loader.get_single_data()
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 43, in get_single_data
    return self.construct_document(node)
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 52, in construct_document
    for dummy in generator:
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 399, in construct_yaml_seq
    data.extend(self.construct_sequence(node))
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 122, in construct_sequence
    for child in node.value]
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 122, in <listcomp>
    for child in node.value]
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 94, in construct_object
    data = constructor(self, tag_suffix, node)
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 624, in construct_python_object_apply
    instance = self.make_python_instance(suffix, node, args, kwds, newobj)
  File "anaconda3/envs/arc_env/lib/python3.7/site-packages/yaml/constructor.py", line 570, in make_python_instance
    node.start_mark)
yaml.constructor.ConstructorError: while constructing a Python instance
expected a class, but found <class 'builtin_function_or_method'>
  in "restart.yml", line 9491, column 5

The relevant lines from the restart file are given below:

  mol: |
    1  C u0 p0 c0 {2,D} {6,S} {8,S}
    2  C u0 p0 c0 {1,D} {3,S} {9,S}
    3  C u0 p0 c0 {2,S} {4,D} {10,S}
    4  C u0 p0 c0 {3,D} {5,S} {7,S}
    5  C u0 p0 c0 {4,S} {6,D} {11,S}
    6  C u0 p0 c0 {1,S} {5,D} {12,S}
    7  C u0 p0 c0 {4,S} {13,S} {14,S} {15,S}
    8  H u0 p0 c0 {1,S}
    9  H u0 p0 c0 {2,S}
    10 H u0 p0 c0 {3,S}
    11 H u0 p0 c0 {5,S}
    12 H u0 p0 c0 {6,S}
    13 H u0 p0 c0 {7,S}
    14 H u0 p0 c0 {7,S}
    15 H u0 p0 c0 {7,S}
  most_stable_conformer: 0
  multiplicity: 1
  neg_freqs_trshed:
  - !!python/object/apply:numpy.core.multiarray.scalar  # This is line 9491
    - &id002 !!python/object/apply:numpy.dtype
      args:
      - f8
      - 0
      - 1
      state: !!python/tuple
      - 3
      - <
      - null
      - null
      - null
      - -1
      - -1
      - 0
    - !!binary |
      7FG4HoVrOMA=
  number_of_rotors: 0
  opt_level: b3lyp/def2tzvp empiricaldispersion=gd3bj
  optical_isomers: null
  t1: null

Is this an ARC issue or a YAML issue?

Why does ARC attempt to run species I did not ask for? (frequency scaling factors)

Originally posted by @NtuCheRoy in #196 (comment)


I try to run hydrogen with sp job, but in the file I see that ARC go to run opt job for the molecules I didn't give, and cause submitting too much job in a sudden, so reach this QOS's job submission limitation. Why ARC would want to run these opt job for these molecule and doesn't run the job I submitting in the yml file?

Using Theano backend.
ARC execution initiated on Tue Sep 17 06:29:57 2019

###############################################################
#                                                             #
#                 Automatic Rate Calculator                   #
#                            ARC                              #
#                                                             #
#   Version: 1.1.0                                            #
#                                                             #
###############################################################

The current git HEAD for ARC is:
    ccdb557aa3f369b0de24da9a84ce9f9c3ad6dbb8
    Sun Sep 15 21:43:52 2019 -0400


Starting project arc_demo_1

Considering the following job types: ['sp']


Using the following ESS settings:
{'qchem': ['local']}



Warning: Not using a fine grid for geometry optimization jobs




Warning: Not running rotor scans. This might compromise finding the best conformer, as dihedral angles won't be corrected. Also, entropy won't be accurate.


Using b3lyp/6-31g* for refined conformer searches (after filtering via force fields)
Using b3lyp/6-31g* for TS guesses comparison of different methods
Using b3lyp/6-31g* for geometry optimizations
Using b3lyp/6-31g* for frequency calculations
Using b3lyp/6-31g* for single point calculations
Using b3lyp/6-31g* for rotor scans



Warning: Could not determine appropriate Model Chemistry to be used in Arkane for thermochemical parameter calculations.
Not using atom energy corrections and bond additivity corrections!


Using b3lyp/6-31g*//b3lyp/6-31g* as a model chemistry in Arkane
Warning: :root:No frequency scaling factor found for model chemistry b3lyp/6-31g*. Assuming a value of unity. This will affect the partition function and all quantities derived from it (thermo quantities and rate coefficients).
Could not determine the harmonic frequencies scaling factor for b3lyp/6-31g* from Arkane.
Calculating it using Truhlar's method:






FREQ: A PROGRAM FOR OPTIMIZING SCALE FACTORS (Version 1)
                 written by
Haoyu S. Yu, Lucas J. Fiedler, I.M. Alecu, and Donald G. Truhlar
Department of Chemistry and Supercomputing Institute
University of Minnesota, Minnesota 55455-0431
CITATIONS:
1. I.M., Alecu, J. Zheng, Y. Zhao, D.G. Truhlar, J. Chem. Theory Comput. 2010, 6, 9, 2872-2887,
   DOI: 10.1021/ct100326h
2. H.S. Yu, L.J. Fiedler, I.M. Alecu,, D.G. Truhlar, Computer Physics Communications 2017, 210, 132-138,
   DOI: 10.1016/j.cpc.2016.09.004




starting ARC...


Considering the following job types: [u'opt', u'conformers', u'sp', u'1d_rotors', u'freq', u'fine']


Computing scaling factors at the b3lyp/6-31g* level of theory...



Using the following ESS settings:
{'qchem': ['local']}

Running job opt_a45 for C2H2
Running job opt_a46 for CH4
Running job opt_a47 for CO2
Running job opt_a48 for CO
Running job opt_a49 for F2
Running job opt_a50 for CH2O
sbatch: error: QOSMaxSubmitJobPerUserLimit
sbatch: error: Batch job submission failed: Job violates accounting/QOS policy (job submit limit, user's size and/or time limits)
Error: The following command is erroneous:

Traceback (most recent call last):
  File "/global/homes/s/shcheng/ARC/ARC.py", line 76, in <module>
    main()
  File "/global/homes/s/shcheng/ARC/ARC.py", line 69, in main
    arc_object = ARC(input_dict=input_dict, project_directory=project_directory)
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 405, in __init__
    self.check_freq_scaling_factor()
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 1015, in check_freq_scaling_factor
    level, ess_settings=self.ess_settings, init_log=False)[0]
  File "/global/u2/s/shcheng/ARC/arc/utils/scale.py", line 108, in determine_scaling_factors
    ess_settings=ess_settings, job_types=job_types, allow_nonisomorphic_2d=True)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 363, in __init__
    self.run_opt_job(species.label)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 805, in run_opt_job
    job_type='opt', fine=False)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 659, in run_job
    self.job_dict[label][job_type][job.job_name].run()
  File "/global/u2/s/shcheng/ARC/arc/job/job.py", line 805, in run
    self.job_status[0], self.job_id = submit_job(path=self.local_path)
  File "/global/u2/s/shcheng/ARC/arc/job/local.py", line 98, in submit_job
    stdout = execute_command(cmd)[0]
  File "/global/u2/s/shcheng/ARC/arc/job/local.py", line 43, in execute_command
    ' sbatch path required in the submit_command dictionary.'.format(e.message))
arc.arc_exceptions.SettingsError: The following command is erroneous:

To correct the command, modify settings.py
Tips: use "which" command to locate cluster software commands on server.
Example: type "which sbatch" on a server running Slurm to find the correct sbatch path required in the submit_command dictionary.

Conformer generation fails for polycyclic without torsions

Current ARC master errors when trying to generate conformers for C1=CC2=CC=C1C2.

image

Generating conformers for C7H6
Species C7H6 has 7 heavy atoms and 0 torsions. Using 100 random conformers.
Traceback (most recent call last):
  File "../../ARC.py", line 73, in <module>
    main()
  File "../../ARC.py", line 67, in main
    arc_object.execute()
  File "/home/mjliu/Code/ARC/arc/main.py", line 719, in execute
    dont_gen_confs=self.dont_gen_confs)
  File "/home/mjliu/Code/ARC/arc/scheduler.py", line 398, in __init__
    self.schedule_jobs()
  File "/home/mjliu/Code/ARC/arc/scheduler.py", line 414, in schedule_jobs
    self.run_conformer_jobs()
  File "/home/mjliu/Code/ARC/arc/scheduler.py", line 784, in run_conformer_jobs
    self.project_directory, 'output', 'Species', label, 'geometry', 'conformers'))
  File "/home/mjliu/Code/ARC/arc/species/species.py", line 776, in generate_conformers
    plot_path=plot_path)
  File "/home/mjliu/Code/ARC/arc/species/conformers.py", line 179, in generate_conformers
    smeared_scan_res, plot_path=plot_path)
  File "/home/mjliu/Code/ARC/arc/species/conformers.py", line 252, in deduce_new_conformers
    torsion_angles = get_torsion_angles(label, conformers, torsions)  # get all wells per torsion
  File "/home/mjliu/Code/ARC/arc/species/conformers.py", line 897, in get_torsion_angles
    'Consider calling `determine_dihedrals()` first.'.format(label))
arc.exceptions.ConformerError: Could not determine dihedral torsion angles for C7H6. Consider calling `determine_dihedrals()` first.

FileNotFoundError: Could not find SSH key for the server ARC is running on

It looks like, arc cannot find the key... I am trying to submit job using slurm inside RMG server

For some reason, arc cannot find it.

Below is the settings in settings.py in ~/ARC/arc. I also finished copy rsa keys in both servers.

servers = {
    'pharos': {
        'cluster_soft': 'OGE',
        'address': 'pharos.mit.edu',
        'un': 'jeehyun',
        'key': '~/jeehyun/.ssh/id_rsa',
    },
    'rmg': {
        'cluster_soft': 'Slurm',
        'address': 'rmg.mit.edu',
        'un': 'jeehyun',
        'key': '~/jeehyun/.ssh/id_rsa',
    },
}

# List here servers you'd like to associate with specific ESS.
# An ordered list of servers indicates priority
# Keeping this dictionary empty will cause ARC to scan for software on the servers defined above
global_ess_settings = {
    'gaussian': ['rmg', 'pharos'],
    'molpro': 'rmg',
    'qchem': 'pharos',
    'onedmin': 'pharos',
}

Rotor detection fails in some special cases

Rotor detector fails in some special cases
(1) When one end of the dihedral has triple bonds, e.g., H-C#C-O-H. It will take C#C-O-H as a rotor.
(2) From the original structure, it seems existing rotors. But when looking at other resonance structures, then some of the rotors are fake, e.g., O=[C]C=O. It looks like O=[C]-C=O is a rotor, but it has a resonance structure: O=C=C-[O], which indicates that it is not really a rotor (The actual angle O-C-C is 180 degree in the optimized structure).

Suggestion:
These two cases fail when they have bond angle = 180. We can check triple bond or resonance structure, but there may be other cases impossible to identify beforehand. So we can check if any bond angle == 180 in the dihedral about to be scanned, if so, ignore the rotor.

Some tests from "make tests" fail

After pulling the latest version of master and running make test, some of the tests failed due to precision errors. An example is shown below.

FAIL: Test converting a string xyz format to the ARC xyz format
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/kevin/Documents/RMG/ARC/arc/species/converterTest.py", line 491, in test_str_to_xyz
    self.assertEqual(xyz5, expected_xyz5)
AssertionError: {'sym[204 chars]531455, -0.5143966785845655, -0.36373303270134[65 chars]64))} != {'sym[204 chars]531454, -0.5143966785845655, -0.36373303270134[65 chars]64))}
  {'coords': ((0.0, 0.0, -6.987742560341984e-08),
              (0.0, 0.0, 1.0911999301225743),
              (0.0, 1.0287933571691315, -0.3637330327013464),
-             (0.8909611825531455, -0.5143966785845655, -0.3637330327013464),
?                               ^

+             (0.8909611825531454, -0.5143966785845655, -0.3637330327013464),
?                               ^

              (-0.8909611825531449, -0.5143966785845662, -0.3637330327013464)),
   'isotopes': (12, 1, 1, 1, 1),
   'symbols': ('C', 'H', 'H', 'H', 'H')}

Syntax error when ARC send server commands (local server)

I meet another error. I have set up submit_command in arc/settings.py by below code.

submit_command = {'OGE': 'export SGE_ROOT=/opt/sge; /opt/sge/bin/lx24-amd64/qsub',
                  'Slurm': '/usr/bin/sbatch'}

I type which sbatch in my server, and it shows /usr/bin/sbatch, as same as that one in settings.py. I have no idea where is going wrong. Below is the trace message.

Running job sp_a51 for hydrogen
/bin/sh: -c: line 0: syntax error near unexpected token `newline'
/bin/sh: -c: line 0: `/usr/bin/squeue -u <username>'
Error: The following command is erroneous:

Traceback (most recent call last):
  File "/global/homes/s/shcheng/ARC/ARC.py", line 76, in <module>
    main()
  File "/global/homes/s/shcheng/ARC/ARC.py", line 70, in main
    arc_object.execute()
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 718, in execute
    dont_gen_confs=self.dont_gen_confs)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 391, in __init__
    self.schedule_jobs()
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 413, in schedule_jobs
    self.get_servers_jobs_ids()  # updates `self.servers_jobs_ids`
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 1965, in get_servers_jobs_ids
    self.servers_jobs_ids.extend(check_running_jobs_ids())
  File "/global/u2/s/shcheng/ARC/arc/job/local.py", line 81, in check_running_jobs_ids
    stdout = execute_command(cmd)[0]
  File "/global/u2/s/shcheng/ARC/arc/job/local.py", line 43, in execute_command
    ' sbatch path required in the submit_command dictionary.'.format(e.message))
arc.arc_exceptions.SettingsError: The following command is erroneous:

To correct the command, modify settings.py
Tips: use "which" command to locate cluster software commands on server.
Example: type "which sbatch" on a server running Slurm to find the correct sbatch path required in the submit_command dictionary.

Emperical dispersion does not make it to the QM input file

Describe the bug
Following the documentation here to add in the empirical dispersion model, I am finding that the dispersion keyword is not actually being added to the QM (in my case Gaussian) input file.

Looking at job.py, it appears that self.dispersion is being set properly, but this variable doesn't appear to be used elsewhere at first glance.

How to reproduce
Here is some of the code you need to reproduce this bug:

project = 'dispersion_test'
ess_settings = {'gaussian': ['local']}
job_types = {'rotors': False,
             'conformers': False,
             'fine': True,
             'freq': True,
             'opt': True,
             'sp': True,
             }
max_job_time = 100
model_chemistry = 'user_defined'

# Level of Theory
lot = {'method': 'b3lyp', 'basis': 'def2tzvp', 'dispersion': 'empiricaldispersion=gd3bj'}

opt_level = lot
freq_level = lot
scan_level = lot
sp_level = lot
conformer_level = lot
ts_guess_level = lot

# arc_species_list is a list of one species. smiles="O=CCl"

arc = ARC(project=project,
          arc_species_list = arc_species_list,
          ess_settings=ess_settings,
          job_types=job_types,
          max_job_time=max_job_time,
          model_chemistry=model_chemistry,
          opt_level=opt_level,
          freq_level=freq_level,
          scan_level=scan_level,
          sp_level=sp_level,
          conformer_level=conformer_level,
          ts_guess_level=ts_guess_level,
          compare_to_rmg=False,
          #calc_freq_factor=False,
          )

arc.execute()

Expected behavior
Specifying the dispersion should mean that the dispersion model is used in QM calculations.

RMG import errors when running ARC

Posted on behalf of @NtuCheRoy:

"When I run my ARC job, it appeared "ImportError: No module named util" error, util is a package of arkane. I installed rmg by "conda install -c rmg rmg", and I had added the PYTHONPATH by "export PYTHONPATH=$PYTHONPATH:~/.conda/envs/arc_env/bin". Could you tell me how to solve this problem?"

Need to improve molpro memory trsh

Warning: Troubleshooting iC3H6CNOOH job sp_a32590 which failed with status: "errored,"
with keywords: ['NoOutput']
in molpro.
The error "Log file could not be read" was derived from the following line in the log file:
"".
Troubleshooting sp job in molpro for iC3H6CNOOH using shift
Running job sp_a32594 for iC3H6CNOOH
Currently running jobs:
{'iC3H6CNOOH': ['scan_a32587', 'freq_a32589', 'scan_a32591', 'scan_a32592', 'scan_a32593', 'sp_a32594']}
  Ending job freq_a32589 for iC3H6CNOOH (run time: 0:11:45)
  Ending job sp_a32594 for iC3H6CNOOH (run time: 0:25:50)


Warning: Troubleshooting iC3H6CNOOH job sp_a32594 which failed with status: "errored,"
with keywords: ['Memory']
in molpro.
The error "Additional memory required: 135.10 MW" was derived from the following line in the log file:
" A further 135.10 Mwords of memory are needed for the triples to run. Increase memory to 359.38 Mwords.".
Troubleshooting sp job in molpro for iC3H6CNOOH using memory: 20.5625 GB instead of 14 GB
Running job sp_a32595 for iC3H6CNOOH
  Ending job sp_a32595 for iC3H6CNOOH (run time: 0:21:48)


Warning: Troubleshooting iC3H6CNOOH job sp_a32595 which failed with status: "errored,"
with keywords: ['Memory']
in molpro.
The error "Additional memory required: 30.10 MW" was derived from the following line in the log file:
" A further 30.10 Mwords of memory are needed for the triples to run. Increase memory to 359.38 Mwords.".
Troubleshooting sp job in molpro for iC3H6CNOOH using memory: 26.34375 GB instead of 20.5625 GB
Running job sp_a32596 for iC3H6CNOOH
Currently running jobs:
{'iC3H6CNOOH': ['scan_a32587', 'scan_a32591', 'scan_a32592', 'scan_a32593', 'sp_a32596']}
Currently running jobs:
{'iC3H6CNOOH': ['scan_a32587', 'scan_a32591', 'scan_a32592', 'scan_a32593', 'sp_a32596']}

The first job returned an empty output.out file. The output file on the server was empty as well, and the Slurm message was:

0:Child process terminated prematurely, status=: 0
(rank:0 hostname:node05 pid:28293):ARMCI DASSERT fail. src/common/signaltrap.c:SigChldHandler():178 cond:0
application called MPI_Abort(comm=0x84000007, -1) - process 0

The other jobs ran fine, but molpro repeatedly asked for more memory. Perhaps a better and simpler strategy for memory trsh would be to increase job_mem to the max allowed by the server if the ESS requests for more (the quantity it asks for might be increased down the road, so don't try predicting how much it needs). If it fails again due to memory, try changing server if the ESS is available on more than one server, but do so only if the other server has more memory allocation.

Another solution would be to teach ARC to parallelize over several nodes.

Need to run sp after conf DFT

After optimizing the N conformers at DFT (which should be apfd/def2svp instead of b3lyp/6-31g(d,p) EmpiricalDispersion=GD3BJ), the selection should be according to a higher level sp energy. The recommendation should be DLPNO-CCSD(T), the same as the new recommendation for the legacy SP level. This should also (and especially) be implemented for TSs.

Can't run onyl a specific job type (e.g., freq) w/o first running opt

I try to run a freq job for hydrogen giving xyz as input. However, it shows that xyz is not be defined. It is so weird... Below are my input file and trace message. Could you help me solve this problem? Thank you so much.
By the way, when I run single point job, it doesn't appear this error.

project: arc_demo_1

ess_settings:
        qchem:
        - local

job_types:
        conformers: false
        opt: false
        fine: false
        freq: true
        sp: false
        1d_rotors: false
        orbitals: false

level_of_theory: b3lyp/6-31g*
conformer_level: b3lyp/6-31g*
scan_level: b3lyp/6-31g*
ts_guess_level: b3lyp/6-31g*
calc_freq_factor: false

species:
- label: hydrogen
  xyz:
  - |
    H         -4.12347       -0.08887        0.00000
    H         -3.70381        0.48134        0.00000
Using Theano backend.
ARC execution initiated on Wed Sep 18 11:51:40 2019

###############################################################
#                                                             #
#                 Automatic Rate Calculator                   #
#                            ARC                              #
#                                                             #
#   Version: 1.1.0                                            #
#                                                             #
###############################################################

The current git HEAD for ARC is:
    9fc5c7722322bc0a0b74c82652cc8401f8fa3fae
    Mon Sep 16 22:39:52 2019 -0400


Starting project arc_demo_1

Considering the following job types: ['freq']


Using the following ESS settings:
{'qchem': ['local']}



Warning: Not using a fine grid for geometry optimization jobs




Warning: Not running rotor scans. This might compromise finding the best conformer, as dihedral angles won't be corrected. Also, entropy won't be accurate.


Using b3lyp/6-31g* for refined conformer searches (after filtering via force fields)
Using b3lyp/6-31g* for TS guesses comparison of different methods
Using b3lyp/6-31g* for geometry optimizations
Using b3lyp/6-31g* for frequency calculations
Using b3lyp/6-31g* for single point calculations
Using b3lyp/6-31g* for rotor scans



Warning: Could not determine appropriate Model Chemistry to be used in Arkane for thermochemical parameter calculations.
Not using atom energy corrections and bond additivity corrections!


Using b3lyp/6-31g*//b3lyp/6-31g* as a model chemistry in Arkane
Warning: :root:No frequency scaling factor found for model chemistry b3lyp/6-31g*. Assuming a value of unity. This will affect the partition function and all quantities derived from it (thermo quantities and rate coefficients).
Could not determine the harmonic frequencies scaling factor for b3lyp/6-31g* from Arkane.
Not calculating it, assuming a frequencies scaling factor of 1.


Considering species: hydrogen
<Molecule "[H][H]">


Only one conformer is available for species hydrogen, using it as initial xyz
The only conformer for species hydrogen was found to be isomorphic with the 2D graph representation [H][H]

Traceback (most recent call last):
  File "/global/homes/s/shcheng/ARC/ARC.py", line 76, in <module>
    main()
  File "/global/homes/s/shcheng/ARC/ARC.py", line 70, in main
    arc_object.execute()
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 718, in execute
    dont_gen_confs=self.dont_gen_confs)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 391, in __init__
    self.schedule_jobs()
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 407, in schedule_jobs
    self.run_conformer_jobs()
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 788, in run_conformer_jobs
    self.process_conformers(label)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 1385, in process_conformers
    self.run_freq_job(label)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 872, in run_freq_job
    level_of_theory=self.freq_level, job_type='freq')
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 683, in run_job
    directed_dihedral=directed_dihedral)
  File "/global/u2/s/shcheng/ARC/arc/job/job.py", line 217, in __init__
    raise InputError('{0} Job of species {1} got None for xyz'.format(self.job_type, self.species_name))
arc.arc_exceptions.InputError: freq Job of species hydrogen got None for xyz

Error in converting xyz to 2D molecule leads to false isomorphism check failure

Describe the bug
I am trying to use ARC for a slightly unconventional species, chloromethylene, which has a smiles of [CH]CCl. Below are the xyz coordinates optimized at wb97m-v/def2-tzvpd

C	1.1709385492000002	0.17631434110000002	-0.0
Cl	-0.5031634975000001	-0.010943003600000003	0.0
H	1.5281481620000001	-0.8718549847000001	0.0

Looking at this geometry in Avogadro, this appears to be the right geometry for this molecule. However, when I try to create an ARC species, I get the following errors:

ERROR:root:Could not update atomtypes for this molecule:
multiplicity -187
1 C  u0 p0 c0 {2,T} {3,S}
2 Cl u0 p2 c0 {1,T}
3 H  u0 p0 c0 {1,S}

ERROR:root:Could not update atomtypes for this molecule:
1 C  u0 p0 c0 {2,T} {3,S}
2 Cl u0 p2 c0 {1,T}
3 H  u0 p0 c0 {1,S}

ERROR:root:Could not update atomtypes for this molecule:
1 C  u0 p0 c0 {2,T} {3,S}
2 Cl u0 p2 c0 {1,T}
3 H  u0 p0 c0 {1,S}

WARNING:arc:XYZ and the 2D graph representation for spcs54 are not isomorphic.
Got xyz:
{'symbols': ('C', 'Cl', 'H'), 'isotopes': (12, 35, 1), 'coords': ((1.1709385492000002, 0.17631434110000002, -0.0), (-0.5031634975000001, -0.010943003600000003, 0.0), (1.5281481620000001, -0.8718549847000001, 0.0))}

which corresponds to C#Cl
1 Cl u0 p2 c0 {2,T}
2 C  u0 p0 c0 {1,T} {3,S}
3 H  u0 p0 c0 {2,S}


and: [CH]Cl
multiplicity 3
1 Cl u0 p3 c0 {2,S}
2 C  u2 p0 c0 {1,S} {3,S}
3 H  u0 p0 c0 {2,S}

WARNING:arc:Element order in xyz (('C', 'Cl', 'H')) differs from mol (['Cl', 'C', 'H'])

To be clear, this is likely an error with RDKit/Openbabel, but I am curious how we should handle this inside of ARC. For example, here ARC can probably work out the correct connectivity, just not the correct bond orders. Should we change ARC to fall back to a warning if the isomorphism fails but the connectivity seems right?

How to reproduce
Try executing the following:

ARCSpecies(smiles='[CH]CCl',  xyz='C\t1.1709385492000002\t0.17631434110000002\t-0.0\nCl\t-0.5031634975000001\t-0.010943003600000003\t0.0\nH\t1.5281481620000001\t-0.8718549847000001\t0.0\n')

Expected behavior
ARC should let this pass with at most a warning

Error related to model chemistries

I don't really have time to look at this now, but so this doesn't get lost I'll put it down here now. I ran the ARC reaction notebook, except giving the TS in the yaml to it as an initial guess. I believe this issue is related to the fact that I set sp_level='B3LYP/6-311++G(3df,3pd)' in the ARC object construction.

Loading statistical mechanics parameters for TS...


Exception Traceback (most recent call last)
in ()
30 arc_rxn_list=arc_rxn_list, sp_level='B3LYP/6-311++G(3df,3pd)',
31 use_bac=False,scan_rotors=False, fine=False,ess_settings={"gaussian":"pharos"})
---> 32 arc0.execute()

/Users/mattjohnson/RMGCODE/ARC/arc/main.pyc in execute(self)
564 lib_long_desc=self.lib_long_desc, rmgdatabase=self.rmgdb, t_min=self.t_min, t_max=self.t_max,
565 t_count=self.t_count)
--> 566 prc.process()
567 self.summary()
568 self.log_footer()

/Users/mattjohnson/RMGCODE/ARC/arc/processor.pyc in process(self)
220 stat_mech_job.modelChemistry = self.model_chemistry
221 stat_mech_job.frequencyScaleFactor = assign_frequency_scale_factor(self.model_chemistry)
--> 222 stat_mech_job.execute(outputFile=None, plot=False)
223 for spc in rxn.r_species + rxn.p_species:
224 if spc.label not in arkane_spc_dict.keys():

/Users/mattjohnson/RMGCODE/RMG-Py/arkane/statmech.pyc in execute(self, outputFile, plot, pdep)
187 logging.error("pdep loading:")
188 logging.error(pdep)
--> 189 self.load(pdep)
190 if outputFile is not None:
191 self.save(outputFile)

/Users/mattjohnson/RMGCODE/RMG-Py/arkane/statmech.pyc in load(self, pdep)
418 atomEnergies=self.atomEnergies,
419 applyAtomEnergyCorrections=self.applyAtomEnergyCorrections,
--> 420 applyBondEnergyCorrections=self.applyBondEnergyCorrections)
421 if len(number) > 1:
422 ZPE = statmechLog.loadZeroPointEnergy() * self.frequencyScaleFactor

/Users/mattjohnson/RMGCODE/RMG-Py/arkane/statmech.pyc in applyEnergyCorrections(E0, modelChemistry, atoms, bonds, atomEnergies, applyAtomEnergyCorrections, applyBondEnergyCorrections)
774
775 else:
--> 776 raise Exception('Unknown model chemistry "{}".'.format(modelChemistry))
777
778 for symbol, count in atoms.items():

Exception: Unknown model chemistry "".

ARC could not determine the correct structure when two dihedrals should be changed

AIBN is a radical initiator
image
ARC oscillated between two structures, both have a rotor (the central N-N rotor) with a more stable conformer
image
image
image
Levels used:

Using default level wb97xd/6-311++g(d,p) for geometry optimizations
Using default level wb97xd/6-311++g(d,p) for frequency calculations
Using b3lyp/6-311+g(3df,2p) for single point calculations
Using wb97xd/6-311++g(d,p) for rotor scans

The correct structure of AIBN is:

N       0.43620300   -0.16213600    0.39977000
N      -0.43609600    0.16200900   -0.39967300
N       3.47047000   -1.53123100    0.03196200
N      -3.46996200    1.53150000   -0.03036300
C       1.77826200    0.47460500    0.12212700
C      -1.77820800   -0.47465200   -0.12199900
C       2.06997600    1.34762500    1.36001200
C       1.86126100    1.27708200   -1.18253100
C      -1.86100900   -1.27778400    1.18227200
C      -2.07041800   -1.34697300   -1.36024900
C       2.72158400   -0.65672400    0.08094100
C      -2.72128800    0.65684600   -0.07998100
H       3.08053300    1.75482400    1.30309300
H       1.97462700    0.76131600    2.27429700
H       1.35568300    2.17301000    1.39068400
H       1.63043300    0.65189300   -2.04518700
H       2.86999700    1.67776300   -1.29694300
H       1.14801100    2.10104700   -1.15797500
H      -1.14806900   -2.10199900    1.15698900
H      -2.86984100   -1.67817400    1.29685000
H      -1.62961800   -0.65314800    2.04517300
H      -1.35626800   -2.17245900   -1.39156500
H      -3.08102200   -1.75404500   -1.30325100
H      -1.97523500   -0.76020200   -2.27425500

Some species aren't processed well in py3Dmol

Could also be related to atom mapping to / from RDKit

O      -0.98526500    1.74795900    0.84621100
C      -1.15584400    0.77365100   -0.08180100
H      -2.21814200    0.49837200   -0.21097000
H      -0.86607200    1.22917700   -1.05225000
C      -0.28520800   -0.49049600    0.12747900
C       1.20033300   -0.14651100    0.18038000
H       1.79449700   -1.04204200    0.37859600
H       1.52877700    0.29545700   -0.76226600
H       1.38216500    0.57981500    0.97402900
C      -0.74563500   -1.24665600    1.37315400
H      -0.63589300   -0.59990000    2.24646500
H      -1.79478200   -1.53611900    1.28112200
H      -0.14716900   -2.14752600    1.52233500
O      -0.58693200   -1.42025300   -0.93733200
O      -0.25339300   -0.81210500   -2.21637000
H       0.52665200   -1.32818200   -2.46076100

Screenshot_2019-05-01 plot xyz

A network error causes ARC to crush

When the machine sending command to the server looses connectivity, ARC crashes wirh:

"""---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
<ipython-input-5-f62c9e69185c> in <module>()
      1 arc0 = arc.ARC(project='Arc-cooptima-methylpropylether-cbsqb3', composite_method='cbs-qb3', rmg_species_list=[], arc_species_list=arc_species_list,ess_settings = {'gaussian': 'c3ddb', 'molpro': 'pharos', 'qchem': 'pharos'})
----> 2 arc0.execute()
/home/dranasinghe/Software/ARC/arc/main.pyc in execute(self)
    234                                    scan_level=self.scan_level, fine=self.fine, settings=self.settings,
    235                                    generate_conformers=self.generate_conformers, scan_rotors=self.scan_rotors,
--> 236                                    initial_trsh=self.initial_trsh)
    237         prc = Processor(project=self.project, species_dict=self.scheduler.species_dict, output=self.scheduler.output,
    238                         use_bac=self.use_bac, model_chemistry=self.model_chemistry)
/home/dranasinghe/Software/ARC/arc/scheduler.pyc in __init__(self, project, settings, species_list, composite_method, conformer_level, opt_level, freq_level, sp_level, scan_level, fine, generate_conformers, scan_rotors, initial_trsh)
    151                 self.species_dict[species.label].generate_conformers()
    152         self.timer = True
--> 153         self.schedule_jobs()
    154 
    155     def schedule_jobs(self):
/home/dranasinghe/Software/ARC/arc/scheduler.pyc in schedule_jobs(self)
    261                             and not self.job_dict[label]['scan'][job_name].job_id in self.servers_jobs_ids:
    262                         job = self.job_dict[label]['scan'][job_name]
--> 263                         successful_server_termination = self.end_job(job=job, label=label, job_name=job_name)
    264                         if successful_server_termination:
    265                             self.check_scan_job(label=label, job=job)
/home/dranasinghe/Software/ARC/arc/scheduler.pyc in end_job(self, job, label, job_name)
    325                          fine=job.fine, software=job.software, shift=job.shift, trsh=job.trsh, memory=job.memory,
    326                          conformer=job.conformer, ess_trsh_methods=job.ess_trsh_methods, scan=job.scan,
--> 327                          pivots=job.pivots, occ=job.occ)
    328         self.running_jobs[label].pop(self.running_jobs[label].index(job_name))
    329         self.timer = False
/home/dranasinghe/Software/ARC/arc/scheduler.pyc in run_job(self, label, xyz, level_of_theory, job_type, fine, software, shift, trsh, memory, conformer, ess_trsh_methods, scan, pivots, occ)
    303                 self.job_dict[label][job_type] = dict()
    304             self.job_dict[label][job_type][job.job_name] = job
--> 305             self.job_dict[label][job_type][job.job_name].run()
    306         else:
    307             # Running a conformer job. Append differently to job_dict.
/home/dranasinghe/Software/ARC/arc/job/job.pyc in run(self)
    508             logging.info('Running job {name} for {label}'.format(name=self.job_name, label=self.species_name))
    509         logging.debug('writing submit script...')
--> 510         self.write_submit_script()
    511         logging.debug('writing input file...')
    512         self.write_input_file()
/home/dranasinghe/Software/ARC/arc/job/job.pyc in write_submit_script(self)
    285             f.write(self.submit)
    286         if self.settings['ssh']:
--> 287             self._upload_submit_file()
    288 
    289     def write_input_file(self):
/home/dranasinghe/Software/ARC/arc/job/job.pyc in _upload_submit_file(self)
    483         ssh.send_command_to_server(command='mkdir -p {0}'.format(self.remote_path))
    484         remote_file_path = os.path.join(self.remote_path, submit_filename[servers[self.server]['cluster_soft']])
--> 485         ssh.upload_file(remote_file_path=remote_file_path, file_string=self.submit)
    486 
    487     def _upload_input_file(self):
/home/dranasinghe/Software/ARC/arc/job/ssh.pyc in upload_file(self, remote_file_path, local_file_path, file_string)
     64             raise InputError('Cannot upload a non-existing file.'
     65                              ' Check why file in path {0} is missing.'.format(local_file_path))
---> 66         sftp, ssh = self.connect()
     67         with sftp.open(remote_file_path, "w") as f_remote:
     68             if file_string:
/home/dranasinghe/Software/ARC/arc/job/ssh.pyc in connect(self)
    168         ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    169         ssh.load_system_host_keys(filename=self.key)
--> 170         ssh.connect(hostname=self.address, username=self.un)
    171         sftp = ssh.open_sftp()
    172         return sftp, ssh
/home/dranasinghe/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.pyc in connect(self, hostname, port, username, password, pkey, key_filename, timeout, allow_agent, look_for_keys, compress, sock, gss_auth, gss_kex, gss_deleg_creds, gss_host, banner_timeout, auth_timeout, gss_trust_dns, passphrase)
    332             errors = {}
    333             # Try multiple possible address families (e.g. IPv4 vs IPv6)
--> 334             to_try = list(self._families_and_addresses(hostname, port))
    335             for af, addr in to_try:
    336                 try:
/home/dranasinghe/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.pyc in _families_and_addresses(self, hostname, port)
    202         guess = True
    203         addrinfos = socket.getaddrinfo(
--> 204             hostname, port, socket.AF_UNSPEC, socket.SOCK_STREAM
    205         )
    206         for (family, socktype, proto, canonname, sockaddr) in addrinfos:
gaierror: [Errno -2] Name or service not known
"""

or:

Traceback (most recent call last):
  File "/home/alongd/Code/ARC//ARC.py", line 95, in <module>
    main()
  File "/home/alongd/Code/ARC//ARC.py", line 78, in main
    arc_object.execute()
  File "/home/alongd/Code/ARC/arc/main.py", line 413, in execute
    project_directory=self.project_directory)
  File "/home/alongd/Code/ARC/arc/scheduler.py", line 175, in __init__
    self.schedule_jobs()
  File "/home/alongd/Code/ARC/arc/scheduler.py", line 253, in schedule_jobs
    successful_server_termination = self.end_job(job=job, label=label, job_name=job_name)
  File "/home/alongd/Code/ARC/arc/scheduler.py", line 349, in end_job
    pivots=job.pivots, occ=job.occ)
  File "/home/alongd/Code/ARC/arc/scheduler.py", line 327, in run_job
    self.job_dict[label][job_type][job.job_name].run()
  File "/home/alongd/Code/ARC/arc/job/job.py", line 507, in run
    self.write_submit_script()
  File "/home/alongd/Code/ARC/arc/job/job.py", line 284, in write_submit_script
    self._upload_submit_file()
  File "/home/alongd/Code/ARC/arc/job/job.py", line 482, in _upload_submit_file
    ssh.upload_file(remote_file_path=remote_file_path, file_string=self.submit)
  File "/home/alongd/Code/ARC/arc/job/ssh.py", line 66, in upload_file
    sftp, ssh = self.connect()
  File "/home/alongd/Code/ARC/arc/job/ssh.py", line 180, in connect
    ssh.connect(hostname=self.address, username=self.un)
  File "/home/alongd/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.py", line 334, in connect
    to_try = list(self._families_and_addresses(hostname, port))
  File "/home/alongd/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.py", line 204, in _families_and_addresses
    hostname, port, socket.AF_UNSPEC, socket.SOCK_STREAM
socket.gaierror: [Errno -2] Name or service not known

or:

---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
<ipython-input-3-580c043d58f4> in <module>()
      3 #                     fine=True, generate_conformers=True, scan_rotors=True, use_bac=True, model_chemistry
      4 arc0 = arc.ARC(project='ArcDemo', rmg_species_list=rmg_species_list, arc_species_list=arc_species_list)
----> 5 arc0.execute()

/home/alongd/Code/ARC/arc/main.pyc in execute(self)
    411                                    generate_conformers=self.generate_conformers, scan_rotors=self.scan_rotors,
    412                                    initial_trsh=self.initial_trsh, restart_dict=self.restart_dict,
--> 413                                    project_directory=self.project_directory)
    414         prc = Processor(project=self.project, species_dict=self.scheduler.species_dict, output=self.scheduler.output,
    415                         use_bac=self.use_bac, model_chemistry=self.model_chemistry)

/home/alongd/Code/ARC/arc/scheduler.pyc in __init__(self, project, settings, species_list, composite_method, conformer_level, opt_level, freq_level, sp_level, scan_level, project_directory, fine, generate_conformers, scan_rotors, initial_trsh, restart_dict)
    173                 self.species_dict[species.label].generate_conformers()
    174         self.timer = True
--> 175         self.schedule_jobs()
    176 
    177     def schedule_jobs(self):

/home/alongd/Code/ARC/arc/scheduler.pyc in schedule_jobs(self)
    180         """
    181         if self.generate_conformers:
--> 182             self.run_conformer_jobs()
    183         while self.running_jobs != {}:  # loop while jobs are still running
    184             logging.debug('Currently running jobs:\n{0}'.format(self.running_jobs))

/home/alongd/Code/ARC/arc/scheduler.pyc in run_conformer_jobs(self)
    370                     for i, xyz in enumerate(self.species_dict[label].conformers):
    371                         self.run_job(label=label, xyz=xyz, level_of_theory=self.conformer_level, job_type='conformer',
--> 372                                      conformer=i)
    373                 else:
    374                     if 'opt' not in self.job_dict[label] and 'composite' not in self.job_dict[label]\

/home/alongd/Code/ARC/arc/scheduler.pyc in run_job(self, label, xyz, level_of_theory, job_type, fine, software, shift, trsh, memory, conformer, ess_trsh_methods, scan, pivots, occ)
    330             self.running_jobs[label].append('conformer{0}'.format(conformer))  # mark as a running job
    331             self.job_dict[label]['conformers'][conformer] = job  # save job object
--> 332             self.job_dict[label]['conformers'][conformer].run()  # run the job
    333         if job.server not in self.servers:
    334             self.servers.append(job.server)

/home/alongd/Code/ARC/arc/job/job.pyc in run(self)
    507         self.write_submit_script()
    508         logging.debug('writing input file...')
--> 509         self.write_input_file()
    510         if self.settings['ssh']:
    511             ssh = SSH_Client(self.server)

/home/alongd/Code/ARC/arc/job/job.pyc in write_input_file(self)
    474             f.write(self.input)
    475         if self.settings['ssh']:
--> 476             self._upload_input_file()
    477 
    478     def _upload_submit_file(self):

/home/alongd/Code/ARC/arc/job/job.pyc in _upload_input_file(self)
    486         ssh.send_command_to_server(command='mkdir -p {0}'.format(self.remote_path))
    487         remote_file_path = os.path.join(self.remote_path, input_filename[self.software])
--> 488         ssh.upload_file(remote_file_path=remote_file_path, file_string=self.input)
    489 
    490     def _download_output_file(self):

/home/alongd/Code/ARC/arc/job/ssh.pyc in upload_file(self, remote_file_path, local_file_path, file_string)
     64             raise InputError('Cannot upload a non-existing file.'
     65                              ' Check why file in path {0} is missing.'.format(local_file_path))
---> 66         sftp, ssh = self.connect()
     67         with sftp.open(remote_file_path, "w") as f_remote:
     68             if file_string:

/home/alongd/Code/ARC/arc/job/ssh.pyc in connect(self)
    178             # This sometimes gives "SSHException: Error reading SSH protocol banner[Error 104] Connection reset by peer"
    179             # Try again:
--> 180             ssh.connect(hostname=self.address, username=self.un)
    181         sftp = ssh.open_sftp()
    182         return sftp, ssh

/home/alongd/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.pyc in connect(self, hostname, port, username, password, pkey, key_filename, timeout, allow_agent, look_for_keys, compress, sock, gss_auth, gss_kex, gss_deleg_creds, gss_host, banner_timeout, auth_timeout, gss_trust_dns, passphrase)
    332             errors = {}
    333             # Try multiple possible address families (e.g. IPv4 vs IPv6)
--> 334             to_try = list(self._families_and_addresses(hostname, port))
    335             for af, addr in to_try:
    336                 try:

/home/alongd/anaconda2/envs/rmg_env/lib/python2.7/site-packages/paramiko/client.pyc in _families_and_addresses(self, hostname, port)
    202         guess = True
    203         addrinfos = socket.getaddrinfo(
--> 204             hostname, port, socket.AF_UNSPEC, socket.SOCK_STREAM
    205         )
    206         for (family, socktype, proto, canonname, sockaddr) in addrinfos:

gaierror: [Errno -3] Temporary failure in name resolution

Arc is creating two sets of calculation in input file

When Arc was running single point energy calculation, it created input files as below.

***,name
memory,224,m;
geometry={angstrom;
I       0.00000000    0.00000000    0.00000000
}

**basis=cc-pVDZ
int;
{hf;wf,spin=1,charge=0;}
uccsd(t);**

basis=6-311g**
int;
{hf;
maxit,1000;
wf,spin=1,charge=0;}

uccsd(t);


---;

Since I set ccsd(T)-6-311g**, I can understand the bottom lines. However, I cannot figure out how cc-pVDZ came out

Add option to only perform a fine optimization

Is your feature request related to a problem? Please describe.
Usually, when one uses ARC, they need ARC to start from the very beginning and generate a plausible conformer for the molecule, and then use progressively higher levels of theory to optimize this geometry.

However, this is not always the case. For example, it is possible that the user has a very good geometry for the molecule already (for example, perhaps they have already performed a calculation for this species using a good DFT functional, but want to compare with another method). In this case, the user does not want to waste any computational time by starting from the very beginning.

ARC is fairly accommodating for this. For example, you can tell ARC not to perform conformer searches. One thing that ARC does not currently let you do though is bypass an initial optimization step before performing a fine optimization. Ideally, the user should be able to tell ARC to go directly to the fine optimization in the event that they think the geometry supplied is very close.

Describe the solution you'd like
I would like to add a keyword to ARC's input file called fine_only that if True will only perform a fine optimization.

Additional context
I am planning on implementing this myself, but I would appreciate any guidance as to which parts of ARC will likely need changing, as well as any comments about how this feature should best be implemented to best align with the vision for ARC going forward (if it is a feature that should be implemented in the first place).

Bug: ARC accepts non-isomorphic 2D and 3D when defining a species

Should raise an error instead.

The following input was allowed (6 H's vs. 5 H's):

species:
- label: bOOH
  multiplicity: 1
  directed_rotors:
    ess:
    - - - 6
        - 7
      - - 7
        - 8
  smiles: c1ccccc1OO
  xyz: |-
    C       0.08059628    1.32037195   -0.29800610
    C      -1.28794158    1.26062641   -0.03029704
    C      -1.89642539    0.02969442    0.21332426
    C      -1.13859198   -1.14110023    0.18344613
    C       0.23092154   -1.08214781   -0.08393383
    C       0.84282867    0.15119681   -0.31285027
    O       2.17997981    0.29916802   -0.59736431
    O       2.90066125   -0.82056323   -0.00921949
    H       0.55201906    2.27952184   -0.49410221
    H      -1.87925130    2.17240519   -0.01581738
    H      -2.96278939   -0.01860646    0.41888241
    H      -1.61463364   -2.10195688    0.36125669
    H       0.80478689   -2.00346200   -0.12519327

Can I only run the freq job and only using Qchem as my ESS?

I want to run a simple job, using Qchem to analyze hydrogen with b3lyp/6-31g* level of theory. I have set up my yml file with below code.

job_types:
    freq: true

However, the output file shows that ARC turns out to run the opt, conformers, sp, 1d_rotors, freq and fine job together. My first question is that can I only run freq job? Below is the message from output file.
image
And my server only has Qchem as ESS. I watch the arc/settings.py file, and in the 84 line it show that level_ess is the default that level and basis set relating to specific ESS. If I want to use Qchem to analyze some molecules with b3lyp/6-31g* level of theory. It seems that I have to set up the level_ess. Otherwise, it would have KeyError: 'gaussian'. I try to set up the level_ess by type below code in my yml file, but still have KeyError: 'gaussian'. Could you tell me how to run job by only using Qchem as ESS?

levels_ess:
    qchem:
    - b3lyp

SP calculation attempt during composite jobs

There seems to be some path for a composite job (at least a restarted composite job) that causes it to attempt to a sp calculation, which automatically triggers an error. Might be related to ARC's troubleshooting recipes.

After restart:
Traceback (most recent call last):
File "../../ARC.py", line 74, in
main()
File "../../ARC.py", line 68, in main
arc_object.execute()
File "/Users/mattjohnson/RMGCODE/ARC/arc/main.py", line 619, in execute
adaptive_levels=self.adaptive_levels)
File "/Users/mattjohnson/RMGCODE/ARC/arc/scheduler.py", line 327, in init
self.run_sp_job(species.label)
File "/Users/mattjohnson/RMGCODE/ARC/arc/scheduler.py", line 716, in run_sp_job
label))
arc.arc_exceptions.SchedulerError: run_sp_job() was called for O=COCOO which has a composite method level of theory

SettingsError: The submit_command is erroneous

I try to run a simple job, the script is in #194 . Before running, I have modified line 51~53, 87 in arc/setting.py and line 198 in arc/job/submit.py. I type which sbatch on a server, and I am sure that the sbatch path required in the submit_command dictionary is correct, line 94 in arc/setting.py. Do you know how to solve this error?
image

Getting `KeyError: 'memory'` when ARC tries to troubleshoot jobs

Traceback (most recent call last)
<ipython-input-3-0d19388948f2> in <module>
     29 ess_settings = {'gaussian': 'local', 'molpro': 'pharos', 'qchem': 'pharos'}
     30 arc0 = arc.ARC(project='phenyl_radical_recombination_2ndtry_022220', arc_species_list=arc_species_list, ess_settings=ess_settings)
---> 31 arc0.execute()

~/ARC/arc/main.py in execute(self)
    549                                    max_job_time=self.max_job_time, allow_nonisomorphic_2d=self.allow_nonisomorphic_2d,
    550                                    memory=self.memory, adaptive_levels=self.adaptive_levels,
--> 551                                    confs_to_dft=self.confs_to_dft, dont_gen_confs=self.dont_gen_confs)
    552 
    553         save_yaml_file(path=os.path.join(self.project_directory, 'output', 'status.yml'), content=self.scheduler.output)

~/ARC/arc/scheduler.py in __init__(self, project, ess_settings, species_list, project_directory, composite_method, conformer_level, opt_level, freq_level, sp_level, scan_level, ts_guess_level, irc_level, orbitals_level, adaptive_levels, rmgdatabase, job_types, job_additional_options, solvent, job_shortcut_keywords, rxn_list, bath_gas, restart_dict, max_job_time, allow_nonisomorphic_2d, memory, testing, dont_gen_confs, confs_to_dft)
    420         self.timer = True
    421         if not self.testing:
--> 422             self.schedule_jobs()
    423 
    424     def schedule_jobs(self):

~/ARC/arc/scheduler.py in schedule_jobs(self)
    503                         successful_server_termination = self.end_job(job=job, label=label, job_name=job_name)
    504                         if successful_server_termination:
--> 505                             self.check_sp_job(label=label, job=job)
    506                         self.timer = False
    507                         break

~/ARC/arc/scheduler.py in check_sp_job(self, label, job)
   2057                 self.output[label]['paths']['geo'] = job.local_path_to_output_file
   2058         else:
-> 2059             self.troubleshoot_ess(label=label, job=job, level_of_theory=job.job_level_of_theory_dict)
   2060 
   2061     def post_sp_actions(self, label, job):

~/ARC/arc/scheduler.py in troubleshoot_ess(self, label, job, level_of_theory, conformer)
   2595                          num_heavy_atoms=self.species_dict[label].number_of_heavy_atoms, software=job.software,
   2596                          fine=job.fine, memory_gb=job.total_job_memory_gb, cpu_cores=job.cpu_cores,
-> 2597                          ess_trsh_methods=job.ess_trsh_methods, available_ess=list(self.ess_settings.keys()))
   2598         for output_error in output_errors:
   2599             self.output[label]['errors'] += output_error

~/ARC/arc/job/trsh.py in trsh_ess_job(label, level_of_theory_dict, server, job_status, job_type, software, fine, memory_gb, num_heavy_atoms, cpu_cores, ess_trsh_methods, available_ess, is_h)
    787             # Increase memory allocation, also run with a shift
    788             ess_trsh_methods.append('memory')
--> 789             memory = servers[server]['memory']  # set memory to the value of an entire node (in GB)
    790             logger.info('Troubleshooting {type} job in {software} for {label} using memory: {mem:.2f} GB instead of '
    791                         '{old} GB'.format(type=job_type, software=software, mem=memory, old=memory_gb,

KeyError: 'memory'

ts yaml from ARC jobs

While currently ARC can't do ts yaml, once arkane_ts_yaml is merged on RMG-Py it still won't be able to generate ts yaml because it isn't providing arkane with proper molecule objects for each reactant and product.

"Error: Re-submitting job" when running opt jobs in QChem

I try to submit a opt job. It appear a Re-submitting job error. However, when I run sp job, it doesn't happen this error. Below is trace message.

ARC execution initiated on Thu Sep 19 06:53:09 2019

###############################################################
#                                                             #
#                 Automatic Rate Calculator                   #
#                            ARC                              #
#                                                             #
#   Version: 1.1.0                                            #
#                                                             #
###############################################################

The current git HEAD for ARC is:
    9fc5c7722322bc0a0b74c82652cc8401f8fa3fae
    Mon Sep 16 22:39:52 2019 -0400


Starting project hydrogen

Considering the following job types: ['opt']


Using the following ESS settings:
{'qchem': ['local']}



Warning: Not using a fine grid for geometry optimization jobs




Warning: Not running rotor scans. This might compromise finding the best conformer, as dihedral angles won't be corrected. Also, entropy won't be accurate.


Using b3lyp/6-31g* for refined conformer searches (after filtering via force fields)
Using b3lyp/6-31g* for TS guesses comparison of different methods
Using b3lyp/6-31g* for geometry optimizations
Using b3lyp/6-31g* for frequency calculations
Using b3lyp/6-31g* for single point calculations
Using b3lyp/6-31g* for rotor scans



Warning: Could not determine appropriate Model Chemistry to be used in Arkane for thermochemical parameter calculations.
Not using atom energy corrections and bond additivity corrections!


Using b3lyp/6-31g*//b3lyp/6-31g* as a model chemistry in Arkane
Warning: :root:No frequency scaling factor found for model chemistry b3lyp/6-31g*. Assuming a value of unity. This will affect the partition function and all quantities derived from it (thermo quantities and rate coefficients).
Could not determine the harmonic frequencies scaling factor for b3lyp/6-31g* from Arkane.
Not calculating it, assuming a frequencies scaling factor of 1.


Considering species: hydrogen
<Molecule "[H][H]">


Only one conformer is available for species hydrogen, using it as initial xyz
The only conformer for species hydrogen was found to be isomorphic with the 2D graph representation [H][H]

Running job opt_a72 for hydrogen
  Ending job opt_a72 for hydrogen (run time: 0:00:22)


Warning: Troubleshooting hydrogen job opt_a72 which failed with status "errored" with keywords [u'Unknown'] in qchem. The error "QChem job terminated for an unknown reason." was derived from the following line in the log file: "".
Error: Re-submitting job opt_a72 on local
Traceback (most recent call last):
  File "/global/homes/s/shcheng/ARC/ARC.py", line 76, in <module>
    main()
  File "/global/homes/s/shcheng/ARC/ARC.py", line 70, in main
    arc_object.execute()
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 718, in execute
    dont_gen_confs=self.dont_gen_confs)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 391, in __init__
    self.schedule_jobs()
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 454, in schedule_jobs
    success = self.parse_opt_geo(label=label, job=job)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 1759, in parse_opt_geo
    self.troubleshoot_opt_jobs(label=label)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 2306, in troubleshoot_opt_jobs
    self.troubleshoot_ess(label=label, job=job, level_of_theory=self.opt_level)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 2334, in troubleshoot_ess
    job.troubleshoot_server()
  File "/global/u2/s/shcheng/ARC/arc/job/job.py", line 1118, in troubleshoot_server
    server_nodes=self.server_nodes)
  File "/global/u2/s/shcheng/ARC/arc/job/trsh.py", line 708, in trsh_job_on_server
    ssh = SSHClient(server)
  File "/global/u2/s/shcheng/ARC/arc/job/ssh.py", line 49, in __init__
    self.address = servers[server]['address']
KeyError: u'address'

Atom ordering by xyz is buggy

The following code does not yield a molecule with the same atom order as the xyz:

xyz5 = str_to_xyz(""" C                  1.50048866   -0.50848248   -0.64006761
F                  0.27568368    0.01156702   -0.41224910
Cl                 3.09727149   -1.18647281   -0.93707550
O                  1.93561914    0.74741610   -1.16759033
O                  2.33727805    1.90670710   -1.65453438
H                  3.25580294    1.83954553   -1.92546105""")

spc5 = ARCSpecies(label='chiralOOH', smiles='OO[C](F)Cl', xyz=xyz5)
for atom, symbol in zip(spc5.mol.atoms, xyz5['symbols']):
    self.assertEqual(atom.symbol, symbol)

The problem is in converter.molecules_from_xyz(), and specifically in converter.order_atoms() where the if mol_is_iso_copy.is_isomorphic(ref_mol_is_iso_copy, save_order=True, strict=False): check gives False: One mol has the radical, the other doesn't.

KeyError: 'TS0' when running examples/rates_demo/input.yml

I ran the examples/rates_demo/input.yml as a test, and it happened a KeyError. Below is the trace message. Because I see self.output[species.label]['restart'] += 'Restarted ARC at {0}; '.format( KeyError: u'TS0', I print the self.output and species.label out which are behind the trace message. Indeed, the key, TSO, is not in the dictionary, self.output, for H2O2.
By the way, it seems that it has no problem for some molecules, such as N2H4, NH, N2H3, NH2 labeled in input.yml. I just went to test the dehydrogenation of ethane (ethane <=> ethene + hydrogen), and it also happened KeyError: u'TS0' for finding rotor for ethane.

Using Theano backend.
ARC execution initiated on Wed Sep 18 09:07:26 2019
###############################################################
#                                                             #
#                 Automatic Rate Calculator                   #
#                            ARC                              #
#                                                             #
#   Version: 1.1.0                                            #
#                                                             #
###############################################################
The current git HEAD for ARC is:
    ccdb557aa3f369b0de24da9a84ce9f9c3ad6dbb8
    Sun Sep 15 21:43:52 2019 -0400


Starting project demo

Considering the following job types: ['opt', '1d_rotors', 'freq', u'fine', 'sp', 'conformers']


Using the following ESS settings:
{'onedmin': ['server1'], 'molpro': ['server2'], 'qchem': ['server1'], 'gaussian': ['local', 'server2']}

Using default level b3lyp/6-31g(d,p) EmpiricalDispersion=GD3BJ for refined conformer searches (after filtering via force fields)
Using default level b3lyp/6-31g(d,p) EmpiricalDispersion=GD3BJ for TS guesses comparison of different methods
Using default level wb97xd/def2tzvp for geometry optimizations
Using default level wb97xd/def2tzvp for frequency calculations
Using default level ccsd(t)-f12/cc-pvtz-f12 for single point calculations
Using default level wb97xd/def2tzvp for rotor scans
Using ccsd(t)-f12/cc-pvtz-f12//wb97xd/def2tzvp as a model chemistry in Arkane


Considering species: N2H4
<Molecule "NN">
Considering species: NH
<Molecule "[NH]">
Considering species: N2H3
<Molecule "N[NH]">
Considering species: NH2
<Molecule "[NH2]">
Considering species: c4rad
<Molecule "[CH]=CC=C">
Considering species: HO2
<Molecule "[O]O">
Considering species: c4birad
<Molecule "[CH]=C[C]=C">
Considering species: H2O2
<Molecule "OO">



Loading RMG's families...



Considering reaction: c4rad + HO2 <=> c4birad + H2O2
(identified as belonging to RMG family H_Abstraction, which is its own reverse)
c4rad + HO2 <=> c4birad + H2O2
Trying to generating a TS guess for H_Abstraction reaction c4rad + HO2 <=> c4birad + H2O2 using AutoTST...
Using Theano backend.
reaction.py:169 load_databases INFO Loading RMG database from '/global/u2/s/shcheng/RMG/RMG-database/input'
thermo.py:844 loadLibraries INFO Loading thermodynamics library from primaryThermoLibrary.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from thermo_DFT_CCSDTF12_BAC.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from CBS_QB3_1dHR.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:861 loadGroups INFO Loading thermodynamics group database from /global/u2/s/shcheng/RMG/RMG-database/input/thermo/groups...
transport.py:294 loadGroups INFO Loading transport group database from /global/u2/s/shcheng/RMG/RMG-database/input/transport/groups...
statmech.py:529 loadGroups INFO Loading frequencies group database from /global/u2/s/shcheng/RMG/RMG-database/input/statmech/groups...
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/R_Addition_MultipleBond/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/H_Abstraction/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/intra_H_migration/TS_groups.py
reaction.py:341 get_labeled_reaction INFO Matched reaction to H_Abstraction family
reaction.py:341 get_labeled_reaction INFO Matched reaction to H_Abstraction family
reaction.py:250 generate_distance_data INFO Distance between *1 and *3 is too small, setting it to lower bound of uncertainty
reaction.py:255 generate_distance_data INFO The distance data is as follows: DistanceData(distances={'d12': 1.431964,'d13': 2.406448,'d23': 1.106812,}, uncertainties={'d12': 0.428770,'d13': 0.245552,'d23': 0.337577,}, comment='Matched node Cd_Cds/Cd/H ([<Entry index=195 label="Cd_Cds/Cd/H">, <Entry index=17 label="Cj">])\nMatched node OjO ([<Entry index=126 label="C/H2/Cd/Cd">, <Entry index=32 label="OjO">])\n')
Traceback (most recent call last):
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 92, in <module>
    main()
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 79, in main
    positions = reaction.ts.ase_ts.get_positions()
AttributeError: 'dict' object has no attribute 'ase_ts'
Trying to generating a TS guess for H_Abstraction reaction c4rad + HO2 <=> c4birad + H2O2 using AutoTST in the reverse direction...
Using Theano backend.
reaction.py:169 load_databases INFO Loading RMG database from '/global/u2/s/shcheng/RMG/RMG-database/input'
thermo.py:844 loadLibraries INFO Loading thermodynamics library from primaryThermoLibrary.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from thermo_DFT_CCSDTF12_BAC.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from CBS_QB3_1dHR.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:861 loadGroups INFO Loading thermodynamics group database from /global/u2/s/shcheng/RMG/RMG-database/input/thermo/groups...
transport.py:294 loadGroups INFO Loading transport group database from /global/u2/s/shcheng/RMG/RMG-database/input/transport/groups...
statmech.py:529 loadGroups INFO Loading frequencies group database from /global/u2/s/shcheng/RMG/RMG-database/input/statmech/groups...
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/R_Addition_MultipleBond/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/H_Abstraction/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/intra_H_migration/TS_groups.py
reaction.py:341 get_labeled_reaction INFO Matched reaction to H_Abstraction family
reaction.py:341 get_labeled_reaction INFO Matched reaction to H_Abstraction family
reaction.py:250 generate_distance_data INFO Distance between *1 and *3 is too small, setting it to lower bound of uncertainty
reaction.py:255 generate_distance_data INFO The distance data is as follows: DistanceData(distances={'d12': 1.108432,'d13': 2.406223,'d23': 1.430548,}, uncertainties={'d12': 0.337116,'d13': 0.246053,'d23': 0.428882,}, comment='Matched node OOH ([<Entry index=29 label="OOH">, <Entry index=261 label="Cdj_CddH">])\nMatched node Cdj_CdsCd ([<Entry index=29 label="OOH">, <Entry index=263 label="Cdj_CdsCd">])\n')
Traceback (most recent call last):
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 92, in <module>
    main()
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 79, in main
    positions = reaction.ts.ase_ts.get_positions()
AttributeError: 'dict' object has no attribute 'ase_ts'



Considering reaction: N2H4 + NH <=> N2H3 + NH2
(identified as belonging to RMG family H_Abstraction, which is its own reverse)
N2H4 + NH <=> N2H3 + NH2
Setting multiplicity of reaction N2H4 + NH <=> N2H3 + NH2 to 3
Trying to generating a TS guess for H_Abstraction reaction N2H4 + NH <=> N2H3 + NH2 using AutoTST...
Using Theano backend.
reaction.py:169 load_databases INFO Loading RMG database from '/global/u2/s/shcheng/RMG/RMG-database/input'
thermo.py:844 loadLibraries INFO Loading thermodynamics library from primaryThermoLibrary.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from thermo_DFT_CCSDTF12_BAC.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from CBS_QB3_1dHR.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:861 loadGroups INFO Loading thermodynamics group database from /global/u2/s/shcheng/RMG/RMG-database/input/thermo/groups...
transport.py:294 loadGroups INFO Loading transport group database from /global/u2/s/shcheng/RMG/RMG-database/input/transport/groups...
statmech.py:529 loadGroups INFO Loading frequencies group database from /global/u2/s/shcheng/RMG/RMG-database/input/statmech/groups...
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/R_Addition_MultipleBond/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/H_Abstraction/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/intra_H_migration/TS_groups.py
Traceback (most recent call last):
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 92, in <module>
    main()
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 79, in main
    positions = reaction.ts.ase_ts.get_positions()
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 118, in ts
    for direction, complex in self.get_rmg_complexes().items():
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 517, in get_rmg_complexes
    self.get_labeled_reaction()
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 333, in get_labeled_reaction
    test_reaction.reactants, test_reaction.products)
  File "/global/homes/s/shcheng/RMG/RMG-Py/rmgpy/data/kinetics/family.py", line 2592, in getLabeledReactantsAndProducts
    mappingsB = self.__matchReactantToTemplate(moleculeB, template.reactants[1].item)
IndexError: list index out of range
Trying to generating a TS guess for H_Abstraction reaction N2H4 + NH <=> N2H3 + NH2 using AutoTST in the reverse direction...
Using Theano backend.
reaction.py:169 load_databases INFO Loading RMG database from '/global/u2/s/shcheng/RMG/RMG-database/input'
thermo.py:844 loadLibraries INFO Loading thermodynamics library from primaryThermoLibrary.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from thermo_DFT_CCSDTF12_BAC.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:844 loadLibraries INFO Loading thermodynamics library from CBS_QB3_1dHR.py in /global/u2/s/shcheng/RMG/RMG-database/input/thermo/libraries...
thermo.py:861 loadGroups INFO Loading thermodynamics group database from /global/u2/s/shcheng/RMG/RMG-database/input/thermo/groups...
transport.py:294 loadGroups INFO Loading transport group database from /global/u2/s/shcheng/RMG/RMG-database/input/transport/groups...
statmech.py:529 loadGroups INFO Loading frequencies group database from /global/u2/s/shcheng/RMG/RMG-database/input/statmech/groups...
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/R_Addition_MultipleBond/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/H_Abstraction/TS_groups.py
base.py:220 load INFO Loading transitions state family groups from /global/homes/s/shcheng/AutoTST/database/intra_H_migration/TS_groups.py
Traceback (most recent call last):
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 92, in <module>
    main()
  File "/global/u2/s/shcheng/ARC/arc/ts/run_autotst.py", line 79, in main
    positions = reaction.ts.ase_ts.get_positions()
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 118, in ts
    for direction, complex in self.get_rmg_complexes().items():
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 517, in get_rmg_complexes
    self.get_labeled_reaction()
  File "/global/homes/s/shcheng/AutoTST/autotst/reaction.py", line 333, in get_labeled_reaction
    test_reaction.reactants, test_reaction.products)
  File "/global/homes/s/shcheng/RMG/RMG-Py/rmgpy/data/kinetics/family.py", line 2592, in getLabeledReactantsAndProducts
    mappingsB = self.__matchReactantToTemplate(moleculeB, template.reactants[1].item)
IndexError: list index out of range

Found one possible rotor for N2H4
Pivot list(s) for N2H4: [[1, 2]]


Found one possible rotor for N2H3
Pivot list(s) for N2H3: [[1, 2]]


Found one possible rotor for c4rad
Pivot list(s) for c4rad: [[1, 2]]


Found 2 possible rotors for c4birad
Pivot list(s) for c4birad: [[1, 3], [1, 4]]


Found one possible rotor for H2O2
Pivot list(s) for H2O2: [[1, 2]]

Traceback (most recent call last):
  File "/global/homes/s/shcheng/ARC/ARC.py", line 76, in <module>
    main()
  File "/global/homes/s/shcheng/ARC/ARC.py", line 70, in main
    arc_object.execute()
  File "/global/u2/s/shcheng/ARC/arc/main.py", line 718, in execute
    dont_gen_confs=self.dont_gen_confs)
  File "/global/u2/s/shcheng/ARC/arc/scheduler.py", line 316, in __init__
    self.output[species.label]['restart'] += 'Restarted ARC at {0}; '.format(
KeyError: u'TS0'

self.output and species.label for H2O2: [[1, 2]]

self.output is: {'NH': {u'info': u'', u'paths': {u'composite': u'', u'freq': u'', u'sp': u'', u'geo': u''},
 u'isomorphism': u'', u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 'bde': True, 
u'onedmin': False, u'composite': False, u'fine': False, 'sp': False, '1d_rotors': True, 'orbitals': False, 
'freq': False, 'conformers': False}, u'convergence': False, u'conformers': u'', u'restart': u'Restarted 
ARC at 2019-09-18 09:44:37.264770; '}, 'c4birad': {u'info': u'', u'paths': {u'composite': u'', u'freq': 
u'', u'sp': u'', u'geo': u''}, u'isomorphism': u'', u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 
'bde': True, u'onedmin': False, u'composite': False, u'fine': False, 'sp': False, '1d_rotors': True, 
'orbitals': False, 'freq': False, 'conformers': False}, u'convergence': False, u'conformers': u'', 
u'restart': u'Restarted ARC at 2019-09-18 09:44:37.265836; '}, 'H2O2': {u'info': u'', u'paths': 
{u'composite': u'', u'freq': u'', u'sp': u'', u'geo': u''}, u'isomorphism': u'', u'warnings': u'', u'errors': 
u'', u'job_types': {'opt': False, 'bde': True, u'onedmin': False, u'composite': False, u'fine': False, 'sp': 
False, '1d_rotors': True, 'orbitals': False, 'freq': False, 'conformers': False}, u'convergence': False, 
u'conformers': u'', u'restart': u'Restarted ARC at 2019-09-18 09:44:37.266186; '}, 'c4rad': {u'info': 
u'', u'paths': {u'composite': u'', u'freq': u'', u'sp': u'', u'geo': u''}, u'isomorphism': u'', u'warnings': u'', 
u'errors': u'', u'job_types': {'opt': False, 'bde': True, u'onedmin': False, u'composite': False, u'fine': 
False, 'sp': False, '1d_rotors': True, 'orbitals': False, 'freq': False, 'conformers': False}, u'convergence': 
False, u'conformers': u'', u'restart': u'Restarted ARC at 2019-09-18 09:44:37.265421; '}, 'N2H4': 
{u'info': u'', u'paths': {u'composite': u'', u'freq': u'', u'sp': u'', u'geo': u''}, u'isomorphism': u'', 
u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 'bde': True, u'onedmin': False, u'composite': 
False, u'fine': False, 'sp': False, '1d_rotors': True, 'orbitals': False, 'freq': False, 'conformers': False}, 
u'convergence': False, u'conformers': u'', u'restart': u'Restarted ARC at 2019-09-18 
09:44:37.264134; '}, 'HO2': {u'info': u'', u'paths': {u'composite': u'', u'freq': u'', u'sp': u'', u'geo': u''}, 
u'isomorphism': u'', u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 'bde': True, u'onedmin': 
False, u'composite': False, u'fine': False, 'sp': False, '1d_rotors': True, 'orbitals': False, 'freq': False, 
'conformers': False}, u'convergence': False, u'conformers': u'', u'restart': u'Restarted ARC at 2019-
09-18 09:44:37.265715; '}, 'NH2': {u'info': u'', u'paths': {u'composite': u'', u'freq': u'', u'sp': u'', 
u'geo': u''}, u'isomorphism': u'', u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 'bde': True, 
u'onedmin': False, u'composite': False, u'fine': False, 'sp': False, '1d_rotors': True, 'orbitals': False, 
'freq': False, 'conformers': False}, u'convergence': False, u'conformers': u'', u'restart': u'Restarted 
ARC at 2019-09-18 09:44:37.265308; '}, 'N2H3': {u'info': u'', u'paths': {u'composite': u'', u'freq': u'', 
u'sp': u'', u'geo': u''}, u'isomorphism': u'', u'warnings': u'', u'errors': u'', u'job_types': {'opt': False, 
'bde': True, u'onedmin': False, u'composite': False, u'fine': False, 'sp': False, '1d_rotors': True, 
'orbitals': False, 'freq': False, 'conformers': False}, u'convergence': False, u'conformers': u'', 
u'restart': u'Restarted ARC at 2019-09-18 09:44:37.264914; '}}

species.label is: TS0

run conformers as jobs

There should be an option to submit conformer generation using FF as an ARC job to run these in parallel. Useful for large molecules.

When should a server's name be "local"?

Posted on behalf of @NtuCheRoy:

"If I just want to run my job on server by using qchem built on server as Electronic Structure Software. Is that right I just type below code in my yml file? or it should be local?"

ess_settings:
qchem:
- server1

Restarting ARC job adds extra entries to initiated_jobs.csv

When restarting an ARC job, I noticed that the job numbers would jump up by a lot. It seems like when job objects are recreated for currently running jobs, it writes those jobs to the initiated_jobs.csv file. Then, since the job number is determined by counting lines in the file, the next new job gets a much higher number.

I think the desired behavior would be not logging these jobs to initiated_jobs.csv, which should resolve the job numbering automatically.

Save an RMG dictionary for species

ARC doesn't save the 2D connectivity information for species (only after thermo is calculated, then it appears in the libraries), we think it should also output an RMG dictionary by default for each species

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.