CustomHelper

class optking.opt_helper.CustomHelper(mol_src, params={}, **kwargs)[source]

Bases: Helper

Class allows for easy setup of optking. Accepts custom forces, energies,

and hessians from user. User will need to write a loop to perform optimization.

>>> import qcengine as qcng
>>> from qcelemental.models import Molecule, OptimizationInput
>>> from qcelemental.models.common_models import Model
>>> from qcelemental.models.procedures import QCInputSpecification
>>> opt_input = {
...     "initial_molecule": {
...         "symbols": ["O", "O", "H", "H"],
...         "geometry": [
...             0.0000000000,
...             0.0000000000,
...             0.0000000000,
...             -0.0000000000,
...             -0.0000000000,
...             2.7463569188,
...             1.3013018774,
...             -1.2902977124,
...             2.9574871774,
...             -1.3013018774,
...             1.2902977124,
...             -0.2111302586,
...         ],
...         "fix_com": True,
...         "fix_orientation": True,
...     },
...     "input_specification": {
...         "model": {"method": "hf", "basis": "sto-3g"},
...         "driver": "gradient",
...         "keywords": {"d_convergence": "1e-7"},
...     },
...     "keywords": {"g_convergence": "GAU_TIGHT", "program": "psi4"},
... }
>>> for step in range(30):
... # Compute one's own energy and gradient
... E, gX = optking.lj_functions.calc_energy_and_gradient(opt.geom, 2.5, 0.01, True)
... # Insert these values into the 'user' computer.
... opt.E = E
... opt.gX = gX
... opt.compute() # process input. Get ready to take a step
... opt.take_step()
... conv = opt.test_convergence()
... if conv is True:
...     print("Optimization SUCCESS:")
...     break
>>> else:
... print("Optimization FAILURE:

“)

>>> json_output = opt.close() # create an unvalidated OptimizationOutput like object
>>> E = json_output["energies"][-1]

Overrides. gX, Hessian, and Energy to allow for user input.

Attributes Summary

E

HX

gX

Methods Summary

calculations_needed()

Assume gradient is always needed.

from_dict(d)

Construct as far as possible the helper.

Attributes Documentation

E: Optional[float]
HX: Optional[ndarray]
gX: Optional[ndarray]

Methods Documentation

calculations_needed()[source]

Assume gradient is always needed. Provide tuple with keys for required properties

classmethod from_dict(d)[source]

Construct as far as possible the helper. Child class will need to update computer