OptimizationAlgorithm

class optking.stepAlgorithms.OptimizationAlgorithm(molsys, history, params)[source]

Bases: OptimizationInterface

The standard minimization and transition state algorithms inherit from here. Defines the take_step for those algorithms. Backstep and trust radius management are performed here.

All child classes implement a step() method using the forces and Hessian to compute a step direction and possibly a step_length. trust_radius_on = False allows a child class to override a basic trust radius enforcement.

Methods Summary

apply_interfrag_step_scaling(dq)

Check the size of the interfragment modes.

apply_intrafrag_step_scaling(dq)

Apply maximum step limit by scaling.

assess_previous_step()

Determine whether the last step was acceptable, prints summary and change trust radius

backstep()

takes a partial step backwards.

backstep_needed()

Simple logic for whether a backstep is advisable (or too many have been taken).

converged(dq, fq, step_number[, str_mode])

decrease_trust_radius()

Scale trust radius by 0.25

expected_energy(step, grad, hess)

Compute the expected energy given the model for the step

increase_trust_radius()

Increase trust radius by factor of 3

requires(**kwargs)

Returns tuple with strings ('energy', 'gradient', 'hessian') for what the algorithm needs to compute a new point

step(fq, H, *args, **kwargs)

Basic form of the algorithm

take_step([fq, H, energy, return_str])

Compute step and take step

update_history(delta_e, achieved_dq, ...)

Basic history update method.

Methods Documentation

apply_interfrag_step_scaling(dq)[source]

Check the size of the interfragment modes. They can inadvertently represent very large motions.

Returns

dq

Return type

scaled step according to trust radius

apply_intrafrag_step_scaling(dq)[source]

Apply maximum step limit by scaling.

assess_previous_step()[source]

Determine whether the last step was acceptable, prints summary and change trust radius

backstep()[source]

takes a partial step backwards. fq and H should correspond to the previous not current point

Notes

Take partial backward step. Update current step in history. Divide the last step size by 1/2 and displace from old geometry. History contains:

consecutiveBacksteps : increase by 1

Step contains:

forces, geom, E, followedUnitVector, oneDgradient, oneDhessian, Dq, and projectedDE

update:

Dq - cut in half projectedDE - recompute

leave remaining

backstep_needed()[source]

Simple logic for whether a backstep is advisable (or too many have been taken).

Returns

bool

Return type

True if a backstep should be taken

converged(dq, fq, step_number, str_mode=None)[source]
decrease_trust_radius()[source]

Scale trust radius by 0.25

expected_energy(step, grad, hess)[source]

Compute the expected energy given the model for the step

Parameters
  • step (float) – normalized step (unit length)

  • grad (np.ndarray) – projection of gradient onto step

  • hess (np.ndarray) – projection of hessian onto step

increase_trust_radius()[source]

Increase trust radius by factor of 3

abstract requires(**kwargs)[source]

Returns tuple with strings (‘energy’, ‘gradient’, ‘hessian’) for what the algorithm needs to compute a new point

abstract step(fq: ndarray, H: ndarray, *args, **kwargs) ndarray[source]

Basic form of the algorithm

take_step(fq=None, H=None, energy=None, return_str=False, **kwargs)[source]

Compute step and take step

update_history(delta_e, achieved_dq, unit_dq, projected_f, projected_hess)[source]

Basic history update method. This should be expanded here and in child classes in future