OptimizationAlgorithm
- class optking.stepAlgorithms.OptimizationAlgorithm(molsys, history, params)[source]
Bases:
OptimizationInterface
The standard minimization and transition state algorithms inherit from here. Defines the take_step for those algorithms. Backstep and trust radius management are performed here.
All child classes implement a step() method using the forces and Hessian to compute a step direction and possibly a step_length. trust_radius_on = False allows a child class to override a basic trust radius enforcement.
Methods Summary
Check the size of the interfragment modes.
Apply maximum step limit by scaling.
Determine whether the last step was acceptable, prints summary and change trust radius
backstep
()takes a partial step backwards.
Simple logic for whether a backstep is advisable (or too many have been taken).
converged
(dq, fq, step_number[, str_mode])Scale trust radius by 0.25
expected_energy
(step, grad, hess)Compute the expected energy given the model for the step
Increase trust radius by factor of 3
requires
(**kwargs)Returns tuple with strings ('energy', 'gradient', 'hessian') for what the algorithm needs to compute a new point
step
(fq, H, *args, **kwargs)Basic form of the algorithm
take_step
([fq, H, energy, return_str])Compute step and take step
update_history
(delta_e, achieved_dq, ...)Basic history update method.
Methods Documentation
- apply_interfrag_step_scaling(dq)[source]
Check the size of the interfragment modes. They can inadvertently represent very large motions.
- Returns
dq
- Return type
scaled step according to trust radius
- assess_previous_step()[source]
Determine whether the last step was acceptable, prints summary and change trust radius
- backstep()[source]
takes a partial step backwards. fq and H should correspond to the previous not current point
Notes
Take partial backward step. Update current step in history. Divide the last step size by 1/2 and displace from old geometry. History contains:
consecutiveBacksteps : increase by 1
- Step contains:
forces, geom, E, followedUnitVector, oneDgradient, oneDhessian, Dq, and projectedDE
- update:
Dq - cut in half projectedDE - recompute
leave remaining
- backstep_needed()[source]
Simple logic for whether a backstep is advisable (or too many have been taken).
- Returns
bool
- Return type
True if a backstep should be taken
- expected_energy(step, grad, hess)[source]
Compute the expected energy given the model for the step
- Parameters
step (float) – normalized step (unit length)
grad (np.ndarray) – projection of gradient onto step
hess (np.ndarray) – projection of hessian onto step
- abstract requires(**kwargs)[source]
Returns tuple with strings (‘energy’, ‘gradient’, ‘hessian’) for what the algorithm needs to compute a new point
- abstract step(fq: ndarray, H: ndarray, *args, **kwargs) ndarray [source]
Basic form of the algorithm