L iOjdZddlmZddlmZddlZgdZGddeZ d Z dd Z dd Z e Z dd Z dd ZdZdZdZddZddZddZ ddZ ddZy)z Functions --------- .. autosummary:: :toctree: generated/ line_search_armijo line_search_wolfe1 line_search_wolfe2 scalar_search_wolfe1 scalar_search_wolfe2 )warn)DCSRCHN)LineSearchWarningline_search_wolfe1line_search_wolfe2scalar_search_wolfe1scalar_search_wolfe2line_search_armijoc eZdZy)rN)__name__ __module__ __qualname__`/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/scipy/optimize/_linesearch.pyrrsrrcTd|cxkr|cxkrdkstdtdy)Nrrz.'c1' and 'c2' do not satisfy'0 < c1 < c2 < 1'.) ValueError)c1c2s r _check_c1_c2rs? ORO!O./ / ./ / rc | g}|gdgdgfd} fd}tj|}t| |||||| | | |  \}}}|dd||dfS)a1 As `scalar_search_wolfe1` but do a line search to direction `pk` Parameters ---------- f : callable Function `f(x)` fprime : callable Gradient of `f` xk : array_like Current point pk : array_like Search direction gfk : array_like, optional Gradient of `f` at point `xk` old_fval : float, optional Value of `f` at point `xk` old_old_fval : float, optional Value of `f` at point preceding `xk` The rest of the parameters are the same as for `scalar_search_wolfe1`. Returns ------- stp, f_count, g_count, fval, old_fval As in `line_search_wolfe1` gval : array Gradient of `f` at the final point Notes ----- Parameters `c1` and `c2` must satisfy ``0 < c1 < c2 < 1``. rc<dxxdz cc<|zzgSNrrr)sargsffcpkxks rphizline_search_wolfe1..phiRs( 1 ad"T""rct|zzgd<dxxdz cc<tjdSrnpdot)rrfprimegcgvalrr s rderphiz"line_search_wolfe1..derphiVs@ad*T*Q 1 vvd1gr""r)rramaxaminxtol)r$r%r )rr&r rgfkold_fval old_old_fvalrrrr*r+r,r!r)derphi0stpfvalrr'r(s```` ` @@@rrr%sL {R$ 5D B B#### ffS"oG. <bt$T;Cx 1r!udHd1g 55rc t||||d}||d}|"|dk7rtdd||z z|z } | dkrd} nd} d} t||||| ||} | | ||| \} }}}| ||fS)a Scalar function search for alpha that satisfies strong Wolfe conditions alpha > 0 is assumed to be a descent direction. Parameters ---------- phi : callable phi(alpha) Function at point `alpha` derphi : callable phi'(alpha) Objective function derivative. Returns a scalar. phi0 : float, optional Value of phi at 0 old_phi0 : float, optional Value of phi at previous point derphi0 : float, optional Value derphi at 0 c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax, amin : float, optional Maximum and minimum step size xtol : float, optional Relative tolerance for an acceptable step. Returns ------- alpha : float Step size, or None if no suitable step was found phi : float Value of `phi` at the new point `alpha` phi0 : float Value of `phi` at `alpha=0` Notes ----- Uses routine DCSRCH from MINPACK. Parameters `c1` and `c2` must satisfy ``0 < c1 < c2 < 1`` as described in [1]_. References ---------- .. [1] Nocedal, J., & Wright, S. J. (2006). Numerical optimization. In Springer Series in Operations Research and Financial Engineering. (Springer Series in Operations Research and Financial Engineering). Springer Nature. r?)\(@d)phi0r0maxiter)rminr)r!r)r8old_phi0r0rrr*r+r,alpha1r9dcsrchr1phi1tasks rr r dsjR |2w*1 S&$/27:; A:FG CRtT :F"T7GCtT d?rc @ dgdgdgdgfd} |fd| g}tj|}  fd}nd}t| ||||| | ||  \}}}}|tdtd nd}|dd|||fS) a Find alpha that satisfies strong Wolfe conditions. Parameters ---------- f : callable f(x,*args) Objective function. myfprime : callable f'(x,*args) Objective function gradient. xk : ndarray Starting point. pk : ndarray Search direction. The search direction must be a descent direction for the algorithm to converge. gfk : ndarray, optional Gradient value for x=xk (xk being the current parameter estimate). Will be recomputed if omitted. old_fval : float, optional Function value for x=xk. Will be recomputed if omitted. old_old_fval : float, optional Function value for the point preceding x=xk. args : tuple, optional Additional arguments passed to objective function. c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax : float, optional Maximum step size extra_condition : callable, optional A callable of the form ``extra_condition(alpha, x, f, g)`` returning a boolean. Arguments are the proposed step ``alpha`` and the corresponding ``x``, ``f`` and ``g`` values. The line search accepts the value of ``alpha`` only if this callable returns ``True``. If the callable returns ``False`` for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions. maxiter : int, optional Maximum number of iterations to perform. Returns ------- alpha : float or None Alpha for which ``x_new = x0 + alpha * pk``, or None if the line search algorithm did not converge. fc : int Number of function evaluations made. gc : int Number of gradient evaluations made. new_fval : float or None New function value ``f(x_new)=f(x0+alpha*pk)``, or None if the line search algorithm did not converge. old_fval : float Old function value ``f(x0)``. new_slope : float or None The local slope along the search direction at the new value ````, or None if the line search algorithm did not converge. Notes ----- Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, 'Numerical Optimization', 1999, pp. 59-61. The search direction `pk` must be a descent direction (e.g. ``-myfprime(xk)``) to find a step length that satisfies the strong Wolfe conditions. If the search direction is not a descent direction (e.g. ``myfprime(xk)``), then `alpha`, `new_fval`, and `new_slope` will be None. Examples -------- >>> import numpy as np >>> from scipy.optimize import line_search A objective function and its gradient are defined. >>> def obj_func(x): ... return (x[0])**2+(x[1])**2 >>> def obj_grad(x): ... return [2*x[0], 2*x[1]] We can find alpha that satisfies strong Wolfe conditions. >>> start_point = np.array([1.8, 1.7]) >>> search_gradient = np.array([-1.0, -1.0]) >>> line_search(obj_func, obj_grad, start_point, search_gradient) (1.0, 2, 1, 1.1300000000000001, 6.13, [1.6, 1.4]) rNc<dxxdz cc<|zzgSrr)alpharrrrr s rr!zline_search_wolfe2..phis( 1 ebj(4((rc~dxxdz cc<|zzgd<|d<tjdSrr#)rBrr&r'r( gval_alpharr s rr)z"line_search_wolfe2..derphi#sI 1 ebj040Q 1 vvd1gr""rcPd|k7r||zz}|||dS)Nrr) rBr!xr)extra_conditionr(rDrr s rextra_condition2z,line_search_wolfe2..extra_condition20s8!}%u URZA"5!S$q': :r)r9*The line search algorithm did not converge stacklevel)r$r%r rr)rmyfprimer rr-r.r/rrrr*rGr9r!r0rH alpha_starphi_star derphi_starr)rr&r'r(rDs` `` ` ` @@@@@@rrrs| B B 6DJ))F##  {R$ffS"oG" ; ;  2F <"b$ g3//J(K 9 1 .1g r!ubeXx DDrc t||||d}||d}d} ||dk7rtdd||z z|z } nd} | dkrd} | t| |} || } |} |}|d}t| D]}| dk(s|1| |kDr,d}|}|}d}| dk(rd}ndd |z}t|td n|dkD}| ||| z|zzkDs| | k\r|rt | | | | |||||||| \}}}n|| }t || |zkr|| | r| }| }|}n^|dk\rt | | | | |||||||| \}}}n=d | z}| t||}| } |} | } || } |}| }| }d}td td ||||fS) aFind alpha that satisfies strong Wolfe conditions. alpha > 0 is assumed to be a descent direction. Parameters ---------- phi : callable phi(alpha) Objective scalar function. derphi : callable phi'(alpha) Objective function derivative. Returns a scalar. phi0 : float, optional Value of phi at 0. old_phi0 : float, optional Value of phi at previous point. derphi0 : float, optional Value of derphi at 0 c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax : float, optional Maximum step size. extra_condition : callable, optional A callable of the form ``extra_condition(alpha, phi_value)`` returning a boolean. The line search accepts the value of ``alpha`` only if this callable returns ``True``. If the callable returns ``False`` for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions. maxiter : int, optional Maximum number of iterations to perform. Returns ------- alpha_star : float or None Best alpha, or None if the line search algorithm did not converge. phi_star : float phi at alpha_star. phi0 : float phi at 0. derphi_star : float or None derphi at alpha_star, or None if the line search algorithm did not converge. Notes ----- Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, 'Numerical Optimization', 1999, pp. 59-61. Nr4rr5r6cy)NTr)rBr!s rrGz-scalar_search_wolfe2..extra_conditionsrz7Rounding errors prevent the line search from convergingz4The line search algorithm could not find a solution zless than or equal to amax: rJrKrI)rr:rangerr_zoomabs)r!r)r8r;r0rrr*rGr9alpha0r<phi_a1phi_a0 derphi_a0irNrOrPmsgnot_first_iteration derphi_a1alpha2s rr r IsCpR |2w* F1 S&$/27:; z VT" [FFI 7^9. Q;4+ JHDK{OL4TF;< 'A 6 !e TBK'11 1 v #6fff$if"GR_F .J+ 6N  Nrc'k )vv.# !'  Nfff$if"GR_F .J+ V  &FV c9.j   9 1 . x{ 22rc ptjddd5 |}||z }||z } || zdz|| z z} tjd} | dz| d<|dz | d<| dz | d<|dz| d <tj| tj||z ||zz ||z || zz gj \} } | | z} | | z} | | zd| z|zz }|| tj |zd| zz z} d d d tjsy |S#t$r Yd d d y wxYw#1swY8xYw) z Finds the minimizer for a cubic polynomial that goes through the points (a,fa), (b,fb), and (c,fc) with derivative at a of fpa. If no minimizer can be found, return None. raisedivideoverinvalidrJ)rJrJ)rr)rr)rr)rrN) r$errstateemptyr%asarrayflattensqrtArithmeticErrorisfinite)afafpabfbcrCdbdcdenomd1ABradicalxmins r _cubicminr|sk G'7 C AQBQB"WNb2g.E&!BQwBtHaxBtHaxBtHQwBtHVVB BGa"f,<,.Ga"f,<,>!??FwyJFQ JA JA!ea!eai'GRWWW--!a%88D!& ;;t  K  %" #s)D,CD D)D,(D))D,,D5ctjddd5 |}|}||dzz }||z ||zz ||zz }||d|zz z } dddtj sy| S#t$r YdddywxYw#1swY8xYw)z Finds the minimizer for a quadratic polynomial that goes through the points (a,fa), (b,fb) with derivative at a of fpa. r`rar5@N)r$rfrkrl) rmrnrorprqDrsrtryr{s r_quadminrs G'7 C AAQWBa!b&R"W-AqC!G}$D  ;;t  K   s(A;(A$$ A8-A;7A88A;;Bc d} d} d}d}|}d} ||z }|dkr||}}n||}}| dkDr||z}t|||||||}| dk(s||z kDs|||zkr.||z}t|||||}||||z kDs|||zkr|d|zz}||}||| |z|zzkDs||k\r |}|}|}|}nH||}t|| |zkr| ||r|}|}|}n0|||z zdk\r |}|}|}|}n|}|}|}|}|}| dz } | | kDrd}d}d}n|||fS)a Zoom stage of approximate linesearch satisfying strong Wolfe conditions. Part of the optimization algorithm in `scalar_search_wolfe2`. Notes ----- Implements Algorithm 3.6 (zoom) in Wright and Nocedal, 'Numerical Optimization', 1999, pp. 61. rg?皙?N?r)r|rrU)a_loa_hiphi_lophi_hi derphi_lor!r)r8r0rrrGr9rZdelta1delta2phi_reca_recdalpharmrpcchka_jqchkphi_aj derphi_aja_starval_star valprime_stars rrTrTsG A F FG E  A:qAqA EF?DD&)T6!7,C F q4xS1t8^F?D4D&AC qv34<SZ'S TBsF7N* *&0@GEDFs I9~"W,f1M! ) $+&!+  DF!I Q KFH M A B 8] **rc tjdg  fd}| |d} n|} tj|} t|| | ||\} } | d| fS)aMinimize over alpha, the function ``f(xk+alpha pk)``. Parameters ---------- f : callable Function to be minimized. xk : array_like Current point. pk : array_like Search direction. gfk : array_like Gradient of `f` at point `xk`. old_fval : float Value of `f` at point `xk`. args : tuple, optional Optional arguments. c1 : float, optional Value to control stopping criterion. alpha0 : scalar, optional Value of `alpha` at start of the optimization. Returns ------- alpha f_count f_val_at_alpha Notes ----- Uses the interpolation algorithm (Armijo backtracking) as suggested by Wright and Nocedal in 'Numerical Optimization', 1999, pp. 56-57 rc<dxxdz cc<|zzgSrr)r<rrrrr s rr!zline_search_armijo..phis( 1 fRi'$''rr4)rrV)r$ atleast_1dr%scalar_search_armijo)rr rr-r.rrrVr!r8r0rBr>rs``` ` @rr r ostD r B B((2wffS"oG&sD'b.46KE4 "Q% rc Ft||||||||}|d|dd|dfS)z8 Compatibility wrapper for `line_search_armijo` )rrrVrrrJ)r ) rr rr-r.rrrVrs rline_search_BFGSrs: 1b"c8$2"( *A Q41q!A$ rc^||}||||z|zzkr||fS| |dzzdz ||z ||zz z }||}||||z|zzkr||fS||kDr|dz|dzz||z z} |dz||z ||zz z|dz||z ||zz zz } | | z } |dz ||z ||zz z|dz||z ||zz zz} | | z } | tjt| dzd| z|zz zd| zz } || } | ||| z|zzkr| | fS|| z |dz kDs d| |z z dkr|dz } |}| }|}| }||kDrd|fS)a(Minimize over alpha, the function ``phi(alpha)``. Uses the interpolation algorithm (Armijo backtracking) as suggested by Wright and Nocedal in 'Numerical Optimization', 1999, pp. 56-57 alpha > 0 is assumed to be a descent direction. Returns ------- alpha phi1 rJr~reg@rgQ?N)r$rjrU)r!r8r0rrVr+rXr<rWfactorrmrpr^phi_a2s rrrs[F 6 ')))v~Z&!) #c )Vd]Wv=M-M NF [F$F7***v~ 4-VQY&&-8 AI$7 8 AI$7 8 9 J QYJ&4-'&.8 9 AI$7 8 9 J"rwws1a4!a%'/#9:;;AFV dRYw.. .6> ! VOv| +F6M0AT/Ic\F+ 4-0 <rc|d}t|} d} d} d} || |zz} || \}}|| |z|| dzz|zz kr| } n| dz|z|d| zdz |zzz }|| |zz } || \}}|| |z|| dzz|zz kr| } nR| dz|z|d| zdz |zzz }tj||| z|| z} tj||| z|| z} | | ||fS)a@ Nonmonotone backtracking line search as described in [1]_ Parameters ---------- f : callable Function returning a tuple ``(f, F)`` where ``f`` is the value of a merit function and ``F`` the residual. x_k : ndarray Initial position. d : ndarray Search direction. prev_fs : float List of previous merit function values. Should have ``len(prev_fs) <= M`` where ``M`` is the nonmonotonicity window parameter. eta : float Allowed merit function increase, see [1]_ gamma, tau_min, tau_max : float, optional Search parameters, see [1]_ Returns ------- alpha : float Step length xp : ndarray Next position fp : float Merit function value at next position Fp : ndarray Residual at next position References ---------- [1] "Spectral residual method without gradient information for solving large-scale nonlinear systems of equations." W. La Cruz, J.M. Martinez, M. Raydan. Math. Comp. **75**, 1429 (2006). rrJ)maxr$clip)rx_kdprev_fsetagammatau_mintau_maxf_kf_baralpha_palpha_mrBxpfpFpalpha_tpalpha_tms r_nonmonotone_line_search_cruzrsGP "+C LEGG E  7Q; 2B uwz1C77 7E A:#rQwY]C,?'?@ 7Q; 2B uwz1C77 7HE A:#rQwY]C,?'?@''(Gg$5w7HI''(Gg$5w7HI) , "b" rc d} d} d} || |zz}||\}}|||z|| dzz|zz kr| } n| dz|z|d| zdz |zzz }|| |zz }||\}}|||z|| dzz|zz kr| } nR| dz|z|d| zdz |zzz }tj||| z| | z} tj||| z| | z} | |zdz}| |z||zz|z|z }|}| |||||fS)a Nonmonotone line search from [1] Parameters ---------- f : callable Function returning a tuple ``(f, F)`` where ``f`` is the value of a merit function and ``F`` the residual. x_k : ndarray Initial position. d : ndarray Search direction. f_k : float Initial merit function value. C, Q : float Control parameters. On the first iteration, give values Q=1.0, C=f_k eta : float Allowed merit function increase, see [1]_ nu, gamma, tau_min, tau_max : float, optional Search parameters, see [1]_ Returns ------- alpha : float Step length xp : ndarray Next position fp : float Merit function value at next position Fp : ndarray Residual at next position C : float New value for the control parameter C Q : float New value for the control parameter Q References ---------- .. [1] W. Cheng & D.-H. Li, ''A derivative-free nonmonotone line search and its application to the spectral residual method'', IMA J. Numer. Anal. 29, 814 (2009). rrJ)r$r)rrrrrsQrrrrnurrrBrrrrrQ_nexts r_nonmonotone_line_search_chengr2sg^GG E  7Q; 2B S57A:-33 3E A:#rQwY]C,?'?@ 7Q; 2B S57A:-33 3HE A:#rQwY]C,?'?@''(Gg$5w7HI''(Gg$5w7HI) .!VaZF a1s7 b F*AA "b"a ""r) NNNr-C6??2:0yE>+=)NNNrrrrr) NNNrrrNNr)NNNrrNNr)rrr)rrr)rrr)rrrg333333?)__doc__warningsr_dcsrchrnumpyr$__all__RuntimeWarningrrrr line_searchrr r|rrTr rrrrrrrrs  !  //337?C!<6~IM%(27JZ! @DIM57LE^,004/379Q3hD*T+v1h7~DGEREH&*N#r