`L iC |dZddlZddlZddlmZddlZddlmZm Z ddl m Z ddl m Z ddlmZmZdd lmZmZmZmZdd lmZmZdd lmZmZmZdd lmZmZdd l m!Z!m"Z"gdZ#edgdgdddZ$edgddgeeddddgeeddddgdgddddddddZ%GddeeeZ&y)z8Isotonic regression for obtaining monotonic fit to data.N)Real) interpolateoptimize) spearmanr)metadata_routing)'_inplace_contiguous_isotonic_regression _make_unique) BaseEstimatorRegressorMixinTransformerMixin _fit_context) check_arraycheck_consistent_length)Interval StrOptionsvalidate_params) parse_versionsp_base_version)_check_sample_weightcheck_is_fitted)IsotonicRegressioncheck_increasingisotonic_regressionz array-likexyTprefer_skip_nested_validationct||\}}|dk\}|dvrt|dkDrdtjd|zd|z z z}dtjt|dz z }tj |d|zz }tj |d|zz}t j|t j|k7rtjd|S) a?Determine whether y is monotonically correlated with x. y is found increasing or decreasing with respect to x based on a Spearman correlation test. Parameters ---------- x : array-like of shape (n_samples,) Training data. y : array-like of shape (n_samples,) Training target. Returns ------- increasing_bool : boolean Whether the relationship is increasing or decreasing. Notes ----- The Spearman correlation coefficient is estimated from the data, and the sign of the resulting estimate is used as the result. In the event that the 95% confidence interval based on Fisher transform spans zero, a warning is raised. References ---------- Fisher transformation. Wikipedia. https://en.wikipedia.org/wiki/Fisher_transformation Examples -------- >>> from sklearn.isotonic import check_increasing >>> x, y = [1, 2, 3, 4, 5], [2, 4, 6, 8, 10] >>> check_increasing(x, y) np.True_ >>> y = [10, 8, 6, 4, 2] >>> check_increasing(x, y) np.False_ r)g?g?r!rg\(\?zwConfidence interval of the Spearman correlation coefficient spans zero. Determination of ``increasing`` may be suspect.) rlenmathlogsqrttanhnpsignwarningswarn) rrrho_increasing_boolFF_serho_0rho_1s V/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sklearn/isotonic.pyrrsfq!_FCQhO +#a&1* $((C#I#)45 5499SVaZ(( !dTk/* !dTk/* 775>RWWU^ + MM  bothclosedboolean)r sample_weighty_miny_max increasingr9r:r;r<ct|ddtjtjg}tt dk\rDt j|||}tj|j|j}n|rtjddntjddd}tj|||j}t|||jd }tj||}t||||}||=|tj }|tj }tj"|||||S) aSolve the isotonic regression model. Read more in the :ref:`User Guide `. Parameters ---------- y : array-like of shape (n_samples,) The data. sample_weight : array-like of shape (n_samples,), default=None Weights on each point of the regression. If None, weight is set to 1 (equal weights). y_min : float, default=None Lower bound on the lowest predicted value (the minimum value may still be higher). If not set, defaults to -inf. y_max : float, default=None Upper bound on the highest predicted value (the maximum may still be lower). If not set, defaults to +inf. increasing : bool, default=True Whether to compute ``y_`` is increasing (if set to True) or decreasing (if set to False). Returns ------- y_ : ndarray of shape (n_samples,) Isotonic fit of y. References ---------- "Active set algorithms for isotonic regression; A unifying framework" by Michael J. Best and Nilotpal Chakravarti, section 3. Examples -------- >>> from sklearn.isotonic import isotonic_regression >>> isotonic_regression([5, 3, 1, 2, 8, 10, 7, 9, 6, 4]) array([2.75 , 2.75 , 2.75 , 2.75 , 7.33, 7.33, 7.33, 7.33, 7.33, 7.33]) Fr) ensure_2d input_namedtypez1.12.0)rweightsr<rANT)rAcopy)rr(float64float32rrrrasarrayrrAs_arrayrascontiguousarrayr infclip)rr9r:r;r<resorders r3rrfsn A3rzz2::>VWA-11**:  JJsuuAGG ,'aBEE$B$K HHQuXQWW -,]AQWWSWX ,,]5-AB /=A eH E- =VVGE =FFE 5%# Hr4c<eZdZUdZdej iZdej iZee ddddgee ddddgde dhge hdgd Z e e d <ddd d d d ZdZdZddZed ddZdZdZdZddZfdZfdZfdZxZS)ra Isotonic regression model. Read more in the :ref:`User Guide `. .. versionadded:: 0.13 Parameters ---------- y_min : float, default=None Lower bound on the lowest predicted value (the minimum value may still be higher). If not set, defaults to -inf. y_max : float, default=None Upper bound on the highest predicted value (the maximum may still be lower). If not set, defaults to +inf. increasing : bool or 'auto', default=True Determines whether the predictions should be constrained to increase or decrease with `X`. 'auto' will decide based on the Spearman correlation estimate's sign. out_of_bounds : {'nan', 'clip', 'raise'}, default='nan' Handles how `X` values outside of the training domain are handled during prediction. - 'nan', predictions will be NaN. - 'clip', predictions will be set to the value corresponding to the nearest train interval endpoint. - 'raise', a `ValueError` is raised. Attributes ---------- X_min_ : float Minimum value of input array `X_` for left bound. X_max_ : float Maximum value of input array `X_` for right bound. X_thresholds_ : ndarray of shape (n_thresholds,) Unique ascending `X` values used to interpolate the y = f(X) monotonic function. .. versionadded:: 0.24 y_thresholds_ : ndarray of shape (n_thresholds,) De-duplicated `y` values suitable to interpolate the y = f(X) monotonic function. .. versionadded:: 0.24 f_ : function The stepwise interpolating function that covers the input domain ``X``. increasing_ : bool Inferred value for ``increasing``. See Also -------- sklearn.linear_model.LinearRegression : Ordinary least squares Linear Regression. sklearn.ensemble.HistGradientBoostingRegressor : Gradient boosting that is a non-parametric model accepting monotonicity constraints. isotonic_regression : Function to solve the isotonic regression model. Notes ----- Ties are broken using the secondary method from de Leeuw, 1977. References ---------- Isotonic Median Regression: A Linear Programming Approach Nilotpal Chakravarti Mathematics of Operations Research Vol. 14, No. 2 (May, 1989), pp. 303-308 Isotone Optimization in R : Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods de Leeuw, Hornik, Mair Journal of Statistical Software 2009 Correctness of Kruskal's algorithms for monotone regression with ties de Leeuw, Psychometrica, 1977 Examples -------- >>> from sklearn.datasets import make_regression >>> from sklearn.isotonic import IsotonicRegression >>> X, y = make_regression(n_samples=10, n_features=1, random_state=41) >>> iso_reg = IsotonicRegression().fit(X, y) >>> iso_reg.predict([.1, .2]) array([1.8628, 3.7256]) TNr5r6r8auto>nanrMraiser:r;r< out_of_bounds_parameter_constraintsTrSc<||_||_||_||_yNrU)selfr:r;r<rVs r3__init__zIsotonicRegression.__init__ s  $*r4c|jdk(s/|jdk(r|jddk(s d}t|yy)NrzKIsotonic regression input X should be a 1d array or 2d array with 1 feature)ndimshape ValueError)rZXmsgs r3_check_input_data_shapez*IsotonicRegression._check_input_data_shape&sC! !  a* S/ ! 1@ r4c|jdk(}tdk(r fd|_ytj|d||_y)zBuild the f_ interp1d function.rTrc:j|jSrY)repeatr_rs r3z-IsotonicRegression._build_f..4s 1r4linear)kind bounds_errorN)rVr#f_rinterp1d)rZrarrjs ` r3_build_fzIsotonicRegression._build_f.sB))W4 q6Q;1DG!**18,DGr4c ^|j||jd}|jdk(rt|||_n|j|_t |||j }|dkD}||||||}}}tj||f}|||fDcgc]}|| c}\}}}t|||\}} } |}t| | |j|j|j}tj|tj|c|_|_|r|tj"t%|ft&} tj(tj*|dd|ddtj*|dd|d d| dd|| || fS||fScc}w) z Build the y_ IsotonicRegression.rDrRrCrr=rNr])rcreshaper<r increasing_rrAr(lexsortr rr:r;minmaxX_min_X_max_onesr#bool logical_or not_equal) rZrarr9trim_duplicatesmaskrOrJunique_Xunique_yunique_sample_weight keep_datas r3_build_yzIsotonicRegression._build_y:s $$Q' IIbM ??f $/15D #D -]AQWWM q gqw d0Cm1 Aq6":;Q 9NOuU|O1m3?1m3T0(0   .****''  $&66!9bffQi  T[ Q 6I!mm QqWaf-r||AaGQqrU/KIaOY<9- - a4K;Ps F*rc4tdd}t|fdtjtjgd|}t|fd|j d|}t ||||j|||\}}||c|_|_ |j|||S)aFit the model using X, y as training data. Parameters ---------- X : array-like of shape (n_samples,) or (n_samples, 1) Training data. .. versionchanged:: 0.24 Also accepts 2d array with 1 feature. y : array-like of shape (n_samples,) Training target. sample_weight : array-like of shape (n_samples,), default=None Weights. If set to None, all weights will be set to 1 (equal weights). Returns ------- self : object Returns an instance of self. Notes ----- X is stored for future use, as :meth:`transform` needs X to interpolate new input data. F) accept_sparser?ra)r@rAr) dictrr(rFrGrArr X_thresholds_ y_thresholds_rm)rZrarr9 check_paramss r3fitzIsotonicRegression.fitks:%5A   bjj"**%= AM   Ic IL I1m4}}Q=11 23A.D. a r4ct|dr|jj}ntj}t ||d}|j ||jd}|jdk(r+tj||j|j}|j|}|j|j}|S)a`_transform` is called by both `transform` and `predict` methods. Since `transform` is wrapped to output arrays of specific types (e.g. NumPy arrays, pandas DataFrame), we cannot make `predict` call `transform` directly. The above behaviour could be changed in the future, if we decide to output other type of arrays when calling `predict`. rF)rAr?rDrM)hasattrrrAr(rFrrcrprVrMrurvrkastype)rZrQrArNs r3 _transformzIsotonicRegression._transforms 4 )&&,,EJJE % 8 $$Q' IIbM    '4;; 4Aggajjj! r4c$|j|S)aTransform new data by linear interpolation. Parameters ---------- T : array-like of shape (n_samples,) or (n_samples, 1) Data to transform. .. versionchanged:: 0.24 Also accepts 2d array with 1 feature. Returns ------- y_pred : ndarray of shape (n_samples,) The transformed data. rrZrQs r3 transformzIsotonicRegression.transforms q!!r4c$|j|S)a%Predict new data by linear interpolation. Parameters ---------- T : array-like of shape (n_samples,) or (n_samples, 1) Data to transform. Returns ------- y_pred : ndarray of shape (n_samples,) Transformed data. rrs r3predictzIsotonicRegression.predictsq!!r4ct|d|jjj}t j |dgt S)aKGet output feature names for transformation. Parameters ---------- input_features : array-like of str or None, default=None Ignored. Returns ------- feature_names_out : ndarray of str objects An ndarray with one string i.e. ["isotonicregression0"]. rk0rC)r __class____name__lowerr(rHobject)rZinput_features class_names r3get_feature_names_outz(IsotonicRegression.get_feature_names_outsA d#^^,,224 zzj\+,F;;r4cHt|}|jdd|S)z0Pickle-protocol - return state of the estimator.rkN)super __getstate__poprZstaters r3rzIsotonicRegression.__getstate__s#$& $ r4ct||t|dr4t|dr'|j|j|j yyy)znPickle-protocol - set state of the estimator. We need to rebuild the interpolation function. rrN)r __setstate__rrmrrrs r3rzIsotonicRegression.__setstate__sH U# 4 )gdO.L MM$,,d.@.@ A/M )r4cht|}d|j_d|j_|S)NTF)r__sklearn_tags__ input_tags one_d_array two_d_array)rZtagsrs r3rz#IsotonicRegression.__sklearn_tags__s-w')&*#&+# r4)TrY)r __module__ __qualname____doc__rUNUSED._IsotonicRegression__metadata_request__predict0_IsotonicRegression__metadata_request__transformrrrrWr__annotations__r[rcrmrrrrrrrrrr __classcell__)rs@r3rrs[|$'(8(?(?"@%(*:*A*A$B!4tF;TB4tF;TB *fX"67$%=>? $D!%DTQV+ " /b5/6/b<"$ "&<"Br4r)'rr$r*numbersrnumpyr(scipyrr scipy.statsr sklearn.utilsr _isotonicr r baser r r rutilsrrutils._param_validationrrr utils.fixesrrutils.validationrr__all__rrrr4r3rs>  '!*LOO7JJ7C K^^#' BBJ^&-4tF;TB4tF;TB k #' DD  D NN)9=Nr4