`L i(ddlZddlmZddlZddlmZddlmZddl m Z m Z m Z ddl mZddlmZdd lmZmZdd lmZmZdd lmZmZd d lmZGddee e Zy)N)Real)sparse)linprog) BaseEstimatorRegressorMixin _fit_context)ConvergenceWarning)_safe_indexing)Interval StrOptions) parse_version sp_version)_check_sample_weight validate_data) LinearModelceZdZUdZeedddgeedddgdgehd gedgd Zee d <d d dddd dZ e dddZ fdZ xZS)QuantileRegressora Linear regression model that predicts conditional quantiles. The linear :class:`QuantileRegressor` optimizes the pinball loss for a desired `quantile` and is robust to outliers. This model uses an L1 regularization like :class:`~sklearn.linear_model.Lasso`. Read more in the :ref:`User Guide `. .. versionadded:: 1.0 Parameters ---------- quantile : float, default=0.5 The quantile that the model tries to predict. It must be strictly between 0 and 1. If 0.5 (default), the model predicts the 50% quantile, i.e. the median. alpha : float, default=1.0 Regularization constant that multiplies the L1 penalty term. fit_intercept : bool, default=True Whether or not to fit the intercept. solver : {'highs-ds', 'highs-ipm', 'highs', 'interior-point', 'revised simplex'}, default='highs' Method used by :func:`scipy.optimize.linprog` to solve the linear programming formulation. It is recommended to use the highs methods because they are the fastest ones. Solvers "highs-ds", "highs-ipm" and "highs" support sparse input data and, in fact, always convert to sparse csc. From `scipy>=1.11.0`, "interior-point" is not available anymore. .. versionchanged:: 1.4 The default of `solver` changed to `"highs"` in version 1.4. solver_options : dict, default=None Additional parameters passed to :func:`scipy.optimize.linprog` as options. If `None` and if `solver='interior-point'`, then `{"lstsq": True}` is passed to :func:`scipy.optimize.linprog` for the sake of stability. Attributes ---------- coef_ : array of shape (n_features,) Estimated coefficients for the features. intercept_ : float The intercept of the model, aka bias term. n_features_in_ : int Number of features seen during :term:`fit`. .. versionadded:: 0.24 feature_names_in_ : ndarray of shape (`n_features_in_`,) Names of features seen during :term:`fit`. Defined only when `X` has feature names that are all strings. .. versionadded:: 1.0 n_iter_ : int The actual number of iterations performed by the solver. See Also -------- Lasso : The Lasso is a linear model that estimates sparse coefficients with l1 regularization. HuberRegressor : Linear regression model that is robust to outliers. Examples -------- >>> from sklearn.linear_model import QuantileRegressor >>> import numpy as np >>> n_samples, n_features = 10, 2 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> # the two following lines are optional in practice >>> from sklearn.utils.fixes import sp_version, parse_version >>> reg = QuantileRegressor(quantile=0.8).fit(X, y) >>> np.mean(y <= reg.predict(X)) np.float64(0.8) rrneither)closedNleftboolean>revised simplexhighshighs-ds highs-ipminterior-pointquantilealpha fit_interceptsolversolver_options_parameter_constraintsg?g?TrcJ||_||_||_||_||_yNr)selfr r!r"r#r$s d/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sklearn/linear_model/_quantile.py__init__zQuantileRegressor.__init__s)!  * ,)prefer_skip_nested_validationct|||gddd\}}t||}|jd}|}|jr|dz }t j ||j z}|jdk(r+ttdk\rtd|jd tj|r'|jd vrtd|jd |j|jdk(rd di}n |j}t j|d}t|} | t|kr||}t!||}t!||}t j"t j$d|z|||j&z|d|j&z zg} |jr d| d<d| |<|jd vrtj(| |j*d} |jrWtj,t j.| df|j*} tj0| || | | | gd} ntj0|| | | gd} nvt j(| } |jr8t j.| df} t j"| || | | | gd} nt j"|| | | gd} |}t3| | ||j|}|j4}|j6s_ddddd}t9j:d|j<d|j?|j<dzdzd z|j@ztB|d |||d|zz }|jD|_#|jr|dd |_$|d|_%|S||_$d!|_%|S)"aFit the model according to the given training data. Parameters ---------- X : {array-like, sparse matrix} of shape (n_samples, n_features) Training data. y : array-like of shape (n_samples,) Target values. sample_weight : array-like of shape (n_samples,), default=None Sample weights. Returns ------- self : object Returns self. )csccsrcooTF) accept_sparse y_numeric multi_outputrrz1.11.0zSolver z- is not anymore available in SciPy >= 1.11.0.)rrrz; does not support sparse X. Use solver 'highs' for example.Nlstsqrr) fill_valuer.)dtypeformat)shaper6)r7)axis)cA_eqb_eqmethodoptionszIteration limit reached.z!Problem appears to be infeasible.z Problem appears to be unbounded.z#Numerical difficulties encountered.)rrzDLinear programming for QuantileRegressor did not succeed. Status is z: zunknown reason zResult message of linprog: g)&rrr8r"npsumr!r#rr ValueErrorrissparser$nonzerolenr concatenatefullr eyer6 csc_matrixoneshstackrxsuccesswarningswarnstatus setdefaultmessager nitn_iter_coef_ intercept_)r(Xy sample_weight n_featuresn_paramsr!r$indices n_indicesr:rJrLr;r<resultsolutionfailureparamss r)fitzQuantileRegressor.fits(  /  1-]A> WWQZ     MH }% 2 ;;* *z]8=T/T$++&ST  ??1 $++5W"W$++'22     &4;;:J+J%t_N!00N,**]+A.L s=) ))'2Mq'*Aq'*A NNH 7 -T]]!23     AaDAhK ;;< < **YaggeDC!!(( 1~QWW)UV}}dAuqb#t%DUS}}a!S3$%7F&&#C!!ww 1~.~~tQr3&EAN~~q1"cC4&8qA;;"  88~~-658 G MM#]]O2/$$V]]4DEF1 1 .. ! # )8$x1x<'HHzz   DJ$QiDO  DJ!DO r+cFt|}d|j_|S)NT)super__sklearn_tags__ input_tagsr)r(tags __class__s r)rgz"QuantileRegressor.__sklearn_tags__*s!w')!% r+r')__name__ __module__ __qualname____doc__r rr dictr%__annotations__r*r rdrg __classcell__)rjs@r)rrsVrdAq;<4D89#     ,$D* -5Y6Yvr+r)rPnumbersrnumpyrBscipyrscipy.optimizerbaserrr exceptionsr utilsr utils._param_validationr r utils.fixesrrutils.validationrr_baserrr+r)r~s@">>+":3BY ^]Yr+