`L i)KdZddlZddlmZmZddlmZmZddlmZddl Z ddl m Z ddl mZdd lmZmZmZdd lmZdd lmZdd lmZdd lmZmZddlmZmZdZGddeeeZy)zBase class for mixture models.N)ABCMetaabstractmethod)IntegralReal)time) logsumexp)cluster) BaseEstimator DensityMixin _fit_context)kmeans_plusplus)ConvergenceWarning)check_random_state)Interval StrOptions)check_is_fitted validate_datactj|}|j|k7rtd|d|d|jy)zValidate the shape of the input parameter 'param'. Parameters ---------- param : array param_shape : tuple name : str zThe parameter 'z' should have the shape of z , but got N)nparrayshape ValueError)param param_shapenames [/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sklearn/mixture/_base.py _check_shapers? HHUOE {{k![%++ /  "ceZdZUdZeedddgeedddgeedddgeedddgeedddgehdgd gd gd geedddgd Ze e d <dZ e dZ dZe dZd&dZedd&dZdZe dZe dZe dZdZd&dZdZdZd'dZdZe d Ze d!Zd"Zd#Z d$Z!d%Z"y)( BaseMixturezBase class for mixture models. This abstract class specifies an interface for all mixture classes and provides basic common methods for mixture models. Nleft)closedgr>kmeansrandomrandom_from_data k-means++ random_statebooleanverbose n_componentstol reg_covarmax_itern_init init_paramsr) warm_startr+verbose_interval_parameter_constraintsc ||_||_||_||_||_||_||_||_| |_| |_ yNr,) selfr-r.r/r0r1r2r)r3r+r4s r__init__zBaseMixture.__init__@sN)"   &($ 0rcy)zCheck initial parameters of the derived class. Parameters ---------- X : array-like of shape (n_samples, n_features) Nr8Xs r_check_parameterszBaseMixture._check_parametersX rcN|j\}}|jdk(rtj||jf|j }t j|jd|j|j}d|tj||f<nq|jdk(rhtj|j||jf|j }||jdddtjfz}n|jd k(rptj||jf|j }|j||jd }d||tj|jf<n{|jd k(rltj||jf|j }t!||j| \}}d||tj|jf<|j#|y)a?Initialize the model parameters. Parameters ---------- X : array-like of shape (n_samples, n_features) random_state : RandomState A random number generator instance that controls the random seed used for the method chosen to initialize the parameters. r%dtyper") n_clustersr1r)r&sizeaxisNr'F)rEreplacer()r))rr2rzerosr-rBr KMeansfitlabels_arangeasarrayuniformsumnewaxischoicer _initialize)r8r=r) n_samples_resplabelindicess r_initialize_parametersz"BaseMixture._initialize_parametersbsww 1   x '88Y(9(9:!''JD#00Q 12D9%u, -    )::$$9d6G6G*H$IQRQXQXD DHH!H$Q ]3 3D   !3 388Y(9(9:!''JD")) 1 15*G;|jA |_!||_||_"|j+|\} }|jGd S) aEstimate model parameters using X and predict the labels for X. The method fits the model n_init times and sets the parameters with which the model has the largest likelihood or lower bound. Within each trial, the method iterates between E-step and M-step for `max_iter` times until the change of likelihood or lower bound is less than `tol`, otherwise, a :class:`~sklearn.exceptions.ConvergenceWarning` is raised. After fitting, it predicts the most probable label for the input data points. .. versionadded:: 0.20 Parameters ---------- X : array-like of shape (n_samples, n_features) List of n_features-dimensional data points. Each row corresponds to a single data point. y : Ignored Not used, present for API consistency by convention. Returns ------- labels : array, shape (n_samples,) Component labels. r )rBensure_min_samplesrz:Expected n_samples >= n_components but got n_components = z, n_samples = converged_r"FTzBest performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.rF)$rrfloat64float32rr-rr>r3hasattrr1infrbrr)range_print_verbose_msg_init_begrY lower_bound_r0_get_parameters_e_step_m_step_compute_lower_boundappend_print_verbose_msg_iter_endabsr._print_verbose_msg_init_endwarningswarnr_set_parametersn_iter_ lower_bounds_argmax)r8r=r^do_initr1max_lower_boundbest_lower_boundsr)rTrUinit lower_boundcurrent_lower_bounds best_params best_n_iter convergedn_iterprev_lower_bound log_prob_normlog_respchanges rr\zBaseMixture.fit_predicts8 $"**bjj)AVW X 771:)) )**.*;*;)<= wwqzl,  q!F74+FG 'Q66')$*;*;< ww 1&M$ 0D  , ,T 2++A|<%,266'$2C2CK#% }}!"224  ! #At}}q'89 F'2$.2ll1o+M8LLH-"&";";Hm"TK(// <(+;;F44VVD6{TXX-$(  00iH0Ow4N&1O"&"6"6"8K"(K(<%&/DOI$ 0R4==1#4 MM9#   [)" +. ll1o 8A&&rcX|j|\}}tj||fS)aE step. Parameters ---------- X : array-like of shape (n_samples, n_features) Returns ------- log_prob_norm : float Mean of the logarithms of the probabilities of each sample in X log_responsibility : array, shape (n_samples, n_components) Logarithm of the posterior probabilities (or responsibilities) of the point of each sample in X. )_estimate_log_prob_resprmean)r8r=rrs rrkzBaseMixture._e_step(s- #'">">q"A xww}%x//rcy)a*M step. Parameters ---------- X : array-like of shape (n_samples, n_features) log_resp : array-like of shape (n_samples, n_components) Logarithm of the posterior probabilities (or responsibilities) of the point of each sample in X. Nr;)r8r=rs rrlzBaseMixture._m_step;s rcyr7r;r8s rrjzBaseMixture._get_parametersI rcyr7r;)r8paramss rrtzBaseMixture._set_parametersMrrclt|t||d}t|j|dS)aCompute the log-likelihood of each sample. Parameters ---------- X : array-like of shape (n_samples, n_features) List of n_features-dimensional data points. Each row corresponds to a single data point. Returns ------- log_prob : array, shape (n_samples,) Log-likelihood of each sample in `X` under the current model. Fresetr"rF)rrr_estimate_weighted_log_probr<s r score_sampleszBaseMixture.score_samplesQs2  $ /99!<1EErc@|j|jS)aCompute the per-sample average log-likelihood of the given data X. Parameters ---------- X : array-like of shape (n_samples, n_dimensions) List of n_features-dimensional data points. Each row corresponds to a single data point. y : Ignored Not used, present for API consistency by convention. Returns ------- log_likelihood : float Log-likelihood of `X` under the Gaussian mixture model. )rrr]s rscorezBaseMixture.scoreds"!!!$))++rcvt|t||d}|j|jdS)aPredict the labels for the data samples in X using trained model. Parameters ---------- X : array-like of shape (n_samples, n_features) List of n_features-dimensional data points. Each row corresponds to a single data point. Returns ------- labels : array, shape (n_samples,) Component labels. Frr"rF)rrrrwr<s rpredictzBaseMixture.predictws9  $ ///299q9AArct|t||d}|j|\}}tj|S)aEvaluate the components' density for each sample. Parameters ---------- X : array-like of shape (n_samples, n_features) List of n_features-dimensional data points. Each row corresponds to a single data point. Returns ------- resp : array, shape (n_samples, n_components) Density of each Gaussian component for each sample in X. Fr)rrrrexp)r8r=rUrs r predict_probazBaseMixture.predict_probas=  $ /2215 8vvhrcjt||dkrtd|jz|jj\}}t |j }|j||j}|jdk(retjt|j|j|Dcgc]"\}}}|j||t|$c}}}} n|jdk(ratjt|j|Dcgc]+\}}|j||jt|-c}}} nutjt|j|j|Dcgc]3\}}}||j!||ftj"|zz5c}}}} tj$t'|D cgc]!\} }tj(|| t#c}} } | | fScc}}}wcc}}wcc}}}wcc}} w)ayGenerate random samples from the fitted Gaussian distribution. Parameters ---------- n_samples : int, default=1 Number of samples to generate. Returns ------- X : array, shape (n_samples, n_features) Randomly generated sample. y : array, shape (nsamples,) Component labels. r"zNInvalid value for 'n_samples': %d . The sampling requires at least one sample.fulltiedrDrA)rrr-means_rrr) multinomialweights_covariance_typervstackzip covariances_multivariate_normalintstandard_normalsqrt concatenate enumerater) r8rTrU n_featuresrngn_samples_compr covariancesampler=jr^s rrzBaseMixture.samples  q=$'+'8'8:   )) : !2!23DMMB   6 ) 7: T%6%672z6++D*c&kJA ! !V + +.dkk>*J&v++D$2C2CS[QA 7: T%6%67 3z6)) /C)Dggj)** A NN*>>HI(0$ # /rc |rdnd}|jdk(rtd|dy |jdk\r/t|jz }td|d|dd |ddy y ) z.Print verbose message on the end of iteration.rzdid not converger"zInitialization .r z . time lapse z.5fzs lower bound N)r+rrr)r8lbinit_has_converged converged_msgts rrqz'BaseMixture._print_verbose_msg_init_end1sw'9 ?Q <<1  OM?!4 5 \\Q ---A !- aWEs81 rr7)r")#__name__ __module__ __qualname____doc__rrrrr5dict__annotations__r9rr>rYrSrKr r\rkrlrjrtrrrrrrrrrrhrorqr;rrr!r!*s"(AtFCDsD89tS$v>?h4?@Haf=> L M (( k;%h4GH $D 10  +"Z    <5l'6l'\0&        F&,&B$ &<| I      '48 0 rr!) metaclass) rrrabcrrnumbersrrrnumpyr scipy.specialrr baser r r r exceptionsrutilsrutils._param_validationrrutils.validationrrrr!r;rrrsK$ '"#<<%+&:= &Q, Qr