JL i f ddlZddlmZGddeZdZedk(reyy#e$rY)wxYw)N)VectorSpaceClusterercPeZdZdZ d dZdZd dZdZdZdZ d Z d Z y) EMClusterera The Gaussian EM clusterer models the vectors as being produced by a mixture of k Gaussian sources. The parameters of these sources (prior probability, mean and covariance matrix) are then found to maximise the likelihood of the given data. This is done with the expectation maximisation algorithm. It starts with k arbitrarily chosen means, priors and covariance matrices. It then calculates the membership probabilities for each vector in each of the clusters; this is the 'E' step. The cluster parameters are then updated in the 'M' step using the maximum likelihood estimate from the cluster membership probabilities. This process continues until the likelihood of the data does not significantly increase. Nctj|||tj|tj|_t ||_||_||_ ||_ ||_ y)aL Creates an EM clusterer with the given starting parameters, convergence threshold and vector mangling parameters. :param initial_means: the means of the gaussian cluster centers :type initial_means: [seq of] numpy array or seq of SparseArray :param priors: the prior probability for each cluster :type priors: numpy array or seq of float :param covariance_matrices: the covariance matrix for each cluster :type covariance_matrices: [seq of] numpy array :param conv_threshold: maximum change in likelihood before deemed convergent :type conv_threshold: int or float :param bias: variance bias used to ensure non-singular covariance matrices :type bias: float :param normalise: should vectors be normalised to length 1 :type normalise: boolean :param svd_dimensions: number of dimensions to use in reducing vector dimensionsionality with SVD :type svd_dimensions: int N) r__init__numpyarrayfloat64_meanslen _num_clusters_conv_threshold_covariance_matrices_priors_bias)self initial_meanspriorscovariance_matricesconv_thresholdbias normalisesvd_dimensionss U/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/nltk/cluster/em.pyrzEMClusterer.__init__sX@ %%dI~Fkk-?  /-$7!  c|jSN)r rs r num_clusterszEMClusterer.num_clustersGs!!!rc Bt|dkDsJt|d}|j}|j}|sBtj|j tj |j z x}|_|j}|sLt|j Dcgc]&}tj|tj (c}x}|_|j||||}d} | s-|r td|tjt||j ftj } tt|D]d}t|j D])} || |j|| || ||z| || f<+| |ddfxxt| |ddfzcc<ft|j D]} || } tj||ftj } tj|tj }d}tt|D]R}|||| z }| | || ftjj!||zz } || || fz }|| || f||zz }T| |z || <||z || <|t|z || <|| xx|j"tj|tj zz cc<|j||||}t%||z |j&krd} |}| s,yycc}w)NrFziteration; loglikelihoodT)r r rronesr r rrangeidentity_loglikelihoodprintzeros _gaussiansummultiplyouterrabsr)rvectorstrace dimensionsmeansr covariancesilastl convergedhjcovariance_beforenew_covariancenew_meansum_hjdeltals rcluster_vectorspacezEMClusterer.cluster_vectorspaceJs7|a_   4--u}}=@R@RR FT\// t1127z5==97 K$3 ##GVUKH 0%8 S\4+=+=> NA3w<( (t112A$Qi$..a+a.'!*+AadG!Q$3qAw<'  (4--. Y$/N!!&j*-Eu}}!U ;;z5==As7|,5A#AJq1E"a1g0D0DUE0R&RRNa1g%F!Q$'!* 44H 5 "0&!8 A#f,a"S\1q A$**u~~j%--/X"XX Y$##GVUKHA519~ 4 44 EI7s +Lcd}t|jD]N}|j||j|j||j ||z}|r ||dkDsK||f}P|dS)Nr)r#r rr(r r)rvectorbestr6ps rclassify_vectorspacez EMClusterer.classify_vectorspaces{t))* A Q$.. A 9 9! )listr rs r__repr__zEMClusterer.__repr__s'$t{{*;;;r)NNgư>皙?FN)F) __name__ __module__ __qualname____doc__rrr=rCrIr(r%rarrrrsE "  &P":x   dz.0f%N)nltkrGrr rr&r#rr rclassifyclassification_probdistsamplesprob) rGfr-r0 clustererclusterscr@pdistsamples rdemorzs *-c S#JA'GH!u{{1~HGHVaY E##E#4I  $d ;H ,  ,! G 1X j! j)++A./ j)**1-. j)88;<  [[!Q F /F ", )  V $%[[!Q F (6 12  - -f 5E--/> UZZ/#5c:!<=>5IsF%__main__)r ImportErrornltk.cluster.utilrrrzrcrgrrr~sQ 3`<&`N zFa  s (00