`L it dZddlZddlmZmZddlZddlmZddl m Z ddl m Z ddl mZmZdd lmZmZdd lmZdd lmZmZdd lmZmZmZdd lmZddlmZm Z m!Z!ddl"m#Z#ddl$m%Z&ddl$m'Z'm(Z(ddl)m*Z*dZ+dZ,dZ-e!ddgeedddge hddgdgeeddde dhgdgdgdd !d"dddd d d#d$Z.d"dddd d d#d%Z/Gd&d'eZ0y)(zSpectral Embedding.N)IntegralReal)sparse)eigh)connected_components)eigshlobpcg) BaseEstimator _fit_context) rbf_kernel)NearestNeighborskneighbors_graph) check_arraycheck_random_statecheck_symmetric)_init_arpack_v0)Interval StrOptionsvalidate_params)_deterministic_vector_sign_flip) laplacian) parse_version sp_version) validate_datac|jd}tj|r|j}t j |t }t j |t }d||<t|D]}|j}t j|||||jk\r|St j|d}|jd|D][}tj|r'||gddfjj} n||} t j|| |]|S)aCFind the largest graph connected components that contains one given node. Parameters ---------- graph : array-like of shape (n_samples, n_samples) Adjacency matrix of the graph, non-zero weight means an edge between the nodes. node_id : int The index of the query node of the graph. Returns ------- connected_components_matrix : array-like of shape (n_samples,) An array of bool value indicating the indexes of the nodes belonging to the largest connected components of the given query node. r)dtypeT)outFN)shaperissparsetocsrnpzerosboolrangesum logical_orwherefilltoarrayravel) graphnode_idn_nodeconnected_nodesnodes_to_explore_last_num_componentindicesi neighborss j/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sklearn/manifold/_spectral_embedding.py_graph_connected_componentr7s,([[^F u hhvT2Oxxd3 $W 6]M,002 o'7_M !4!4!6 6  ((+,Q/e$ MAu%"1#q&M11399; !!H MM*I;K L MM ctj|r3ttdk\}t |d|}t |\}}|dk(St |dj|jdk(S)a~Return whether the graph is connected (True) or Not (False). Parameters ---------- graph : {array-like, sparse matrix} of shape (n_samples, n_samples) Adjacency matrix of the graph, non-zero weight means an edge between the nodes. Returns ------- is_connected : bool True means the graph is fully connected and False means not. z1.11.3T accept_sparseaccept_large_sparser) rr rrrrr7r&r)r,r<n_connected_componentsr1s r6_graph_is_connectedr?Lsvu )M(,CC ;N %9$?!%***%3779U[[^KKr8c|jd}tj|s|r||jdd|dz<|S|j }|r(|j |j k(}||j|<tj|j |j z j}|dkr|j}|S|j}|S)aMSet the diagonal of the laplacian matrix and convert it to a sparse format well suited for eigenvalue decomposition. Parameters ---------- laplacian : {ndarray, sparse matrix} The graph laplacian. value : float The value of the diagonal. norm_laplacian : bool Whether the value of the diagonal should be changed or not. Returns ------- laplacian : {array, sparse matrix} An array of matrix in a form that is well suited to fast eigenvalue decomposition, depending on the band width of the matrix. rNr=) rrr flattocoorowcoldatar"uniquesizetodiar!)rvaluenorm_laplaciann_nodesdiag_idxn_diagss r6 _set_diagrOls,ooa G ??9 % -2INN>gk> *" OO%  }} 5H',INN8 $))IMMIMM9:?? a<!)I ")I r8z array-likez sparse matrixr=leftclosed>amgarpackr random_stateautoboolean adjacency n_components eigen_solverrU eigen_tolrK drop_firstTprefer_skip_nested_validationrZr[rUr\rKr]c <t|}t|||||||S)aProject the sample on the first eigenvectors of the graph Laplacian. The adjacency matrix is used to compute a normalized graph Laplacian whose spectrum (especially the eigenvectors associated to the smallest eigenvalues) has an interpretation in terms of minimal number of cuts necessary to split the graph into comparably sized components. This embedding can also 'work' even if the ``adjacency`` variable is not strictly the adjacency matrix of a graph but more generally an affinity or similarity matrix between samples (for instance the heat kernel of a euclidean distance matrix or a k-NN matrix). However care must taken to always make the affinity matrix symmetric so that the eigenvector decomposition works as expected. Note : Laplacian Eigenmaps is the actual algorithm implemented here. Read more in the :ref:`User Guide `. Parameters ---------- adjacency : {array-like, sparse graph} of shape (n_samples, n_samples) The adjacency matrix of the graph to embed. n_components : int, default=8 The dimension of the projection subspace. eigen_solver : {'arpack', 'lobpcg', 'amg'}, default=None The eigenvalue decomposition strategy to use. AMG requires pyamg to be installed. It can be faster on very large, sparse problems, but may also lead to instabilities. If None, then ``'arpack'`` is used. random_state : int, RandomState instance or None, default=None A pseudo random number generator used for the initialization of the lobpcg eigen vectors decomposition when `eigen_solver == 'amg'`, and for the K-Means initialization. Use an int to make the results deterministic across calls (See :term:`Glossary `). .. note:: When using `eigen_solver == 'amg'`, it is necessary to also fix the global numpy seed with `np.random.seed(int)` to get deterministic results. See https://github.com/pyamg/pyamg/issues/139 for further information. eigen_tol : float, default="auto" Stopping criterion for eigendecomposition of the Laplacian matrix. If `eigen_tol="auto"` then the passed tolerance will depend on the `eigen_solver`: - If `eigen_solver="arpack"`, then `eigen_tol=0.0`; - If `eigen_solver="lobpcg"` or `eigen_solver="amg"`, then `eigen_tol=None` which configures the underlying `lobpcg` solver to automatically resolve the value according to their heuristics. See, :func:`scipy.sparse.linalg.lobpcg` for details. Note that when using `eigen_solver="amg"` values of `tol<1e-5` may lead to convergence issues and should be avoided. .. versionadded:: 1.2 Added 'auto' option. norm_laplacian : bool, default=True If True, then compute symmetric normalized Laplacian. drop_first : bool, default=True Whether to drop the first eigenvector. For spectral embedding, this should be True as the first eigenvector should be constant vector for connected graph, but for spectral clustering, this should be kept as False to retain the first eigenvector. Returns ------- embedding : ndarray of shape (n_samples, n_components) The reduced samples. Notes ----- Spectral Embedding (Laplacian Eigenmaps) is most useful when the graph has one connected component. If there graph has many components, the first few eigenvectors will simply uncover the connected components of the graph. References ---------- * https://en.wikipedia.org/wiki/LOBPCG * :doi:`"Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method", Andrew V. Knyazev <10.1137/S1064827500366124>` Examples -------- >>> from sklearn.datasets import load_digits >>> from sklearn.neighbors import kneighbors_graph >>> from sklearn.manifold import spectral_embedding >>> X, _ = load_digits(return_X_y=True) >>> X = X[:100] >>> affinity_matrix = kneighbors_graph( ... X, n_neighbors=int(X.shape[0] / 10), include_self=True ... ) >>> # make the matrix symmetric >>> affinity_matrix = 0.5 * (affinity_matrix + affinity_matrix.T) >>> embedding = spectral_embedding(affinity_matrix, n_components=2, random_state=42) >>> embedding.shape (100, 2) ra)r_spectral_embeddingrXs r6spectral_embeddingrds1H&l3L !!!% r8cPt|}|dk(r ddlm}|d}|j d} |r|dz}t |stjdt||d \} } |dk(s"|d k7rtj| r| d |zkrrt| d|} |d k(rdn|} | d z} t| j d|} t| dd} t| |dd| | \}}|j |dd }|r|| z }n|dk(rtj| stjdt| t$j&t$j(gd} t| d|} dtj*| j dz}| |z } t-tdr/t/| tj0rtj2| } t| d}| |z} |j5}|j7| j d|dzf}| j9|dddf<|j;| j<}|d k(rdn|} t?| ||| d\}}|j }|r|| z }|j ddk(rt|d k(r0t| t$j&t$j(gd} | d |zdzkrLtj| r| jA} tC| d\}}|j d|}|r|| z }nt| d|} |j7| j d|dzf}| j9|dddf<|j;| j<}|d k(rdn|} t?| || dd\}}|j d|}|r|| z }|j ddk(rttE}|r|d|j S|d|j S#t$r}t d|d}~wwxYw#t"$r d }| d z} YwxYw)NrSr)smoothed_aggregation_solverz>The eigen_solver was set to 'amg', but pyamg is not available.rTr=zJGraph is not fully connected, spectral embedding may not work as expected.T)normed return_diagr rVcsrFr:?LM)ksigmawhichtolv0z$AMG works better for sparse matrices)rr;gh㈵> csr_array)r;)rH)Mrqlargest) check_finitei)rqrumaxiter)#rpyamgrf ImportError ValueErrorrr?warningswarncsgraph_laplacianrr rOrrrT RuntimeErrorr"float64float32eyehasattr isinstancers csr_matrixaspreconditionerstandard_normalr+astyperr r*rr)rYrZr[rUr\rKr]rferLrddrqrrr1 diffusion_map embedding diag_shiftmlrtXs r6rcrc+s] *Iu  9  ooa G#a' y ) X &.dMIrx +w\9I/IiN;  !F*! C OI !3\BB#EI %\Dcb  A}& (8b(89I%N   y) MM@ A bjj"**5T iN; FJJyq'9:: Z 6; 'Jy&BRBR,S)))4I (Ye)T UZ    !  ( (yq/A`. Parameters ---------- n_components : int, default=2 The dimension of the projected subspace. affinity : {'nearest_neighbors', 'rbf', 'precomputed', 'precomputed_nearest_neighbors'} or callable, default='nearest_neighbors' How to construct the affinity matrix. - 'nearest_neighbors' : construct the affinity matrix by computing a graph of nearest neighbors. - 'rbf' : construct the affinity matrix by computing a radial basis function (RBF) kernel. - 'precomputed' : interpret ``X`` as a precomputed affinity matrix. - 'precomputed_nearest_neighbors' : interpret ``X`` as a sparse graph of precomputed nearest neighbors, and constructs the affinity matrix by selecting the ``n_neighbors`` nearest neighbors. - callable : use passed in function as affinity the function takes in data matrix (n_samples, n_features) and return affinity matrix (n_samples, n_samples). gamma : float, default=None Kernel coefficient for rbf kernel. If None, gamma will be set to 1/n_features. random_state : int, RandomState instance or None, default=None A pseudo random number generator used for the initialization of the lobpcg eigen vectors decomposition when `eigen_solver == 'amg'`, and for the K-Means initialization. Use an int to make the results deterministic across calls (See :term:`Glossary `). .. note:: When using `eigen_solver == 'amg'`, it is necessary to also fix the global numpy seed with `np.random.seed(int)` to get deterministic results. See https://github.com/pyamg/pyamg/issues/139 for further information. eigen_solver : {'arpack', 'lobpcg', 'amg'}, default=None The eigenvalue decomposition strategy to use. AMG requires pyamg to be installed. It can be faster on very large, sparse problems. If None, then ``'arpack'`` is used. eigen_tol : float, default="auto" Stopping criterion for eigendecomposition of the Laplacian matrix. If `eigen_tol="auto"` then the passed tolerance will depend on the `eigen_solver`: - If `eigen_solver="arpack"`, then `eigen_tol=0.0`; - If `eigen_solver="lobpcg"` or `eigen_solver="amg"`, then `eigen_tol=None` which configures the underlying `lobpcg` solver to automatically resolve the value according to their heuristics. See, :func:`scipy.sparse.linalg.lobpcg` for details. Note that when using `eigen_solver="lobpcg"` or `eigen_solver="amg"` values of `tol<1e-5` may lead to convergence issues and should be avoided. .. versionadded:: 1.2 n_neighbors : int, default=None Number of nearest neighbors for nearest_neighbors graph building. If None, n_neighbors will be set to max(n_samples/10, 1). n_jobs : int, default=None The number of parallel jobs to run. ``None`` means 1 unless in a :obj:`joblib.parallel_backend` context. ``-1`` means using all processors. See :term:`Glossary ` for more details. Attributes ---------- embedding_ : ndarray of shape (n_samples, n_components) Spectral embedding of the training matrix. affinity_matrix_ : ndarray of shape (n_samples, n_samples) Affinity_matrix constructed from samples or precomputed. n_features_in_ : int Number of features seen during :term:`fit`. .. versionadded:: 0.24 feature_names_in_ : ndarray of shape (`n_features_in_`,) Names of features seen during :term:`fit`. Defined only when `X` has feature names that are all strings. .. versionadded:: 1.0 n_neighbors_ : int Number of nearest neighbors effectively used. See Also -------- Isomap : Non-linear dimensionality reduction through Isometric Mapping. References ---------- - :doi:`A Tutorial on Spectral Clustering, 2007 Ulrike von Luxburg <10.1007/s11222-007-9033-z>` - `On Spectral Clustering: Analysis and an algorithm, 2001 Andrew Y. Ng, Michael I. Jordan, Yair Weiss `_ - :doi:`Normalized cuts and image segmentation, 2000 Jianbo Shi, Jitendra Malik <10.1109/34.868688>` Examples -------- >>> from sklearn.datasets import load_digits >>> from sklearn.manifold import SpectralEmbedding >>> X, _ = load_digits(return_X_y=True) >>> X.shape (1797, 64) >>> embedding = SpectralEmbedding(n_components=2) >>> X_transformed = embedding.fit_transform(X[:100]) >>> X_transformed.shape (100, 2) r=NrPrQ>rbf precomputednearest_neighborsprecomputed_nearest_neighborsrrU>rSrTr rVrZaffinitygammarUr[r\ n_neighborsn_jobs_parameter_constraintsr)rrrUr[r\rrct||_||_||_||_||_||_||_||_yNr) selfrZrrrUr[r\rrs r6__init__zSpectralEmbedding.__init__ts@)   (("& r8ct|}d|j_|jdv|j_|S)NT)rr)super__sklearn_tags__ input_tagsrrpairwise)rtags __class__s r6rz"SpectralEmbedding.__sklearn_tags__s=w')!%#'==5 $   r8c|jdk(r||_|jS|jdk(rgt|j|jdj |}|j |d}d||jzz|_|jS|jdk(rtj|rtjdd |_n|j |jn$tt|jd d z d |_t ||jd |j|_d|j|jjzz|_|jS|jd k(rW|j |j nd|jd z |_t%||j"|_|jS|j||_|jS)a;Calculate the affinity matrix from data Parameters ---------- X : array-like of shape (n_samples, n_features) Training vector, where `n_samples` is the number of samples and `n_features` is the number of features. If affinity is "precomputed" X : array-like of shape (n_samples, n_samples), Interpret X as precomputed adjacency graph computed from samples. Y: Ignored Returns ------- affinity_matrix of shape (n_samples, n_samples) rr)rrmetric connectivity)rmodeg?rz`Nearest neighbors affinity currently does not support sparse input, falling back to rbf affinityrr r=T) include_selfrrl)r)raffinity_matrix_rrrfitrr~rr r{r|maxintr n_neighbors_rgamma_r )rrY estimatorrs r6_get_affinity_matrixz&SpectralEmbedding._get_affinity_matrixs& ==M )$%D !(( ( ==; ;( ,,T[[c!f %555OL$'<,..+H$ID !(( ( ==/ /q! # !& ''3$$Sb115! )9t((tDKK)%),))D,A,A,C,CC)%,,, ==E !(, (>$**C!''RS*DTDK$.q $DD !(( ( $ a 0$$$r8Tr^ct||dd}t|j}|j|}t ||j |j |j||_|S)aFit the model from data in X. Parameters ---------- X : {array-like, sparse matrix} of shape (n_samples, n_features) Training vector, where `n_samples` is the number of samples and `n_features` is the number of features. If affinity is "precomputed" X : {array-like, sparse matrix}, shape (n_samples, n_samples), Interpret X as precomputed adjacency graph computed from samples. y : Ignored Not used, present for API consistency by convention. Returns ------- self : object Returns the instance itself. rkr )r;ensure_min_samples)rZr[r\rU) rrrUrrcrZr[r\ embedding_)rryrUaffinity_matrixs r6rzSpectralEmbedding.fitsf. $1 M)$*;*;< 33A6- ****nn%   r8c<|j||jS)aFit the model from data in X and transform X. Parameters ---------- X : {array-like, sparse matrix} of shape (n_samples, n_features) Training vector, where `n_samples` is the number of samples and `n_features` is the number of features. If affinity is "precomputed" X : {array-like, sparse matrix} of shape (n_samples, n_samples), Interpret X as precomputed adjacency graph computed from samples. y : Ignored Not used, present for API consistency by convention. Returns ------- X_new : array-like of shape (n_samples, n_components) Spectral embedding of the training matrix. )rr)rrrs r6 fit_transformzSpectralEmbedding.fit_transforms,  r8)r r)__name__ __module__ __qualname____doc__rrrcallablerrdict__annotations__rrrr rr __classcell__)rs@r6rrsFR"(AtFCD     4D8$?'(#$?@$GtQVRS 1d6BDI"%$D.%*8%t5"6"Hr8r)1rr{numbersrrnumpyr"scipyr scipy.linalgrscipy.sparse.csgraphrscipy.sparse.linalgrr baser r metrics.pairwiser r5rrutilsrrr utils._arpackrutils._param_validationrrr utils.extmathr utils.fixesrr}rrutils.validationrr7r?rOrdrcrr8r6rs "5-.): ,KK;83,*ZL@+\"O4!(AtFCD#$?@$G'(tQVRS$+ k#' B BPh*Vr rr8