K iAHdZGddZdZdZdZGddZdZy ) ar Algorithms and classes to support enumerative combinatorics. Currently just multiset partitions, but more could be added. Terminology (following Knuth, algorithm 7.1.2.5M TAOCP) *multiset* aaabbcccc has a *partition* aaabc | bccc The submultisets, aaabc and bccc of the partition are called *parts*, or sometimes *vectors*. (Knuth notes that multiset partitions can be thought of as partitions of vectors of integers, where the ith element of the vector gives the multiplicity of element i.) The values a, b and c are *components* of the multiset. These correspond to elements of a set, but in a multiset can be present with a multiplicity greater than 1. The algorithm deserves some explanation. Think of the part aaabc from the multiset above. If we impose an ordering on the components of the multiset, we can represent a part with a vector, in which the value of the first element of the vector corresponds to the multiplicity of the first component in that part. Thus, aaabc can be represented by the vector [3, 1, 1]. We can also define an ordering on parts, based on the lexicographic ordering of the vector (leftmost vector element, i.e., the element with the smallest component number, is the most significant), so that [3, 1, 1] > [3, 1, 0] and [3, 1, 1] > [2, 1, 4]. The ordering on parts can be extended to an ordering on partitions: First, sort the parts in each partition, left-to-right in decreasing order. Then partition A is greater than partition B if A's leftmost/greatest part is greater than B's leftmost part. If the leftmost parts are equal, compare the second parts, and so on. In this ordering, the greatest partition of a given multiset has only one part. The least partition is the one in which the components are spread out, one per part. The enumeration algorithms in this file yield the partitions of the argument multiset in decreasing order. The main data structure is a stack of parts, corresponding to the current partition. An important invariant is that the parts on the stack are themselves in decreasing order. This data structure is decremented to find the next smaller partition. Most often, decrementing the partition will only involve adjustments to the smallest parts at the top of the stack, much as adjacent integers *usually* differ only in their last few digits. Knuth's algorithm uses two main operations on parts: Decrement - change the part so that it is smaller in the (vector) lexicographic order, but reduced by the smallest amount possible. For example, if the multiset has vector [5, 3, 1], and the bottom/greatest part is [4, 2, 1], this part would decrement to [4, 2, 0], while [4, 0, 0] would decrement to [3, 3, 1]. A singleton part is never decremented -- [1, 0, 0] is not decremented to [0, 3, 1]. Instead, the decrement operator needs to fail for this case. In Knuth's pseudocode, the decrement operator is step m5. Spread unallocated multiplicity - Once a part has been decremented, it cannot be the rightmost part in the partition. There is some multiplicity that has not been allocated, and new parts must be created above it in the stack to use up this multiplicity. To maintain the invariant that the parts on the stack are in decreasing order, these new parts must be less than or equal to the decremented part. For example, if the multiset is [5, 3, 1], and its most significant part has just been decremented to [5, 3, 0], the spread operation will add a new part so that the stack becomes [[5, 3, 0], [0, 0, 1]]. If the most significant part (for the same multiset) has been decremented to [2, 0, 0] the stack becomes [[2, 0, 0], [2, 0, 0], [1, 3, 1]]. In the pseudocode, the spread operation for one part is step m2. The complete spread operation is a loop of steps m2 and m3. In order to facilitate the spread operation, Knuth stores, for each component of each part, not just the multiplicity of that component in the part, but also the total multiplicity available for this component in this part or any lesser part above it on the stack. One added twist is that Knuth does not represent the part vectors as arrays. Instead, he uses a sparse representation, in which a component of a part is represented as a component number (c), plus the multiplicity of the component in that part (v) as well as the total multiplicity available for that component (u). This saves time that would be spent skipping over zeros. c,eZdZdZdZdZdZdZdZy) PartComponentaXInternal class used in support of the multiset partitions enumerators and the associated visitor functions. Represents one component of one part of the current partition. A stack of these, plus an auxiliary frame array, f, represents a partition of the multiset. Knuth's pseudocode makes c, u, and v separate arrays. cuvc.d|_d|_d|_y)Nrselfs a/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sympy/utilities/enumerative.py__init__zPartComponent.__init__jscNd|j|j|jfzS)z&for debug/algorithm animation purposeszc:%d u:%d v:%drr s r __repr__zPartComponent.__repr__ts 466466466":::rct||jxrO|j|jk(xr4|j|jk(xr|j|jk(S)z>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import multiset_partitions_taocp >>> # variables components and multiplicities represent the multiset 'abb' >>> components = 'ab' >>> multiplicities = [1, 2] >>> states = multiset_partitions_taocp(multiplicities) >>> list(list_visitor(state, components) for state in states) [[['a', 'b', 'b']], [['a', 'b'], ['b']], [['a'], ['b', 'b']], [['a'], ['b'], ['b']]] See Also ======== sympy.utilities.iterables.multiset_partitions: Takes a multiset as input and directly yields multiset partitions. It dispatches to a number of functions, including this one, for implementation. Most users will find it more convenient to use than multiset_partitions_taocp. r TFN)lensumrangerrrrmin)multiplicitiesmnipstackfjpsalpartbkxstates r multiset_partitions_taocpr3sH NA NA(-QUQY'7 8!mo 8F 8 q1u A1X! AYa a  ! AaD A E AaD A AA1a[ $QikkF1IKK7q !9;;!#A"()++F1IK"%fQikk6!9;;"?F1IKq fQikk1AAA"()++F1IK"()++F1IKAA  1u  %!) 7<E6" !A!9;;!#E!9;;!#Av&)++*A: eH%QikkAoq q1ua.A"()++F1IK.!G %9s*IIFI A8Ic|\}}}g}t|dzD]W}d}|||||dzD]1}|jdkDs|||j|jzz}3|j|Y|S)aUse with multiset_partitions_taocp to enumerate the ways a number can be expressed as a product of factors. For this usage, the exponents of the prime factors of a number are arguments to the partition enumerator, while the corresponding prime factors are input here. Examples ======== To enumerate the factorings of a number we can think of the elements of the partition as being the prime factors and the multiplicities as being their exponents. >>> from sympy.utilities.enumerative import factoring_visitor >>> from sympy.utilities.enumerative import multiset_partitions_taocp >>> from sympy import factorint >>> primes, multiplicities = zip(*factorint(24).items()) >>> primes (2, 3) >>> multiplicities (3, 1) >>> states = multiset_partitions_taocp(multiplicities) >>> list(factoring_visitor(state, primes) for state in states) [[24], [8, 3], [12, 2], [4, 6], [4, 2, 3], [6, 2, 2], [2, 2, 2, 3]] r r )r#rrappend) r2primesr*r.r) factoringr(factorr,s r factoring_visitorr9.s4AufI 519 !1qQx( /Bttax&,"$$.. /  ! rc|\}}}g}t|dzD]d}g}|||||dzD]>}|jdkDs|j||jg|jz@|j |f|S)aReturn a list of lists to represent the partition. Examples ======== >>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import multiset_partitions_taocp >>> states = multiset_partitions_taocp([1, 2, 1]) >>> s = next(states) >>> list_visitor(s, 'abc') # for multiset 'a b b c' [['a', 'b', 'b', 'c']] >>> s = next(states) >>> list_visitor(s, [1, 2, 3]) # for multiset '1 2 2 3 [[1, 2, 2], [3]] r r )r#rextendrr5) r2 componentsr*r.r) partitionr(partr,s r list_visitorr?Ss AufI 57^1a!f% 7Bttax Z-.56 7   rcjeZdZdZdZdZdZdZdZdZ dZ d Z d Z d Z d Zd ZdZdZdZy)MultisetPartitionTraversera' Has methods to ``enumerate`` and ``count`` the partitions of a multiset. This implements a refactored and extended version of Knuth's algorithm 7.1.2.5M [AOCP]_." The enumeration methods of this class are generators and return data structures which can be interpreted by the same visitor functions used for the output of ``multiset_partitions_taocp``. Examples ======== >>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> m.count_partitions([4,4,4,2]) 127750 >>> m.count_partitions([3,3,3]) 686 See Also ======== multiset_partitions_taocp sympy.utilities.iterables.multiset_partitions References ========== .. [AOCP] Algorithm 7.1.2.5M in Volume 4A, Combinatoral Algorithms, Part 1, of The Art of Computer Programming, by Donald Knuth. .. [Factorisatio] On a Problem of Oppenheim concerning "Factorisatio Numerorum" E. R. Canfield, Paul Erdos, Carl Pomerance, JOURNAL OF NUMBER THEORY, Vol. 17, No. 1. August 1983. See section 7 for a description of an algorithm similar to Knuth's. .. [Yorgey] Generating Multiset Partitions, Brent Yorgey, The Monad.Reader, Issue 8, September 2007. cd|_d|_d|_d|_d|_d|_d|_d|_g|_t|dsi|_ yy)NFr dp_map) debugk1k2p1r)r*r. discardeddp_stackhasattrrCr s r r z#MultisetPartitionTraverser.__init__s[    tX&DK'rc(|jrty)zeUseful for understanding/debugging the algorithms. Not generally activated in end-user code.N)rD RuntimeError)r msgs r db_tracez#MultisetPartitionTraverser.db_traces ::  rclt|}t|}t||zdzDcgc] }tc}|_dg|dzz|_t|D],}|j|}||_|||_|||_.d|j d<||j d<d|_ ycc}w)zAllocates and initializes the partition stack. This is called from the enumeration/counting routines, so there is no need to call it separately.r r N) r!r"r#rr)r*rrrr.)r r%num_components cardinalityr(r+r,s r _initialize_enumerationz2MultisetPartitionTraverser._initialize_enumerations ^,.) ^k9A=>@1}@  a(~& %AQBBD!!$BD!!$BD  % q "q  @sB1c,t|}t|dz ddD]v}|dk(r||jdkDs|dkDs ||jdkDs3||xjdzc_t|dz|D]}||j||_yy)a*Decrements part (a subrange of pstack), if possible, returning True iff the part was successfully decremented. If you think of the v values in the part as a multi-digit integer (least significant digit on the right) this is basically decrementing that integer, but with the extra constraint that the leftmost digit cannot be decremented to 0. Parameters ========== part The part, represented as a list of PartComponent objects, which is to be decremented. r r TF)r!r#rr)r r>plenr+r0s r decrement_partz)MultisetPartitionTraverser.decrement_parts"4ytaxR( AAv$q'))a-1q5T!WYY]Q Q q1ud+*A $Q DGI* rc|j|dz k\r|xjdz c_yt|}t|dz ddD]E}|dk(rI|djdz ||jz z|dj kr|xj dz c_y|dk(r||jdkDs|dkDso||jdkDs||xjdzc_t|dz|D]}||j ||_|dkDr||djdk(rj|dj |djz ||jz dz |djzk(r(|xjdz c_|jdyyy)aDecrements part (a subrange of pstack), if possible, returning True iff the part was successfully decremented. Parameters ========== part part to be decremented (topmost part on the stack) ub the maximum number of parts allowed in a partition returned by the calling traversal. Notes ===== The goal of this modification of the ordinary decrement method is to fail (meaning that the subtree rooted at this part is to be skipped) when it can be proved that this part can only have child partitions which are larger than allowed by ``ub``. If a decision is made to fail, it must be accurate, otherwise the enumeration will miss some partitions. But, it is OK not to capture all the possible failures -- if a part is passed that should not be, the resulting too-large partitions are filtered by the enumeration one level up. However, as is usual in constrained enumerations, failing early is advantageous. The tests used by this method catch the most common cases, although this implementation is by no means the last word on this problem. The tests include: 1) ``lpart`` must be less than ``ub`` by at least 2. This is because once a part has been decremented, the partition will gain at least one child in the spread step. 2) If the leading component of the part is about to be decremented, check for how many parts will be added in order to use up the unallocated multiplicity in that leading component, and fail if this number is greater than allowed by ``ub``. (See code for the exact expression.) This test is given in the answer to Knuth's problem 7.2.1.5.69. 3) If there is *exactly* enough room to expand the leading component by the above test, check the next component (if it exists) once decrementing has finished. If this has ``v == 0``, this next component will push the expansion over the limit by 1, so fail. r FrTr zDecrement fails test 3T) r.rGr!r#rrrErFrN)r r>ubrUr+r0s r decrement_part_smallz/MultisetPartitionTraverser.decrement_part_smallsfb ::a  GGqLG4ytaxR( AAv4799q=2 ?;d1giiG1 Av$q'))a-1q5T!WYY]Q Q q1ud+*A $Q DGI*1Haa!WYYa*tzz/A-a:<GGqLGMM":; 1 2rc|dk(r|j|sy||jz }|dkrytd|D}td|D}||kry|||z z }|dkrytt |dz ddD]}}|dk(r.|dj |kDr|dxj |zc_yy||j |k\r||xj |zc_y|||j z}d||_y) aDecrements part, while respecting size constraint. A part can have no children which are of sufficient size (as indicated by ``lb``) unless that part has sufficient unallocated multiplicity. When enforcing the size constraint, this method will decrement the part (if necessary) by an amount needed to ensure sufficient unallocated multiplicity. Returns True iff the part was successfully decremented. Parameters ========== part part to be decremented (topmost part on the stack) amt Can only take values 0 or 1. A value of 1 means that the part must be decremented, and then the size constraint is enforced. A value of 0 means just to enforce the ``lb`` size constraint. lb The partitions produced by the calling enumeration must have more parts than this value. r Fr Tc34K|]}|jywN)r.0pcs r zBMultisetPartitionTraverser.decrement_part_large..~s-"-c34K|]}|jywr\)rr]s r r`zBMultisetPartitionTraverser.decrement_part_large..s.2"$$.rarTN)rVr.r"r#r!r) r r>amtlb min_unalloc total_mult total_allocdeficitr(s r decrement_part_largez/MultisetPartitionTraverser.decrement_part_largeSs : !8 &&t, 4::o ! --- ...  $k!9: a<s4y1}b"- "AAv799w&GII(I 799'GII(ItAwyy(G !DGI "rcP|j||xr|j|d|S)aDecrements part (a subrange of pstack), if possible, returning True iff the part was successfully decremented. Parameters ========== part part to be decremented (topmost part on the stack) ub the maximum number of parts allowed in a partition returned by the calling traversal. lb The partitions produced by the calling enumeration must have more parts than this value. Notes ===== Combines the constraints of _small and _large decrement methods. If returns success, part has been decremented at least once, but perhaps by quite a bit more if needed to meet the lb constraint. r )rYri)r r>rdrXs r decrement_part_rangez/MultisetPartitionTraverser.decrement_part_ranges1B((r23  % %dAr 2 3rc|j|j}|j|jdz}|}d}t|j|j|j|jdzD]Q}|j|j|j|j z |j|_|j|jdk(rd}h|j|j |j|_|r,|j|j|j|_n|j|j|j|j kr.|j|j|j|_d}n+|j|j |j|_|dz}T||kDr1|jdz|_||j|jdz<yy)aReturns True if a new part has been created, and adjusts pstack, f and lpart as needed. Notes ===== Spreads unallocated multiplicity from the current top part into a new part created above the current on the stack. This new part is constrained to be less than or equal to the old in terms of the part ordering. This call does nothing (and returns False) if the current top part has no unallocated multiplicity. r Fr T)r*r.r#r)rrr)r r+r0basechangeds r spread_part_multiplicityz3MultisetPartitionTraverser.spread_part_multiplicitys FF4::  FF4::> "tvvdjj)466$**q.+AB A#{{1~//$++a.2B2BBDKKN {{1~1$#';;q>#3#3 A '+{{1~'7'7DKKN${{1~''$++a.*:*::+/;;q>+;+; A("&+/;;q>+;+; A(E  t8aDJ%&DFF4::> "rc|j|j|j|j|jdzS)zEReturn current top part on the stack, as a slice of pstack. r )r)r*r.r s r top_partz#MultisetPartitionTraverser.top_parts3{{466$**-dffTZZ!^.DEErc#K|j| |jr |jr|j|j|jg}||j |j sE|jdk(ry|xjdzc_|j |j sEw)aEnumerate the partitions of a multiset. Examples ======== >>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> states = m.enum_all([2,2]) >>> list(list_visitor(state, 'ab') for state in states) [[['a', 'a', 'b', 'b']], [['a', 'a', 'b'], ['b']], [['a', 'a'], ['b', 'b']], [['a', 'a'], ['b'], ['b']], [['a', 'b', 'b'], ['a']], [['a', 'b'], ['a', 'b']], [['a', 'b'], ['a'], ['b']], [['a'], ['a'], ['b', 'b']], [['a'], ['a'], ['b'], ['b']]] See Also ======== multiset_partitions_taocp: which provides the same result as this method, but is about twice as fast. Hence, enum_all is primarily useful for testing. Also see the function for a discussion of states and visitors. r Nr )rRror*r.r)rVrq)r r%r2s r enum_allz#MultisetPartitionTraverser.enum_alls> $$^4//1//1VVTZZ5EK))$--/:::? a ))$--/:s4CB CCc#Kd|_|dkry|j| |jrb|jd|j|k\r1|xjdz c_|jd|dz |_n8|jrb|j |j|j g}||j|j|sh|jd|jdk(ry|xjdzc_|jd|j|j|sh|jd 4w) aEnumerate multiset partitions with no more than ``ub`` parts. Equivalent to enum_range(multiplicities, 0, ub) Parameters ========== multiplicities list of multiplicities of the components of the multiset. ub Maximum number of parts Examples ======== >>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> states = m.enum_small([2,2], 2) >>> list(list_visitor(state, 'ab') for state in states) [[['a', 'a', 'b', 'b']], [['a', 'a', 'b'], ['b']], [['a', 'a'], ['b', 'b']], [['a', 'b', 'b'], ['a']], [['a', 'b'], ['a', 'b']]] The implementation is based, in part, on the answer given to exercise 69, in Knuth [AOCP]_. See Also ======== enum_all, enum_large, enum_range r Nspread 1r z Discarding$Failed decrement, going to backtrackBacktracked todecrement ok, about to expand) rHrRrorNr.r*r)rYrq)r r%rXr2s r enum_smallz%MultisetPartitionTraverser.enum_smallsP 7  $$^4//1 j)::#NNa'NMM.1!#aDJ //1T[[9 // D DE::? a  ./ // D MM9 :+sBEB.EEc#xKd|_|t|k\ry|j||j|j d| d}|j rJ|j|j d|s|xjdz c_d}n|j rJ|r'|j |j|jg}||j|j d|sG|jdk(ry|xjdzc_|j|j d|sGw)aEnumerate the partitions of a multiset with lb < num(parts) Equivalent to enum_range(multiplicities, lb, sum(multiplicities)) Parameters ========== multiplicities list of multiplicities of the components of the multiset. lb Number of parts in the partition must be greater than this lower bound. Examples ======== >>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> states = m.enum_large([2,2], 2) >>> list(list_visitor(state, 'ab') for state in states) [[['a', 'a'], ['b'], ['b']], [['a', 'b'], ['a'], ['b']], [['a'], ['a'], ['b', 'b']], [['a'], ['a'], ['b'], ['b']]] See Also ======== enum_all, enum_small, enum_range r NTr F) rHr"rRrirqror*r.r))r r%rdgood_partitionr2s r enum_largez%MultisetPartitionTraverser.enum_largeas F ^$ $  $$^4 !!$--/1b9!N//100!RHNNa'N%*N //1T[[9 // BG::? a // BGsB%D:(BD:9D:c#Kd|_|dks|t|k\ry|j||j|j d| d}|j r|j d|j|j d|s)|j d|xjdz c_d}nS|j|k\r3|xjdz c_d}|j d|d z |_n|j r|r'|j|j|jg}||j|j ||si|j d |jdk(ry|xjdzc_|j d |j|j ||si|j d w) aEnumerate the partitions of a multiset with ``lb < num(parts) <= ub``. In particular, if partitions with exactly ``k`` parts are desired, call with ``(multiplicities, k - 1, k)``. This method generalizes enum_all, enum_small, and enum_large. Examples ======== >>> from sympy.utilities.enumerative import list_visitor >>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> states = m.enum_range([2,2], 1, 2) >>> list(list_visitor(state, 'ab') for state in states) [[['a', 'a', 'b'], ['b']], [['a', 'a'], ['b', 'b']], [['a', 'b', 'b'], ['a']], [['a', 'b'], ['a', 'b']]] r NTruz Discarding (large cons)r Fz Discarding small consrvrwrxry) rHr"rRrirqrorNr.r*r)rk)r r%rdrXr|r2s r enum_rangez%MultisetPartitionTraverser.enum_ranges2 7bC//  $$^4 !!$--/1b9!N//1 j)00!RHMM"=>NNa'N%*NZZ2%NNa'N%*NMM";>> from sympy.utilities.enumerative import MultisetPartitionTraverser >>> m = MultisetPartitionTraverser() >>> m.count_partitions([9,8,2]) 288716 >>> m.count_partitions([2,2]) 9 >>> del m Notes ===== If one looks at the workings of Knuth's algorithm M [AOCP]_, it can be viewed as a traversal of a binary tree of parts. A part has (up to) two children, the left child resulting from the spread operation, and the right child from the decrement operation. The ordinary enumeration of multiset partitions is an in-order traversal of this tree, and with the partitions corresponding to paths from the root to the leaves. The mapping from paths to partitions is a little complicated, since the partition would contain only those parts which are leaves or the parents of a spread link, not those which are parents of a decrement link. For counting purposes, it is sufficient to count leaves, and this can be done with a recursive in-order traversal. The number of leaves of a subtree rooted at a particular part is a function only of that part itself, so memoizing has the potential to speed up the counting dramatically. This method follows a computational approach which is similar to the hypothetical memoized recursive function, but with two differences: 1) This method is iterative, borrowing its structure from the other enumerations and maintaining an explicit stack of parts which are in the process of being counted. (There may be multisets which can be counted reasonably quickly by this implementation, but which would overflow the default Python recursion limit with a recursive implementation.) 2) Instead of using the part data structure directly, a more compact key is constructed. This saves space, but more importantly coalesces some parts which would remain separate with physical keys. Unlike the enumeration functions, there is currently no _range version of count_partitions. If someone wants to stretch their brain, it should be possible to construct one by memoizing with a histogram of counts rather than a single count, and combining the histograms. r r rT) rrIrRpart_keyrqr5rorCr.rVpop)r r%pkeykeyoldcounts r count_partitionsz+MultisetPartitionTraverser.count_partitionssN   $$^4 ( tQi]+//1 04;;&KKDKK$5$9:KJJ!OJMM((4*=)@A//1" KK1 K))$--/:%)]]%6%6%8>MC'+{{X'=DKK$>::?;;& a ))$--/:DMMO,D MM"  $ $dDKK%8 :CrN)rrrrr rNrRrVrYrirkrorqrsrzr}rrrrrr rArAps`)V4 >p;rrAcg}|D]8}|j|j|j|j:t|S)aHelper for MultisetPartitionTraverser.count_partitions that creates a key for ``part``, that only includes information which can affect the count for that part. (Any irrelevant information just reduces the effectiveness of dynamic programming.) Notes ===== This member function is a candidate for future exploration. There are likely symmetries that can be exploited to coalesce some ``part_key`` values, and thereby save space and improve performance. )r5rrtuple)r>rvalr,s r rrnsD" D BDD BDD ;rN)rrr3r9r?rArrrr rs;Yv%!%!lTx"J:{ ;{ ;|r