L iddlmZddlZddlZddlmZ ddZ d dZ ddZ ddZ y) ) annotationsN)compute_hypervolumec|jddk(r||jdksJ|jd}tj|jd}|j}tj|tj ddf|d}tj |t}t|D]} tj||z d} tj| } ||| || <|| j} tj|| z t} d| | <|| }|| }|| }tj| d|d| df|d| df<tj| d|| ddf|| ddf<|S)NraxisdtypeF)shapenparangecopyrepeatnewaxiszerosintrangeprodargmaxonesboolminimum)rank_i_loss_valsrank_i_indices subset_sizereference_pointn_trialssorted_indicessorted_loss_vals rect_diagsselected_indicesicontribs max_index loss_valskeeps ^/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/optuna/_hypervolume/hssp.py_solve_hssp_2dr* s  ! !" % *{>N>T>TUV>W/WW W%%a(HYY/55a89N',,.?2::q=98!LJxx 37 ; X77:(88rBIIh' ,^I-FG$Y/446 wwx!|40Y'-% +D1$&JJy|Z  TU =V$W :I:q=!$&JJy|Z TU =V$W 9:q=!X c tj|r$tj|tjStj |ddtj f|dd}tj||z d}tj|}tj||tj||dddfz dz }d}|jddk} tj| D]|} || rtjx}|| <|| |kr'| r-|| j|d<t||d} | |z || <n|| t|| |z || <t|| |}~|S) aLazy update the hypervolume contributions. (1) Lazy update of the hypervolume contributions S=selected_indices - {indices[max_index]}, T=selected_indices, and S' is a subset of S. As we would like to know argmax H(T v {i}) in the next iteration, we can skip HV calculations for j if H(T v {i}) - H(T) > H(S' v {j}) - H(S') >= H(T v {j}) - H(T). We used the submodularity for the inequality above. As the upper bound of contribs[i] is H(S' v {j}) - H(S'), we start to update from i with a higher upper bound so that we can skip more HV calculations. (2) A simple cheap-to-evaluate contribution upper bound The HV difference only using the latest selected point and a candidate is a simple, yet obvious, contribution upper bound. Denote t as the latest selected index and j as an unselected index. Then, H(T v {j}) - H(T) <= H({t} v {j}) - H({t}) holds where the inequality comes from submodularity. We use the inclusion-exclusion principle to calculate the RHS. Nrr rgT) assume_pareto)mathisinfr full_likeinfmaximumrrrr argsortrrmax) r%pareto_loss_values selected_vecsr hv_selectedintersec inclusive_hvsis_contrib_inf max_contribis_hv_calc_fastr$hv_pluss r)_lazy_contribs_updater?-sq. zz+||Hbff--zz,Q ];]3B=OPHGGO.@@qIMXXm,Nzz-"''/HQUO*KRS"TTHK(..q1Q6O ZZ "4 ! (* .K(1+  A; $   21 5 : : $-+D1 a   * ++ ) & gA(>QS  $ * ++r+c\||jk(r|Stj|dd\}}|j}||kr]tj|jt}d||<tj |j|}d||d||z <||St ||||} || S)ahSolve a hypervolume subset selection problem (HSSP) via a greedy algorithm. This method is a 1-1/e approximation algorithm to solve HSSP. For further information about algorithms to solve HSSP, please refer to the following paper: - `Greedy Hypervolume Subset Selection in Low Dimensions `__ Tr) return_indexr r N)rCruniquerrrrK) rrrrrank_i_unique_loss_valsindices_of_unique_loss_valsn_uniquechosenduplicated_indices$selected_indices_of_unique_loss_valss r) _solve_hssprUs n)));=99t!<88+//H+.--T:.2*+YY~':':;VGD?C!":K($:;<f%%+J! ??r+) r np.ndarrayrrVrrrrVreturnrV) r%rVr6rVr7rVrrVr8floatrWrV) __future__rr/numpyroptuna._hypervolume.wfgrr*r?rKrUr+r)r]s" 7        F44"44 4  4  4n(, (,(,(, (,  (,V!@ !@!@!@ !@  !@r+