L iuPddlmZddlmZddlmZddlZddlZddlm Z ddlm Z ddl Z ddl Z ddl m Z ddl mZdd l mZdd lmZdd lmZdd lmZdd lmZddlmZddlmZddlmZddlmZddlmZddlmZej>e Z!dZ"GddeZ#GddeZ$y)) annotations)UserDict)SequenceN)Any)overload) distributions)logging)pruners)convert_positional_args)deprecated_func)BaseDistribution)CategoricalChoiceType)CategoricalDistribution)FloatDistribution)IntDistribution) FrozenTrial)_SUGGEST_INT_POSITIONAL_ARGS) BaseTrialz Use suggest_float{args} instead.ceZdZdZd.dZed/dZddd d0dZedd e jd  d1d Z edd e jd  d1dZ edd e jd  d2dZ eeddddd d3dZed4dZed5dZed6dZed7dZed8dZe d9dZ d9dZd:dZd;dZdd$Zd>d%Zd?d&Zd@d'Zed/d(ZedAd)Zed/d*Zeed!dd/d+ZedBd,Z edCd-Z!y)DTrialaA trial is a process of evaluating an objective function. This object is passed to an objective function and provides interfaces to get parameter suggestion, manage the trial's state, and set/get user-defined attributes of the trial. Note that the direct use of this constructor is not recommended. This object is seamlessly instantiated and passed to the objective function behind the :func:`optuna.study.Study.optimize()` method; hence library users do not care about instantiation of this object. Args: study: A :class:`~optuna.study.Study` object. trial_id: A trial ID that is automatically generated. c||_||_|jj|_|jj |j|_t j|j|j }|jjj||j d|_ d|_ |j jjdi|_y)N fixed_params)study _trial_id_storagestorage get_trial_cached_frozen_trialr _filter_studysampler before_trialrelative_search_space_relative_params system_attrsget _fixed_params)selfrtrial_ids Y/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/optuna/trial/_trial.py__init__zTrial.__init__2s !zz** $(LL$:$:4>>$J!%%djj$2K2KL ''t/H/HIJN"7;!66CCGGXZ[cp|jtj|j|j}|jj j ||j|_|jj j||j|j|_|jSN) r#r rrrr infer_relative_search_spacer"sample_relative)r'rs r)relative_paramszTrial.relative_paramsBs  ())$**d6O6OPE)-););)W)Wt00*D &%)JJ$6$6$F$Ft00$2L2L%D !$$$r+NF)steplogclt||||}|j||}|j|||S)a Suggest a value for the floating point parameter. Example: Suggest a momentum, learning rate and scaling factor of learning rate for neural network training. .. testcode:: import numpy as np from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.neural_network import MLPClassifier import optuna X, y = load_iris(return_X_y=True) X_train, X_valid, y_train, y_valid = train_test_split(X, y, random_state=0) def objective(trial): momentum = trial.suggest_float("momentum", 0.0, 1.0) learning_rate_init = trial.suggest_float( "learning_rate_init", 1e-5, 1e-3, log=True ) power_t = trial.suggest_float("power_t", 0.2, 0.8, step=0.1) clf = MLPClassifier( hidden_layer_sizes=(100, 50), momentum=momentum, learning_rate_init=learning_rate_init, solver="sgd", random_state=0, power_t=power_t, ) clf.fit(X_train, y_train) return clf.score(X_valid, y_valid) study = optuna.create_study(direction="maximize") study.optimize(objective, n_trials=3) Args: name: A parameter name. low: Lower endpoint of the range of suggested values. ``low`` is included in the range. ``low`` must be less than or equal to ``high``. If ``log`` is :obj:`True`, ``low`` must be larger than 0. high: Upper endpoint of the range of suggested values. ``high`` is included in the range. ``high`` must be greater than or equal to ``low``. step: A step of discretization. .. note:: The ``step`` and ``log`` arguments cannot be used at the same time. To set the ``step`` argument to a float number, set the ``log`` argument to :obj:`False`. log: A flag to sample the value from the log domain or not. If ``log`` is true, the value is sampled from the range in the log domain. Otherwise, the value is sampled from the range in the linear domain. .. note:: The ``step`` and ``log`` arguments cannot be used at the same time. To set the ``log`` argument to :obj:`True`, set the ``step`` argument to :obj:`None`. Returns: A suggested float value. .. seealso:: :ref:`configurations` tutorial describes more details and flexible usages. )r2r1)r_suggest_check_distributionr'namelowhighr1r2 distributionsuggested_values r) suggest_floatzTrial.suggest_floatNs;h)d$G --l;   |4r+z3.0.0z6.0.0)args)textc(|j|||S)arSuggest a value for the continuous parameter. The value is sampled from the range :math:`[\mathsf{low}, \mathsf{high})` in the linear domain. When :math:`\mathsf{low} = \mathsf{high}`, the value of :math:`\mathsf{low}` will be returned. Args: name: A parameter name. low: Lower endpoint of the range of suggested values. ``low`` is included in the range. high: Upper endpoint of the range of suggested values. ``high`` is included in the range. Returns: A suggested float value. r<r'r7r8r9s r)suggest_uniformzTrial.suggest_uniforms(!!$T22r+z(..., log=True)c,|j|||dS)aoSuggest a value for the continuous parameter. The value is sampled from the range :math:`[\mathsf{low}, \mathsf{high})` in the log domain. When :math:`\mathsf{low} = \mathsf{high}`, the value of :math:`\mathsf{low}` will be returned. Args: name: A parameter name. low: Lower endpoint of the range of suggested values. ``low`` is included in the range. high: Upper endpoint of the range of suggested values. ``high`` is included in the range. Returns: A suggested float value. T)r2rArBs r)suggest_loguniformzTrial.suggest_loguniforms(!!$Tt!<`__. .. testcode:: import numpy as np from sklearn.datasets import load_iris from sklearn.ensemble import RandomForestClassifier from sklearn.model_selection import train_test_split import optuna X, y = load_iris(return_X_y=True) X_train, X_valid, y_train, y_valid = train_test_split(X, y) def objective(trial): n_estimators = trial.suggest_int("n_estimators", 50, 400) clf = RandomForestClassifier(n_estimators=n_estimators, random_state=0) clf.fit(X_train, y_train) return clf.score(X_valid, y_valid) study = optuna.create_study(direction="maximize") study.optimize(objective, n_trials=3) Args: name: A parameter name. low: Lower endpoint of the range of suggested values. ``low`` is included in the range. ``low`` must be less than or equal to ``high``. If ``log`` is :obj:`True`, ``low`` must be larger than 0. high: Upper endpoint of the range of suggested values. ``high`` is included in the range. ``high`` must be greater than or equal to ``low``. step: A step of discretization. .. note:: Note that :math:`\mathsf{high}` is modified if the range is not divisible by :math:`\mathsf{step}`. Please check the warning messages to find the changed values. .. note:: The method returns one of the values in the sequence :math:`\mathsf{low}, \mathsf{low} + \mathsf{step}, \mathsf{low} + 2 * \mathsf{step}, \dots, \mathsf{low} + k * \mathsf{step} \le \mathsf{high}`, where :math:`k` denotes an integer. .. note:: The ``step != 1`` and ``log`` arguments cannot be used at the same time. To set the ``step`` argument :math:`\mathsf{step} \ge 2`, set the ``log`` argument to :obj:`False`. log: A flag to sample the value from the log domain or not. .. note:: If ``log`` is true, at first, the range of suggested values is divided into grid points of width 1. The range of suggested values is then converted to a log domain, from which a value is sampled. The uniformly sampled value is re-converted to the original domain and rounded to the nearest grid point that we just split, and the suggested value is determined. For example, if `low = 2` and `high = 8`, then the range of suggested values is `[2, 3, 4, 5, 6, 7, 8]` and lower values tend to be more sampled than higher values. .. note:: The ``step != 1`` and ``log`` arguments cannot be used at the same time. To set the ``log`` argument to :obj:`True`, set the ``step`` argument to 1. .. seealso:: :ref:`configurations` tutorial describes more details and flexible usages. )r8r9r2r1)rintr4r5r6s r) suggest_intzTrial.suggest_ints@r'3TsN dmmD,?@   |4r+cyr-r'r7choicess r)suggest_categoricalzTrial.suggest_categoricalNORr+cyr-rQrRs r)rTzTrial.suggest_categoricalQrUr+cyr-rQrRs r)rTzTrial.suggest_categoricalTMPr+cyr-rQrRs r)rTzTrial.suggest_categoricalWsQTr+cyr-rQrRs r)rTzTrial.suggest_categoricalZrXr+cyr-rQrRs r)rTzTrial.suggest_categorical]s!$r+c:|j|t|S)aSuggest a value for the categorical parameter. The value is sampled from ``choices``. Example: Suggest a kernel function of `SVC `__. .. testcode:: import numpy as np from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.svm import SVC import optuna X, y = load_iris(return_X_y=True) X_train, X_valid, y_train, y_valid = train_test_split(X, y) def objective(trial): kernel = trial.suggest_categorical("kernel", ["linear", "poly", "rbf"]) clf = SVC(kernel=kernel, gamma="scale", random_state=0) clf.fit(X_train, y_train) return clf.score(X_valid, y_valid) study = optuna.create_study(direction="maximize") study.optimize(objective, n_trials=3) Args: name: A parameter name. choices: Parameter value candidates. .. seealso:: :class:`~optuna.distributions.CategoricalDistribution`. Returns: A suggested value. .. seealso:: :ref:`configurations` tutorial describes more details and flexible usages. )rS)r4rrRs r)rTzTrial.suggest_categoricalbsl}}T#:7#KLLr+cRt|jjdkDr td t |} t|}|dkrt d |d ||jjvrtjd |d y|jj|j||||jj|<y#t t f$rdt|d}t |dwxYw#t t f$rdt|d}t |dwxYw) a0 Report an objective function value for a given step. The reported values are used by the pruners to determine whether this trial should be pruned. .. seealso:: Please refer to :class:`~optuna.pruners.BasePruner`. .. note:: The reported value is converted to ``float`` type by applying ``float()`` function internally. Thus, it accepts all float-like types (e.g., ``numpy.float32``). If the conversion fails, a ``TypeError`` is raised. .. note:: If this method is called multiple times at the same ``step`` in a trial, the reported ``value`` only the first time is stored and the reported values from the second time are ignored. .. note:: :func:`~optuna.trial.Trial.report` does not support multi-objective optimization. Example: Report intermediate scores of `SGDClassifier `__ training. .. testcode:: import numpy as np from sklearn.datasets import load_iris from sklearn.linear_model import SGDClassifier from sklearn.model_selection import train_test_split import optuna X, y = load_iris(return_X_y=True) X_train, X_valid, y_train, y_valid = train_test_split(X, y) def objective(trial): clf = SGDClassifier(random_state=0) for step in range(100): clf.partial_fit(X_train, y_train, np.unique(y)) intermediate_value = clf.score(X_valid, y_valid) trial.report(intermediate_value, step=step) if trial.should_prune(): raise optuna.TrialPruned() return clf.score(X_valid, y_valid) study = optuna.create_study(direction="maximize") study.optimize(objective, n_trials=3) Args: value: A value returned from the objective function. step: Step of the trial (e.g., Epoch of neural network training). Note that pruners assume that ``step`` starts at zero. For example, :class:`~optuna.pruners.MedianPruner` simply checks if ``step`` is less than ``n_warmup_steps`` as the warmup mechanism. ``step`` must be a positive integer. rLz?Trial.report is not supported for multi-objective optimization.z!The `value` argument is of type 'z' but supposed to be a float.Nz The `step` argument is of type 'z' but supposed to be an int.rzThe `step` argument is z but cannot be negative.z2The reported value is ignored because this `step` z is already reported.)lenr directionsNotImplementedErrorfloat TypeError ValueErrortyperNrintermediate_valueswarningswarnrset_trial_intermediate_valuer)r'valuer1messages r)reportz Trial.reports9H tzz$$ % )%Q  /%LE /t9D !86tfC!!55d;/:& /3DK=@]^ G$$ .  /:& /8d D`aGG$$ . /s C C; +C8;+D&ct|jjdkDr td|j }|jj j |j|S)aSuggest whether the trial should be pruned or not. The suggestion is made by a pruning algorithm associated with the trial and is based on previously reported values. The algorithm can be specified when constructing a :class:`~optuna.study.Study`. .. note:: If no values have been reported, the algorithm cannot make meaningful suggestions. Similarly, if this method is called multiple times with the exact same set of reported values, the suggestions will be the same. .. seealso:: Please refer to the example code in :func:`optuna.trial.Trial.report`. .. note:: :func:`~optuna.trial.Trial.should_prune` does not support multi-objective optimization. Returns: A boolean value. If :obj:`True`, the trial should be pruned according to the configured pruning algorithm. Otherwise, the trial should continue. rLzETrial.should_prune is not supported for multi-objective optimization.)r^rr_r`_get_latest_trialprunerprune)r'trials r) should_prunezTrial.should_prunes\0 tzz$$ % )%W &&(zz  &&tzz599r+c|jj|j||||jj|<y)aSet user attributes to the trial. The user attributes in the trial can be access via :func:`optuna.trial.Trial.user_attrs`. .. seealso:: See the recipe on :ref:`attributes`. Example: Save fixed hyperparameters of neural network training. .. testcode:: import numpy as np from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.neural_network import MLPClassifier import optuna X, y = load_iris(return_X_y=True) X_train, X_valid, y_train, y_valid = train_test_split(X, y, random_state=0) def objective(trial): trial.set_user_attr("BATCHSIZE", 128) momentum = trial.suggest_float("momentum", 0, 1.0) clf = MLPClassifier( hidden_layer_sizes=(100, 50), batch_size=trial.user_attrs["BATCHSIZE"], momentum=momentum, solver="sgd", random_state=0, ) clf.fit(X_train, y_train) return clf.score(X_valid, y_valid) study = optuna.create_study(direction="maximize") study.optimize(objective, n_trials=3) assert "BATCHSIZE" in study.best_trial.user_attrs.keys() assert study.best_trial.user_attrs["BATCHSIZE"] == 128 Args: key: A key string of the attribute. value: A value of the attribute. The value should be JSON serializable. N)rset_trial_user_attrrr user_attrsr'keyris r) set_user_attrzTrial.set_user_attrs5l ((eD49!!,,S1r+z3.1.0c|jj|j||||jj|<y)aSet system attributes to the trial. Note that Optuna internally uses this method to save system messages such as failure reason of trials. Please use :func:`~optuna.trial.Trial.set_user_attr` to set users' attributes. Args: key: A key string of the attribute. value: A value of the attribute. The value should be JSON serializable. N)rset_trial_system_attrrrr$rus r)set_system_attrzTrial.set_system_attrXs4 **4>>3F6;!!..s3r+c|j}|j}|j}||jvr4tj|j|||j |}|S|j ||r|j|}n|jrtj|}nj|j||r|j|}nHtj|j|}|jjj!||||}|j#|}|j%||||||j&j|<||j&j |<|Sr-)rrrmr check_distribution_compatibilityparams_is_fixed_paramr&single_get_single_value_is_relative_paramr0r rrr sample_independentto_internal_reprset_trial_paramr) r'r7r:rr(rp param_valuerparam_value_in_internal_reprs r)r4zTrial._suggestjsT,,>>&&( 5&& &  : :5;N;Nt;TVb c,,t,K(%##D,7"006 $$&+==lK ((|<"2248 --djj%@"jj00CC5$  ,8+H+H+U (  # #Hd4PR^ _>r+cTtj|jjS)zReturn distributions of parameters to be optimized. Returns: A dictionary containing all distributions. )rrrrrs r)rzTrial.distributionss}}T66DDEEr+cTtj|jjS)zkReturn user attributes. Returns: A dictionary containing all user attributes. )rrrrtrs r)rtzTrial.user_attrss}}T66AABBr+crtj|jj|jS)zoReturn system attributes. Returns: A dictionary containing all system attributes. )rrrget_trial_system_attrsrrs r)r$zTrial.system_attrss&}}T\\@@PQQr+c.|jjS)zvReturn start datetime. Returns: Datetime where the :class:`~optuna.trial.Trial` started. )rdatetime_startrs r)rzTrial.datetime_starts((777r+c.|jjS)zxReturn trial's number which is consecutive and unique in a study. Returns: A trial number. )rnumberrs r)rz Trial.numbers((///r+)rz'optuna.study.Study'r(rNreturnNone)rzdict[str, Any]) r7strr8rar9rar1z float | Noner2boolrra)r7rr8rar9rarra) r7rr8rar9rarGrarra) r7rr8rNr9rNr1rNr2rrrN)r7rrSzSequence[None]rr)r7rrSzSequence[bool]rr)r7rrSz Sequence[int]rrN)r7rrSzSequence[float]rra)r7rrSz Sequence[str]rr)r7rrSzSequence[CategoricalChoiceType]rr)rirar1rNrr)rr)rvrrirrr)r7rr:r rr)r7rr:r rr)r7rr:r rr)rr)rzdict[str, BaseDistribution])rzdatetime.datetime | None)rrN)"__name__ __module__ __qualname____doc__r*propertyr0r<r _suggest_deprecated_msgrrCrErHr rrOrrTrkrqrwrzr4r~rr5rmr}rrtr$rrrQr+r)rrs$\  % %""WWW W  WW WrWg,C,J,JPR,J,ST3U3*Wg,C,J,JPa,J,bc=d=*Wg,C,J,JPa,J,bc;d;8&B" >?EWW!W),W7:WEIW W  WrRR RR PP TT PP $$"A$ $$6M6M"A6M 6MpcDJ:@7:rWg&<'<"> D&  ??FFCCWg&R'R8800r+rc,eZdZdfd Zdfd ZxZS)rcLt|||_||_d|_y)NF)superr*rr _initialized)r'r(r __class__s r)r*z_LazyTrialSystemAttrs.__init__s$ ! !r+c|dk(rE|js9d|_t| |jj |j t||S)NdataT)rrupdaterrr__getattribute__)r'rvrs r)rz&_LazyTrialSystemAttrs.__getattribute__sJ &=$$$(!t}}CCDNNSTw',,r+)r(rNrzoptuna.storages.BaseStoragerr)rvrrr)rrrr*r __classcell__)rs@r)rrs" --r+r)% __future__r collectionsrcollections.abcrrdatetimetypingrrrfoptunarr r optuna._convert_positional_argsr optuna._deprecatedr optuna.distributionsr rrrr optuna.trialroptuna.trial._baserr get_loggerr_loggerrrrrQr+r)rs|" $  C.16820$;( '  X &<Z 0IZ 0z -H -r+