L i'8ddlmZmZmZddlZddlmZmZm Z m Z m Z m Z m Z ddlmZmZe r ddlmZddlmZe rdd lmZe r ddlZdd lmZe j4eZd Zd ZGd deZeeddGddeZ y))AnyUnionoverloadN) ExplicitEnumadd_end_docstringsis_tf_availableis_torch_availableis_vision_availableloggingrequires_backends)Pipelinebuild_pipeline_init_args)Image) load_image)/TF_MODEL_FOR_IMAGE_CLASSIFICATION_MAPPING_NAMES),MODEL_FOR_IMAGE_CLASSIFICATION_MAPPING_NAMESc:ddtj| zz S)Ng?)npexp)_outputss q/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/transformers/pipelines/image_classification.pysigmoidr/s #y)) **ctj|dd}tj||z }||jddz S)NT)axiskeepdims)rmaxrsum)rmaxes shifted_exps rsoftmaxr$4s? FF8"t 4E&&E)*K b4@ @@rceZdZdZdZdZy)ClassificationFunctionrr$noneN)__name__ __module__ __qualname__SIGMOIDSOFTMAXNONErrr&r&;sGG Drr&T)has_image_processoraZ function_to_apply (`str`, *optional*, defaults to `"default"`): The function to apply to the model outputs in order to retrieve the scores. Accepts four different values: - `"default"`: if the model has a single label, will apply the sigmoid function on the output. If the model has several labels, will apply the softmax function on the output. - `"sigmoid"`: Applies the sigmoid function on the output. - `"softmax"`: Applies the softmax function on the output. - `"none"`: Does not apply any function on the output.c XeZdZUdZej Zeed<dZdZ dZ dZ fdZ ddZ edeedfd ed eeeeffd Zedeeeedfd ed eeeeeffd Zdeeeededfd ed eeeeefeeeeeffffd ZddZdZddZxZS)ImageClassificationPipelinea Image classification pipeline using any `AutoModelForImageClassification`. This pipeline predicts the class of an image. Example: ```python >>> from transformers import pipeline >>> classifier = pipeline(model="microsoft/beit-base-patch16-224-pt22k-ft22k") >>> classifier("https://huggingface.co/datasets/Narsil/image_dummy/raw/main/parrots.png") [{'score': 0.442, 'label': 'macaw'}, {'score': 0.088, 'label': 'popinjay'}, {'score': 0.075, 'label': 'parrot'}, {'score': 0.073, 'label': 'parodist, lampooner'}, {'score': 0.046, 'label': 'poll, poll_parrot'}] ``` Learn more about the basics of using a pipeline in the [pipeline tutorial](../pipeline_tutorial) This image classification pipeline can currently be loaded from [`pipeline`] using the following task identifier: `"image-classification"`. See the list of available models on [huggingface.co/models](https://huggingface.co/models?filter=image-classification). function_to_applyFTct||i|t|d|j|jdk(r t yt y)Nvisiontf)super__init__r check_model_type frameworkrr)selfargskwargs __class__s rr7z$ImageClassificationPipeline.__init__ksH $)&)$) ~~% < > rci}|||d<i}|||d<t|trt|j}|||d<|i|fS)Ntimeouttop_kr2) isinstancestrr&lower)r:r@r2r?preprocess_paramspostprocess_paramss r_sanitize_parametersz0ImageClassificationPipeline._sanitize_parameterstsp  +2 i (  */ w ' ' - 67H7N7N7P Q   (6G 2 3 "&888rinputsz Image.Imager<returnc yNr.r:rGr<s r__call__z$ImageClassificationPipeline.__call__sberc yrJr.rKs rrLz$ImageClassificationPipeline.__call__stwrc hd|vr|jd}| tdt| |fi|S)a Assign labels to the image(s) passed as inputs. Args: inputs (`str`, `list[str]`, `PIL.Image` or `list[PIL.Image]`): The pipeline handles three types of images: - A string containing a http link pointing to an image - A string containing a local path to an image - An image loaded in PIL directly The pipeline accepts either a single image or a batch of images, which must then be passed as a string. Images in a batch must all be in the same format: all as http links, all as local paths, or all as PIL images. function_to_apply (`str`, *optional*, defaults to `"default"`): The function to apply to the model outputs in order to retrieve the scores. Accepts four different values: If this argument is not specified, then it will apply the following functions according to the number of labels: - If the model has a single label, will apply the sigmoid function on the output. - If the model has several labels, will apply the softmax function on the output. Possible values are: - `"sigmoid"`: Applies the sigmoid function on the output. - `"softmax"`: Applies the softmax function on the output. - `"none"`: Does not apply any function on the output. top_k (`int`, *optional*, defaults to 5): The number of top labels that will be returned by the pipeline. If the provided number is higher than the number of labels available in the model configuration, it will default to the number of labels. timeout (`float`, *optional*, defaults to None): The maximum time in seconds to wait for fetching images from the web. If None, no timeout is set and the call may block forever. Return: A dictionary or a list of dictionaries containing result. If the input is a single image, will return a dictionary, if the input is a list of several images, will return a list of dictionaries corresponding to the images. The dictionaries contain the following keys: - **label** (`str`) -- The label identified by the model. - **score** (`int`) -- The score attributed by the model for that label. imageszICannot call the image-classification pipeline without an inputs argument!)pop ValueErrorr6rL)r:rGr<r=s rrLz$ImageClassificationPipeline.__call__sBd v ZZ)F >hi iw1&11rct||}|j||j}|jdk(r|j|j}|S)N)r?)rOreturn_tensorspt)rimage_processorr9todtype)r:imager? model_inputss r preprocessz&ImageClassificationPipeline.preprocesssK5'2++5+X >>T !'??4::6Lrc*|jdi|}|S)Nr.)model)r:rY model_outputss r_forwardz$ImageClassificationPipeline._forwards" 2\2 rc:||jjjdk(s#|jjjdk(rtj }n|jjjdk(s#|jjjdkDrtj }nSt|jjdr#|!|jjj}ntj}||jjjkDr |jjj}|dd}|jdk(rZ|jtjtjfvr.|jtj j#}n|j#}|tj k(r t%|}nC|tj k(r t'|}n$|tjk(r|}nt)d|t+|Dcgc]9\}}|jjj,||j/d ;}}}|j1d d ||d|}|Scc}}w) Nmulti_label_classificationrsingle_label_classificationr2logitsrrTz+Unrecognized `function_to_apply` argument: )labelscorec |dS)Nrdr.)xs rz9ImageClassificationPipeline.postprocess..s qzrT)keyreverse)r\config problem_type num_labelsr&r+r,hasattrr2r-r9rWtorchbfloat16float16rVfloat32numpyrr$rQ enumerateid2labelitemsort) r:r]r2r@outputsscoresird dict_scoress r postprocessz'ImageClassificationPipeline.postprocesss#  $zz  --1MMQUQ[Q[QbQbQmQmqrQr$:$B$B!""//3PPTXT^T^TeTeTpTpstTt$:$B$B!**,?@EVE^$(JJ$5$5$G$G!$:$?$?! 4::$$// /JJ%%00E)!, >>T !gmm 7V&Vjj/557GmmoG  6 > > >W%F "8"@"@ @W%F "8"="= =FJK\J]^_ _]ffl\m PXPQSXdjj''003ejjl K   14@  %fu-K s9>J)NNNrJ)N)r(r)r*__doc__r&r-r2__annotations___load_processor_load_image_processor_load_feature_extractor_load_tokenizerr7rFrrrBrlistdictrLrZr^r{ __classcell__)r=s@rr1r1As/.1G0K0K-KO #O  9euS-%78eCeDQUVY[^V^Q_L`ee wuT#Y]0C%CDwPSwX\]abfgjlogobp]qXrww62CcM4 ;NNO62[^62 tDcN#T$tCH~*>%?? @62p$rr1)!typingrrrrrrutilsrrr r r r r baserrPILr image_utilsrmodels.auto.modeling_tf_autorrnmodels.auto.modeling_autor get_loggerr(loggerrr$r&r1r.rrrs('5(^X   H %+ A\ 6F a(a ar