O Python, pelo menos na atual versão (em abril de 2025) está gerando erro quando são definidos o TOP P e TOP K para os parâmetros, como se tais parâmetros não fizessem parte da configuração dos modelos (via codigo).
raceback (most recent call last):
File "C:\curso_gemini\curso_gemini\Lib\site-packages\proto\marshal\rules\message.py", line 36, in to_proto
return self._descriptor(**value)
~~~~~~~~~~~~~~~~^^^^^^^^^
ValueError: Protocol message GenerationConfig has no "top-k" field.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\curso_gemini\main.py", line 29, in <module>
resposta = llm.generate_content (pergunta)
File "C:\curso_gemini\curso_gemini\Lib\site-packages\google\generativeai\generative_models.py", line 305, in generate_content
request = self._prepare_request(
contents=contents,
...<3 lines>...
tool_config=tool_config,
)
File "C:\curso_gemini\curso_gemini\Lib\site-packages\google\generativeai\generative_models.py", line 165, in _prepare_request
return protos.GenerateContentRequest(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
model=self._model_name,
^^^^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
cached_content=self.cached_content,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\curso_gemini\curso_gemini\Lib\site-packages\proto\message.py", line 728, in __init__
pb_value = marshal.to_proto(pb_type, value)
File "C:\curso_gemini\curso_gemini\Lib\site-packages\proto\marshal\marshal.py", line 235, in to_proto
pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
File "C:\curso_gemini\curso_gemini\Lib\site-packages\proto\marshal\rules\message.py", line 46, in to_proto
return self._wrapper(value)._pb
~~~~~~~~~~~~~^^^^^^^
File "C:\curso_gemini\curso_gemini\Lib\site-packages\proto\message.py", line 724, in __init__
raise ValueError(
"Unknown field for {}: {}".format(self.__class__.__name__, key)
)
ValueError: Unknown field for GenerationConfig: top-k
Essa é a resposta dada.