0
respostas

Modelo llama3-70b-8192

Na execução do código, veio a mensagem:

BadRequestError: Error code: 400 - {'error': {'message': 'The model llama3-70b-8192 has been decommissioned and is no longer supported. Please refer to https://console.groq.com/docs/deprecations for a recommendation on which model to use instead.', 'type': 'invalid_request_error', 'code': 'model_decommissioned'}}

Achei na documentação do groq:

August 30, 2025: llama3-70b-8192 and llama3-8b-8192
In line with our commitment to bringing you cutting-edge models, on May 31, 2025, we emailed users to announce the deprecation of llama3-70b-8192 and llama3-8b-8192 in favor of llama-3.3-70b-versatile and llama-3.1-8b-instant respectively. The newer Llama 3.3 70B and Llama 3.1 8B models deliver exceptional performance, enabling your applications to harness state-of-the-art text generation with unparalleled speed on our platform.

Deprecated Model Shutdown Date Recommended Replacement Model ID
llama3-70b-8192

08/30/25
llama-3.3-70b-versatile

llama3-8b-8192

08/30/25
llama-3.1-8b-instant

Assim, substitui o modelo no código:

from llama_index.core import Settings
from llama_index.llms.groq import Groq

Settings.llm = Groq(model='llama-3.3-70b-versatile', api_key=key)

Funcionou, deu certo!

Garanta sua matrícula hoje e ganhe + 2 meses grátis

Continue sua jornada tech com ainda mais tempo para aprender e evoluir

Quero aproveitar agora