This Model's Maximum Context Length Is…

This error message is not generated by AI Engine but directly originates from the OpenAI API. Currently, there is no check performed on our end at this stage of the request.

If this was working with versions before 1.9.94, it might be due to the removal of checks related to MaxTokens (approximately 4096 tokens) in the newer versions.

You can refer to the recommended values for each context displayed in the Chatbot AI settings section. Additionally, trying different models that offer larger context sizes might be beneficial.

If changing the models does not affect the query parameters, it could be a result of optimization or cache-related plugins retaining old parameters. Please disable such plugins.

 

This can be attributed to two factors: messages and completion. The "messages" factor pertains to the number of tokens in the context for every message sent so far, while the "completion" factor is determined by the MAX TOKENS value, how much the model will generate. It is essential to ensure that both of these values are reasonable.

ℹ️ If you interested in learning which settings to tweak to ensure this problem doesn’t happen, you can read this documentation.
 
Did this answer your question?
😞
😐
🤩