This model has a maximum context length of 4097 tokens.

Both the question and answer are included in the 4097 tokens.