Quote:
Originally Posted by robertsconley
I haven't gotten any answer to that question other than that each prompt can't be longer than 4,000 tokens. I believe I posted more than that but in separate prompts but it remember. But....
I switched away and reloaded the conversation so that it may be that when you return, it only processes the last 4k tokens.
|
The token limit IS a bummer, and clearly holds it back a lot. But I still believe this method of AI is an early evolutionary dead end. It gives some good experimental results though, and some entertaining responses!