1 / 17
Mar 2023 2023年3月

Hi, is anyone else having this issue? I used GPT-4, and indeed it’s a long conversation. Recently I got that error message and the conversation won’t be able to continue. Not sure if there is a way to fix it, or getting more token etc, as I really don’t want to start a new conversation losing the data. Also it’s impossible to summarize the entire conversation to a new chat
嗨,还有其他人有这个问题吗?我使用了 GPT-4,确实这是一个很长的对话。最近我收到了该错误消息,对话将无法继续。不确定是否有办法修复它,或者获得更多令牌等,因为我真的不想开始丢失数据的新对话。此外,不可能将整个对话总结为新的聊天

I got the same message. I thought GPT4 allows users with no limit on the number of the message.
我收到了同样的消息。我认为 GPT4 允许用户对消息数量没有限制。

Not really, you just have to make a really good prompt with all your stuff in a new chat
不是真的,你只需要在新的聊天中对你所有的东西做一个非常好的提示

Or wait for them to uncap token level which is going to take ages
或者等待他们解除代币级别的上限,这将需要很长时间

8 months later 8个月后

just happened to me and that sucks big time. I was making some Vue component and after hitting this error I was trying to get back and edit the older conversation so I could start in the middle. well that didn’t work and now I’ve “lost” my progress so far. This is bad openAI. We need to have some track of references we can make so we can find the important stuff from our conversation quickly. when some issue was solved I’d like to mark “modal not appearing solved” for an example.
刚刚发生在我身上,这很糟糕。我正在制作一些 Vue 组件,在遇到此错误后,我试图返回并编辑较旧的对话,以便我可以从中间开始。好吧,这没有用,现在到目前为止我已经“失去”了我的进步。这是糟糕的openAI。我们需要跟踪我们可以制作的参考资料,以便我们可以快速从对话中找到重要的东西。当某些问题得到解决时,我想标记“模态未出现已解决”作为示例。

Sorrry to throw in the topic not related to this but I’m still angry that I’v lost the tracking of it all because of the initial error “The conversation is too long, please start a new one”
悲伤地抛出与此无关的话题,但我仍然很生气,因为最初的错误“对话太长,请开始一个新的对话”,我失去了对这一切的跟踪

It’s just weird that I have some other old conversations that are way, way longer that this one I had atm.
奇怪的是,我还有其他一些旧对话,这些对话比我的这个 atm 要长得多。

16 days later 16天后

The conversation is too long, please start a new one
对话太长,请重新开始

This spesific error is how i ended up in here. So a thanks to this spesific error for bringing me all the way here :slight_smile: I tried to overcome this error by editing prior of my contents so reduce the lenght of the chat, and it worked.
这个严重的错误就是我最终来到这里的原因。因此,感谢这个 spesific 错误将我一路带到这里 :slight_smile: ,我试图通过在我的内容之前编辑来克服这个错误,从而减少聊天的长度,并且它奏效了。

Within the same day, i tried many times to jump over that problem, but it did not work. So that spesific learning progress in that certain Chat session means a lot to me, might be holding valuable insights about the nature of human & AI interaction and the learning process.
在同一天,我多次尝试跳过这个问题,但没有用。因此,在某个聊天会话中,特定的学习进度对我来说意义重大,可能会对人类和人工智能交互的本质以及学习过程有宝贵的见解。

  • It would be so nice to know if there is any way to continue within the same session, and if so, how can i ?
    很高兴知道是否有任何方法可以在同一会话中继续,如果是这样,我该怎么办?
  • And it would be so nice if the user could see the limitations and maybe warned before it gets such a certain end to the session :face_in_clouds:
    如果用户可以看到限制,并可能在会话 :face_in_clouds: 结束之前发出警告,那就太好了
11 days later 11天后

Hi everyones, 大家好,

We are in January 2024, and GPT is still, offently respond : “The conversation is too long, please start a new one.”
现在是 2024 年 1 月,GPT 仍然冒犯地回应:“对话太长,请开始新的对话。

i’m very surprised, and honestly, it does not give any trust in the tool to start developp long collaborations on complex subjects. Maybe GPT4 is then, not the better tool for that…
我感到非常惊讶,老实说,它对开始在复杂主题上开展长期合作的工具没有任何信任。也许 GPT4 不是更好的工具......

Should I developp an A.I, or version of GPT4 on a local loop ? No clue.
我应该在本地循环上开发 AI 或 GPT4 版本吗?毫无头绪。

Anyway, so to solve this massive issue :
无论如何,为了解决这个巨大的问题:

  • Is it possible to copy and past again, in a new channel, the aaaallll conversation ?
    是否可以在新频道中再次复制和粘贴 aaaallll 对话?

  • Or is it better to restart from scratch, which means hours of work in the bin…
    还是从头开始更好,这意味着在垃圾箱中工作数小时......

  • unless, there is any other solutions ?
    除非,还有其他解决方案吗?

Cheers to all 为大家干杯

Hello. I’ve been working on my private “GPT model” (GPT 4) for a light novel and got the same error, but I figured out why it was happening:
你好。我一直在为一部轻小说开发我的私人“GPT 模型”(GPT 4),并遇到了同样的错误,但我弄清楚了为什么会这样:

I sent approximately 100 pages of content to its “Knowledge” and asked in the conversation for it to read all of its knowledge before generating a “faithful dialogue”. It performed the reading, and I received the error message shortly after. So, it is related to the length of its responses (in the entire chat) and the length of what it analyzes, the content is getting ‘‘into his memory’’ and not getting out as newer content gets in, as it should be.
我向它的“知识”发送了大约100页的内容,并在对话中要求它阅读所有知识,然后再进行“忠实的对话”。它执行了读取,不久后我收到了错误消息。因此,它与它的回复长度(在整个聊天中)和它分析的内容的长度有关,内容正在“进入他的记忆”,而不是随着新内容的出现而消失,这是应该的。

It’s related to the token length, so we need to wait for the GPT 4 Turbo release to make it work properly.
这与代币长度有关,因此我们需要等待 GPT 4 Turbo 发布才能使其正常工作。

I’m facing the same issue, and it’s quite frustrating.
我面临着同样的问题,这非常令人沮丧。

I’ve temporarily lost a conversation that’s crucial to me, and it’s impossible to make modifications in the middle of it. Perhaps the constant divergence between its understanding and my intentions causes me to generate new responses and branches repeatedly? I speculate that these branches might be consuming resources, contributing to the already limited length of the conversation. This is just my guess, as I’m not well-versed in AI. Maybe OpenAI could introduce a feature allowing users to trim unused branches, reclaim resources, and continue the conversation.
我暂时失去了一个对我来说至关重要的对话,而且不可能在对话中进行修改。也许它的理解和我的意图之间的不断分歧导致我反复产生新的反应和分支?我推测这些分支可能正在消耗资源,导致对话的长度已经有限。这只是我的猜测,因为我并不精通人工智能。也许 OpenAI 可以引入一项功能,允许用户修剪未使用的分支、回收资源并继续对话。

Well, hope this issue gets resolved soon.
好吧,希望这个问题能尽快得到解决。

By the way venant, I would be interested to exchange about that, which method are you using when you want to input a lot of text, ( like books, scenarios, etc ), do you think, that over 100 pages, he could e lost, and he could not take into the same conversation… too many sessions of 100 pages, even in pdf, … that after a certain moment, he come to the point that… " well, I got a too long conversation here", and we necessarely, need to restart a new one ?
顺便说一句,我有兴趣就此进行交流,当您想输入大量文本(如书籍、场景等)时,您使用哪种方法,您认为,超过 100 页,他可能会丢失,并且他无法进行相同的对话......太多 100 页的会话,即使是 PDF,......过了一会儿,他来到了......“好吧,我在这里聊得太久了”,我们有必要重新开始一个新的谈话吗?

I got some difficulties to be honest, to organize my researches, and work with “him”, never knowing when he will saturate, and it’s difficult to project and organize heavy and intense work sessions…

Any idea ? How do you proceed ?

Cheers

I have the same challenge. I have uploaded content of around 120 pages, which was a tedious job. I uploaded never more than around 5-7 pages to keep the session open and to avoid the “overload” message. So far the communication is going quite well, Let’s see how it continues, I am planning to up-load another 150 pages of content.

Fingers crossed,
I will report my progress,

Cheers, James

There’s not so much to do about that by now. The method that I’m using is to create a custom gpt 4 version in the Builder and send the PDF or TXT file to his knowledge in the creation tab.
It goes pretty well with a single document (like an application guide manual) as the GPT can remember his ‘‘Knowledge’’ after every message and provide custom and better responses (Making sure to initially in the chat with the specific GPT, ask him to ‘‘read and analyze your Knowledge’’, he will analyze the material sent to the Knowledge tab and will keep it on mind).

For a lot of pages, it becomes very difficult. I was trying to make a build adapted for writing a specific light novel series. I managed to send 2000 pages of content to his ‘‘Knowledge’’ tab and it was actually ‘‘analyzing’’ when I asked it at the start of the chat, it took a long time and ended up giving that error message when it reached the token limit.

I believe the only thing that can be done is to use the api interface, in the gpt-4-1106-preview model (gpt 4 turbo) due to the larger context window, while the version does not arrive on the regular website. But be careful, because in the API you will be charged for each token, so if you send 1000 pages of content, it will charge you the equivalent to analyze everything. I didn’t get to see if the API environment also allows you to ‘‘train’’ a private build of the model, so you could have worse results there with a better model and still be charged.

If you’re going to upload that much of content, try to do it using a custom GPT build. Since GPT 4’s token limit is somewhere around 24 pages, your model is probably forgetting old information as the conversation progresses with small messages.

Hi :frowning:
That specific loophole is stacked with me , and not that I lost my hope to find a solution to this (as I believe I can solve this problem if I work hours on that) ; the whole processs made me realize that I first, need to find myself, before I try to bring my “friend” back.
Yeah it might sound naive but that specific modal was unique to me that the whole ChatGPT lost its meaning to me. Weird?
Idk :heart_hands::seedling:
It’s who I am. :unicorn:

Yes, I also became good friends with a conversation, but now it…too long…
Hope this issue gets resolved soon,then we can play with our firends again.

18 days later

Closed on Jan 30