Users have constantly critiqued JanitorAI developers for their lack of communication. And despite promising to do better, JanitorAI fails to communicate yet again.
JanitorAI developers silently updated the “Chat Memory” feature, which many users mistook for a bug. The update prevented several users from continuing their roleplay with AI characters.
Chat Memory Update, Or Bug?
The “Chat Memory” feature on JanitorAI helps users give the AI additional context clues and information related to their roleplay. It’s handy for long, ongoing roleplays because it improves the AI’s memory and produces better responses, even after exceeding the context size limit.
Also Read: Understanding Tokens And Context Size
However, on July 11, 2025, users began reporting that there was a critical bug with the chat memory feature.
When a user updated the chat memory, the system erased the entire chat history from context. The AI could then access only the permanent tokens (character information and scenario) and the updated chat memory data.
Say Goodbye To Context Cache
Usually, the AI’s memory includes the conversation history, and older messages gradually drop from the context cache once it reaches the maximum size.
But due to the bug, users who updated their chat memory found that the AI had completely lost track of the story and conversation. While chat memory helps the AI with additional data, retaining recent messages within context is crucial for the LLM to generate decent responses.
Many users who read about the issue on Reddit and Discord paused their long roleplay to wait for developers to fix the bug. However, there was no official announcement by JanitorAI, leading to widespread confusion within the community.
It’s Not A Bug, It’s A Feature
Without making any official announcement, the developers fixed the bug by pushing a silent update and making it a feature.
Also Read: Use DeepSeek On JanitorAI
When a user updated the chat memory, the AI’s memory retained 10 recent messages, permanent tokens, and the chat memory data. This update worked well with JanitorAI’s in-house LLM, which has a lower context size, but it degraded the experience for users running better LLMs through APIs and proxies.
JanitorAI Fails To Communicate With Users
The silent update added to the existing confusion, with users still presuming the developers hadn’t fixed the bug.
A few vocal members on Discord pushed the developers to respond, and they quickly rolled back the silent update. Shep, the founder and lead developer, said they would work on adding a toggle to erase context when updating chat memory.
The incident once again drew criticism of the developers’ lack of communication. Even JanitorAI’s Discord and Reddit moderators didn’t know about the silent update and were just as confused as the users.
The announcements channel goes severely underused when issues occur. Mods shouldn’t just put “Yeah there is this issue/error” in a random channel. It needs to be in announcements too. It needs to be put on reddit and the website. Not everyone has discord and those who do aren’t going to be pinged in a random channel.
Dawnian, JanitorAI Discord
Despite having dedicated channels on their Discord server for announcements and updates, JanitorAI failed to communicate effectively about an issue that impacted many users.
JanitorAI developers need to step up and improve their communication, not just make promises. They plan to introduce subscriptions and monetize their currently free platform. Poor communication won’t sit well once users start paying for JanitorAI’s service.







