People are letting the viral AI assistant formerly known as Clawdbot run their lives, regardless of the privacy concerns.
Talking to AI bots can lead to unhealthy emotional attachments or even breaks with reality. Some people affected by chatbot interactions or those of a loved one are turning to one other for support.
Clawdbot can automate large parts of your digital life, but researchers caution that proven security flaws mean users should stop and listen before trusting it with sensitive systems.
Early users praise Moltbot as a more useful and proactive form of AI. However, relying on this tool could bring security risks.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results