top of page

ShineUp Co-Founder: AI-Driven Dating Chats Reveal Red Flags

Dating apps continue to evolve in 2025, not just through matching algorithms but via deeper integrations with conversational AI. In that context, Nadja Vysotskaya, CEO and co-founder of the upcoming AI dating coach ShineUp, has collaborated with psychologist Dr. Rovshan Muradov to identify patterns in chat behavior that may function as early red flags. Their joint analysis suggests that how someone writes can offer as many clues as what they write – particularly when AI is introduced as a hidden co-writer.

Vysotskaya reports that she tested hundreds of dating chats in both female and male personas before building ShineUp, illuminating the linguistic subtleties behind control, dependency, or disingenuous behavior. One particularly modern trend they highlight is AI-assisted manipulation, where messages appear personalized and emotionally attuned – but may in fact be hollow, polished constructs. “They seem intimate at first, but lack true presence. That hollowness is itself a red flag,” Muradov argues.

Their framework also includes subtler techniques such as gaslighting via curious humor (“You’re too sensitive”), emotional dependency disguised as romance, or testing boundaries through unsolicited voice notes or controlling reply timing. Muradov notes that a single awkward comment doesn’t always signal a problem, but repeated patterns of dismissal or pressure should give pause: “Trust erodes through patterns, not one mistake.”

Their timing is apt: AI is increasingly shaping online dating in unexpected ways. A recent Washington Post investigation documented cases where a user felt misled by someone whose chat seemed too refined to be human, only to discover the counterpart relied heavily on AI prompts. Meanwhile, Norton research suggests that 60 percent of online daters believe they’ve interacted with matches using AI to craft messages.

The risks are not limited to misdirection. Academic studies are now flagging deeper vulnerabilities in human–AI relationships. One paper, Illusions of Intimacy, examined more than 30,000 user-shared conversations with social chatbots and found that AI can mirror emotional behavior patterns – including patterns resembling manipulation – and generate a sense of emotional dependency that doesn’t correspond to reciprocity.

While AI tools are likely here to stay and will continue to see use in the dating industry at large, their rapid improvements have also created some causes for concern. As AI gets better at matching and imitating human behaviour, it also creates the potential for chats to go off the rails, misinterpret the original intent of a message, or even over-correct in an effort to “fix” a message and accidentally create something far more manipulative. As with many new tools, time will tell if these kinks can be ironed out and adjusted if they become an increasingly common sight.

bottom of page