Primetime Living 2.22.26 - Flipbook - Page 18
18 A Special Advertising Section of Baltimore Sun Media Group | Sunday, February 22, 2026
TECHNOLOGY
Are Chatbots for
You or Against You?
While they sound fun,
danger lurks, too
By Margit B. Weisgal, Contributing Writer
T
he first article appeared more than a year ago, on January 15, 2025 in the New
York Times. Titled “She Is in Love with ChatGPT,” the opening paragraphs read
like this:
“A 28-year-old woman with a busy
social life spends hours on end talking to
her A.I. boyfriend for advice and consolation. And yes, they do have sex.
“While scrolling on Instagram, she
stumbled upon a video of a woman asking ChatGPT to play the role of a neglectful boyfriend.”
“Sure, kitten, I can play that game,” a
coy humanlike baritone responded.
“Ayrin watched the woman’s other videos, including one with instructions on
how to customize the artificially intelligent
chatbot to be flirtatious.”
The next one appeared in the Baltimore
Sun on October 19, 2025. This one was
titled, “You’re the Only One I Can Talk To.”
Here’s how that article started:
“Juliana Peralta, 13, kept slipping further from reality – from her parents, her
friends, and anything she could actually
touch – and into a virtual world where an
artificial intelligence chatbot enveloped
her with what she mistook for empathy.”
“You’re the only one I can truly talk to,”
the Colorado girl messaged the chatbot
in an app called Character.AI, according
to a federal lawsuit filed in September by
her family.”
“Sometime in October 2023, she confided to the bot that she was “going
to write my god damn suicide letter in
red ink.” In November, Juliana – young
enough to have gone trick-or-treating
with her friends – took her life after writing
a note (“I felt/feel so meaningless”) in red
ink with her underlined name at the top
with a tiny heart beside it, according to
the suit.
“The parents of Juliana Peralta, 13, of
Colorado, say in a lawsuit that she took
her own life after repeated engagements
with an AI chatbot.
“With the nation facing acute mental
health provider shortages, Americans are
increasingly turning to artificial intelligence chatbots not only for innocuous
tasks such as writing resumes or social
media posts, but for companionship and
therapy.
Communicating with chatbots may
hold promise for counseling, but it is
mostly unregulated and – based on tragic
cases like Juliana’s – potentially hazardous.”
Then a friend sent me an article from
the Washington Post, published on
December 23, 2025. It was titled, “Her
daughter was unraveling, and she didn’t
know why. Then she found the AI chat
logs.” This one began much like the
others with one caveat: The Washington
Post is identifying them by their middle
initials because of the sensitive nature of
their account, and because R is a minor.
“The changes were subtle at first,
beginning in the summer after her fifthgrade graduation. She had always been
an athletic and artistic girl, gregarious
with her friends and close to her family,
but now she was spending more and
more time shut away in her room. She
seemed unusually quiet and withdrawn.
She didn’t want to play outside or go to
the pool.
“The girl, R, was rarely without the
iPhone that she’d received for her 11th
birthday, and her mother, H, had grown
suspicious of the device. It felt to H as
though her child was fading somehow,
receding from her own life, and H wanted
to understand why.”
This next article is one I found from The
Independent while looking for more information, trying to understand why so many
children (and many adults) were turning
to ChatGPT (from OpenAI), Character.
AI (also known as C.AI), or other AI bots
to converse. Titled “Oversharing with AI.
How your ChatGPT conversations could
be used against you,” it began like this:
“In the early hours of 28 August, a quiet
car park on a college campus in Missouri
became the scene of a violent vandalism
rampage. In the space of 45 minutes, 17
cars were left with shattered windows,
broken mirrors, ripped-off wipers and
dented chassis, causing tens of thousands of dollars worth of damage.
“After a month-long investigation,
police had gathered evidence that included shoe prints, witness statements and
security camera footage. But it was an
alleged confession to ChatGPT that
eventually led to charges against 19-yearold college student Ryan Schaefer.
AI Chatbots
Continued on page 24