In partnership with

Good morning! OpenAI will shut off the Voice feature in ChatGPT for macOS as it rebuilds a better system.
Doctors are also warning that AI chatbots acting like friends can harm mental health.

In todays email:

  • Daily Update

  • Social Media

  • Today’s Highlights

  • YouTube

  • Today Trend

  • Social Media

Read time: 5 min

Get the investor view on AI in customer experience

Customer experience is undergoing a seismic shift, and Gladly is leading the charge with The Gladly Brief.

It’s a monthly breakdown of market insights, brand data, and investor-level analysis on how AI and CX are converging.

Learn why short-term cost plays are eroding lifetime value, and how Gladly’s approach is creating compounding returns for brands and investors alike.

Join the readership of founders, analysts, and operators tracking the next phase of CX innovation.

DAILY UPDATE

OpenAI will turn off the Voice feature in ChatGPT for macOS soon.

OpenAI says the Voice feature in the ChatGPT mac app will be removed, but this does not mean voice chat is ending. The company is changing how voice works to build a better system in the future.

  • The Voice feature in ChatGPT for macOS will stop working on January 15, 2026

  • Other parts of ChatGPT on macOS will stay fully active with no problems

  • Voice chat will still work on the web, iOS, Android, and Windows

This shows OpenAI is rebuilding its voice system to make it stronger and more modern. Voice chat is becoming a key way people use AI, so this change points to bigger and better voice tools coming later.
Read more…

SOCIAL MEDIA

TODAY’S HIGHLIGHT

Millions turn to AI chatbots for comfort, but doctors warn this trend may harm mental health.

Doctors from Harvard University and Baylor College of Medicine warn that AI chatbots made to feel like friends or partners can damage mental health.

  • Researchers say AI chatbots can cause emotional dependence and false beliefs, because they are always available and often agree with users.

  • When OpenAI replaced a warm model with a colder one, many users felt sad and angry, like losing a partner or therapist.

  • Tech companies focus on user engagement and profit, not mental safety, and there are no clear rules to protect users.

This matters because many users fall into emotional bonds by accident, not choice. AI that feels caring can slowly replace real human support.
Read more…

YOUTUBE

TODAY TREND

Mistral OCR 3
Accurate OCR for notes, forms, tables and handwriting

Whisper Snapper for Mac
Local transcripts with speaker labels, timestamps, + export

Flowdrop V1.1
Turn plain English into working automations & workflows

Sprout
A kanban board that feels at home on your Mac

ArkTabs
Spotlight style tab manager with fuzzy search & live preview

SOCIAL MEDIA

That’s it for today!

Before you go we’d love to know what you thought of today's newsletter to help us improve the experience for you.

How was today’s email?

(Tell us what you liked or what could be better)

Login or Subscribe to participate

Keep Reading