fbpx Skip to content

Is Big Tech Using You as a Guinea Pig and Not Telling You?

A chat app used for emotional support used a popular chatbot to write answers for humans to select. Controversy followed.
February 9th, 2023

David Ingram tells the story of Koko, a San Francisco-based online emotional support chat service.  Users “can ask for relationship advice, discuss their depression or find support for nearly anything else — a kind of free, digital shoulder to lean on.”  In October of 2022 Koko ran an experiment using an artificial intelligence chatbot to write portions of or all their replies to Koko users – but they did not disclose this to the users, which raises huge ethical questions.  What is there in place to stop this from happening again? 

Interestingly, the people who saw the co-written GTP-3 responses rated them significantly higher than the ones were written just the human.  But their opinion quickly changed when they found out the messages were co-created by a machine.  “Simulated empathy feels weird, empty,” wrote Koko co-founder Robert Morris.

Read More: NBC News


  • Facebook
  • Twitter
  • Email
  • Print

Free Aging With Dignity Membership

Enter your information to receive periodic updates and special offers from Aging with Dignity.