Driven in part by frustration with the
medical system, more and more Americans are seeking advice from artificial intelligence.
Last year, about 1 in 6 adults—and about a quarter of adults under 30—used
chatbots to find health information at least once a month, according to a
survey from KFF, a health policy research group. Those numbers are probably
higher now, said Liz Hamel, who directs survey research at the group.
In dozens of interviews, Americans
described using chatbots to try to compensate for the health system’s
shortcomings. A self-employed woman in Wisconsin routinely asked ChatGPT
whether it was safe to forgo (to not have or do something
enjoyable
放棄,摒絕(令人愉悅之物)) expensive
appointments. A writer in rural Virginia used ChatGPT to navigate surgical
recovery in the weeks before a doctor could see her. A psychologist in Georgia sought answers after her
providers brushed
off (to refuse to listen to what someone says,
or to refuse to think about something seriously漠視,不理睬) concerns about a side effect of her
cancer treatment.
Some know that AI can get things wrong. But
they appreciate that it is available at all hours, charges next to nothing and makes
them feel seen with convincing impressions of empathy.
“All of us now are starting to put
so much stock
in (If you put stock in something that someone
says or does, you have a high opinion of it.信任;看重) this that it’s a little bit
worrisome,” said Rick Bisaccia, 70, of Ojai, California. ‘’But it’s very addicting
because it presents itself as being so sure of itself.’’
Chatbots routinely suggest diagnoses,
interpret lab results and advise on treatment, even offering scripts to help
persuade doctors to follow AI-generated treatment plans. But AI is not well
suited for the kinds of questions it is often asked. Somewhat
counterintuitively, chatbots may excel at solving difficult diagnostic quandaries (a state of not being able to decide what to do about a
situation in which you are involved困境;猶豫不決), but they often struggle with basic health decisions, like whether
to stop taking blood thinners before surgery.
Representatives for OpenAI, which makes
ChapGPT, and for Microsoft, which makes Copilot, said the companies take the accuracy of health
information seriously and are working with medical experts to improve
responses.
For all the limitations, it’s not hard to
understand why people are turning to chatbots, said Dr. Robert Wachter, chair
of the department of medicine at the University of California, San Francisco.
Americans sometimes wait months to see a specialist, pay hundreds of dollars
per visit and feel that their concerns are not taken seriously.
“If the system worked, the need for these
tools would be far less,”
Wachter said. ‘’But in many cases, the alternative is either bad or nothing.’’ (Teddy
Rosenbluth and Maggie Astor)
Empathetic having the ability to imagine how someone else feels表示同情的;有同感的,産生共鳴的

沒有留言:
張貼留言