Experts raised the issue of potential benefits and drawbacks of using artificial intelligence in counseling or healthcare settings. (photo: Getty Images)
When a client told Curtis Taylor he had downloaded an app that billed himself as an AI therapist, the licensed Erie counselor knew he had to check it out for himself.
Taylor said the app “crashed” when he mentioned self-harm and didn’t correct him when he called the chatbot as a counselor, prompting him to file an ethics complaint against the company.
“I had a lot of fun pretending to be an advisor… I say I’m not,” Taylor said. “I have been vetted. I completed two master’s level programs to obtain my Ph.D. With my counseling license, I have completed 3,000 (supervised) hours.”

“I have permission. I’m a mandated reporter. And these are all things that artificial intelligence just isn’t and won’t be,” he continued.
Across the country, some people have died by suicide in high-profile cases where the deceased used an artificial intelligence chatbot to aid with their mental health. In the wake of these deaths, several states have introduced advanced laws restricting their operate, and Pennsylvania could be next.
Country House Bill 2100 will apply modern standards to AI chatbots and prohibit companies from providing mental health services to Pennsylvanians unless they are under the guidance of a licensed therapist.
Even then, AI cannot make individual treatment decisions, directly interact with clients in therapeutic communications, or make recommendations or treatment plans as part of a proposal. During a Tuesday hearing before a panel of House Democrats, testimony considered the uses and limitations of artificial intelligence in the mental health space shortly after another House committee considered the technology’s role in health care.
Taylor said he is not “anti-AI” and added that he uses it as a documentation or spreadsheet tool.
“I personally think it’s appropriate for students trained to be counselors to be able to use AI to role-play as clients. However, I don’t think AI has a place for role-playing as a counselor,” Taylor said.
He has clearly stated that he does not want insurers to view AI chatbots as a way to “segregate” customers with mental health issues in order to bypass a counselor, even if there is a nationwide shortage of providers.
More to consider
Madeliene Stevens, chair of the government relations committee at the Pennsylvania Counseling Association, shared other concerns about the operate of artificial intelligence, including potential breaches of confidentiality and lack of oversight.
“There are no current parameters on the types of information and data that AI technology can ingest and what it then does with it,” Stevens said.
Molly Cowan, director of professional affairs for the Pennsylvania Psychological Association, added that AI chatbots are “designed to keep people engaged” and using them for as long as possible.
“They don’t challenge false assumptions. They don’t give you new coping skills. They tell you to talk to them,” Cowan said.
SEE THE MORNING HEADLINES.
She added that potential regulations should still allow the operate of technologies such as note-taking under human supervision.
However, Cowan cautioned that in settings with multiple types of providers – such as hospitals where doctors and psychologists work together – there may be situations in which one doctor can provide AI-enabled care while another cannot do so under the current proposal.
Other health care settings
Earlier Tuesday, a bipartisan group of lawmakers heard about ongoing applications of artificial intelligence in general health care settings, particularly hospitals.
As of 2024, 71% of hospitals were using predictive AI, compared to 32% of hospitals using generative AI, according to Paige Nong, an assistant professor at the University of Minnesota’s school of public health.
“When it comes to implementing technology in the healthcare system, it’s moving really fast,” Nong said.

Hospitals operating within larger systems or with higher operating margins were more likely to implement AI, while rural or critical access hospitals were less likely to implement AI. The most common applications were for planning, monitoring, or monitoring and predicting adverse events. One of the fastest growing applications is invoicing.
Standardizing “model cards” that summarize basic information about an AI tool — which Nong compared to nutrition labels on food — could aid rural or critical access hospitals looking to adopt the modern technology.
However, she detailed a divide in surveillance when AI was used for clinical purposes – such as supporting diagnosis or identifying risk factors – or for administrative purposes – such as documentation.
The former has guidelines from the US Food and Drug Administration, which also approves artificial intelligence-enabled devices, while the latter does not.
“It’s not clear who they might actually be looking for… (they) just don’t have that clarity,” Nong said.
Additionally, she noted that few systems were able to effectively evaluate the operate of AI, adding that only 57% of hospitals using AI also evaluate these tools.
For Peter Lazes, a clinical and industrial psychologist, the biggest concerns were a lack of input from frontline caregivers during development and too much focus on making money or cutting costs.
“So the tools are then not focused on the concerns of frontline staff or on the effective implementation of those tools,” Lazes said. “It’s not about pain points or problems for patients or staff… it’s about ways to make money.”
He said electronic health records – a “serious waste of time” – are now used more to document billing codes for reimbursement rather than clinical issues. Artificial intelligence tools such as transcriptionists are touted as effective time-saving solutions, similar to electronic health records, but “we see the same scenario with artificial intelligence.”
Would you like to get in touch?
Have a news tip?
“If doctors and nurses are fired, the policy of these hospitals will be, ‘By the way, now that you have a few more minutes, you can admit three more patients.’ That doesn’t help the situation,” Lazes said.
Nong noted that while clinician satisfaction seemed to improve with such documentation tools, it did not appear to result in time savings. Lazes added that some doctors report that AI transcription takes longer because they have to correct AI errors.
Lazes proposed using state dollars to encourage the operate of cogeneration AI, in which development is driven by workers – such as doctors and nurses, rather than hospital executives looking for ways to cut costs.

