How Medical Faculty and Students Are Using AI Today

A group of School of Medicine professors and students took the pulse of their community to draft a roadmap for integrating the rapidly changing technology into the school's curriculum.
A doctor in a white lab coat holds an artificial intelligence graphic

Two-thirds of U.S. doctors see the benefits of using artificial intelligence (AI) in their work, according to a survey conducted last year by the American Medical Association. But only about one-third said they were already using it at the time. Some of that gulf comes down to training. Medical schools — like many industries — are grappling with the opportunities and challenges AI offers, to teach future doctors how the technology may help them. 

Tufts University School of Medicine adopted an AI-focused learning objective in the fall of 2024. This spring, a group of faculty members and students at the School of Medicine, led by Maria Blanco, professor of psychiatry and associate dean for faculty development, published a paper exploring use and attitudes toward the technology among the school’s faculty and students based off of a needs-assessment survey the team distributed. 

Blanco and her coauthors knew that students had begun experimenting with AI on their own and wanted to understand how they were using the tool and how that could be improved. The result helped the team outline a roadmap for better integrating AI into the school’s curriculum. 

“This technology has the potential to change the way medicine is practiced, and even to change how students learn, so we cannot disregard it,” she says. 

To conduct the survey, Blanco and her coauthors reached out to all medical students and a select group of faculty. They heard back from 128 faculty members and 138 medical students, with student respondents evenly distributed across academic years. They found that many respondents were already using AI, but few felt incredibly comfortable or familiar with the technology. Only 12% of students and 9% of faculty said they were proficient with the tool. 

“AI technologies simulate knowledge generation and clinical reasoning with a human-like fluency — this can be deceiving. We have to train everyone to know how to use it, because it has assets, but it also has significant pitfalls and risks.”

- Maria Blanco, professor of psychiatry and associate dean for faculty development at Tufts University School of Medicine

Use Cases and Ethics

Faculty at the medical school currently use artificial intelligence most often in their clinical practice, the survey showed, such as for taking notes or helping analyze patient charts. They also use the technology in their research and to enhance their classroom teaching. AI can help develop curriculum tools like test writing, as long as a faculty member checks the accuracy of outputs. 

Students appear to use the tech most often to deepen their understanding of course material, through interactive learning or summarizing content. 

Encouragingly, both the surveyed faculty and students showed concern about the ethical implications of integrating artificial intelligence into their work and requested training on how to utilize AI in an ethical manner. Last spring, Blanco said, the School of Medicine’s AI & Biotech Medicine Club sponsored their own forum and invited faculty members to discuss the ethics of AI. 

Blanco hopes that awareness around ethical considerations will extend towards a broader recognition of the limitations of artificial intelligence. The technology’s powerful capabilities, she says, can make it easy to believe in its infallibility; but outputs can still include biases, inaccuracies, and hallucinations. When integrating AI into the School of Medicine curriculum, she wants to ensure that students learn how to treat AI-based content and analysis with appropriate skepticism and appraisal, training students in the critical thinking skills they need to discern fact from fiction. 

“These technologies simulate knowledge generation and clinical reasoning with a human-like fluency — this can be deceiving,” she says. “We have to make sure that we train everyone to know how to use it, because it has assets, but it also has significant pitfalls and risks.”

Students and faculty, she believes, must learn how to use AI as a tool to support and improve their work, rather than relying on it as a backbone of clinical and professional practices. Blanco is not alone there: Surveyed faculty expressed concerns about the potential for AI to reduce the human connection inherent to the medical field.

“At the end of the day, we have to make sure this is what is going to help us refocus on our human-centered care,” says Blanco. “My hope is that this will also help us by pushing us to focus on critical thinking, making sure that we refine those skills.”

Experimentation and Training

Both students and faculty who responded to the survey noted that they lacked the time to experiment with AI to better understand how to use it. Two-thirds of faculty and students suggested that training and hands-on workshops could improve competency.

That work has already begun. Based in part on the survey results, the School of Medicine has begun hosting faculty and student seminars on the technology. 

The school also updated its policy and guidelines, originally issued in July 2024, regarding the use of AI by students and faculty, and launched several pilot projects. These include librarians teaching about using AI in the Problem-Based Learning course, a chatbot designed to help students practice interviewing skills in the Medical Interviewing and the Doctor-Patient Relationship course, an app to support student learning in the Neuroscience course, and the integration of AI into case studies in the Introduction to Clinical Reasoning course, among others.

More broadly, Tufts has provided guidelines on how to discuss AI in course syllabi, along with resources for faculty and staff on appropriate uses of the technology.

“Medical schools and medical organizations are embracing AI to help physicians and trainees make the most of these resources,” says Blanco. “The key is that we are all learning from each other: students, faculty, and of course, our technology department.”

Department:

Psychiatry