In a Systems Physiology course, biomedical engineering students train a neural network to figure out what a human heart is doing. Using Electrocardiogram (ECG) data, they build models that can spot irregular rhythms. Until recently, courses in the Wallace H. Coulter Department of Biomedical Engineering included little coding.
But a new initiative looks to change that—and BME faculty including Dr. Laura Christian and Dr. Todd Fernandez, are initiating one of several efforts across the department to integrate data skills, coding and machine learning into BME’s core curriculum.
Instead of adding separate AI courses, faculty are embedding data skills directly into four required classes—grounding them in clinical challenges and engineering design decisions students are already learning to tackle.
“More and more in engineering, [students] use data. They use tools to make sense of data,” said Fernandez, who teaches in the department. “We can teach them to do it in a way that causes them to think with the tool, not transfer the responsibility of thinking to the tool.”
Christian said feedback from the BME Advisory Board spurred the decision to retool course curriculum, so students are trained early and often to code, manage data sets, and use AI and machine learning—critical skills that help engineers and researchers make informed decisions.
In an ECG exercise, for example, students analyze raw electrocardiogram data to diagnose atrial fibrillation—a condition where the upper chambers of the heartbeat irregularly. To do this, they revisit core concepts: what a normal ECG should look like, how electrical signals map onto physical contractions, and what it means when those signals go awry. Working in teams, they match cardiologist-level accuracy, identifying about 75% of cases by eye.
Using math, they then train a neural network on preprocessed ECG data where human eyes can hardly spot a difference—but the neural network reaches over 90% accuracy. “We ask the students, ‘Can you tell these signals apart?’ They say no. ‘Could a doctor?’ Also no,” Fernandez said. “But the computer can.”
Working with a neural network forces students to think critically about what human eyes see from how computers interpret patterns. “It’s easy to look at the parts of an ECG and see what’s right and wrong,” Christian added. “But generalizing that into a tool that can diagnose something? That’s hard.”
In the activity, students learn how to spot irregularities like a missing P wave, but they also begin to think about those abnormalities in mathematical terms—like entropy. “Tying ECG interpretation to entropy, and then to machine learning, helps them realize that this isn’t a separate skill set. It’s part of the same way of thinking,” Fernandez said.
The exercise pushes students to reframe what they know—not just seeing the heart as a biological organ, but also as an electrical signal, a physical system, and a mathematical pattern.
Fernandez added, “Students thought they understood the material—but it’s this activity that made it click. One student even said aloud, mid-lab, ‘I just realized the heart is an electric signal.’ These moments of realization are the goal. Not mastery over data, necessarily—but fluency and exposure.”