'I can see my bones!'
X-ray machines were second-generation diagnostic technology by 1923, but at least some physicians then would have remembered the shock of early 1896.
Wilhelm Conrad Roentgen, a physics professor at the University of Wurzburg in Germany, discovered an electrical current passing through low-pressure gas in a glass tube emitted rays of unknown origin, so he called them X-rays. They passed through human flesh, but not bones, creating a unique shadow on photographic plates. The world’s first “roentgenogram” was an image of the hand of his wife, Bertha. Her wedding ring was clearly visible — and so were the bones beneath
“What was it like to be the first person to take this thing from theory to reality? You know, put your hand between a film and an X-ray generator and then develop this plate and say, ‘Oh my God, I can see my bones,’ ” says Doug Walled, M.D., a radiologist and senior member of IEEE, the Institute of Electrical and Electronics Engineers. “I can’t even imagine how mind-blowing that must have been.”
Roentgen never patented his methods, so X-ray technology and fascination had spread around the world by the time he received the world’s first Nobel Prize for physics in 1901.
The diagnostic machinery improved, images got better and, once scientists began to understand potential dangers of radiation exposure, physicians, their staff and patients were better shielded from X-rays. But by the 1960s, a doctor still could complain to a research engineer “that X-ray images of the brain were too grainy and only two-dimensional,” according to IEEE.
That research engineer, Godfrey Hounsfield, considered that as he pondered 3D imagery at his job at Electrical and Musical Industries, in Hayes, England.
Music fans knew the company as EMI, the corporate parent of the record labels producing the Beatles. The British Invasion of the 1960s left EMI flush with cash for recordings and research, so Hounsfield prototyped a rotating X-ray machine with a sensor to measure X-ray absorption and a computer that could combine multiple scans, or “slices,” to create 3D images.
The Fab Four had disbanded by October 1971, when a woman in her 40s with a suspected brain tumor became the first patient to go under the EMI scanner. “There was a beautiful picture of a circular cyst right in the middle of the frontal lobe,” Hounsfield recalled in a perspective article published by the Nobel Foundation, “and, of course, it excited everyone in the hospital who knew about this project.”
The EMI scanner would be better known by the name computed tomography, shortened to a CT scanner. In his 1979 Nobel Prize lecture, Hounsfield compared it to another form of medical imaging moving from concept to laboratory to hospital at the same time.
In 1971, chemistry professor Paul C. Lauterbur at the State University of New York at Stony Brook wrote down his ideas about magnetic fields, radiofrequency currents and sensors to detect energy at the molecular level. With components assembled into a scanner for humans, the results were images different from X-rays and CT.
“Suddenly one could distinguish, for instance, gray and white matter in the brain and even see tissues completely surrounded by bone, which were not visible with X-ray examinations,” says Dr. Peter A. Rinck, professor and author of a European textbook on medical imaging, who studied with Lauterbur.
In 2003, Lauterbur and Sir Peter Mansfield, whose calculations were used in creating MRI scanners, would share a Nobel Prize for their work. Within Lauterbur’s lifetime, people who learned of his achievements related stories about how MRI had kept them playing sports they loved, or prevented a needless operation or saved their lives by detecting cancerous tumors and other maladies, says his daughter Elise Lauterbur, Ph.D.
“That’s one of the things he talked about a lot,” she says. “He didn’t see why we should have to cut somebody open to figure out what’s gone wrong inside.”
The 1970s brought another breakthrough for seeing with sound. Researchers began using modern ultrasound imaging in the 1940s, with developments producing several images per second as early as the mid-1960s, mainly for gynecology and liver examinations. Transducer technology in the 1970s was the needed breakthrough, with digitizing following in the 1980s, says Ingo Zenger, author and technology expert at the Historical Institute of Siemens Healthineers medical technology company.
Ultrasound allows physicians to see tissue in real time. Now the units are portable enough to take to points of care, such as athletic fields, to assess injuries immediately, says Hollis G. Potter, M.D., chair of the Department of Radiology and Imaging at the Hospital for Special Surgery in New York City.
She and Walled talked about the transition from imaging to interventional radiology, the ability to operate while using scanning technology to view what is happening inside the body in real time. The technology already is getting ahead of illness, compared with the times when physicians could not see under the skin.
“You think about 100 years ago, doctors would see patients at the end stages of their disease, when their bowel is obstructed, when they couldn’t get up out of a chair because their hip arthritis was so bad, or they presented with an enormous soft tissue mass that was palpable,” Potter says. “Now we have the ability to actually act upon those imaging findings and intervene in the course of the disease, rather than waiting until its end stage.”