For millennia advances in the treatment of diseases or disorders have been the result of accidental discovery (for example penicillin)1, pattern matching (for example cauterization of bleeding wounds)2 or blind experimentation (Ehrlich’s testing of specific binding.)3 Recent knowledge and technical developments have allowed for a change in methodology, advances in medicine are now more intentional, the result of design as much as discovery. This sets the stage for an increase in engineering involvement in the treatment of disease and in the translation of new therapeutic approaches. In addition, recent economic forces are reducing research time available to clinicians while simultaneously reducing medical center based basic research staff. While those changes can be seen as potentially retarding the advancement of medical knowledge, it provides an opportunity for engineers who are solution-based to partner with knowledge-based scientists to advance the treatment of disease. In addition, attempts to manage health care costs has led to reimbursement being tied, not to what treatment was provided but to what improvement in health was accomplished. This will lead to assessment of new presumably more efficient methods of patient treatment.
In his book Product Esthetics: An Interpretation for Designers, Lewalski encapsulates solution-based engineering with “Engineers use science to solve their problems if the science is available. But available or not, the problem must be solved and whatever form the solution takes under those conditions is called engineering.” 4
However, if engineers are to be involved in medical applications it places additional burdens on them. Involvement in therapeutic devices, methodologies and systems is mission critical engineering. In mission critical engineering the performance of the device or system is judged not on its average performance but on its worst case performance. In addition therapeutic devices have to be robust in the face of inadequate or low signal to noise data
In order for the process to have true value in patient treatment, the engineer has to push beyond the traditional academic boundaries. The process or device has to be translated from the laboratory to the hospital where it can be used on patients. This often requires laborious Institutional Review Board (IRB) protocol development and review. The engineer also has to be cleared to have a presence in the operating room to watch his or her design being used.
In addition, the difference in styles of learning adds a barrier to engineer/physician interaction. The nature of medical education is experiential, engineers learn by doing. Early failure is built into the very nature the engineering process. Therefore when an engineer proposes a new technique, it is difficult for clinicians like surgeons to break away from what they know works with some degree of success to a new, only possibly more successful, paradigm. The engineer has to watch and later debrief the surgeon to optimize the design of their new system or device. But even when a system works with a particular clinician in a particular setting, the work is not done. To truly maximize the value of a new step forward the new approach has to be distributed to a number of settings. This both maximizes the value of the new design and provides identification of remaining challenges or flaws in the design. Such a distribution requires working with a commercial entity either an existing one or one of the engineer’s making. Examples of such are described below.
The connection between imaging and surgery was almost instantaneous. Roentgen published his discovery of Xrays on Dec 28th 1895. Approximately 2 weeks later on Jan 13th 1896, casualty surgeon JH Clayton used an Xray to guide the removal of a needle from a woman’s hand5. Twelve years later, Victor Horsley and Robert Clarke published their results of system for guiding tools into the brains of animals with the intent of reproducibly reaching the same target area in the animals6. While Horsley and Clarke get credit for the creation of an external device to guide a tool to an interior target, the reality was that the device failed to accomplish what they desired. That is, that the interior of the subject could not be simply predicted from the exterior. Perhaps paradoxically, their use of serial sections of earlier subjects brains kept in an atlas format presaging modern tomograms, is often overlooked. Xray went on to become a commonplace diagnostic tool. The development of pneumoencephalgraphy7 and ventriculography8 allowed surgeons to make quantitative determinations of target locations in the brain but such techniques still lacked the third dimension of data.
It wasn’t until 1947 when Spiegel and Wycis9 combined the idea of the stereotactic frame pioneered by Horsley and Clarke with the patient-specific information provided by Xray imaging that imaging and surgery were again rejoined. Here they took perpendicular shadowgram xrays. If a target point could be unambiguously identified in both images then a surgical probe could be guided to that location. While researchers such as Leksell, Riechert&Mundinger, Narabayashi, Talairach, Cooper and others all developed stereotactic frames they only moved into general use when they became commercially available10. Lars Leksell founded Elekta in 1972, Talairach licensed his techniques to SORMEL (now DIXI Medical).
But stereotaxy was primarily a method for tissue ablation and, as elegantly demonstrated by Phil Gildenberg11, the commercial release of L-Dopa significantly attenuated the use of stereotaxy. It wasn’t until the development of CT scanning in 1973 and the subsequent creation of the N-bar system by Russell Brown11 that CT drove increased stereotaxy applications. Critical to that increase was that, while Brown was involved in new type of frame (the BRW), the N-bar system was licensed by a number of frame manufacturers.
Even the now ubiquitous CT scanner faced a rocky start. Refused by medical xray companies, EMI (Electrical and Musical Industries) spent the money being generated by their top product, the Beatles, to create the CT market. Once the market was shown to exist, the big medical device companies waded in and used their superior marketing resources to claim the market.
In a similar fashion, Image Guided Neurosurgery (IGN) began. An idea simultaneously developed in four labs across the world 12,13,14,15 it was the reverse of stereotaxy; instead of finding a location on an image and moving to it, the present surgical position and trajectory were digitized and the location displayed on the images. One remarkable aspect to this development was that it arose from surgeon/engineer teams. From the metal bending of stereotactic frames, surgical guidance had progressed to electronic devices, registration algorithms, image processing routines and digital displays. It required the surgeon to accept an engineer as a partner while simultaneously requiring the engineer to learn medical fundamentals and to understand surgical work flow so the device could be inserted into the process. While image-guided neurosurgery has progressed to being sold and supported by large international companies, development has slowed. Present IGN systems trail research systems in functionality by a decade and have made considerable concessions to price at the cost of reduced patient care. Even so attenuated, image-guided surgery has become a 2.2 billion dollar international business and IGN is the standard of care for intracranial surgery.
The last example is image-guided liver surgery. It arose a decade after image-guided neurosurgery was established as the standard of care for intracranial surgery. Studies have shown that surgical resections have as much as a 10 to 1 outcome advantage in five year progression free survival versus other therapeutic processes such as ablation, radiation, or chemotherapy16. Other studies show that only about 12% of the people who could potentially be helped by liver surgery receive it.17 Therefore there is a real possibility for dramatically improving the outcome of the other 88% of liver cancer patients. However, commercial entities look at Medicare data for numbers of liver surgeries at the present and claim there is very little market. Or they say “we hear no cry for this technology”. So it is left to surgeon/engineer teams to create the market. Figure 2 shows the rise in papers using the phrases “Image-Guidance” and “Liver Surgery”.
In this chart (Figure 2)I show data from Google Scholar (GS) and PubMed (PM).
While steady, the growth of image-guided liver surgery as a research process is presently generating less than 350 publications a year. However, when compared to IGN, we see that the idea has exploded. See Figure 3.
A Modest Proposal
While a rapidly increasing publication rate shows interest an multiple research group development, the important step is to get these techniques to patients. So how do we address the medical need versus research and development? Where that sort of translation has worked surgeon and engineer teams demonstrate strong integration and frequent interaction. Secondly, new developments have to move from the laboratory into local hospitals for testing. This means that the teams have to work with Institution Review Boards (IRBs) to maintain patient safety. However, IRB’s (and their parent agency, the FDA) have become overly litigation averse. Obviously the best way to completely avoid the potential for a lawsuit is to approve nothing, a clearly flawed risk/reward assessment. It completely lacks any assessment for potential rewards in improved medical techniques and devices. Engineer need to help IRB’s carefully examine new devices and assess the potential risk for patients to the potential reward for patients. Requiring every new device brought into an OR to hold an Investigational Device Exemption (IDE) with hundreds of pages of documents Good Manufacturing Practices does not make the patient safer or healthier, it makes defense lawyers happier.
Once a device or process is developed and tested, it should go through a series of initial clinical tests and later a full clinical trial. While the NIH is to be commended for their innovation R21 Early Phase Clinical Trial grant mechanism, it sits alone. There is a step where technology meets medicine, where engineers work with surgeons not on the idea but on the implementation; where graduate students work with nurses and other OR staff to integrate the process into the work flow. We all know of cases where a technique or system has potentially great value to healthcare but whose value was dampened by complexity of implementation. Having time and support to clean up that middle step will improve healthcare. If a process is not easy to integrate into applications, it won’t be used. We won’t deliver the value that we claimed with the research and development.
Finally, to get these devices to wide spread corporate involvement we need knowledgeable, bold corporate leadership. Without EMI’s willingness to gamble on CT we might not have it now. Where is that leadership? If a company is going to call itself a “medical” company then it must place improved health care delivery at the top of its priorities. This requires a method for bringing new ideas to market.
Often as academics we place motivation as being the advancement one algorithm over another, or the design of a novel device that wins a federal grant monies. In reality, our competition is the treatment and prevention of disease. If we sit in our ivory towers tweaking an algorithm so we get one more paper; or in our corporate board rooms more concerned with finding a new application for a suboptimal existing device than getting the best possible device to patients then we have lost. If we say “medical” then we need to be prepared to learn, to think, to fight, to deliver our ideas not just to the journals but to the patients.