5. Living Systems

5. Living Systems

New industries such as biotechnology are emerging as knowledge is gained into the highly energy- and material- efficient processes employed by living systems for self- propagation and the formation of functional structures such as organs, shells, or musculo-skeletal systems. This knowledge is being applied not only to the agricultural and aquacultural production of food plants and animals, but human health protection and medical care.

The Living Systems technology category includes four technology areas:

Medical technology
Agriculture and food technology
Human systems

Biotechnology is enabling both established and newly emerging industries to design, create, and produce highly specific substances derived from molecular structures and processes in naturally occurring biological systems. Currently, the two most important areas of biotechnology application are human health care and agriculture.

While the United States is the overall world leader in biotechnology, Europe and Japan pose strong competition in specific areas. The United States does more basic research in genetic engineering and molecular biology than any other country. Computer based methods for analyzing and modeling molecular sequences and interaction, as well as facilitating collaboration among widely dispersed groups have contributed to the rapid evolution and application of knowledge in these areas.

The integration of knowledge and practice of many technologies is essential to an effective and efficient system for protecting the public health and delivering health care services. Innovative biomedical research and information-based integrated decision support systems leading to prevention, more effective therapies and minimizing the need for long term care hold the key to advances that can restrain costs while enhancing the quality of public health and health care. Key technologies in this area are integrated information systems, functional diagnostic imaging, biocompatible materials, and the rapid identification of bacteria and viral infectious agents.

Global agriculture is facing the challenges of increasing human population, accelerating need for food, fiber, feed and raw materials for other industries, and a declining amount of cultivated land per capita. Sustainable agricultural systems must address the development of environmentally sound, productive, economically viable and socially desirable agriculture. Aquaculture is currently the most rapidly growing agricultural segment and will play a significant role in providing a stable source of fish protein in the face of declining yield of oceanic fisheries.

The human-machine interface is a critical component in a number of complex integrated systems such as power and communications grids, air traffic control systems, and highly automated process control and manufacturing systems. Such systems typically produce much more data than a human is able to digest in a time-critical situation, so the main job of the interface is to present the data in a form easily understandable by the human and to provide an easy means of interacting with the system to ensure continued safe and reliable operations. The United States has a pre-eminent technical position in understanding human capabilities, behavior and performance while interacting with engineered systems and environments, and in implementing advanced human- machine interfaces.

For the most part, the U.S. is in a strong position in the technologies included in the Living Systems category. The specifics of the assessment are presented in the text of the section. The summary of the U.S. relative position and trends from 1990 to 1994 are shown in Figure 5.1.


Biotechnology is enabling both established and newly emerging industries to design, create, and produce highly specific substances derived from molecular structures and processes in naturally occurring biological systems. Currently, the two most important areas of biotechnology application are human health care and agriculture. Other application areas, including biomining, specialty chemicals, energy, environment, electronics and advanced materials, make up significantly smaller segments of the biotechnology industry.

While the United States is the overall world leader in biotechnology (primarily because of its extensive medical applications), Europe and Japan pose strong competition in specific areas. The United States does more basic research in genetic engineering and molecular biology than any other country. Europe, primarily Germany, France, and the United Kingdom, also conducts world-class basic research. Japan, which has always emphasized applied research, continues to be a world leader in microbial strain selection and manipulation, and has a very strong background in bioprocess engineering, particularly in the food, brewing, and antibiotics industries. Japanese firms have not established basic research programs in areas such as genetic and protein engineering that are comparable to those in Europe or the United States.

European countries have made significant advances in recent years both in human-health-related and agricultural biotechnologies and are slowly beginning to narrow the gap with the United States. Technical advances are likely to continue with the support of increased government funding and greater industry involvement. In addition, the EU is beginning a slow process of amending biotechnology regulations to better support the industry. Japanese firms have had some success as manufacturers and marketers but are now addressing the perception that they lacked quality pharmaceutical and biotechnological R&D. For the most part, Japanese biotechnologies have been obtained through alliances with smaller U.S. biotechnology firms, rather than through in-house R&D. As a result, Japanese researchers are behind their peers in the United States and Europe in most areas of biotechnology. In addition, Japanese industry R&D expenditures, a very important component of R&D support, are low in comparison to those in both the United States and Europe, leaving Japanese researchers at a significant disadvantage.


Bioprocessing is the use of microbal, plant, or animal cells for the production of chemical compounds. Bioprocessing exploits a range of biological phenomena extending from the fermentation processes to produce beer, wine, and commercial ethanol products to state-of-the-art processes for the production of specialty chemicals such as enzymes, amino acids, biocatalysts, and pharmaceuticals. Such production methods can be more energy efficient, product specific and environmentally friendly than traditional methods of organic synthesis.

One specific bioprocessing technology is mineral extraction or biomining. Biomining utilizes microbes that leach out minerals without the harsh conditions of physical mining methods, while improving recovery rates and reducing capital expenses and operating costs. Now widely used, about 25 percent of worldwide copper production is based on bioprocessing, and applications to gold and phosphate extraction are promising.

Bioprocessing supports a number of national goals. It creates jobs in the food, pharmaceutical, chemical, mining, and biotechnology industries, and contributes to the competitiveness of those industries in global markets. By creating possibilities for new and highly specific chemical production, bioprocessing allows the creation of new drugs which supports the health of U.S. citizens. It also supports national security by providing better medicines and organic compounds for the use of the military.

European and Japanese technical capabilities in human therapeutics lag those of the United States; however, Europe lags only slightly and is improving. Efforts in human therapeutics in Europe have been led by the United Kingdom-- which has Europe's largest biotechnology industry--but significant technical capabilities also exist in both France and Germany. New biopharmaceuticals are being developed in Europe for such wide ranging applications as treatments for shock, asthma, and cancer. Much of European biotechnology research is still being done by the large pharmaceutical and chemical firms such as Glaxo (UK), Rhone-Poulenc (France), and Schering-Plough (Germany), however, innovations may increasingly come from new, smaller companies. For example, British Biotechnology has a number of biopharmaceuticals at various stages of development, including an anti-cancer compound and a drug that reduces the detrimental effects of chemotherapy agents.

Japan has only limited capabilities in nearly all segments of human therapeutics. While assessment teams and reviewers of Japanese programs have found little cutting edge basic science and discovery of new compounds, they have seen impressive implementations, and the application of automation and robotic techniques to technology largely developed outside Japan. By mastering the production processes of current bioprocesses, the Japanese are extremely well prepared to support new products coming from their discovery programs.

The Japanese are also making significant investment in the application of bioprocessing techniques to energy and environmental issues such as the de-sulphurization and de- nitrification of coal. They see this as a way to approach the acid rain problem they encounter with Chinese coal combustion.

The Japanese approach to bioprocessing curiously does not focus on basic engineering principles, but is more biologically oriented, emphasizing screening, selection, and medium development. Biotechnology is well accepted in Japan as an extension of their historical involvement in fermentation products, and bio-derived detergents and cosmetics are actively marketed by emphasizing the bio- processing technology in their production. This stands in marked contrast to the European attitudes which seem to be very distrustful of biotechnology processes and products, and have created a political environment that had substantially inhibited European corporate activities in these areas--but encouraged them to invest in R&D and production facilities in the U.S.

Monoclonal antibody production

While antibodies alone can be used to kill cancer cells, they are most often used as carriers of other substances for either therapeutic or diagnostic purposes. Chemotherapeutic agents can be attached to monoclonal antibodies to deliver high concentrations of these toxic substances directly to tumor cells. Theoretically, this approach is more effective than conventional chemotherapy and less toxic because the delivery of harmful agents to normal tissues is decreased. For diagnostic purposes, they can be radioactively labeled and used to locate metastases previously undetectable by other methods.

Monoclonal antibody therapies involve the development of specific antibodies directed against antigens located on the surface of tumor cell. These techniques are highly patient specific and require that samples of the patients tumor cells be taken and processed to produce specific antibodies to the tumor associated antigen. In order for this to work a sufficient quantity of antigens unique to the tumor cells must be present. The tumor antigens must be sufficiently different from the antigens elaborated by normal cells to provoke the desired antibody response.

Monoclonal antibody treatments still have significant limitations. Monoclonal antibodies are made using mouse antibodies, and as foreign proteins they often trigger an immune response. They can be neutralized before any therapeutic effect can occur. They may also lack sufficient specificity for tumor antigens and may not be different enough to ensure only cancer cell destruction. These problems may in the future be resolved. Studies into the use of monoclonal are in progress in the treatment of T cell lymphoma, chronic and acute lymphocytic leukemia, melanoma, colorectal cancer, and neuroblastoma.

The primary contribution of monoclonal antibody technology is to the health of the U.S. population. This technology also contributes to job creation and global competitiveness of the drug and medical diagnostics industries, and greater efficiency of the health care industry. By providing better tools for military doctors, it also contributes to U.S. warfighting capabilities.

While there are noteworthy British and French programs in these areas, the bulk of the research and subsequent exploitation by biotechnology firms is occurring in the U.S. supported by venture capital and alliances with the large pharmaceutical companies.

Protein engineering

Protein engineering, that is creating proteins sequences whose specific functions are determined by their three dimensional shape, holds the key to tailoring protein catalysts and rational drug design. While it is now well established that more than 500 proteins each routinely fold into single conformations out of hundreds of millions of possibilities, the general folding rules have yet to be defined. While gene sequences are converted to protein sequences by ribosomes, the process of correctly converting sequences to folding patterns is not well understood. The sequence of amino acids in a protein regulates the folding of the protein chain, with the folding pattern dependent on interactions among the amino acid's side chains. The final shape of the protein affects its functional ability to selectively bind and interact with specific sites. Once the protein folding riddle is solved, it should facilitate the development of synthetic and entirely new proteins and materials. Promising results are just beginning to become available from protein crystals grown in space.

Protein engineering technology contributes to several national goals. By permitting precise design of more specific and more potent drugs, protein engineering improves the health of the U.S. population, as well as the warfighting capabilities of the U.S. military. New and more potent drugs involve the possibilities of increased sales and more jobs in the pharmaceutical industry. Because it is on the leading edge of understanding molecular structure, it contributes to maintaining U.S. leadership in science and engineering.

The bulk of the physics of protein folding is being done in U.S. universities and national labs. Activities ranging from the computational simulation to using tools developed for the study of disordered systems in condensed matter physics have been applied to these problems. The two predominant areas have been protein folding and the behavior of folded proteins. The primary basic physics work has been done in the U.S. with some mathematical elaboration in the former Soviet Union, Japan and Switzerland.

Recombinant DNA technologies

Recombinant DNA techniques involve the transfer of genetic material between differing organisms, a process popularly referred to as genetic engineering. This transplanted genetic material contains encoded instructions for characteristics of the original cell, namely the production of specific proteins. This is done to enable recipient organisms to synthesize increased yields of compounds, to form entirely new compounds, or to adapt to different environments.

Research in gene therapy has grown dramatically since 1990, with more than 40 therapeutic gene transfer protocols approved since that time. Pending positive results in animal models, many researchers believe that gene transfer could be potentially used to remedy serious human diseases caused by genetic mutations including sickle-cell anemia, emphysema, hemophilia, and even extremely high levels of cholesterol. Work is in progress on developing therapies for hepatitis and other liver diseases, AIDS, and diseases of the cardiovascular and central nervous systems, as well as inserting genes that stimulate the production of immune cells that fight cancer. Other approaches involve introducing genes through viral vectors such as adenovirus or through protein binding "receptors." Another approach called "Antisense" tries to do the opposite of what other techniques do--it tries to turn off genes that code for the production of harmful proteins.

Targeted gene replacement provides access to more than 5,000 human disorders attributed to genetic defects. As the genes and disorders are identified, the same mutations can be produced in mice, and the mouse models should make it possible to trace the events leading up to the manifestation of the disease. Understanding the molecular pathology of the disease should facilitate the development of more effective therapies. The various mutations of the cystic fibrosis gene are being studied this way. The study of mammalian neurobiology also looks attractive for this technique.

Transgenic animals are being developed for a variety of purposes, from mice for specific disease research to purposes such as leaner meat or increased milk production. Transgenic plants such as rot-resistant tomato, and insect and herbicide resistant corn and cotton have been aggressively pursued. The present pursuit of herbicide resistant crops is considered controversial due to concerns that they may encourage the increased use of herbicides. Progress in animals and plants has been somewhat slower than in cell culture preparations as they are frequently constrained by the gestation period in animals or the growing seasons for plants.

One of the most intriguing application of rDNA technology is the past year is biomolecular electronics. It is conceivable that for certain kinds of algorithms molecular computation might compete with electronic computation. Energy efficiency and storage density would be quite impressive. Storing information in DNA molecules and achieving an information density of approximately 1 bit per cubic nanometer would lead to a substantial improvement in capabilities over conventional video tape which stores data at a density of approximately 1 bit per 1012 nm3.

Progress in recombinant DNA technology will contribute first and foremost to the health of the U.S. population. In addition, by facilitating care of chronic diseases and more productive agriculture, it contributes to a more productive economy. Agriculture-related biotechnology applications are potentially extremely important. As in human-health-care related biotechnology, there are currently only a handful of products on the market. However, researchers are developing a variety of transgenic animals and crops that will probably have significant market impacts after the turn of the century.

The U.S. is a world leader in rDNA technology, although the extent of the U.S. lead varies from area to area. Europe is only slightly behind the United States in the development of transgenic animals, with research taking place primarily in the United Kingdom and the Netherlands. European firms have developed significant capabilities in transgenic plant technology, and, although they lag the United States slightly, they are likely to improve over the next five years. Technical capabilities lie primarily within the United Kingdom, France, Belgium and the Netherlands. The United Kingdom--which has conducted the largest number of field trials in the European Union (EU)--and France have developed and are field testing transgenic varieties of many different plants with many different traits. These countries are also working on some collaborative projects including R&D on the Euromelon, a joint British, French, Spanish and Greek research project, which is now being field tested. Belgium has a number of important achievements, perhaps the most significant of which is the development of a new type of oilseed rape, genetically engineered to be herbicide resistant. The crop was recommended for general sale and use for the first time in the EU by a panel of government experts and now awaits approval by the Environment Secretary, the Agriculture Minister and the EU. If approval is given, as industry experts believe likely, the scale of genetic crop development will change from small scale to mass production. In the Netherlands, biotechnology has been applied in plant breeding and seed propagation for the last 15 years. While the Netherlands have been a leader in these technological developments, strong public opposition has hindered field trials and thus, slowed research. The Dutch company, Van der Have Groep, for example, had to abandon trials of corn engineered to be herbicide-resistant after environmental activists objected to the release of these plants' pollen into the environment.

While some feel that Japan has virtually no technical capabilities in the development of transgenic animals and has not begun significant research in this area, its potential in transgenic plant development should not be under-estimated. Small highly focused efforts such as the Rice Genome Project were designed to make them pre-eminent in rice research. The Japanese map of the Rice Genome (smallest of the major cereal grasses, one sixth that of wheat) linkages and markers has been highly productive with 53 scientists and a budget of $5.5 million. In a collaboration with British molecular biologists, begun in 1991, they have determined that many markers are in the same relative position on wheat and rice. The colinearity has been observed in barley and rye as well, offering the possibility for a generalized map of the genome of the ancestral grass that provided the basis for all these contemporary cereal grains. There have been recent announcements that the agricultural ministry, buoyed by its success in the Rice Genome Project, will soon start on an animal genome project using the pig as a model.


Vaccines for prevention are a cost-effective way to control or even eradicate selected infectious diseases. Recently, vaccine therapies for cancer are also being aggressively explored. Active research efforts presently focus on HIV, malaria, tuberculosis, pneumococcus, cholera, rotavirus, measles, varicella zoster, cytomegalovirus, and respiratory syncytial virus. The classical approaches to vaccine development were based on stimulating the body's immune system with attenuated living pathogens (measles, polio, tuberculosis) or with killed infectious agents--a protein from the pathogen and an adjuvant (diphtheria, tetanus, and whooping cough). With genetic engineering techniques, the "blueprint" for specific infectious pathogens antigens can be isolated and subsequently inserted into harmless bacteria or viruses which can then produce the antigen. This can then be used to stimulate the immune system's ability to attack the pathogen without direct exposure to it. Such recombinant sub-unit vaccines have been successfully developed (such as the MS&D vaccine for Hepatitis B) and similar techniques are being pursued to prevent Lyme disease, measles, and malaria.

An alternative approach is the live recombinant vehicle vaccine based on incorporating a gene for a specific antigen into a harmless bacterium or virus which is subsequently introduced into the body. The live organism will then manufacture and deliver the antigen to the body's immune system. These can be more effective as they tend to persist longer in the body and by carrying multiple foreign genes may be able to allow simultaneous vaccination against multiple pathogens. Some are better at stimulating antibodies and others better at stimulating T lymphocyte production. Virus vehicles are being explores for polio virus and adenovirus. The BCG bacterium is being engineered to produce antigens for the pathogens that cause Lyme disease, tetanus, and malaria.

Work on improving preventative vaccines is stirring less interest than the potential for vaccine therapies. A number of small biotechnology firms are pursuing vaccines to stimulate the immune system to fight tumors. Based on advanced cell culture techniques to identify specific tumor antigens, then monoclonal antibody techniques and polymerase chain reaction (PCR), these firms are making rapid progress in identifying cytokines, the chemicals that control the immune system, with the intent of mobilizing the T cells to attack specific tumor types.

Vaccine development makes an immediate and critical contribution to the health of the U.S. population. Given the scientific basis on which vaccines are developed, it also contributes to the goal of retaining U.S. world leadership in science, mathematics, and engineering. Vaccines also have significant implications for national security by protecting U.S. soldiers, sailors, airmen, and marines during peacekeeping and other missions, and by assuring the health of their families while they are on deployment.

There is little commercial interest in the development or production of vaccines for diseases that do not afflict developed countries because of the inability of companies of companies to recoup their investment. Work in this area is performed predominately by smaller biotechnology firms which form marketing and distribution alliances with larger firms to support clinical trials. The NIH is the world's largest funder of vaccine research, spending more than $300 million annually. The Walter Reed Army Institute for Research also a major research funder. There is a joint working agreement in effect to coordinate the efforts of the DOD's U.S. Army Medical Research Command (Walter Reed Army Institute of Research), the U.S. Agency for International Development, and the National Institute of Allergy and Infectious Diseases (NIAID). There are productively active research groups in France, Belgium, Venezuela and England.

Medical Technology

The integration of knowledge and practice of many technologies is essential to an effective and efficient system for protecting the public health and delivering health care services. Innovative biomedical research and information-based decision support systems hold the key to advances that can restrain costs while enhancing the quality of public health and health care.

Health information systems and services

Health information systems and services are discussed in the "Information and Communication" section under Information Management.

Biocompatible materials

Biocompatible materials are materials which are designed to exist and perform specific functions within living organisms. These include a broad range of substances such as structural metallic orthopedic prosthetic implants, artificial blood and skin, and surface coatings for implantable sensors for chronic (long-term) patient monitoring or electrodes for functional electrical stimulation. While implant durability is one concern, the major problem is the body's ability to reject these materials as foreign objects either through an adverse immune system response or by attempting to "wall them off" by surrounding them with a protein layer. Newer porous materials for total joint replacement, by contrast, allow the existing bone to grow into the replacement joint. Structural orthopedic implants are primarily joint replacements for the hip or knee. While implants in less active elder patients were often structurally sufficient, the activity levels of younger patients caused premature device failure, often requiring multiple surgical replacement. Artificial blood and skin have great appeal by expanding a relatively limited resource and avoiding the problems of type matching and screening for bacterial or viral contamination. Chronic monitoring and functional electrical stimulation open the door to more effective low-dose therapies and functional restoration.

Artificial blood and skin have great appeal by expanding a relatively limited resource and avoiding the problems of type matching, screening for bacterial or viral contamination and a very limited shelf life. Most approaches to artificial blood are based on hemoglobin, the protein inside red blood cells that binds and transports oxygen. While hemoglobin can be pasteurized to minimize disease and given to anyone regardless of blood-type, breakdown in the circulatory system with resulting kidney damage remains as a problem. Several methods for using ultrasound to form an aqueous solution of hemoglobin microbubbles are now yielding excellent results, although they have yet to be tested in humans. Though widely pursued, particularly by the Japanese who have had substantial programs in this area, artificial blood remains an elusive objective.

An entirely different approach to biocompatible materials is emerging from the ability to manipulate the body's own immune system. Exemplified by projects to manipulate animal derived extra-cellular matrix to create biologically derived materials for reconstructive applications such as vascular grafts, ligaments or tendons, these are intended to stimulate the body's own cells and induce them to rebuild the lost material. Other efforts are underway to develop implantable "microreactors" containing living cells isolated from the body's immune system to treat diseases requiring replacement of hormones, enzymes, factors, and other cell-produced bioagents. Progress in this area is closely linked to research and development in protein engineering and subsequent tissue engineering.

Biocompatible materials make a direct contribution to the improved health of the U.S. population by enabling a variety of biomedical applications. These materials and their adaptation are part of the growing biotechnology industry, contributing to job creation in that industry. Progress in these areas will be valuable for improved outcomes and reduced length of stay for trauma and surgical procedures. The importance of minimal anesthesia and proper tissue perfusion and oxygenation cannot be over emphasized, particularly in infants or geriatric patients. These materials are also important for the restorative care of battlefield casualties, and so contribute to reducing the damage from military operations.

The U.S. has a substantial leadership position in this area across metallic, ceramic, and organic materials with the possible exception of artificial blood where the Japanese have had a significant concentration of effort.

Functional diagnostic imaging

High-resolution medical imaging techniques, including CAT scans, MRI and Ultrasound, are revolutionizing medical diagnosis and treatment. As a result, they have emerged as critical components of the nation's health system. Computer driven image processing (enhancement, comparison of episodic changes, quantitative analysis of absorption ratios) has held great promise and is becoming more clinically acceptable with better displays, faster processors, storage systems and networks. As the imaging parameters become more complex in modes such as MRI, decision support systems will become more valuable in making a differential diagnosis based on tissue response to the various excitation modalities available in MRI.

Functional MRI complements earlier methods of functional neurological imaging which used positron emitters carried in the blood stream. One method using labeled water allows the visualization of alterations of blood flow, labeled glucose allows visualization of altered regional metabolism. As astronomers learned that they could benefit from images registered from different parts of the electromagnetic spectrum, brain researchers and neuro-radiologists are beginning to utilize the multi-spectral perspective.

Optical coherence tomography uses infrared radiation to provide cross-sectional images of biological tissues with 10 to 20 micron resolution. While its immediate applications will be in the eye for non-invasive detection of glaucoma and macular degeneration, the ability to analyze the top few millimeters of any biological structure should include arteries and mucosal tissue. As many pathologies such as skin cancers and atheroscelerosis begin on tissue surfaces, the use of various wavelengths may enable rapid, minimally- invasive differential diagnosis. Incorporated in endoscopic procedures, it could provide real-time data on tissue hydration, oxygenation, and guidance for highly selective laser angioplasty.

Functional diagnostic imaging has the capability to improve early diagnosis and through earlier proper treatment, provide improved patient outcomes at lower costs. They also contribute to job creation and economic growth through global export of medical imaging systems. As the imaging chain becomes fully digitized, chemical wastes from film processing will be reduced, reducing the impact of medical technology on the environment. Techniques developed for clinical imaging are also being adapted to industrial quality control as they permit non-destructive testing and visualization of internal structures and flaws in complex metal and composite assemblies.

Functional diagnostic imaging also makes a contribution to the nation's warfighting capabilities. Digital images can facilitate the medical decision making process in isolated or remote military environments, both ship-board and land based, not so much in the management of gross combat casualties, but in the daily experience of sick-bays and medical operating groups. In the event of large scale deployments, digital medical imaging could augment the interpretative capabilities on-site by using state-side radiology and pathology consultants linked to the forward areas.

The U.S. holds undisputed leadership in ultrasonic imaging of the heart and soft abdominal tissues. This is supported by an extensive technology base in ultrasonic transducer design, digital signal processing, and data display.

Computerized tomography is also dominated by U.S. companies after a major marketing-based sorting out that occurred in the mid-1980s. While the initial invention was done at EMI in England, the mathematical reconstruction algorithms were derived from astronomy work and were ultimately dependent on both computer processing capabilities and low noise, high sensitivity detector technologies. Two recent innovations using high speed CT are also enabled by the existing strong U.S. technology base. One system developed by a group affiliated with the University of California at San Francisco Medical Center is derived from linear accelerator technology- -the chief physicist in the group having previously worked at the Stanford Linear Accelerator (SLAC). Another system of note is capable of functioning as a "virtual colonoscope"-- that is reconstructing an animated visualization of the bowel based on a series of high speed CT images of the air filled bowel--without actually inserting an intrusive instrument. The technology is also dependent on high speed graphic workstations and animation /image reconstruction software.

Magnetic Resonance imaging (MRI) and formerly known as Nuclear Magnetic Resonance imaging (NMR) is derived from a technology used in physical chemistry since the late 1940s. The new systems are dependent on superconductor magnet technologies, digital signal processing and the supporting electronics--areas where the U.S. has a dominant technology base. The most recent developments in the field, functional or spectrographic imaging also draw on that technology base. Bruckner Instruments of Germany is the preferred supplier of ultra high field strength instruments used for research applications.

Although they have not been developing the technology themselves, the Japanese have been eager investors in American technology. For example, Diasonics/Toshiba licensed a highly innovative high field NMR instrument from Imatron, a spin-off from University of California at San Francisco Diagnostic Imaging Group.

Bacterial/viral detection and screening

Rapid identification of bacterial and viral contamination is becoming more important because of newly emerging viruses and the re-emergence of old diseases such as bubonic plague, tuberculosis, and cholera. In contrast to the 1918 influenza epidemic which took 30 days to spread globally in the days of steam ships, present estimates for worldwide spread are on the order of less than seven days. Public water supply contamination has been reported on the increase with presently installed water processing facilities unable to assure the water quality released downstream, and subsequent re-uptake for public water supply unable to adequately filter and remove the viral burden. Improved methods are also needed for assuring a safe, uncontaminated food supply. Hospital infections, their etiology, prevention and control could also benefit from more rapid techniques for differential identification of both bacterial and viral presence. Similarly, antibiotics could be more selectively prescribed with an effective, economic antibiotic susceptibility testing system. Rapid detection and identification of bacterial and viral contaminants is also an issue in an era of the rapid deployment of troops into remote regions for either peacekeeping or regional conflict resolution. In this case both air and waterborne contaminants are of concern as is the provision of s safe water supply. The value of improved systems to dealing with the treat of biologic warfare agents does not require further comment. A range of techniques ranging from the less selective such as bioluminescence based on the presence of ATP and electroconductance methods of bacterial detection to highly specific antibody techniques have been demonstrated under laboratory conditions. Yet a broad spectrum, high sensitivity system with real-time or near real time response is not available for bacteria much less for viruses.

The need for an effective global surveillance and reporting systems cannot be overstated. Bacterial/viral detection and screening technologies would contribute to the health of the U.S. population by assuring a safe water and food supply, reducing hospital acquired infections, and improving the effectiveness of antibiotic therapy in the face of increasingly resistant antibiotic strains. They would contribute to economic growth by contributing to global trade in agricultural products. They would contribute to the improvement in environmental quality by improving the ability to monitor and thus control the discharge of contaminated water into rivers and streams.

Bacterial/viral detection and screening would also improve the ability of the U.S. military to carry out its missions. These technologies would provide support for troops in remote regions without water supplies of known quality, and would also provide an early warning system against biological warfare agents.

There are no clear cut leaders in this area although the Japanese biosensor work and U.S. work of applying monoclonal antibodies hold considerable promise. While the breadth of the Japanese program is quite extensive with over 50 systems having been announced, and they are being developed for both food process monitoring and human health applications, the actual progress and performance is difficult to assess until they are released for independent testing.

Medical devices and equipment

A broad array of electronic monitoring devices are essential for life support and patient monitoring. Providing real-time information on patient status, they are a hallmark of modern medical care in circumstances ranging from hospital operating room to roadside heart attack patient, trauma victims, or battlefield casualties. Passive monitoring instrument for electrophysiology (EKG) and blood pressure are used by the medical staff of the operating room or critical care unit to assess patient status and stability, and the progress of various interventions. Blood gas instrumentation (oximeters and CO2 monitors) may be used to control respirators or cardio-pulmonary bypass pumps. While these technologies are relatively mature, reduced size, reduced response time and increased stability, accuracy, and precision remain challenges for competitive advantage. Critical parameters such as the adequacy of tissue perfusion and actual real-time cardiac output are being approached by a variety of ultrasound and optically coupled techniques. Signal processing of the raw data and the computer assisted display of the array of multiple parameters so as not to overwhelm the medical staff remain as challenges.

Another example of a group of medical devices are the devices for functional electrical stimulation (FES) which offer great promise for moving the disabled and elderly toward greater independence. Cardiac pacemakers are one example of functional electrical stimulation which have enhanced the lives of millions while providing the basis for a substantial industry with domestic and export markets. Cochlear implants to help with hearing impairments and selective stimulation of nerve and muscle groups to restore paralyzed limbs have made limited progress over the past two decades. FES has also been explored for respiratory support in spinal cord patients with diaphragm activation problems, as well as for bladder and rectum control.

Medical devices and equipment make a major contribution to the health of the U.S. population and to the improvement of quality of life for individuals. These technologies provide greater independence and functionality for the elderly and the injured, allowing them to remain productive members of society longer, and contribute to the effectiveness of the U.S. health care system. They also reduce the human costs of U.S. military actions by providing injured soldiers with care on and off the battlefield and with more normal lives following battle injuries.

The U.S. biomedical industry is estimated to supply 49 percent of the global market. While there are competent competitors in most device markets, the technology base, supporting R&D, and ready availability of quality components all contribute to U.S. technical leadership. There has been a strong academic interest in Biomedical Engineering departments since the late 1960s. Technology transfers and development funded by the aerospace sector contributed to early development. Component sub-systems for imaging such as x-ray tubes and ultrasonic transducers drew on a technology base developed to support microwave transmitter tube and ultrasonic sonar transducers. Laser based instruments have drawn on a technology base developed by national labs for physics research and military applications. While significant innovations such as the lithotriptor and color coded ultrasonic cardiac flow visualization systems have been developed overseas and introduced to the U.S. market, they are an exception.

Agriculture and Food Technologies

Global agriculture is facing the challenges of increasing human population, accelerating need for food, fiber, feed and raw materials for other industries, and a declining amount of cultivated land per capita. There is a growing realization that agriculture, as a form of ecological system, is concerned with the regulation of the abundance, distribution, and behavior of species. Naturally occurring, historic inter-species competition has its counterpart in agriculture which tries to control the inputs of energy, nutrients, and water into productive crops.

Sustainable agriculture production

Sustainable agricultural systems address the development of environmentally sound, productive, economically viable and socially desirable agriculture. The stability and sustained fertility of the soil depend on prevailing soil-climate conditions and on the effects of human activity, There are strict limits to the extant of human influence over soil substrates, ground, soil and surface water, and the resident flora, fauna and micro-organisms which are essential to productive agriculture. The preservation of arable land must continually meet the challenges of desertification, deforestation, salinization, soil degradation and soil erosion. Regional and global climate change also has the potential to alter the productivity and suitability of many crop systems.

Selective breeding to enhance desirable attributes has been a long-standing hallmark of agricultural R&D. Biotechnology and genetic engineering are facilitating the manipulation of seed stock to create food and fiber crops with selectively enhanced attributes such as drought tolerance or pest resistance. Cereal grains such as wheat and rice are being intensively studied, with traits such as growth cycle, disease resistance and stalk height already well known. Efforts are well underway to improve their resistance to fungal, bacterial, and viral diseases, as well as insects, drought and increased salinity.

The preservation of wild seed grain is of continuing importance. Many crops have become highly specialized, whether to enhance crop yield, flavor, or shelf life, but in the process may have become more vulnerable to drought or specific pests or disease. It has frequently proven invaluable to return to historic wild seed to study the basic attributes which may have been "bred-out" in the pursuit of more economically attractive strains. The resulting monocultures reduced tolerance in not problematic under narrowly defined conditions, but when environmental conditions shift significantly (as in the 1970âs mid- West drought and subsequent corn blight) they can be highly vulnerable. Thus the importance of ancestral seed collections as a "reference resource."

Through efforts of the USDA and various land-grant university Departments of Agriculture, the U.S. investment in this area is substantially greater than that of other nations. The published work from Germany and the Netherlands is oriented to production in semi-arid regions such as West Africa. The globally dominant position of the United States in world-wide food production is reflected in the research in this area.

Food safety assurance

Ensuring food safety to the best extent possible is an on- going challenge because the route from field or catch to table is a long one with many handlers involved in processing, storage and transportation. No technology for processing food is universally protective. Pasteurization is quite effective but cannot be applied to solid foods. The increase in food poisoning from bacteria, viruses, and parasites is escalating in almost every country that collects statistics on the subject. Explanations for this increase have ranged from improved reporting and detection, to increased demands for meat and animal products. Attempts to halt these trends in food poisoning occurences have focused on the re-establishment of surveillance systems and attempting to require new testing and standards on the food industry. A system of checks known as the Hazard Analysis Critical Control Point (HACCP) system, initially devised by NASA in the late 1960âs will be required by European Union directive by December 1995.

The application of advanced technologies to monitor food quality and detect bacteria, viruses, parasites, or chemical contaminants is still quite limited in the processing/production environment. While techniques including flow cytometry, immuno-assays, and DNA- hybridization including polymerase chain reaction (PCR) have demonstrated capabilities in the laboratory, it is not currently economic to deploy them in the large volume production environment in today's food processing plants. These tecnologies all offer promise for improved microbial quality assurance and drug or residue detection, but different areas of the food industry such as highly processed foods, fermented foods, and foods of animal origin will require different approaches.

Aquaculture and fisheries

The worldwide demand for fish is steadily increasing, and the wild supply is steadily being depleted. As a result, the interest in aquaculture, i.e., fish farming, has grown globally to an estimated $26 billion a year industry. Aquaculture is increasingly dependent on specially bred stocks and high quality supplies of water which must be properly treated prior to discharge. Acceptance of cultivated seafood among food processors and restaurant operators has been good, driven by the perception that the farmed fish are cleaner and more consistent in quality than the wild catch. The use of feed has grown increasingly efficient with state-of-the-art producers using 1.4 pounds of feed to produce a pound of fish. This compares to chicken farming where the ratio is 2 to 1; a significant improvement from the 15 pounds of feed to produce a pound of chicken that was the norm in 1925.

Aquaculture and fishery technologies make a significant contribution to the safe and abundant food supply in the U.S. They also make a positive contribution job creation and economic growth, and to the U.S. balance of payments by reducing dependence on imported seafood and increasing U.S. exports of fish and shellfish.

Accounting for over 12 percent of the global fish harvest, aquaculture is making a significant contribution to the global food supply. It is estimated that 90 percent of global aquaculture activity is in China where the indoor systems are primarily hatcheries. While Israel, Japan and India have very active programs in this area represented in the open literature, high tech aquaculture is flourishing as a private enterprise in California, Massachusetts, Mississippi, Florida, and Hawaii. Many operators consider their processes to be proprietary, publishing little.

Human Systems

Advanced human-machine interfaces

A number of integrated systems have been proposed or designed for situations where a human needs to be considered as a necessary part of the system. In such cases, the human- machine interface is a critical component of the system. Such systems typically produce much more data than a human is able to digest in a time-critical situation, so the main job of the interface is to present the data in a form easily understandable by the human and to provide an easy means of interacting with the system. Advanced work on the human- machine interface was initially done by the Department of Defense because of the need to support pilots in combat situations, but as automation and information intensity increase in other fields, human-machine interfaces become increasingly important to the safety and efficiency in a variety of work places.

The field of human factors focuses on human capabilities, behavior and performance while interacting with engineered systems and environments. While the classic perception of human factors deals with aircraft cockpit displays and ergonomics, today it is a design oriented discipline that covers the range of human interaction with complex systems and environments such as the control rooms of power plants, air traffic control, manufacturing, and telecommunications networks. In future "highly automated systems," when staff size is small and the operating environment stressful, the cognitive demands of large, complex and dynamic nonlinear or digital systems quickly outstrip the control capacity of the unaided human. Neglecting human factors can result in labor- intensive operations and increased operating costs, high workload and fatigue, and higher rates accidents resulting from human error. These problems are far from trivial and they can be life threatening.

The design of systems to assist operators in managing cognitive workload, situational awareness, and decision making in automated, multi-task environments is still evolving. Mental workload assessment and guidelines are based on either behavioral or performance-based techniques, subjective assessment techniques or psychophysiological techniques such as heart rate, eye blinks, and brain waves. These techniques can be combined to derive a composite workload estimate that takes advantage of the sensitivities of the various measures. These techniques have been the basis for the development of crew resource management training which is now a standard training protocol for commercial aviation crews. The application of these techniques to the less constrained environment of industrial process control is still in its infancy.

Both the assessment techniques and the decision support aids can be applied to a wide range of task design and training situations, and should be of value in both the transportation and industrial process control environments. Advances in human-machine interface technologies will lead to greater productivity and greater safety in U.S. industry. This technology is also critical for warfighting in the "information-intensive" battlefield.

The DOD Crew System Ergonomics Information Analysis Center, NASA's Space and Flight Human Factors groups, and their affiliated universities and support contractors have been the leaders in the development of advanced human-machine interface. While the information has been widely available, industry has only recently shown a broader interest in incorporating it in system design. While the European community has been more amenable to standards activities such as video display terminal standards and workstation ergonomics, the U.S. has developed a substantial body of knowledge that could be more effectively exploited in the design of large systems and consumer products. Japanese industry has shown some interest in the field and has increased its research funding but the U.S. efforts are so much greater that the U.S. lead in this area is increasing.

President and First Lady | Vice President and Mrs. Gore
Record of Progress | The Briefing Room
Gateway to Government | Contacting the White House | White House for Kids
White House History | White House Tours | Help
Privacy Statement


Site Map

Graphic Version

T H E   W H I T E   H O U S E