The 2022 Oration
The 2022 Graeme Clark Oration
Engineering Your Heart’s Health
Would you want to know if and when you were going to have a heart attack? Perhaps not – it is too similar to that age-old question of whether you want to know when you will die – but it might be a different story if something could be done to prevent it.
Professor Natalia Trayanova’s team has developed an artificial intelligence (AI) algorithm to predict potentially life-threatening arrhythmias more accurately than doctors.
Each beat of your heart is preceded by an electrical wave that sweeps through. The electrical connection between cells of the heart and their resulting synchronous behaviour fascinate Natalia.
But sometimes there is a problem with the rhythm – whether too slow, too fast or irregular – which is known as arrhythmia. Many people experience an arrythmia at some point in their lives. Although usually harmless, they can be life-threatening, causing a large proportion of death by sudden cardiac death. If there is a problem with the electrical wave’s rhythm, which is the trigger for a heartbeat, the heart can simply stop.
During a heart attack, lack of oxygen causes part of the heart tissue to be damaged, forming scar tissue. Heart attack victims are therefore particularly prone to irregular rhythms. Scarring can cause the electrical wave before a heartbeat to become disrupted, bouncing back or reverberating rather than acting as a single wave. The heart stops contracting in a synchronous manner and, in Natalia’s words, becomes more like a ‘bag of wiggling worms’.
People at high risk of cardiac arrest due to a ventricular arrhythmia in the lower ventricle chambers of the heart receive a stopwatch-sized implantable cardio defibrillator. The defibrillator sits in the upper chest to monitor the heart’s electrical activity and delivers a sudden shock to reset the heart should they experience an arrhythmia.
The diagnosis for arrhythmia typically is based on detecting a drop in the amount of blood the heart pumps out with each beat. A “wriggly” heartbeat weakens its strength and the volume of blood pumped falls. Below a certain threshold, and you are considered at risk of sudden cardiac death. However, Natalia does not believe that this truly captures those at risk as it does not reflect the cause of the drop – it may be electrical dysfunction, but it may not.
There are patients who may be at low risk of SCDA receiving defibrillator implants they might not need, putting them at unnecessary risk. Conversely, there are high-risk patients that are not getting the treatment they need, who could die in the prime of their lives. Research into SCDA risk monitoring has mainly taken a one-size-fits-all approach that clearly does not capture everyone.
Natalia’s solution is a more personalised approach. By creating a digital twin of a person’s heart, it can be extensively probed and prodded virtually to predict their risk of sudden cardiac death. Doctors could use this new computer-based method to inform a patient’s treatment plan
The virtual-heart arrhythmia risk predictor – or VARP for short – is the only predictive technology that determines risk of sudden cardiac death. Using MRI scans of a patient’s heart, Natalia’s lab can create a 3D digital model that captures the unique shape of its chambers and its individualised post-heart attack scarring pattern – the most important determinant of arrhythmias. They then run a series of simulated electric currents to unmask any potential problems.
VARP formed the basis of Natalia’s latest technology called Survival Study of Cardiac Arrhythmia Risk (SSCAR) that generates a personalised survival assessment for each patient over ten years.
The name, SSCAR, is a nod to the cardiac scarring that contributes to life-threatening arrhythmias. Given that scarring impacts electrical signals in the heart, this scarring is key to predictions. One machine learning neural network analyses the distribution of scars based on MRI scans. The scars can be spread through the heart in different ways, and each says something about a patient’s chance of survival.
A second neural network assesses patient data, factoring in demographics, physical traits, and lifestyle choices, to link a patient’s background to their risk. Together, the two neural networks comprise SSCAR and can determine the probability of sudden cardiac death. This technology empowers doctors to decide next steps for each patient to prevent future heart attacks.
What can be done if you know that sudden cardiac death is coming? Natalia has figured that out too.
Arrhythmia can be treated with ablation: a spot in the heart is burned. Doctors assess a patient’s heart to find a good spot for ablation, however, there may be multiple places where the rhythm appears off. If the root driver of the arrhythmia is not terminated, the arrhythmia will simply return. In many cases, the arrhythmia does indeed come back, the patient returns to hospital, and they do it all over again – all the while, the patient is accumulating more heart damage.
With a personalised digital heart twin based on a patient’s scans and information, Natalia’s team can find the perfect spot for ablation the first time. They simulate the ablation in potential places of the digital replica and play out each scenario virtually to find the best place for ablation prior to burning a patient’s real heart.
None of the members on Natalia’s team are medical professionals – they are all engineers. Yet in the operating theatre, the surgeons look to her and her team before beginning the treatment. The surgeons ask, “do we do it?” and her team gives the go-ahead. Natalia finds it exhilarating and somewhat surreal that her team is giving such an amazing sense of responsibility in a medical procedure, but it means that patients do not need to come back.
“We have demonstrated that VARP is better than any other arrhythmia prediction method that is out there,” she says.
The Graeme Clark Oration celebrates world leaders and advances in medical research and developments in convergence science. Natalia’s work certainly exemplifies the use of convergent science: technology, engineering, physics, and medicine in collaboration as the future of cardiology. She sees computational approaches and AI as major tools in healthcare and precision medicine. The applause and excitement of the audience was a great indication of just powerful we all believe the future of her work to be.
Written by Catriona Nguyen-Robertson, Science Communication Officer, Convergence Science Network
The 2019 Oration
The 2018 Oration
2019 Graeme Clark Oration: Professor Tim Denison
Towards an Electronic Prescription?
“The brain machine interface – it’s the BMI that could change your life” – Chief Scientist, Dr Alan Finkel.
Seemly taken straight out of science fiction, electronic devices can be implanted into patients to regulate their nervous system in treatments for neurological disorders.
Neurological disorders are a major contribution to disability and death worldwide. Despite decreases in mortality rates from stroke and other disorders, the burden of neurological disorders has increased globally over the past 25 years due to expanding and aging populations. The number of patients who require care by clinicians is continuing to grow. We therefore need new solutions to help us tackle these challenges.
Graeme Clark Orator, Professor Timothy Denison knows that “it’s our first instinct to look to pharma” for the answers. He poses an alternative solution: he is integrating electronics to treat disease in the hope that bioengineering and electric devices will be paired with pharmacology to provide optimal solutions in healthcare.
As the network of nerve cells and fibres in the nervous system use electrical impulses to communicate, it’s no surprise that bioelectronics has significant potential to treat a wide range of neurological disorders. Bioelectronics involves implantable devices that monitor brain signals, and stimulate or block nerve function inside the patient’s body as needed. Hundreds of thousands of patients currently rely on bioelectronic systems inside them to alleviate their symptoms of Parkinson’s disease, epilepsy, chronic pain, and incontinence.
The concept of bioelectronics has been around since ancient times – prior to the discovery of electricity. The ancient Greeks used electric rays (torpedo fish), fish that can produce an electric discharge to stun prey or for self-defence, to numb the pain of childbirth and surgical procedures. The “Torpedo’s shock”, which had the power to stun and numb, was described by Aristotle and Plato, and later taken up by Roman physicians to treat headaches.
Today, neurostimulation is a hot topic in the biotechnology and health spaces, as it has the potential to transform treatment for patients. Direct brain stimulation, for example, is used as a treatment for tremors in Parkinson’s Disease, providing a 20% improvement in patients’ ability to perform fine motor skills. Denison’s aim is to improve this figure without triggering any undesirable side effects. He notes that it’s important to be aware of anything that can go wrong so that it’s possible to find a solution before it does go wrong. With the technology behind brain machine interfaces and bionics rapidly developing, bioelectronic systems can only get better from here.
Denison has developed electric circuits that, instead of merely stimulating the brain, incorporate it into the system as a “co-processor” – allowing for the incorporation of a feedback loop. For example, many thermostats in buildings monitor the environmental temperature and only actively heat when the temperature falls below a certain level. Similarly, electrodes are placed in the brain to monitor brain wave frequency oscillations (the sensorimotor rhythm) and only turn on stimulation when required (i.e. when a patient has the intention to perform movements). Otherwise, the stimulation is turned off. This increases the longevity of the device, uses less power, and decreases the risk of side effects due to stimulation.
There are drawbacks of the current bioelectronic systems as they often remain unchanged in contrast to the rapidly changing and reactive activity of a patient’s nervous system. Denison is developing universal, programable systems that allow these systems to be adaptable for a range of diseases and tailored for specific individuals – it will act as a blank slate that can be moulded on a personal basis.
Advancements of new technologies in healthcare, such as this, are made possible through convergent science. “My background in anthropology has served me just as well as my ability to do calculus,” says Denison. His work has been pioneered by neurosurgeons, neuroscientists and electrical engineers working together, as well as the patients in clinical trials – “they’re the true heroes who will propel this field”. Denison advises that with all these new opportunities for patients, we need to be cautious about neuroethics (ethics associated with our increasing capability of monitoring and influencing the brain) and risk management, and that’s where working in large, interdisciplinary teams come into play.
“The only way to build on science is to build new tools”. Denison’s innovative platform can be easily adapted as our knowledge of nervous systems improves and the technology develops. His devices will allow us to move towards electronic prescriptions based on patient data, seizure prediction and memory improvement in epilepsy, and emotional and sensory processing in chronic pain and depression based on individual needs.
“You can run a brain on four segments of chocolate per hour while a machine would require the power of an entire house to achieve the same processing power” – Alan Finkel. The brain is a truly remarkable organ and Denison’s work is making leaps and bounds to decrease the burden of neurological diseases.
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
The 2017 Oration
2019 Graeme Clark Oration: Dr Paula Hammond
Nanomedicine Comes of Age
Dr Alan Finkel posed a question to the audience: “To which branch of science do you credit cochlear implant?” While many may answer biomedical engineering, to Dr Finkel, it “showcase[s] the gifts of material science”. Professor Mark Cook, Director of the Graeme Clark Institute referred to it as “the invisible science” – so hears to material science, the unsung science, which the Convergence Science Network has celebrated annually for a decade as part of the Graeme Clark Oration.
Something so incredible can be built from the simplest of scientific principles that we learn in school: that opposites attract. Professor Paula Hammond, David H Koch Professor in Engineering and the Head of Department of Chemical Engineering at the Massachusetts Institute of Technology, uses this concept as the basis of her layer-by-layer (LbL) nanoparticle technology:
Immerse a negatively charged substrate into a solution filled with positively charged molecules (polycation solution), it will attract the positively charged molecules and they will form a monolayer around the negative core, reversing the charge. The process is then repeated, immersing the now-positive substrate into a solution of negatively charged molecules (polyanion solution). There is no limit to the number of layers that can be added as long as the charge alternates between each layer.
Professor Hammond uses LbL assembly to incorporate a range of different drugs into nanoparticles for delivery into the body. By adding charged polymer layers that degrade at different rates, multiple drugs can be delivered in one go but released sequentially. Similar to Willy Wonka’s Gobstoppers, layer-by-layer nanoparticles can be constructed around any core, with specially designed shapes, sizes and layers. Professor Hammond provided multiple examples of how she has applied this technology to create the new frontiers of medicine.
Assisting bone growth and healing for orthopaedic implants
Currently the sole treatment for large bone defects is surgical intervention. Professor Hammond was therefore interested in creating a less invasive approach to aid bone growth. Her group has created nanoparticles that contain two growth factors: bone morphogenic factor (BMP)-2, which induces stem cells to differentiate into bone cells, and platelet-derived growth factor (PDGF), which promotes angiogenesis, the formation of new blood vessels, both of which are required for proper structural organisation and bone mechanics. PDGF has to be delivered first (in an outer layer of the nanoparticle) as blood vessels need to form before the area becomes too calcified by bone, and the inner core of BMP-2 is released later.
Orthopaedic implants, such as hip replacements, are associated with a risk of infection, especially when replacing worn out, old replacements. Ironically, bacterial infections undo the work of the procedure as they can cause bone reabsorption, the breakdown of bone. Hence with a sequential release of an antibiotic to clear any potential bacterial infection, PDGF to form blood vessels in the area, and lastly BMP-2 to promote bone growth and healing, Professor Hammond created a one-hit delivery system that complements orthopaedic procedures.
Cancer therapy
Professor Hammond has produced nanoparticles the size of 10-100nm (on par with viruses) that deliver cancer therapeutic drugs, again, with multiple layers:
Stealth layer/outer layer for penetration
To sustain their rapid growth, a large number of tumours undergo rapid angiogenesis (blood vessel formation) to obtain sufficient amounts of the nutrients, but these blood vessels can be leaky. The nanoparticles take advantage of leaky blood vessels to enter and accumulate in tumour tissues. A negative “stealth” layer allows the nanoparticles to specifically target tumour cells as most cells of the body have a net negative charge and therefore repel the negative nanoparticles. Tumour cells are also negative, but the tumour microenvironment can be slightly acidic compared to the rest of the body, and by creating a pH-sensitive outer layer, the outer layer senses when it is in the presence of the tumour and loses its charge. The now-neutral molecule can then interact with and be taken up by tumour cells without being repelled. Additionally, for cancers that do not have leaky blood vessels, the nanoparticle outer layer also contains hyaluronic acid molecules, which bind to a receptor overexpressed on many tumour cells (e.g. in ovarian, non-small cell lung, and breast cancers) called CD44. These methods ensure that the nanoparticles specifically target cancerous cells and leave healthy cells untouched.
Nucleic acid to reprogram the tumour cells
A middle layer of nucleic acids, such as silencing RNA or DNA, silences genes that enable tumour survival or activate anti-tumour genes. Cancer chemotherapy slows down tumour growth, but cancer has the ability to become resistant to many different types of drugs. Cancer cells can spit drugs back out, enhance their repair mechanisms, increase their tolerance to damage that would usually result in cell death, or use other methods to evade death.
In some cases, it is due to MRPI, a molecule on the tumour cell surface that acts as a pump to “spit out” any DNA-damaging drugs used in cancer therapy. Silencing RNA can be introduced to silence the MRPI gene, allowing any chemotherapeutic to get into the tumour cells. Tumour cells also evade death when they lose the “guardian” tumour suppressor gene, p53. Nanoparticles can also deliver microRNA to restore p53 so that it can carry out its anti-tumour functions. This cargo layer therefore disarms the tumour’s defences before the final blow.
Chemotherapeutic drug core
Chemotherapy can lead to the death of most tumour cells by damaging their DNA, but some drug-resistant ones may survive and grow again. The “one-two punch” approach with the combination of LbL nanoparticle, delivering a nucleic acid to first lower the tumour’s defences, provides great potential to treat more stubborn cancers.
Treating infectious disease
Professor Hammond hopes to apply her LbL assembled nanoparticles to treat bacterial infections. With an outer layer that can penetrate biofilms and/or mucosal barriers in the body (e.g. respiratory tract and gut), a middle layer that contains drugs that weaken bacterial defence mechanisms and kills most bacteria, and an inner core of a final high-strength antibiotic that can kill any persistent bacteria, she hopes that many problematic infectious diseases may become treatable. This is especially important in the face of the global antimicrobial resistance crisis.
Professor Hammond has taken a simple physics concept and applied it in multiple ways. She has integrated engineering, physical chemistry, biology, and medicine to encapsulate and release proteins and biologic drugs in a timed manner, creating multi-agent delivery nanolayered release systems. Professor Hammond’s determination to “create something special - magical almost” and see innovative ideas come to life will ensure that her work continues to be adapted and used for many tissue engineering, biomedical devices, and therapeutic purposes.
Written by Catriona Nguyen-Roberson | Science Communication Officer | Convergence Science Network
The 2016 Oration
2017 Graeme Clark Oration: Dr Harold Varmus
Transitions in Cancer Research
‘Science is based on observation, measurement and deduction,’ says Professor Harold Varmus. The 2017 Graeme Clark Oration, delivered by Professor Varmus, Lewis Thomas University Professor at the New York Genome Centre, focused on the science that provides explanations that continue to change our lives.
Professor Varmus’ life and career in science grew out of an interest in helping others. With parents in the medical profession, he was expected to become a doctor. After several twists and turns, ‘[he] found his footing’ when he studied tumour biology at the University of California, San Francisco. To this day, he endeavours to answer one, single question: how does a normal, healthy cell become a malignant tumour cell?
Cancer is a disease that is part of the human condition – ‘it is inherent in who we are,’ according to Professor Varmus. ‘The reality is that cancer will affect nearly half of us in our lifetimes.’
Our DNA – all 6 billion letters of code – are duplicated every time a cell divides, and mistakes occur often. While our cells tend to have repair mechanisms, sometimes mistakes slip through the cracks. In addition to DNA replication being error-prone, DNA is also subject to rearrangement issues (with regards the way DNA is packaged in a cell in chromosomes) and damage by mutational forces, such as ultraviolet radiation and tobacco.
‘A key aspect of cancer is variety,’ which can make it difficult to study. What unifies cancers is their origin: damage to the genome. This damage can take many different forms in many different genes to affect many different functions in our cells. With increasingly advanced technology, we can better examine changes in genes, look for them in diagnosis, and target them in therapeutics.
Professor Varmus outlined three historical phases of cancer research: the largely observational studies prior to the 1950’s, new experimental biology developing in the late 20th century, and the current boom of faster and more powerful methods to understand the molecular underpinnings of cancer. When Professor Varmus started out as a scientist, it was not feasible to isolate and manipulate genes in humans, but now it has become commonplace and our research capacity has come along in leaps and bounds.
Medical studies prior to the genetics revolution relied on autopsies and observations of cancer cells under a microscope. Epidemiologists also associated cancer with a variety of phenomenon such as age, smoking, or occupation (i.e. exposure to asbestos, mustard gas, or radiation). Paradoxically, toxic chemicals associated with contributing to cancer, radiation and chemotherapy, became the basis of traditional cancer treatments together with surgery. There also appeared to be a strong familial link, thus implicating genetics in cancer incidence.
The advances in phase two of cancer research were reliant on other biological studies: the determination of the DNA double helix structure, the discovery of the genetic code, and the central dogma of biology – that DNA is transcribed into RNA, which is translated into protein. Molecular biology became critical to all forms of research, including cancer research. Armed with a limited ability to study viral genomes, which are much smaller and simpler compared to animal genomes, researchers, such as Peyton Rous, identified viruses that were associated with cancers. Rous could isolate viruses in one chicken and give cancer to another chicken by transferring the viral DNA.
It became possible to identify the specific genes in viruses that caused cancer. Animal cells grown in culture could be infected with viruses and they transformed into tumour cells visibly under the microscope. When researchers asked why viruses contained cancer-causing genes, they realised that these genes were similar to genes in animal cells themselves, hence termed oncogenes. The human genome appeared to contain similar genes that, when altered, would transform a healthy cell into a cancer cell.
Cells undergo a constant, healthy cycle of growth and death, controlled by a set of “go” and “stop” signals provided by regulatory genes. Oncogenes tell cells to keep dividing, growing, and conquering, leading to tumour growth. Understanding this, researchers developed drugs to block protein targets of oncogenes to reverse the effects in malignant cells. Some cancers with commonly mutated genes can be treated with enzyme inhibitors (lung (EGFR), breast (HER2), melanoma (BRAF)), but the effectiveness of this treatment is limited by drug resistance. In the same way that the normal genome can give rise to mutations, mutations that arise in cancer endow it with the ability to adapt and become resistant – it’s a matter of outsmarting cancer.
Phase three of cancer research began with the human genome project, which was completed in 2003. The cost of the initial effort to sequence a single human genome was high, but DNA sequencing costs have since declined exponentially and bioinformatics has accelerated to keep up with the analysis of such large quantities of data. Researchers can now identify common mutations by directly sequencing and comparing cancer genomes. It is still not simple to identify single, targetable cancer genes, as many genes contribute to cancer, but we are getting there.
While our questions around cancer have not changed, the methods we have at our hands to tackle them give us a fighting chance. Professor Varmus is in amongst the forefront of evolving cancer research, and who knows what the innovative technologies of the future will be to aid in finding solutions to current medical challenges?
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
2016 Graeme Clark Oration: Dr Thomas Insel
New frontiers for helping people with mental illness
2016 Graeme Clark Oration: Dr Thomas Insel
New frontiers for helping people with mental illness
‘This is going to be the century of non-communicable diseases, and brain disorders will be the most disabling and most costly of these without question,’ according to Dr Thomas Insel, 2016 Graeme Clark Orator. ‘But we do not know enough to meet this challenge’.
One in five Australians aged between 16-85 experience a mental illness in any year. Despite the fact that they are common, we still struggle diagnosing them, treating them, and talking about them due to fear of stigma. We use labels for other non-communicable diseases, such as “cancer”, “diabetes” and “heart disease”, but struggle with terminology around mental health and illnesses associated with it.
Mental illness was projected to contribute the greatest financial burden to healthcare by the World Economic Forum – more than cancer, respiratory diseases and diabetes combined. Funnily enough, speaking to its neglect, it hard only been initially included in the analysis as a baseline comparison and the results were surprising. Mental disorders have incredibly high disease burden, measured by financial cost, mortality, disability and morbidity. In Australia, depression alone is the number one cause of non-fatal disability and has the third highest burden of all diseases globally. While there has been some progress in Australia, ‘there is still much more to do’.
Dr Insel is an American neuroscientist and psychiatrist who formerly led the National Institute of Mental Health. In his time there, he observed a drastic shift in the three pillars underpinning our capacity to tackle mental health: neuroscience, technology and information science, and genomics, providing promise and hope.
A report from the National Review of Mental Health Programmes and Services demonstrated problems in the state of care of patients with mental disorders in Australia. Treatment accessibility is a large challenge. Fewer than half of those who need services receive them, of those, only 40% receive the minimally accepted care. Psychological care, medical care, and social support are also fragmented, when Dr Insel believes that they should be combined in a more cohesive fashion. In addition, diagnosis and treatment are received at a late stage, which impacts on recovery. We can do much, much better,’ says Dr Insel.
The genomic revolution has driven faster, cheaper and better DNA sequencing than what was imaginable a decade or two ago. There has also been a change in the nature of sharing of scientific data; scientists now collaborate to create large databases of DNA sequences. Over 80 countries have contributed one such database of nearly one million genome samples to identify genetic variations associated with specific psychiatric disorders (e.g. bipolar, schizophrenia, autism, major depressive disorder and attention deficit hyperactivity).
Researchers compared whole genomes in this database and identified genetic trends across multiple disorders – none of which were expected. The genes didn’t appear to be linked to neural pathways that are targeted by current therapeutics (e.g. dopamine, a messenger for neurons) but were linked to pathways related to the immune system, neural development, and cell signalling. These trends thus give us a glimpse at the biology of risk and the complexity of mental health rather than providing diagnostic tools.
The technology behind functional brain studies have also advanced, to soon allow researchers to understand how the brain works ‘at the speed of thought’. In their endeavours to ‘break the neural code’, the quality of functional MRI scans has improved, the Brainbow (cells of the brain are individually coloured under a microscope) allows them to study relationships between cells, and a micro-endoscope can be implanted into the brains of mice to track the firing of neurons in action. Applying these approaches to mental disorders, while several years away, may provide the insights into diagnostics and treatments that we need.
Lastly, the technology boom of the last decades makes it possible for psychiatric and mental health services to have a global impact. With smartphone apps and tools to provide online services, social networking, coaching, and tracking of activity and progress, the two billion smartphones in the world would provide remarkable reach. 800,000 lives a year are lost to suicide and there are 20 attempts to every death. This equates to 90 deaths every hour. But if people has access to online services at their fingertips, we might be able to make a difference.
Dr Insel wants to move towards ‘Not doing things to [mental health patients], but rather doing things with them and for them’. Advancements in technology, genomic sequencing and analyses, and the ability to study the brain at a functional level, provide powerful tools with which we can create change in the mental health space.
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
The 2015 Oration
2015 Graeme Clark Oration: Sir Paul Nurse FRS PRS
Controlling How Cells Reproduce
Sir Paul Nurse, 2015 Graeme Clark Orator, told a story that began around 350 years when Robert Hooke identified cells as the basic unit of life.
All living things are comprised of cells. They go through a cycle of growth and reproduction. The cell cycle is a series of events that lead to the duplication of DNA and division of cell components to produce two, identical cells. Nurse provided insights into his work that determined how the cell cycle is controlled – work that earned him the 2001 Nobel Prize in Physiology or Medicine together with Leland Hartwell and Tim Hunt.
How could cell reproduction be investigated when nothing was known about it? For Nurse and Hartwell, the answer was in yeast. As a fresh postdoctoral researcher, Nurse moved to Edinburgh where he began studying fission yeast (S.pombe), yeast taken from the Carlsberg brewery in the 1890s. With only 5,000 genes in this yeast compared to the 25,000 genes in humans, they were much simpler to study.
Interested in both, Nurse merged the originally independent studies of fission yeast genetics and cell cycle research under the guidance of his mentors, Murdoch Mitchison and Urs Leupold who were experts in one or the other. Nurse mutated random genes in the yeast to determine the impacts of each mutation on the ability of the cell to divide; if the mutant couldn’t divide, then the gene mutated must be necessary for division.
The genes identified in this way were important for the process of cell division. But then Nurse became interested in what controlled the entire process. What controlled the rate of cell division? What allowed it to take place? Effectively looking for the “boss” gene regulators rather than simply the minions that did the work.
Once cell reproduction was blocked, the yeast cells continued to grow but not divide and appeared elongated (like two merged cells). As Nurse looked for elongated cells under the microscope he started seeing smaller cells instead. As serendipity would have it, these smaller mutants were completing the cell cycle at a faster rate and the answer was brought to him.
Nurse collected 50 small yeast mutants, which, being in Scotland at the time, he called wee1, wee2, etc. He then identified the mutated gene in these wee mutants as cdc2, a gene that appeared to control the progression of the cell cycle.
At this point, Nurse moved to the University of Sussex to determine the role of the cdc2 gene. In doing so, he found an equivalent (homologous) gene in humans, called cdk1 that played the same role. In fact, once he started looking, ‘it was everywhere: human, mice chicken, plants, etc.’
When eukaryotic cells grow, they go through phases called G1 (growth), S (synthesis), G2 (growth) and M (mitosis – division). Nobel Laureates Nurse, Hartwell and Hunt together discovered two proteins that are encoded by regulatory genes, cyclin and cyclin dependent kinase (CDK), that control the transition from one stage to another as checkpoints. These checkpoints regulate whether the cell has progressed properly before transitioning into the next phase.
The cdc2 gene in fission yeast (or cdk1 in humans) controls the progression of the cell cycle from G1 phase to S phase and the transition from G2 phase to mitosis – division itself. When something is wrong, the cell is stopped from dividing and the fault is repaired or the cell dies. But if a cell divides incorrectly and survives, it may then transform into a cancerous cell.
Nurse established his own lab in 1984 with the Imperial Cancer Research Fund (now Cancer Research UK) to further investigate the molecular biology of cell cycle control. Many wondered how a yeast researcher fit in in a cancer research institute, but it became increasingly clear that regulation of the cell cycle was tightly linked with cancer control.
Cell reproduction is the basis of growth for all organisms and is a universal process. All complex organisms, including us as humans, begin as one cell that then divides into approximately 37.2 trillion cells. As Nurse discovered along his scientific journey, our cells run a tight ship!
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
The 2014 Oration
2014 Graeme Clark Oration: Dr Donald Ingber
The Next Technology Wave: Biologically Inspired Engineering
Dr Donald Ingber transformed bioengineering across all of Harvard University and its hospitals to form the Wyss Institute for Biologically Inspired Engineering. Ingber’s vision was to see the walls between living and non-living break down as the life sciences, physical sciences, and engineering came together.
As the 2014 Graeme Clark Orator, Ingber outlined how he leverages the understanding of biological concepts to inspire the field of engineering.
Many people are familiar with Moore’s Law: that computing power doubles every 18 months. Indeed, technology has advanced in leaps and bounds in the last few decades. Meanwhile, the medical and biomedical fields go backwards as they face the challenge of Eroom’s Law: that the number of medicines invented halves every nine years.
Ingber is concerned that the current drug development model is broken. Big Pharma invest money and resources into drugs to get so far down the product development pipeline before ultimately failing. Animal studies and tests on cells cultured in dishes take years and don’t necessarily predict clinical outcomes. This means that there is a lack of new drugs reaching patients and millions of wasted dollars.
To provide an alternative to traditional animal testing, Wyss Institute researchers and a multidisciplinary team of collaborators adapted computer microchip manufacturing techniques to design and engineer models that mimic whole human organ cell functions. They build microfluidic culture devices that contain living human cells in a recapitulation of the microarchitecture and functions of human organs, including the lung, intestine, kidney, skin, blood-brain barrier, and others.
“Organs-on-chips” are the size of a computer memory stick, containing hollow microfluidic channels lined by living human organ-specific cells. They emulate the interface between the blood vasculature (the endothelium) and the organ epithelial layer (the single layer of cells closest to the blood) so that researcher can observe how the organ interacts with blood vessels. Mechanical forces can also be applied to the system to mimic the physical microenvironment (e.g. cyclic breathing motions for the lung and peristalsis-like gut contractions). These chips essentially provide a cross-section of organs and their translucent nature provides a window into the inner workings of human cells and tissues in real time.
Wyss researchers have also developed an instrument that links the individual “organs-on-chips” together to represent the interconnectedness of organs in the human body. Fluid is circulated through vascular channels between different chips, much like blood flows through our bodies, to analyse the complex interactions between tissues. This “human Body-on-Chips” approach is being used to investigate drug delivery and design new therapeutics in order to see how the whole body would respond to any given drug.
These chips also provide insights into interactions between bacteria and our cells, whether it be the microbiome or invading pathogens. One of the first microchips Ingber and his team designed was a human “Lung-on-a-chip”. The chip contained lung epithelial cells on one side of a membrane and blood vessel endothelial cells on the other. When bacteria were added to the lung side, immune cells in the blood began to stick to the membrane as they flowed over the top, before squeezing through pores in the membrane. The team watched immune cells cross over from the blood into the lung to attack and engulf the bacteria, essentially watching the entire inflammatory response to infection unfold in real time.
Scaling down even further from the cellular level to the molecular level, Wyss researchers, including Professor William Shih, are driving the future of nano-biotechnology by manipulating DNA. Using DNA as a building material – essentially a molecular spring – Shih folds “DNA origami”. A long strand of DNA can be programmed to fold on itself to create specific shapes, much like a single piece of origami paper can be folded into a multitude of things. These incredibly tiny structures could potentially be used to deliver drugs in the body, or as cogs in a machine for molecular manufacturing. They can also be programmable nano-robots that alter shape based on their environment, allowing them to carry drugs around the body and only deploy right at the site where the drugs are needed.
‘The only way we can [design these bio-inspired technologies] is by doing convergent research.’ The remarkable accomplishments of the researchers at the Wyss Institute are the result of collaboration and inter-disciplinary science spanning life sciences, physical sciences, and computational sciences. From novel devices to therapeutics, the potential for biology-inspired engineering is endless. Inger believes that ‘the problems we face in this world seem insurmountable, but if we get together, we can create amazing solutions’.
Written by Catriona Nguyen-Robertson | Science Communication Officer | Convergence Science Network
The 2013 Oration
2013 Graeme Clark Oration: Mr Geoffrey Lamb
Global Health, Economic Growth and the End of Absolute Poverty
2013 Graeme Clark Oration: Mr Geoffrey Lamb
Global Health, Economic Growth and the End of Absolute Poverty
‘Since 1980 to the last few years, the world has had a transformation; the number of absolutely or extremely poor people has declined.’ – Geoffrey Lamb.
“Absolute poverty” is defined by households/families not having sufficient income to meet basic needs of living. The “extreme poverty line” is living on $1.25 a day – people living at or below that line go to bed hungry, are likely to lose their children before they turn five, and a sick day from work can tip a family into crisis. Many of us simply cannot comprehend what it is like to live like that.
Mr Geoffrey Lamb, 2013 Graeme Clark Orator, grew up in South Africa and from a relatively early age, felt “outrage at the inequality and oppression that was suffered by the majority of people [living there]”. He devoted himself to advocate for change and for global policies that aid the plight of poor people and countries. He joined the Bill and Melinda Gates Foundation team in 2006, and is now President of Global Policy and Advocacy.
‘It is certainly not all bad news,’ says Lamb. We are winning the war against poverty.
Economic growth has reduced the proportion and number of the global population living in absolute poverty, with 700 million fewer people living in poverty than 20 years ago.
The developing world as a whole has maintained a GDP growth rate of 6% over the past decade, which is 2-6 times more GDP growth compared to developed countries – they are booming while we stay stagnant.
Back in mid 1980s, there were almost one billion poor people in East Asian and Pacific region, but this has come down to 280 million, which is ‘a fantastic economic success story’. Looking at southern Asia, the rate of poverty has not reduced as rapidly, despite some economic success. In Africa, the fast population growth coupled with little economic success has seen poverty numbers climb. While the overall picture is looking good, it’s not good enough for Lamb.
Economic growth has driven so much improvement in the human condition, but it can only do so much. Improvements in public health also play a role in fighting poverty.
The developing world lives under the constant threat of diseases that are entirely preventable in developed countries. But things are getting better.
‘Against all odds and skepticism,’ public health success stories include the eradication of polio, the reduction of malaria incidence by half, 4 million people on ART for HIV, and TB treatment provided for almost 10 million people. The Global Alliance for Vaccine and Immunizations (GAVI) has been critical in this transformation by delivering vaccines. Lamb estimates that 370 million more children are vaccinated, averting an estimated 5.5 million deaths.
‘It is now possible to think of an effective end to absolute poverty.’ Lamb hopes to see the already declining numbers from over one billion people in absolute poverty to perhaps only 100 million people over the next decade or two.
Lamb calls on governments and countries need to make good investments, give employment opportunities, and contribute to a global open economy.
We need to recognise that economic growth and raising incomes alone don’t resolve the precariousness and vulnerability of life at the poor level. Along with that investment in education, infrastructure and economic growth, and public health action, we can do more.
With these changes, ‘poor people will be pulled into the mainstream rather than pushed to the margins.’ Lamb believes that our progress to date means that it is now possible to think of an effective end to absolute poverty.
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
The 2012 Oration
2012: Professor Dame Linda Partridge DBE FRS FRSE
Forever Young?
2012: Professor Dame Linda Partridge DBE FRS FRSE
Forever Young?
‘Aging is not inevitable’ – Professor Dame Linda Partridge
“Prolongation of life” and “recovery of youth” were listed by Robert Boyle in the 1660s as top of his wish list of desirable yet most difficult to accomplish scientific achievements. Indeed, our aging has fascinated philosophers and scientists over the ages, including 2012 Graeme Clark Orator, Professor Dame Linda Partridge.
Improvements in living conditions and medical advances in the last few centuries have led to people living generally healthier lives. But now, aging is the major risk factor for predominant killer diseases (e.g. neuropathology, cardiovascular disease and cancer). Biomedical scientists are therefore placed under pressure to find ways to keep people healthy as they age.
One way to do this is to target the aging process itself. But aging and aging-related diseases are complicated. Given the complexity of multiple different tissues being affected in different ways by aging, the scientific community was pessimistic that one, simple intervention could deal with all of it.
The aging process seems to be an evolutionary paradox: as animals age, their mortality rises while fertility decreases, therefore lowering their evolutionary fitness. Natural selection evolves organisms for optimal survival and reproductive success, so why does it not prevent aging in the first place? As an evolutionary biologist by training, Partridge pondered this question.
Partridge outlined evidence and theories formed by notable biologists Peter Medawar, John Haldane and George Williams that explain why aging exists. The theories built upon by these scientists suggest that aging is a trait that evolves as a side effect of mutation pressure and/or a trade-off (e.g. cost of reproduction).
It appears that the force of natural selection – that is, how effectively it acts on survival or reproductive rate – declines with age. This could explain why deleterious genetic variants that cause late-onset disease, remain highly prevalent. Haldane speculated that, since Huntington's disease, for example, typically only affects people beyond age 30, such a disease would not have been efficiently eliminated by selection in pre-modern populations because most people would already have died well before they could experience symptoms. Thus, the disease would not have been subject to selection.
Peter Medawar then proposed that even a potentially immortal individual would inevitably die due to extrinsic hazards such as accidents, predators, pathogens, or other hazards. As most individuals die or get killed before growing old, again, selection does not act against the aging process. In support of this theory, an animal’s intrinsic rate of aging is linked to its exposure to extrinsic hazards – the fewer their hazards (e.g. better protection from predation), the slower they age.
It is also possible that aging is a trade-off for youth and reproduction. George Williams argued that if it is true that selection cannot counteract deleterious effects at old age, then is it also possible that genetic variants that, on the one hand exhibit beneficial effects on fitness early in life, but on the other hand have deleterious effects late in life may also exist.
“Perhaps the aging process itself is an accumulation of late-onset genetic mutations that have not been weeded out,” Partridge says.
Because aging has a complex genetic and evolutionary basis, is it difficult to medically intervene. But there is some emerging evidence that provides hope.
Studies performed by Partridge and her collaborators in various animal models revealed a link between defects in the insulin receptor-signalling pathway and an animal’s longevity. Surprisingly, while an animal unresponsive to insulin may be expected to have diabetes, they in fact tended to be healthier overall and protected from osteoporosis, cataracts, and other diseases. This was evidence that the loss of a single gene could produce a diverse array of improvements when it comes to aging.
While it is much harder to study the effects of genetically modifying the insulin-receptor signalling pathway in humans, studies have recruited people in their 90’s and 100’s to assess whether they may have genetic variants in common. Centenarians don’t necessarily live healthier lifestyles but do appear to have variations in a gene that feeds into the insulin-signalling pathway.
This insulin-signalling pathway senses nutrients in the body to avoid starvation to control metabolically costly processes (e.g. growth). Indeed, restricting the diet of mice also resulted in longer life and improved health during aging, which has since been seen for other animals.
Partridge is now also investigating whether the same effects can be achieved pharmacologically (i.e. without having to completely restrict our diets). Her team tested the effects of rapamycin, a drug currently used as an immunosuppressant in organ transplantation, on reducing the incidence of aging-related disease. As this drug targets the same pathway, the same beneficial results were observed. Her team is now investigating the safety and efficacy of using drugs to pharmacologically target the insulin-signalling pathway to improve overall health as we age.
It therefore may be eventually possible to protect against diseases of aging by preventing the process of aging itself. Cancer, cardiovascular disease, Alzheimer’s disease, etc. are all usually studied in isolation, but this offers an alternative strategy to capture several diseases at once with prevention. Partridge therefore continues working towards helping us stay forever young.
Written by Catriona Nguyen-Roberson | Science Communication Officer | Convergence Science Network
The 2011 Oration
2011 Graeme Clark Oration: Professor Terrence Sejnowski
The Computational Brain
The brain is at the heart of everything we do. Imagine what could achieve if we could harness our unique intelligence into machines.
The world’s largest supercomputer consumes over $100 million and 4 megawatts to run – equivalent to a small power plant. Yet our brain, with over 100 billion neurons and connections, can simply make do with a snack. There is clearly a disconnect between what we can build and what nature can do.
Professor Terrence Sejnowski, 2011 Graeme Clark Orator wants to close this gap. He is the Francis Crick Professor at the Salk Institute for Biological Studies, where he uses technology to mimic cognitive function by combining computational science with neurobiology.
Artificial intelligence (AI) was one of the first methods to mechanise used to thought with its origins dating back to the first digital computers. Computers could be programmed to play chess, use deductive logic, and compute arithmetic. With concurrent advances in neurobiology, information theory and cybernetics, researchers were considering the possibility of building an electronic brain. The challenge that hindered the breakthrough of AI until more recently was the difficulty in programming simple common sense.
With any given problem, all possibilities have to be considered. But it is difficult to ask a computer to take uncertainties into account. Teaching computers logic stems back to the Boole’s Laws of Thought (1854), which have become the Laws of Boolean Algebra. These reduce the number of logic gates needed to perform operations, reducing everything down to a logic “0” (false) and a logic “1” (true) – and so the binary system was born. But how do you program all possibilities with a binary code? Sejnowski looked to the human brain for answers.
As we grow up and learn new things in school, our brains create new neural circuits. Neurons are numerous in the brain but are slow compared to the processing power of a digital computer. It takes milliseconds for a signal to jump from one neuron to the next, while computers can perform millions of computations in that time. The difference is that the brain can run multiple neural circuits in parallel. ‘The sheer quantity of the billions of cells – and exponentially more routes that a signal can take as it zips through the brain – [contain all the information necessary to form a thought]’.
Sejnowski used computer-modelling techniques to encapsulate what we know about the brain and test hypotheses on how brain cells process, sort and store information.
He first designed NETtalk, an artificial neural network that learned to pronounce written English test. The English pronunciation is irregular and complicated (e.g. a “c” sound can be card in “cat” or soft in “city”), yet we often know how to pronounce words based on our vast experience over many years of learning. Letter sequences comprising words were shown to the NETtalk neural network during training, and it received feedback from a “teacher”, based on which it forged new and strengthen existing neural connections. Learning to read involves a complex mechanism involving many parts of the brain, which is why it was essential for the neural network to continually adapt itself. In one day, the artificial neural network progressed from pronouncing unintelligible babble to being able to properly read and enunciate words.
‘This astonished my lab and experts. We were told that we couldn’t solve this problem because linguists have written entire books on this difficult problem,’ says Sejnowski.
Thus, it was neural networks instead of digital programming that appeared to be the way forward for Sejnowski to account for both regularities and irregularities in learning. Machine learning algorithms are powerful tools with many applications, especially if the artificial neural networks can continually improve themselves based on their experiences.
Sejnowski next designed an artificial neural network to solve the “cocktail party problem” – the separation of noise sources (e.g. distinguishing background chatter and music from a conversation). His team developed an AI with flexible neural connections to separate mixtures of sounds to recover clear output sources. The AI algorithm successfully separated voice from music so that each could be heard clearly and individually. This could then be scaled up for 100 sources or even 1000 sound sources, which is ‘way beyond human capability’.
Accompanying the advancement of machine learning there has been a rise in the number of more human-like robots in appearance and behaviour. Not only do researchers consider the mechanics and computational aspects of robotics, but they are also starting to consider the psychology and education of a “robot brain”. To this end, researchers are comparing robots to a developing child’s brain and what better way to do that than to bring a robot into a pre-school classroom. Cognitive scientist at the University of California, San Diego, Javier Movellan, created the RUBI the social robot and let it play with children. RUBI quickly learned that it had to stop children from ripping its arms off by saying “OUCH!” and crying when the pressure reached a certain point. It also learned that its reaction times were important when interacting with children to play games and create dialogue so as not to be either unnaturally fast or boringly slow. This study demonstrated the importance of robots developing natural human expressions and learning to recognise them too.
Computer science is moving towards powering nearly everything. Sejnowski believes that we will soon be able to build computer chips and systems that allow us to interact with all the information available in the world. Nanoscience will also be critical in helping make things smaller, especially as energy is often limiting. We cannot afford supercomputers to do everything – housing artificial neural networks and incorporating knowledge of neurobiology and psychology with the computational side. The next challenge is therefore to design hardware that can run artificial neural networks cheaply. ‘This is all within grasp – it is not merely science fiction.’
Written by Catriona Nguyen-Robertson |Science Communication Officer | Convergence Science Network
The 2010 Oration
2010 Graeme Clark Oration: Dr J Craig Venter
From Reading to Writing the Genetic Code
Craig Venter’s mission is to understand more about the biological world and use the power of genomics to develop unique insights into disease, health and the environment.
It took Venter ten years to isolate a single neurotransmitter receptor protein as a researcher at the National Institute of Health. During this time, his group developed the first automated sequencing method in 1987. Shortly after, the first discussions around the Human Genome Project took place. The notion that it may be possible to uncover every human protein with a single project, even if it took 10-15 years, seemed like a ‘fantastic step forward in science’.
Excited at the prospect, Venter’s lab did some initial sequencing to determine whether it was indeed possible to sequence human chromosomes. They found that it was, but the challenge was putting the pieces of the puzzle together: DNA was sequenced in random, short fragments that had to be fit together to create the entire picture. Except there was no picture to be used as a reference as normal jigsaw puzzles have.
To this end, Venter developed the shotgun sequencing method and completed the first draft of the human genome in parallel with the Human Genome Project. He then wanted to perform genomic sequencing in a more complex environment: the oceans. Micro-organisms comprise over half of the Earth’s biomass. In every millilitre of seawater, there are one million bacteria and ten million viruses. Venter first sampled 200 L of water from the Sargasso Sea and, much to the scientific community’s surprise, identified 40,000 new species and 1.3 million new genes.
‘This doubled the number of known genes in a single barrel of seawater.’
Biodiversity in the ocean had previously been largely ignored. It had been expected that without many nutrients in the Sargasso Sea, there would not be much life. Venter found that nearly every organism towards the surface of the water had photoreceptors to receive energy directly from the sun. He then circumnavigated the world in a yacht to sample different oceanic environments and could associate GPS coordinates with each DNA sequence. In the genes encoding the photoreceptors, there were single letter mutations in the DNA sequence that determined the wavelengths of light absorbed. Venter discovered a correlation between geography and the DNA code: organisms found in the deep indigo-blue oceans encoded a receptor that absorbs blue light, while organisms with coastal waters absorb green light.
With this approach, Venter realised that there is more genetic variation among microbes than initially realised. The 16S ribosomal RNA (16SrRNA) gene, which encodes part of the bacterial molecular machinery, was the initial molecular marker for identification of bacterial species. However, by sequencing entire genomes, Venter found that what people thought were single species due to the presence of the same 16SrRNA sequence, were actually separate major taxa with 40-50% genetic variation between them.
‘[Humans] are only 10% different from all other mammals, so 40-50% variation between bacteria is a huge amount.’
Having successfully read the genetic code and digitising the four letters of DNA (G, T, A, C) into a binary code of 0’s and 1’s, Venter next wanted to write DNA himself. For 15 years he has asked the question of whether it was possible to make a synthetic chromosome and ‘boot it up in a cell’.
Venter first trialled constructing the DNA of a bacteriophage – a virus that infects bacteria (φX174) – out of synthetic DNA strands. Upon inserting synthetic DNA into a bacterial cell, the DNA was read by the cell to form viral proteins that self-assembled into viral particles. ‘The software was beginning to build its own hardware.’ The next goal was to build an entire bacterial chromosome, which is considerably larger in size. While the larger amount of DNA had to be pieced together bit by bit across multiple steps, he found a method using both bacteria and yeast.
With 20 million genes discovered to date, Venter sees them all as ‘the design components of the future’. His idea is that we can engineer cells using these genetic components to do exactly what we want them to do.
One application for his synthetic microorganisms is for the production of alternative fuels. Fossil fuels still represent more than 80% of energy sources, and energy consumption and carbon emissions are expected to double by 2030 with no change. “We don’t have enough fuel, food, clean water and medicine for the 6.8 billion people that are now here”.
Venter utilises genes that naturally exist to build microbes that can convert carbon dioxide into alcohols and other biofuels. Taking genes from algae and archaea, which can produce lipids (fatty acids) and methanol from carbon dioxide respectively, he is currently scaling up his research with ExxonMobil and BP. He would ideally like to see power plants using synthetic microbes with these genetic components to produce energy while simultaneously removing the accumulated greenhouse gases from the atmosphere.
From making headlines for sequencing the first draft human genome in 2001, to mapping the ocean’s biodiversity from 2003, to now creating the first synthetic lifeforms that produce alternative, sustainable fuels, Craig Venter has a remarkable list of accomplishments.
“We’re only limited by our imaginations now - starting with this genetic code and writing new software.”
Written by Catriona Nguyen-Robertson | Science Communication Officer | Convergence Science Network
The 2008 Oration
2008: Graeme Clark Oration:
A Partnership in Research Leading to the Bionic Ear and Beyond
“I started my journey as a five-year old. My dad had a severe hearing loss and when my primary school teacher asked the kids in her class what they wanted to do, I said to her that I wanted to fix ears 'cause I wanted to help someone like my dad” – Graeme Clark AC
The inaugural Graeme Clark Oration celebrated Clark’s pioneering convergent science: he brought the medical sciences, physical sciences, and information technology together to develop the bionic ear. Clark started his research nearly 40 years ago to electrically stimulate the brain to restore hearing in deaf people, and ever since, there have been major advances in all science disciplines and a great need to form partnerships between them, transforming the biomedical engineering space.
The development of the bionic ear took many years of research and experiments, understanding the brain and how it processes sound, developing surgical techniques and tools, and finally, finding a path to connect hearing basic noise with meaningful sound and speech. A large team of researchers, surgeons, and audiologists from different backgrounds were involved in the process over the many years to make it possible.
Clark’s journey was one of constant learning and building on his skillset. He initially studied medicine and became a Fellow of the Royal Australian College of Surgeons, specialising in ear, nose and throat surgery. He then completed his doctoral studies on the middle ear and neural mechanisms in hearing while concurrently completing a Masters of Surgery. He then travelled to England to learn more about speech science as this was final piece of the puzzle to convert complicated speech signals into electrical stimulation of the auditory nerve.
Neurons in the brain and nervous system communicate via electric and chemical signals. Clark hypothesised that hearing might be restored in deaf people if the damaged part of the ear were bypassed, and the auditory nerve was electrically stimulated directly to reproduce sound. He initially investigated the effect of electrical stimulation on single cells and groups of cells in the auditory brainstem response, where sound frequencies are first decoded in the brain, which revealed promising results.
Clark hit a roadblock when he had difficulty identifying a way to place an electrode bundle in the spiral-shaped cochlea to directly stimulate the speech frequency region without causing further tissue damage. He had a breakthrough during a vacation at a beach, however, when he conceptualised using a seashell to replicate the human cochlea and electrodes similar to grass blades (flexible at the tip but stiffer towards the roots).
In 1974, Clark set out in partnership with key members of his research team, Ian Forster and Jim Patrick, to develop the most complex package of electronics ever implanted in a patient. The device contained electrodes to stimulate nerve fibres, packaged on a silicon chip and sealed in a transmitting container based on the same technology used to communicate to the astronauts on the Moon.
Clark and Dr Brian Pyman performed the first multi-channel cochlear implant operation at the Royal Victoria Eye and Ear Hospital in 1978. Electrical stimulation of the brain was able to reproduce sound frequency and intensity as pitch and loudness, but electrical stimulation at low or high rates were poorly discriminated – the timing of electrical stimulation was therefore important to convert sound into recognisable speech. Clark and his team therefore had to adjust the rate and place of stimulation and used temporo-spatial patterns to develop a speech coding strategy. Within a year of the first, a second patient was implanted and the US Food rapidly administered clinical trials and Drug Administration to approve cochlear implants for adults.
This was all very well for adults who originally had hearing before going deaf, but Clark wanted to help children too. Clark established a research-industry partnership with Cochlear Limited to develop an electrode system suitable for children. The first two children were implanted in 1985 and 1986 at ages 10 and 5 respectively. By 1990, children around the world were able to hear, which was a remarkable leap forward in their development of spoken language.
Following Clark’s delivery of the Oration, Sophie Li, a cochlear implant recipient, delivered a personal and moving account of how the implant has changed her life. At age 5, her speech and perception skills were at the level of a 2-year-old child. But with hard work and cochlear implant, her speech and language improved. She is now bilingual in English and Chinese, and also received the Victorian Premier’s Award for dance in year 12 – something that would not have been possible could she not hear music.
The bionic ear was only the beginning of a boom in the field of bioengineering. There are many new applications for implantable electrodes to electrically stimulate the nervous system. We see medicine and the physical sciences come together in the development of bionic eyes, nerve and spinal cord repair treatment, drug-resistant epilepsy treatment, etc. Clark has therefore paved the way for inter-disciplinary science that will change so many people’s lives.
“This invention has given people worldwide a priceless gift – the gift of hearing. It will continue to change the lives of people around the world, because it sure has changed mine.” – Sophie Li.
Written by Catriona Nguyen-Robertson | Science Communication Officer | Convergence Science Network