High performance and distributed computing in biomedical research
Prof. Peter Coveney
Centre for Computational Science, Department of Chemistry
University College London
I will discuss the rapidly evolving scope for the application of computational methods within biomedicine and the opportunities that now exist for effective forms of translational research based on the use of high performance computing techniques. Patient specific, or personalised, healthcare will become a dominant theme in twenty-first century medicine; its success depends centrally on the availability of patient data--such as that derived from various imaging modalities, genotypic analyses--that can be combined with predictive, high fidelity modelling and simulation to furnish unique information in support of clinical decision making. I will illustrate this vision with examples drawn from cerebrovascular pathologies (such as stroke and aneurysms), and HIV/AIDS as a paradigm of infectious disease. These approaches, adumbrated by, for example, the Virtual Physiological Human initiative currently running within the European Union, are dependent on the massive computing power now becoming available at the petascale, and through the advent of desktop machines with many hundreds or thousands of cores. Computational biomedicine has the potential to effect al revolution in healthcare provision, in which an individual can take control of his or her own health by managing their own electronic data in conjunction with medical experts. However, to make such a scenario a reality, national and international IT infrastructure needs to be much more joined up and sustainable than is currently the case -- we need to be able to rely on distributed environments in which data storage systems are connected to all kinds of computing resources from the lowest to the highest end, through very fast, high quality of service networks, while observing stringent requirements concerned with ethics, confidentiality and privacy.