Description
Much scientific research and innovation requires a wide
range of data intensive computations--including high
performance computing (HPC)--to be run as part of a
complex workflow, for example with different steps for data
access, data acquisition, data transfer, data processing and
compute-intensive simulation. To simplify the process for
the user, we can orchestrate these steps using a workflow
manager and provide seamless access to remote resources
using audited credential delegation and the application
hosting environment. This talk will outline several
biomedical applications our group has been working on to
enable better medical/clinicial treatments which draw on
the use of HPC inter alia. These include patient specific HIV
drug therapy, personalized treatment of aneurysms in the
brain and patient specific cancer therapy. In this talk I will
describe the e-Science techniques used in each project and
will make a case for an integrated computational
infrastructure (data storage, networks and computational
resources) to ensure the successful development of future
biomedical applications. I will also provide an overview of the
developments at the EU level to further computational
biomedicine through the FP7 Virtual Physiological Human
(VPH) Initiative. VPH projects in which we are involved
include VPH-SHARE, which aims to provide robust cloud
based infrastructure for translational biomedical research
and clinical usage, and p-medicine, where we process large
amounts of federated medical data from different sources to
provide personalized clinical decision support. Such scenarios
require a heterogeneous computational infrastructure,
comprising secure resources from the desktop (within a
hospital) to the very largest supercomputers available
nationally and internationally, in new and disruptive ways,
for example to deliver urgent results on demand.