CITIC

HIGH PERFORMANCE COMPUTING

High Performance Computing (HPC) is an essential tool for processing the large data sets needed to understand and meet social, scientific and industrial challenges across a wide range of fields, including personalised medicine, weather forecasting, cyberattack detection, design of new materials, and industrial simulation.

HPC is driven by the growing demand for large computational resources in areas such as Artificial Intelligence and Data Science.

RESEARCH AREAS AND PRIORITIES:

HPC software: HPC software uses supercomputers to find solutions to complex technological problems involving large numbers of calculations.

Research areas:

    • Application-specific HPC architectures using FPGAs
    • Energy and performance optimization for emerging HPC architectures
    • Code auto-tuning tools for heterogeneous systems
    • Fault tolerance in exascale systems
    • Visualization of massive data and computer-generated graphics
    • Genome algorithm acceleration in HPC systems
    • Code acceleration for machine learning in HPC systems
    • Harnessing HPC and cloud infrastructures to deal with large-scale optimization problems in computational biology processes

 

Big Data infrastructures: Infrastructures that enable the collection, storage and analysis of large data sets that require Big Data analytics.

Research areas:

    • Characterization and optimization of Big Data processing in HPC systems
    • Visualization of massive data and graphics

 

Cloud computing: Cloud computing is the delivery of computing services over a network, typically the internet.

Research areas:

    • Harnessing HPC and cloud infrastructures to deal with large-scale optimization problems in computational biology processes
    • Characterization and optimization of Big Data processing in HPC systems