Research degree opportunities in Computing Science and Mathematics

A PhD in computing or mathematics can be the first step into an academic career and a passport to some of the most interesting technology jobs in the world. We are welcoming students for study towards a PhD or MPhil degree in data science, artificial intelligence, mathematics, biological modelling and other areas of computer science.

We have a limited number of funded places each year, and these are advertised on FindAPhD.

We also welcome students from the UK and abroad who have their own funding or who wish to develop a proposal to apply for a scholarship. We will help you develop your research question and your proposal with a view to you studying at Stirling. You may have your own ideas for a research question and we would be happy to help you shape them into a high quality PhD proposal. Alternatively, you may find one of our existing research projects the perfect fit for your own interests. We also offer a professional doctorate programme, in which you can work on a project for your employer (who covers the costs) and earn a PhD at the same time.

The list below describes some PhD opportunities that are available right now. If you have a scholarship opportunity or private funding, please contact the supervisor listed for the project that interests you.

Computing Science PhD opportunities

Topic: Artificial Intelligence Sight Loss Assistant

Supervisor: Dr Kevin Swingler

The Artificial Intelligence Sight Loss Assistant (AISLA) project aims to use state of the art computer vision and artificial intelligence to develop personal assistant technology for people with sight loss. Topics within the project include computer vision, natural language processing and human-AI interfaces. A PhD in AI and computer vision can lead to an academic career or jobs in industries such as automotive, building self driving cars, digital assistant design or security. Companies like Google, Amazon and Facebook are at the forefront of commercial AI.

Topic: Efficient search techniques for large-scale global optimisation problems in the real world

Supervisor: Dr Sandy Brownlee

Optimisation problems become really difficult once they become "large-scale": like allocating thousands of skilled engineers to jobs, or prioritising where to spend public money in improving energy efficiency of thousands of homes. This project will look at how to learn the structure of these problems, allowing us to intelligently divide them up so they can be solved efficiently.

Topic: Search-based software improvement for green computing

Supervisor: Dr Sandy Brownlee

Reducing computational energy consumption is important at the extremes (i.e., mobile devices and datacentres), and in many cases there is even a trade-off between functionality and energy consumption. Yet improving existing code is difficult because it is easy to break functionality and there is a lot of noise when we measure energy. This project will explore how search-based approaches like genetic algorithms can be used to improve the efficiency of code, accounting for these difficulties: saving the planet and putting off your next recharge.

Topic: Understanding and Visualising the Landscape of Multi-objective Optimisation Problems

Supervisor: Prof. Gabriela Ochoa

In commerce, industry and science, optimisation is a crosscutting, ubiquitous activity. Optimisation problems arise in real-world situations where resources are constrained and multiple criteria are required or desired such as in logistics, manufacturing, transportation, energy, healthcare, food production, biotechnology and others. Most real-world optimisation problems are inherently multi-objective. For example, when evaluating potential solutions, cost or price is one of the main criteria, and some measure of quality is another criterion, often in conflict with the cost.  The analysis of multi-objective optimisation surfaces is thus of paramount importance, yet it is not well developed.  This project will look at developing and applying network-based models of fitness landscapes and search trajectories to multi-objective optimisation problems. The ultimate goal is to provide a better understanding of algorithms and problems and demonstrate that better knowledge leads to better optimisation across a number of domains.

Topic: Machine Learning approaches to tackle Cyber Attacks

Supervisor: Dr Mario Kolberg

The range of internet services has increased dramatically in recent years, however, at the same time cyber-attacks have grown both in number and sophistication endangering user trust and uptake of such services. Thus there is a need for researchers to  develop solutions to these evolving cyber-attacks. However, these attacks are evolving as attackers keep changing their approaches.

Security measures such as firewalls are put in place as the first line of network defense to safeguard these networks but attackers are still able to exploit vulnerabilities in these networks. Intrusion Detection Systems (IDS) have shown potential to be a successful counter measure against potential attacks. However, there are still many open issues, such as their efficiency and effectiveness in the presence of large amount of network traffic. Several IDS have been proposed that can differentiate between attacks and benign network traffic and raise an alarm when a potential threat is detected. However, these systems must be able to analyse large quantity of data in real time to be applicable in modern networks. Unfortunately the larger the data quantity, the more irrelevant information stored. One solution may be to extract key features and apply Machine Learning (ML) techniques to detect attacks. This project will investigate using ML approaches to detect intrusion attacks at runtime.

Topic: Bio inspired Peer-to-Peer Overlay algorithms

Supervisor: Dr Mario Kolberg

Peer-to-Peer (P2P) overlay networks are self-organising, self-managing, and hugely scalable networks without the need for a centralised server component. Utilizing inspiration from biological processes to construct and maintain P2P overlays has attracted some research interest to date. The majority of related solutions focus on providing efficient resource discovery mechanisms using swarm intelligence techniques. In fact such techniques have proven performance benefits in regard to routing and scheduling in dynamic networks, while they have also inherent support for adaptability and robustness in light of node failures. Conversely, except for very few examples, using such techniques for topology management has not really been exploited. This project will investigate the use of bio-inspired solutions for topology management addressing some of the techniques’ challenges such as relatively high computational and messaging complexity.

Topic: The application of cognitive computational methods to enhance vocational rehabilitation

Supervisor: Dr Sæmundur Haraldsson

Vocational Rehabilitation (VR) is a field within healthcare which aims to assist long term sick-listed and unemployed individuals to enter the workforce or education [2]. VR has yet to fully embrace the use of cognitive computer systems, including Artificial Intelligence (AI) approaches. As such it offers indefinite avenues of research for inquisitive minds, e.g., predicting future regional demand for VR, optimising VR pathways for maximum probability of success, and many more. Potential PhD candidates would collaborate with international partners of the ADAPT consortium to exploit state-of-the-art AI and Data Science methods to improve decision making and planning in VR. The projects would form the foundation for the field of VR informatics with international real-world impact on people's health and wellbeing as well as current societal issues.

Title: Predicting the Performance of Backtracking While Backtracking

Supervisor: Patrick Maier

Backtracking is a generic algorithm for computing optimal solutions of many combinatorial optimisation problems such as travelling salesman or vehicle routing. Unfortunately, the time a backtracking solver requires to find an optimal solution, to prove optimality, or to prove infeasibility is very hard to predict, which limits the practicality of such solvers for real-world problems.

Research in algorithms has mainly focused on specific problem classes and on identifying characteristic features of hard problem instances. Instead, this project aims to mine a generic backtracking solver for performance data at runtime (that is, while solving a particular problem instance) and to build statistical models that can be used to estimate the future performance of the solver on the current problem. Interesting estimates include: How likely is it that the current solution is optimal? Assuming the current solution is optimal, how long will it take to prove optimality? Can the search be parallelised, and if so, how many CPUs would be required to get the answer in one hour?

Title: Computational Modelling of Biological Systems

Supervisor: Carron Shankland

We can understand the world through modelling it, manipulating the model to incorporate new features, and analysing that model. For example, model disease spread: what happens when we add a vaccine, or quarantine, or mutation of the disease? The model can help inform policy decisions such as we’ve seen with the recent Covid outbreak. Or we might model a tumour cell, and the effects of different kinds of radiotherapy on that cell to develop better and safer treatment schedules. I have data for these systems, and am keen to supervise students using a range of modelling techniques, perhaps incorporating the use of evolutionary computation to refine the model.

Topic 1: Deepfake and Fake news detection

Due to the growing presence of social media or social networking sites people are digitally connected more than ever. This also empowers citizens to express their views in multitude of topics ranging from Government policies, events in everyday life to just sharing their emotions. However, the growing influence experience by the propaganda of fake news is now cause for concern for all walks of life. Election results are argued on some occasions to have been manipulated through the circulation of unfounded and sometime doctored stories on social media. In addition to fake text, there has been huge growth of AI based image/media manipulation algorithms commonly known as ‘deepfake’. Near realistic fake videos are being generated that contributes significantly to spreading misinformation. This project will research on developing new algorithms that combines deep learning based Natural Language Processing (NLP) and Computer Vision (CV) techniques to detect fake news and prevent misinformation spreading.

Topic: The multimedia blockchain

This project proposes to develop a blockchain based media distribution framework to a) enable trust, privacy and security in the media consumption chain; and b) empower transparent and trusted media distribution ecosystem in the creative sector. The first one aims to provide an efficient solution to issues related to trust, privacy and security in the consumption chain, while the latter part intend to provide a transparent and trusted media distribution ecosystem empowering creative content creators, publishers, consumers and digital archives. One key aspect of this proposed framework is the proposition of a transparent and decentralised blockchain architecture for media transactions and the provisions of media integrity through novel signal processing techniques that can address challenges posed by recent advances media manipulation such as deep fake. Further reading on this is available here.

Topic: Image/video auto-captioning

Image auto-captioning is an emerging area that has many applications. It is easy to capture and share images, extract the geo-location or even identify the objects. However, it is a very challenging problem to train a computer to see an image, understand and describe its content. This project will research into developing robust algorithms that can generate natural language descriptions of images/videos and their regions/segments. The project will explore about the inter-modal correspondences between language and visual data. The work will contribute the development of multimodal approaches that combines deep learning based Natural Language Processing (NLP) and Computer Vision (CV) techniques.

Topic: Non-Linear Deep Learning

Supervisor: Keiller Nogueira

Over the past decade, Convolutional Networks (ConvNets) have renewed the perspectives of the research and industrial communities. Although this deep learning technique may be composed of multiple layers, its core operation is the convolution, an important linear filtering process. Easy and fast to implement, convolutions actually play a major role, not only in ConvNets but in digital image processing and analysis as a whole, being effective for several tasks. However, aside from convolutions, researchers also proposed and developed non-linear filters, such as operators provided by mathematical morphology. Even though these are not so computationally efficient as the linear filters, in general, they are able to capture different patterns and tackle distinct problems when compared to the convolutions. This project will research the combination of deep learning and non-linear filters, mainly morphological operations, in order to create a new network that can be used for different applications and tasks.

Topic: Small Data Learning

Supervisor: Keiller Nogueira

The recent impressive results of methods based on deep learning for computer vision applications brought fresh air to the research and industrial community. Although extremely important, deep learning has a relevant drawback: it needs a lot of labelled data in order to learn patterns. However, some domains do not usually have large amounts of labelled data available which, in turn, makes the use of such technique unfeasible. This project will research strategies to better and efficiently exploit deep learning using few annotated samples.

Mathematics PhD opportunities

Topic: Optimising disease control measures for novel outbreaks

Supervisor: Dr Anthony O'Hare

This project will use census, demographic, and travel network data to model a disease outbreak in a country given some high-level input such as incubation period and R0 value and use Artificial Intelligence to determine the optimal disease control measures, e.g. closing schools, closing rail lines etc. Also, for a given amount of vaccine, you will determine the optimal distribution of the use of the vaccine.

Title: Modelling Pathologies in Cardiac Cells

Supervisor: Dr Anya Kirpichnikova

Cardiac modelling serves as a crucial tool in comprehending the mechanisms of pathophysiology in both healthy and afflicted hearts. The prospective PhD project centres around cardiac modelling, specifically focusing on creating and examining models of both healthy and diseased ventricular cells. As part of this project, the candidate will acquire proficiency in sophisticated techniques for model development and analysis. These techniques will encompass virtual population methodology and sensitivity analysis, aimed at identifying cardinal cellular attributes influencing disease manifestation.

Title: Designing obstacles for the Network Simulator 3

Supervisor: Dr Anya Kirpichnikova

In the realm of network simulation, the process of integrating obstacles plays a pivotal role in achieving realistic and reliable results. This project is a combination of various methods of wave propagation techniques in the presence of obstacles together with the implementation of the results in coding, i.e. implementing obstacles within Network Simulator 3 (NS-3), a popular tool utilized extensively for network research and development. The approach considers the physical characteristics of real-world barriers and their impact on signal propagation, effectively enabling more accurate simulations of varied environmental conditions. By incorporating variables such as material type, size, and location of obstacles, the model should emulate their interference in signal strength, reflection, refraction, diffraction, and absorption.

Title: Novel methods for ECG classification

Supervisor: Dr Anya Kirpichnikova

Electrocardiogram (ECG) classification is critical in diagnosing cardiac abnormalities, offering an essential tool in preventative health measures. Despite advancements in this field, there remain significant opportunities for improving the accuracy and reliability of ECG classification methods. This research proposal aims to explore novel mathematical and signal processing techniques to enhance ECG classification and support timely intervention for cardiac patients.

Topic: How to avoid tipping points in the food system

Supervisor: Prof Rachel Norman

The way we produce, distribute and purchase food is referred to as the food system and has many non-linearities in it. In this project we will look at the role of tipping points, and in particular ways in which we could avoid them. The project will use mathematical models to describe aspects of the food system and we will take a theoretical approach to the analysis alongside considering particular case studies of previous tipping points, for example the collapse of some cod populations.

Topic: When does a new infectious disease outbreak occur and when does it die out?

Supervisor: Prof Rachel Norman

There have been a significant number of emerging infections which are either diseases which we have not seen before or ones which enter a new region. For example, Covid-19 had not been seen until the end of 2019 and it seems to have come from wildlife. However, we are challenged with these new infections more frequently as the way we interact with our environment changes. This project will use mathematical models to look at what features cause an outbreak to occur or the disease to die out. This project will use stochastic SIR type models which are coupled non-linear differential equations to understand what happens at the start of an outbreak. If a small number of individuals get infected (for example if a pathogen passes from wildlife into humans), what pathogen characteristics are more likely to result in an outbreak? The approach will be a combination of theoretical exploration of parameter space for different models and consideration of specific diseases.

Topic: Sparse multidimensional exponential analysis in computational science and engineering

Supervisor: Dr Wen-shin Lee

Exponential analysis might sound remote, but it touches our lives in many surprising ways, even if most people are unaware of just how important it is. For example, a substantial amount of effort in the field of signal processing is essentially dedicated to the analysis of exponential functions of which the exponents are complex. The analysis of exponential functions whose exponents are very near each other is directly linked to super-resolution imaging. As for exponential functions with real exponents, they are used to portray relaxation, chemical reactions, radioactivity, heat transfer, and fluid dynamics.

Since exponential models are vital to being able to describe physical as well as biological phenomena, their analysis plays a crucial role in advancing science and engineering. The proposal investigates several multidimensional uses of exponential analysis in practice.

Topic: Exponential analysis meets Wavelet theory

Supervisor: Dr Wen-shin Lee

In the past years, sparse representations have been realised as linear combinations of several trigonometric functions, Chebyshev polynomials, spherical harmonics, Gaussian distributions and more. Recently also the paradigm of dilation and translation was introduced for use with these basis functions in sparse exponential analysis or sparse interpolation. As a result, high-resolution models can be constructed from sparse and coarsely sampled data, and several series expansions (Fourier, Chebyshev, ...) can be compactified. The above are available in one as well as higher dimensions. The similarity with wavelet theory remains largely unexplored.

Topic: Improving Antibiotic Dosage Regimens: Mitigating for risks associated with Varying Patient Compliance

Supervisor: Andy Hoyle

The rise of antibiotic resistance is putting increasing pressure on our health service, and it is estimated that over 30,000 deaths per year across the EU are associated with resistant bacteria. Yet, there is little movement away from conventional antibiotic regimens, whereby we apply a constant daily dosage, e.g. X mg (or 1 tablet) per day for N days. This project will combine mathematical modelling and Artificial Intelligence, and aims to find optimal antibiotic regimens which maximise host survival, minimise the emergence of resistance and mitigate against uncertain patient compliance.