Dr. Raffaella Calabrese and Dr. Belen Martin-Barragan are looking for a ESRC-funded PhD student with a strong quantitative background, to develop the following project:
 
Since the beginning of the 21st century, machine learning methods have been increasing its presence in research, industry and public institutions.
 
However, they are often seen as black boxes that are difficult to interpret. In particular, social sciences final aim is to obtain insight from data beyond a mere prediction or forecast. 
In such a context, machine learning techniques are usually diminished in favour of more traditional statistical techniques. We believe that there is an enormous potential in investigating alternative methods that combine the predictor power of machine learning methods and the interpretability usually present in statistical techniques.
 
Interpretability is a very subjective concept. Its meaning greatly varies depending on the background of the decision maker or social researcher, as well as on the final aims of the study. In order to mathematically model interpretability, to incorporate it into machine learning methods, we first need an appropriate definition for it. In this project, we will focus on the meaning of interpretability in the context of social sciences, using the topic of financial distress and housing affordability as a case study. We will use one of the largest panel survey in the world (UK Household Longitudinal Study, UKHLS). In this context, machine-learning-based credit scoring models will be used for estimating household financial distress. Due to the black-box nature of existing machine-learning techniques, variants of such techniques will be developed to impose interpretability of the score. Different steps are needed for the development of these variants: first interpretability needs to be defined as something meaningful for social researcher; second, such definition needs to be translated to a trackable mathematical definition; third the machine learning technique needs to be modified according with the mathematical definition of interpretability; finally, the technique needs to be applied to data.
 
An ideal candidate for this project would have a strong quantitative background and a will to apply his knowledge to real-life social problems.
 
Highly valuable are PG studies in Operations Research, and Statistics, as well as Applied Mathematics, Engineering, Physics, Actuarial Sciences, Econometrics, Machine Learning. Excellent candidates currently doing their UG studies will also be considered.
 
Candidates should contact Dr. Belen Martin-Barragan (belen.martin@ed.ac.uk). The deadline for applications is 31 May 2018.