Face-to-face social interaction is a core part of human real world behaviour. To study it, we need to capture natural behaviour (not just computerised tasks), find new ways to analyse these rich complex datasets and link patterns of behaviour to our neurocognitive models of brain activity. This requires both new technologies and new experimental paradigms. Here, I will share some initial steps in this direction. I will describe studies which use motion capture and eyetracking to characterise the detail of human social interactions across different tasks and contexts. I will present functional near-infrared spectroscopy data from two-person face-to-face interactions and will discuss the different ways we can model and understand such data. In particular, I show how models built on the mutual prediction hypothesis can go beyond current methods and allow us to link brain and behaviour in order to understand the embodiment of social interaction. Thus, this talk will highlight new directions in social neuroscience and the exciting opportunities which are now available to understand real world behaviour.