Abstract: Situated human-computer interactions rely on a shared understanding of the surrounding environment. For example an instruction to an autonomous vehicle to drop you off at a particular building entrance on campus relies on your mental model of geographic space correlating to a digital representation of that space stored in the navigation system. Going beyond route navigation, if you wanted to ask the vehicle’s navigation system for details about the spherical dome-like building you could see in the distance on your left, then the system would need to be able to model your current view and identify possible candidates. This requires a detailed surface model of the region with corresponding attributes for each feature (e.g. building use, name, type) and an ability to model in real-time which objects are the most salient. In this talk I’ll explain how LiDAR data can be used to construct digital geographic landscapes which when fused with a variety of map datasets can provide the analytical capabilities to support such natural language interfaces. Through such developments we can reduce the seam between the user and computer/robot to build more natural interactions and a better shared contextual awareness.
Bio:
Dr Phil Bartie’s research interests are centred in data science and information systems, particularly in the overlap between models of space and place, mobile computing, and human-computer interaction, and how these technologies interact with industry and society. His interests include how spatial data models may be used to improve the effectiveness of interactions with mobile computers (e.g. autonomous vehicles, robots, and wearables) while exploring urban space, and the development of spatial analytical and decision-making methods that support more natural human-machine interactions. Before moving into academia, he worked in technical roles in government and industry in the Channel Islands and New Zealand, developing a wide range of solutions using database, mobile, and web technologies.