Recent introductions of robots to everyday scenarios have revealed unprecedented opportunities for collaboration and social interaction between robots and people. However, to date, such interactions are hampered by a significant challenge: having a semantic understanding of their environment. Even simple requirements, such as “a robot should always be in the kitchen when a person is there”, are difficult to implement without prior training. In this paper, we advocate that robot-people coexistence can be leveraged to enhance the semantic understanding of the shared environment, and improve situation awareness. We propose a probabilistic framework that combines human activity sensor data generated by smart wearables with low level localisation data generated by robots. Based on this low level information and leveraging colocation events between a user and a robot, it can reason about the two types of semantic information: 1) semantic maps, i.e. the utility of each room, and 2) space usage semantics: tracking humans and robots thru rooms of different utilities. The proposed system relies on twoway sharing of information between the robot and the user. In the first phase, user activities indicative of room utility are inferred from wearable devices and shared with the robot, enabling it to gradually build a semantic map of the environment. In the second phase, via colocation events, the robot teaches the user device to recognize the type of room where they are co-located. Over time, robot and user become increasingly independent and capable of semantic scene understanding.