Capturing, Exploring and Sharing People’s Emotional Bond with Places in the City using Emotion Maps

The vision of ubiqitous computing is becoming increasingly realized through smart city solutions. With the proliferation of smartphones and smartwatches, alongside the rise of the quantified-self movement, a new technological layer is being added to the urban environment. This framework offers the possibility to capture, track, measure, visualize, and augment our experience of the urban environment. However, to that end, there is a growing need to better understand the triangular relationship between person, place, and technology.

Please do not adjust margins However, these sensors are not only incorporated in the physical fabric of the urban environment; the people inhabiting the city currently carry a range of mobile devices such as smartphones and smartwatches, which can be used and turned into sensors to gather data in the built environment. Artist Esther Polak explored the potential of GPS in people's mobile devices to introduce time to a map as something that would be measured and experienced in real time, by using the real-time location and traces of movement based on GPS in mobile devices to draw a map of the streets of downtown Amsterdam as people passed through them. A similar technique is nowadays used to add real-time traffic information to Google Maps for route planning and navigation purposes, by tracking and aggregating the location and movement of Android mobile devices through the city (Polak 2002).
These mobile devices also enable users to track various aspects of their everyday lives and allow them to interact with and experience the city in novel ways. Complementary to the smart city approach, Urban Interaction Design takes a bottom-up, human scale design approach. It aims to identify the needs, desires, routines, behaviours and experiences of people in the smart city of the (near) future, in order to inform the design of innovative technological devices and services (Smyth et al. 2013). The focus is on city making, that is, people as engaged citizens using technology to create pleasant cities to work, live, play, create wealth, culture and more people (Hill 2013). In recent years, researchers such as Ratti, Picon, Krivy and Offenhuber have contested the clear-cut distinction between topdown and bottom-up approaches in the context of their practices and representational agencies, and have combined a human-centred, bottom-up design approach with a more traditional smart city, top down-design approach (Ratti 2010;Krivý 2018;Offenhuber and Lee 2012;Picon 2015). Offenhuber and Lee for example developed a participatory waste management tool to help residents combat litter in their municipalities due to the lack of proper, government-regulated waste management system. Tying in with Brazil's long history of selforganized cooperatives of informal recyclers known as "catadores", the mobile app allowed residents to take control of the overflowing bins and litter in their neighbourhood, by scheduling a pickup request. By aggregating the collection activity of the recyclers and the pickup requests of citizens with the GPS data in a digital map, the recyclers could not only plan their daily routes more efficiently and validate the service they provide to the neighbourhood, but it also empowered citizens to take control of litter in their own neighbourhood (Offenhuber and Lee 2012).
Personal Informatics and the Quantified-Self movement also use a range of mobile devices as wearable sensors for collecting data on more personal aspects of people's everyday lives in the urban environment. Popular metrics to track with quantified-self technology include physical activity (e.g. running apps like Nike+) diets (e.g. MyFitnessPal), moods and emotions (e.g. MoodPanda, ComfortZones) and memories (e.g. Memoir, UMap) (Blom et al. 2010;Elsden 2014;Huang, Tang, and Wang 2015;Li et al. 2013;Matassa 2013;Stals, Smyth, and Mival 2017b). The goal typically is to use this personal data to gain self-knowledge, self-insight and to promote positive attitudes and behaviours.

Increased Focus on Emotion and Affect
With the proliferation of smart city solutions, mobile devices, wearable technologies and the rise of the quantifiedself movement (Li et al. 2013), there is a growing need to better understand the triangular relationship between people, place and technology in the urban environment (Stals, Smyth, and IJsselsteijn 2014). To this end, there has been an increased focused on emotion and affect to create a better understanding of the urban lived experience and to augment people's experience of the urban environment.
De Lange for example, argues that emotion and affect have mostly been absent in the smart city discussion (de Lange 2013). According to de Lange, the smart city does not appeal to the emotions and as a result insufficiently engages citizens. However, this view of insufficiently engaging citizens being a problem does not fit in the traditional smart city visions, which typically take a top-down, technology-centred design approach in which technology is in control. Therefore it has no need to be engaging for the people living in the smart city. Affective computing however Please do not adjust margins does point to affect and emotional intelligence as a different kind of intelligence about the world (i.e. different from a logical, rational intelligence) and de Lange argues that this could be the missing component when considering what is truly smart about cities (de Lange 2013;Picard 2000). De Lange sketches a framework for the affective smart city in which affect and emotions are given a central role in the design of future cities. In this framework, smart city solutions depart from people's emotional attachment, or lack thereof, to shared, emotionally charged issues in the community, like for example air pollution. In this framework, the data collected by Quantified-Self technology could be a valuable resource, as it not only quantifies and measures individuals' emotional experiences, but also provides a sense that the collected data is "mine", thus encouraging a sense of ownership. It therefore encourages people to take their own responsibility and act upon it, and can also be seen as a way to exchange something of value with the world and other people (de Lange 2013).
Although Rooksby noted that currently people's motivations to track and collect personal data using Quantified-Self technologies during their everyday lives are typically egocentric and particularly present focused (Rooksby et al. 2014), endeavours are currently being undertaken in this field to go beyond short term use and direct goals concerned with self-knowledge, self-reflection and behavioural change. For example, an interest has emerged in different types of social relationships mediated and affected by this personal data (Elsden et al. 2017;Puussaar, Clear, and Wright 2017;Stals, Smyth, and Mival 2017b). In addition, Elsden argues we should also take into account the rich, emotional experience of looking back on current and past personal Quantified-Self data to create a better understanding of the value of this data and how it could potentially be used to augment people's lives in the future. He argues we should explore how to design for long term use, for remembering a digital "quantified past" (Elsden and Kirk 2014;Elsden et al. 2017).
Also in human-centred approaches to the design of smart cities, there has been an increased focused on emotion and affect to augment and create a better understanding of the urban lived experience. Matassa and Simeoni consider smart cities as places in which people and mobile and wearable technologies should cohabit in a synergic way. They also point to feelings, affections and moods as the features that are currently missing in order to be able to define and transform a space into a hybrid space (Matassa and Simeoni 2014). Many projects focus on the topic of safety in the urban environment, and have introduced technological interventions to increase people's feelings of safety in cities, or to inform the (re)design of urban places. Satchell and Foth for example investigated the potential for mobile technology to help users manage their personal safety concerns in the city at night. They advocated the design of a dedicated safety device that would enable people, in particular men, to take on the role of protectors instead of victims in situations that people felt unsafe in nocturnal urban environments (e.g. for example when walking home alone at night after a night out) (Satchell and Foth 2011). Where Satchell and Foth mainly aimed to mitigate people's feelings of personal safety, the Emocycling project  attempts to improve traffic safety by utilizing aggregated arousal level data of cyclists in the city to inform urban planning. In this study, participants were equipped with wearable technology to measure physiological data in combination with a GoPro-camera and a GPS-tracker to make it possible to geolocate the measurements and detect areas of negative arousal in the city. Whenever increased levels of stress were detected, the GoPro camera would automatically take a picture. The aim was to identify hotspots of stress for cyclists in the city, enabling non-professionals to use technology to automatically identify potential danger spots in the traffic infrastructure that need to be redesigned. This example illustrates that the data collected by a Quantified-Self system like an activity tracker could not only valuable for the individual using it, but could potentially also be used to, in this case, improve the traffic infrastructure of a city (Stals, Smyth, and Mival 2017b;Zeile et al. 2015).
But research has not been limited to feelings of safety alone. In the fields of architecture and urban planning, researchers have used a mobile, wireless EEG headset to record and analyse the emotional experience of a group of walkers in different types of urban places. Analysis of the real-time neural responses to different urban places Please do not adjust margins showed evidence of lower frustration, engagement and arousal and higher meditation when moving into a green space, and higher engagement when moving out of it into a busy street (Aspinall et al. 2015).
Using similar wearable technology and metrics, artist Christian Nold (Nold 2009) investigated people's emotional relationships with places in the urban environment by measuring people's arousal levels as they walked freely through the city. Participants were equipped with a wearable GPS locator and a biometric sensor attached to their fingers which measures their Galvanic Skin Response (i.e. sweat levels). This data was subsequently overlaid on a map of the city, showing peaks of arousal levels at certain locations along their walking route. Each participant was asked to interpret and contextualized their own data after the walk, which were subsequently combined into an annotated "emotion map" of the city. Although some places showed peaks in arousal levels because of the specific environmental characteristics like traffic or architecture, these emotion maps were also filled with personal stories and emotions, indicating people's strong and meaningful personal connection with certain places in the city.
Such data collected by a mobile and wearable devices or a Quantified-Self system could potentially also be used to enhance and augment a specific individual's experience of the urban environment, by complementing aggregated emotion data of multiple people gathered using a more top-down, crowdsourcing approach, with personal emotion data from a bottom-up, human-centred design approach. Quercia, Schifanella & Aiello improved the experience of walking routes through the urban environment as provided by traditional route planners by taking into account the emotional responses that the physical characteristics of places evoke in people (Quercia, Schifanella, and Aiello 2014). This was done by crowdsourcing geotagged pictures on Flickr and performing (sentiment) analysis on metadata such as number of pictures in a certain area, number of views, comments and tags. This data was subsequently used to successfully determine more quiet, beautiful or happy walking routes in London and Boston. A future improvement suggested by the researchers, was to include personalization options that would take into account an individual's personal history with a place. This personal emotion data could come from a QS-system as proposed by Matassa and Rapp, who prototyped and tested a QS-system for cyclists which aims to enhance an individual's remembering process by connecting personal experiences with the places in which they took place. It in situ alerts a cyclist of the of the cyclist's personal emotional connection with a place, acting as a memory trigger and a cue for reminiscing (Matassa and Rapp 2015).

Exploring Emotion and Person-Place relationships in the Urban Environment
Inspired by Nold's work on emotional cartography (Nold 2009) and the rise of the Quantified-Self movement (Li et al. 2013), there currently is an increased interest in exploring how mobile, wearable and Quantified-Self technology could potentially be used to capture and collect people's emotional experiences of urban places (Rooksby et al. 2014;Matassa 2013;Resch et al. 2015;Stals, Smyth, and Mival 2017b;Quercia, Schifanella, and Aiello 2014) and the potential for sharing this personal data with other people using emotion maps (Al-Husain, Kanjo, and Chamberlain 2013; Leahu, Schwenk, and Sengers 2008;Matassa and Rapp 2015;Mody, Willis, and Kerstein 2009;Nold 2004;Nold 2009). Based on social science studies of the concept of place attachment (Manzo 2005;Gustafson 2001;Scannell and Gifford 2010) and these urban HCI studies attempting to leverage people's emotional experience of the urban environment, we argue that places that are meaningful to people on a personal level, could provide a suitable lens for further investigation, as these personally meaningful places are typically the places that a person has a strong emotional bond with (Stals, Smyth, and Mival 2017a).
In the social sciences, research has centred on various person-place related concepts. Place is in the literature often defined as a meaningful location (Lewicka 2011), with place meaning developing from people's positive and negative experiences and emotions in places (Manzo 2005), which can result in place attachment, a multidimensional concept which characterizes the emotional relationship between individuals and their important places (Low and Altman 1992). The overall aim of the ongoing research in the PhD dissertation of which this literature review is a part, is to seek to understand how people's experiences of places in the urban environment Please do not adjust margins that are meaningful to them on a personal level (e.g. the pub where they have met their partner, or the dark alley where they got mugged), and in particular their personal stories and emotions connected to those places, could potentially inform the design of future technological devices and services. The aim is to investigate how people would like to capture their experience of a personally significant place, the different forms this data could take, and the potential for sharing this personal data with other people that the participant has different types of social relationships with (e.g. strangers, friends, and family) (Stals 2017;Stals, Smyth, and Mival 2017a;Stals, Smyth, and Mival 2017b).
The data corpus regarding person-place relationships was collected using the ethnographically-informed Walking & Talking method, an observed walking interview between the participant and the researcher along the participant's typical routes through the city, during which five of the participant's personally significant place were visited (Stals, Smyth, and IJsselsteijn 2014). The Walking & Talking method has been designed and used in-situ, to elicit qualitative measurements of the subjective emotional experiences that participants have in their personally significant places using the Plutchik Emotion Wheel (see Figure 1). The emotion wheel contains the eight basic human emotions depicted by different colours (joy, trust, fear, surprise, sadness, disgust, anger and anticipation), each divided in three different intensity levels. This visual tool offers participants a lightweight means of explicitly verbalizing their different emotions and emotion intensities associated with different places as the Walking & Talking interview unfolds, and is used to further contextualize their personal stories connected to their personally significant places (Plutchik 2005;Stals, Smyth, and IJsselsteijn 2014). In addition, mobile, wearable technology in the form of a GoPro camera was used by the researcher to record the Walking & Talking interview on video. Using Automatic Facial Expression Recognition software (AFER), these videos were retrospectively analysed to also gather quantitative emotion data on how participants in-situ had emotionally experienced their personally significant places in the urban environment (Stals, Smyth, and IJsselsteijn 2014).
Please do not adjust margins Furthermore, during the semi-structured Walking & Talking interviews, it was also investigated how participants would like to capture their experience of a place, the different forms this data could take, and the types of social relationships (e.g. strangers, friends, family members) participants would potentially be willing to share this data with. Participants were recruited using a networking procedure, beginning with referrals of potential participants from acquaintances. The participants were expected to be between the ages of 18 and 70 years old and have been living in the city of Edinburgh (United Kingdom) for at least two years. This two year minimum is to ensure that participants have had the time and opportunity to create personal relationships with places in the city.
As this research is part of an ongoing PhD dissertation, the full data corpus gathered over a six-month period in the city of Edinburgh (United Kingdom) is expected to consist of 40 personally significant places and will be analysed using a thematic, bottom-up analysis typical of a grounded-theory approach. However, initial analysis of the data gathered during our pilot study of in total 10 personally significant places, which was conducted prior to the main data gathering in Edinburgh over a one-month period, suggests that for participants there is not just one emotion connected to the experience of each personally significant place, but there can be multiple different ones (Stals, Smyth, and Mival 2017a). This poses a potential problem for the use of emotion maps as a tool to create a more accurate understanding of a person's emotional experience of, and relationship with personally meaningful places in the urban environment. The design and evaluation of the Walking & Talking method as a way to in-situ, elicit qualitative measurements of the subjective emotional experiences that participants have in their personally significant urban places (Stals, Smyth, and IJsselsteijn 2014), and the limited efficacy of using Automatic Facial Expression Recognition (AFER) software to retrospectively analyse video recordings of those Walking & Talking interviews to gather quantitative emotion data, have already been discussed in more detail in earlier work (Stals, Smyth, and Mival 2017a). However, based on the results of our pilot study and a systematic review of emotions maps in existing literature, this journal paper aims to highlight and discuss the strengths, limitations and potential of using emotion maps as a means to subsequently capture, visualize, explore and share this personal, geo-located emotion data of a person's emotional experience of, and relationship with personally meaningful places in the urban environment.

Emotion Maps
When it comes to creating a better understanding of people's emotional experience of a place, it can be concluded that the creation of emotion maps has become common practice and continues to inspire researchers around the globe. It originated in the 1950's from the Situationist movement, which used dérive to create psychogeographies of life in the streets, in a response to the singular, institutionalized view of the city by urban planners (Sadler 1999). In the 1960's, urban planner Kevin Lynch incorporated subjective experiences into the process creating mental maps of urban spaces (Lynch 1960), while artist Christian Nold's used GPS and wearable technology in a mobile methodology to create visually beautiful emotional cartographies of cities at the beginning of the new millennium (Nold 2004;Nold 2009). More recently, emotion maps have been used to visualize stress hotspots in supermarkets (El Mawass and Kanjo 2013), people's feelings in places affected by environmental factors like air pollution, noise and green space (MacKerron and Mourato 2012), people's emotional feelings of different typologies of a places (e.g. restaurants, museums, stores) (Mody, Willis, and Kerstein 2009) and feelings in a place based on personal memories that happened in that place (Matassa and Rapp 2015). From a research perspective, this makes sense. Researchers try to make sense of the data and want to know exactly which emotion or experience occurred in which location. So geographically ordering that data and linking it to the location in which it was collected, seems like a reasonable first step. Furthermore, thanks to advances in technology, GPS can nowadays accurately determine a person's location. So when incorporated in a research method, GPS technology can relatively easy be used to determine a participant's exact location. What GPS unfortunately cannot do, is tell you what a person's emotional experience of that location is, let alone the quality or cause of that experience. As it turns out, this is also still a challenge for other technologies such as EEG, GSR and AFER, which are often deployed in combination with GPS to Please do not adjust margins automatically collect emotion-related data, especially when utilized outside a lab setting such as an urban environment Stals, Smyth, and Mival 2017a;Tilley et al. 2017;Westerink et al. 2008). This raises interesting challenges regarding the visualization, use, and sharing of geo-located emotion data using emotion maps, which are often based on these types of data.
Although creating emotion maps has become common practice in fields such as Urban Interaction Design, limitations of such emotions maps are often not adequately addressed, and, as Frodsham noted, in particular when it comes to affective mapping and the use of GPS (Frodsham 2015). Therefore, this journal paper aims to assess the suitability of emotion maps when it comes to visualizing, exploring, sharing and communicating a person's (emotional) experience of, and relationship with personally meaningful locations in the urban environment. This review of emotion maps in the literature is not meant to be an exhaustive analysis of all the different types and variations in visualizations of emotion maps that exist in the literature. The aim is to highlight the important aspects and limitations that one needs to be aware of when aiming to use emotions maps as means to create a better understanding of the urban lived experience, or to communicate the (emotional) experience of a place or a person's personal relationship with a place to other people. The first thing to be aware of, is the type of sensors that have been used to collect the emotion data. In the literature, three types of sensors could be identified: technical sensors, human sensors and crowdsourced data . Technical sensors typically automatically collect quantitative, objective, biometric data (e.g. EEG, GSR, AFER) from individuals using wearable technology, while human sensors typically enable the qualitative, subjective measurement of people's emotions (e.g. interviews, Emotion Wheel). Crowdsourced data is typically collected by using an algorithm to crowdsource and automatically rank geo-located social media data into emotion categories. For example, Quercia, Schifanella and Aiello determined the most happy, quiet walking routes using crowdsourced ratings of places in the city based on pictures from Google Streetview and Flickr (Quercia, Schifanella, and Aiello 2014), while Resch, Summa, Zeile and Strube extracted emotion information from Tweets on Twitter using sentiment analysis to understand which emotions where associated with places for use in urban planning (Resch et al. 2016). It is not uncommon for researchers to deploy a combination of different sensors to try to get a more complete picture of people's emotions related to urban places . We will now take a closer look at two commonly used types of emotion maps used to visualize the geo-located emotion data collected: emotion maps based on a single, linear metric (e.g. biometric, quantitative data) and emotion maps based on emotion categories (e.g. qualitative data).

Emotion Maps or Arousal Maps?
Perhaps the most well-known emotion maps based on the automatic collection of biometric data, are the emotion maps produced by artist Christian Nold in the Bio Mapping project (Nold 2004) and the Emotional Cartography project (Nold 2009). Taking Nold's emotion map of San Francisco from the Emotional Cartography project as an example (see Figure 2), the first thing to note is that this is not actually an emotion map, but an arousal map, in which the depicted arousal levels are based on quantitative data collected using a wearable Galvanic Skin Response sensor (GSR) paired with a mobile GPS locator. This map, and in fact all the maps from Nold's Bio Mapping-project (Nold 2004) and Emotional Cartography-project (Nold 2009), are beautiful visualizations of arousal levels of participants at a certain locations in the urban environment. These physiological arousal levels are measured using Galvanic Skin Response, and are thus based on a single, linear metric. This means that they are easy to visualize, typically using heat map-like visualizations. For example, when we look at Nold's San Francisco emotion map, the higher the peak in arousal levels, the brighter and lighter the red dot on the map. However, these dots only indicate heightened arousal levels at certain locations in the city, not the actual emotions experienced by the participants at that place (e.g. these are not indicative of valence or the type of emotion), due to currently available technical sensors not being able to unambiguously correlate biometric data with a person's actual emotions , which is a wellknown problem in affective computing (Picard 2000;Leahu, Schwenk, and Sengers 2008;Leahu and Sengers 2015;Resch et al. 2015).
Please do not adjust margins

Source: "San Francisco Emotion Map" by Christian Nold is licensed under CC BY NC SA 2.5
An additional problem is the grounding of this quantitative data (i.e. the context of the emotion), which to this date remains a challenge for all biometric data related to emotions which is collected outside a lab setting (Frodsham 2015;Resch et al. 2015;Stals, Smyth, and Mival 2017a;Tilley et al. 2017;Westerink et al. 2008). The GSR data in Nold's San Francisco emotion map for example, on its own gives no indication regarding the cause of the arousal and could very well be related to the physical activity of walking (e.g. getting tired or walking uphill) rather than a participant's personal relationship with a place (Westerink et al. 2008;Resch et al. 2015). Nold attempts to mitigate this problem by allowing each participant to annotate their own arousal map, thus combining quantitative data with subjective data, which has become a common approach to deal with these limitations Leahu, Schwenk, and Sengers 2008;Matassa and Rapp 2015). However, the text that can be added afterwards (Nold 2009) or in-situ using an app (Matassa and Rapp 2015;Zeile et al. 2015) is often limited and might not be able to sufficiently reflect the experience or relationship a person has with a personally significant place. For example, in the detailed figure of the San Francisco emotion map the annotation "Reminiscing" gives some indication of why the arousal level is elevated, but is still insufficient for obtaining a better insight in the participant's personal relationship with the place where perhaps an event has occurred in the past, or it could be the case, for example, that the participant is simply reminiscing about the day at the office while walking home from work. This "groundtruthing" also remains a challenge when creating emotion maps based on crowdsourced social media data due to the limited amount of text and characters available in social media data (Quercia, Schifanella, and Aiello 2014;Resch et al. 2016).
Another aspect to be aware of, is the temporal factor. Arousal maps and emotion maps often provide a snapshot in time and contain data which is not necessarily related to the typical (emotional) experience of that place or personal relationship a person has with that place. For example, the annotation in the San Francisco emotion map "Little girl running past me with a Pitbull" can of course be the cause of a higher arousal level and be picked up by biometric sensors, but it is particular to that specific walk and it is an event that is unlikely to occur if the walk was to be repeated. A potential advantage of using crowdsource data is that the algorithm can be used to update an emotion map in real time, thus providing an up-to-date emotion map. Related to the issue of temporality, is the Please do not adjust margins medium of the emotion map. Nold's San Francisco emotion map for example, is a map printed on paper. That is, an actual physical paper object, which is static and not interactive. As a result, personally significant places cannot be added to or removed from the map, nor can the emotional experience of personally meaningful places be updated or traced across time. This is a limitation, as the emotional experience of a person's personally meaningful place do not remain stable, but can evolve over time (Stals, Smyth, and Mival 2017a).

Emotion Maps Based on Emotion Categories
Due to the limited insight provided by arousal maps regarding a person's actual emotions connected to places in the urban environment, more recent studies make use of emotion maps based on emotion categories (Mody, Willis, and Kerstein 2009;Matassa and Rapp 2015;Quercia, Schifanella, and Aiello 2014;Resch et al. 2016). For example, Matassa and Rapp designed and developed a concept for a smartphone app for cyclists called UMap, which aims to enhance people's reminiscing of past experiences by linking them to the context (i.e. places) in which they occurred (Matassa and Rapp 2015). The mobile app registers contextual data both automatically and by selfreporting. Sensors in the mobile phone are used to automatically collect quantitative contextual data such as time, GPS location and weather conditions. Additional qualitative data such as emotions, notes and media in the form of pictures and videos are not automatically registered, but can be added manually by the user. Emotion maps that do attempt to depict a person's actual emotions connected to a place (i.e. rather than arousal levels), as the one depicted in Figure 3, typically use simplified emotion categories based on qualitative measurements or crowdsourced data, thus limiting the range of emotions on the resulting emotion map (Mody, Willis, and Kerstein 2009;Matassa and Rapp 2015;Quercia, Schifanella, and Aiello 2014;Resch et al. 2016). Please do not adjust margins to limitations in the technology used for sentiment analysis, with happiness notably being the only positive emotion category used in their emotion model (Resch et al. 2016). Matassa and Rapp used eight emotion categories for their concept of the smartphone app UMap, namely, Happy, Sunny, Blissful, Sad, Alone, Calm, Impatient, and Wishful (see Figure 4), but a rationale for this particular classification was not provided (Matassa and Rapp 2015).
Although the different types of emotions in the specific emotion categories thus appear to vary across the literature, the emotion categories on an emotion map are typically visualized by assigning a different colour to each of the emotions (see Figure 3). The emotions appear as coloured dots at the relevant locations on the emotion map, or sometimes a user or participant is allowed to define a customized area on the emotion map as well, as can be seen in Figure 3. However, the specific details of the resulting visualizations can differ across the literature as well.
In a slightly more abstract emotion map proposed by Matassa and Rapp as part of their emotion map app for cyclists for example, the different emotion categories are depicted by different coloured circles on the emotion map (see Figure 4). The size of the circle could then be used to, for example, depict the intensity of the emotion linked to the memory in that place. Furthermore, the emotion map is interactive, allowing memories and places to be added and removed, and the emotion connected to a place to be changed, thus taking into account the temporality of emotions (Matassa and Rapp 2015). However, visualizations based on simplified emotion categories using qualitative measurements or crowdsourced data, appear to incorrectly assume that there is only one emotion connected to a person's experience of their personally significant places and typically only allow one emotion to be linked to a place on the map, regardless of the fact if it is the emotion map of an individual (Matassa and Rapp 2015), or an aggregated emotion map combining the data of multiple people (Nold 2009;Quercia, Schifanella, and Aiello 2014;Resch et al. 2016). Although more research is necessary, initial analysis of the qualitative data of our pilot study of ten personally significant places, showed that for participants there can be between two and seven different emotions connected to the experience of a personally meaningful location (Stals, Smyth, and Mival 2017a). These emotion maps thus currently appear to oversimplify the complexity of the emotional experience of a place due to technical limitations or for visualization purposes. This poses a problem if we as Urban Interaction Designers want to use these emotion maps to create a Please do not adjust margins better understanding of the triangular relationship between person, place and technology, or use these emotion maps to share and communicate people's emotional bond with personally significant places in the city to other people. If we would want emotion maps to more accurately represent the emotional bond that people have with personally significant places in the urban environment or use those maps to communicating or share this relationship with other people, we thus first need to further unpick the details of the complex emotional bond that people have with personally meaningful locations in the urban environment.

Emotion Maps as Speculative Design
Although emotion maps (and arousal maps) have become a common way for researchers to visualize, represent and create an understanding of people's emotional bond with personally significant places in the urban environment, little research has been conducted on how these emotion maps, once created, could potentially be relevant to other people (e.g. non-researchers). Leahu, Schwenk and Sengers provided a group of friends with a mock up emotion map during a walk through a familiar city. They found that emotion maps can act as a mnemonic trigger and as a therapeutic instrument for self-reflection (Leahu, Schwenk, and Sengers 2008). Similarly, Matassa and Rapp aim to use the emotion map as a tool to strengthen the bond between a person and their own personally significant places with the aim to raise engagement and stimulate people to take care of urban spaces (Matassa and Simeoni 2014). Following the current trend in the field of Quantified-Self technology and Personal Informatics exploring what might be valuable or interesting about personal data beyond personal use, we are not only interested in how people would like to capture and represent their own emotional bond with personally significant places in the city, but also how this data could potentially be relevant to, and used and explored by other people. In the study by Leahu, Schwenk and Sengers, although participants regarded their own arousal map as something personal and intimate, they were also willing to share it as an artefact with loved ones (e.g. by framing the paper map similar to a painting and give it away as a present to a loved one) (Leahu, Schwenk, and Sengers 2008). What we are particularly interested in, is where people's interest would lie in exploring somebody's personal data depicted in an emotion map.
Rather than attempting to create more accurate emotion maps, we acknowledge their current limitations (and the limitations of the technology currently used to measure and collect spikes in biometric data and emotions, for the creation of emotion maps and arousal maps), and propose to use those limitations as a provocation. The aim is to use these emotion maps as a research tool to further unpick the emotional bond people currently have with personal meaningful locations in the urban environment, explore the different forms this data could take, and the potential for capturing, sharing and exploring this personal data with other people using emotion maps. Nold for example used the aggregated arousal maps of multiple participants to identify potential "hotspots" in the city where arousal levels peaked for multiple participants. These aggregated arousal maps were subsequently presented to the participants to act as a catalyst and memory trigger to facilitate public discussion, in order create a better understanding of why those places caused heightened arousal levels for multiple participants (Nold 2009;Frodsham 2015). In a similar approach, Matassa and Vernero proposed to confront participants with, dissonant memories that would contrast their own memory and experience of a particular place, in order to learn how people would react to such distorted and misrepresented signals about the urban space in which they live their daily lives (Matassa and Vernero 2014).
Similarly, in our own study we take a speculative design approach (Auger 2013). Although many slightly different interpretations of speculative design exist, Auger argues that in general, speculative design serves two distinct purposes: it critiques current practice and it enables thinking about potential futures (Auger 2013). Thus, after the Walking & Talking tour with a participant through the city has finished, during which the participant has shown us their own personally significant places and has reflected on ways to represent their emotional experience of each place, the participant will be presented with an emotion map of the city as a provocation (see Figure 5).
Please do not adjust margins This provocation is intended to stimulate reflection and critical attention within participants on their current personal, emotional relationship with places in the urban environment, and how such personal geo-located emotion data might be used, explored and shared using emotion maps in the (near) future. The emotion map is intended to act as a catalyst and conversation piece to help participants imagine and reflect on a future scenario in which personal data regarding other people's emotional bonds with places in the city would be available to them in the form of an emotion map, and how they could potentially use such a map. The emotion map contains different types of positive and negative emotions connected to specific locations and areas in the city, with each emotion indicated by a different colour. A specific aim is to investigate the potential influence of the different types of emotions connected to places, on the relevance of these person-place relationships to other people, and people's interest in exploring this personal data. Our hypothesis is that the participants will be more interested in places with extreme positive or negative emotions connected to them (e.g. anger and love) (Stals, Smyth, and Mival 2017a).
One aspect that the emotions maps encountered in the literature appear to have in common, is that they are typically a visual medium. However, a potential theme for future speculation, could be the use of multimodal interactions or interactions other than visual interactions (e.g. auditory, olfactory, and tactile) to capture, explore and share the emotional experience of a place (Stals, Smyth, and Mival 2017a). Indeed, one of the limitations of emotions maps, is that they typically provide a top-down, visual representation of the city and the emotions experienced at a personally meaningful place. In fact, Nold experimented with both well-known visualization Please do not adjust margins techniques in cartography as well as new visualization techniques, in order to find the best way to visually represent the collected geo-located arousal data in arousal maps, such as a metro-style arousal map of Paris (Nold 2008) and the terrain elevation-style arousal map of Greenwich (Nold 2006). All the proposed arousal maps were strictly visual representations though. In addition, the proposed linking of digital media to locations on the map such as pictures and videos, as for example suggested by Matassa and Rapp, are predominantly visual media as well (Matassa and Rapp 2015). This is not to say that emotion maps that would allow multimodal interactions or interactions other than the visual, would in any way be better, more effective or more desirable than strictly visual emotion maps. But these other modalities could potentially also be used in a speculative design approach as a means of provocation, to create a conversation piece to investigate with participants, the different forms emotion data related to certain places in the city could potentially take and how this personal emotion data could potentially be communicated to, and shared with, other people. As such, speculative emotion maps are tools which can enable us to reflect upon and create a better understanding of the emotional bond that people currently have with personally meaningful places in the urban environment and could potentially inform the design of future technological devices and services to capture, share and explore this personal, geo-located emotion data in novel ways in the future.

Conclusions
With the vision of ubiquitous computing becoming increasingly realized through smart city design, the proliferation of mobile and wearable technology, and the rise of the quantified-self movement, there is a growing need to create a better understanding of the triangular relationship between person, place and technology in the urban environment. To this end, there has been an increased critical focus on emotion and affect to create a better understanding of the urban lived experience, and to augment people's experience of the urban environment.
Inspired by the psychogeographies of the Situationist movement, the mental mapping of cities by urban planner Kevin Lynch, and the emotional cartographies of artist Christian Nold, the creation of emotion maps has become common practice for researchers in the field of Urban Interaction Design. Such methods are used to capture and understand people's emotional experience of, and relationship with, places in the urban environment. Based on a systematic review of emotions maps in existing literature, and our own work on the design of future technological devices and services, we have highlighted and discussed the strengths, limitations and potential of capturing, exploring, communicating and sharing this personal, geo-located emotion data with other people using emotion maps.
Although emotion maps (and arousal maps) currently do not accurately capture and represent the profound, complex emotional bond that people have with personally meaningful places in the city and may even oversimplify this complexity due to limitations in technology, we argue that they could be used as a provocation in a speculative design approach. As such, emotion maps could help to create a better understanding of the personal, emotional relationship that people currently have with personally meaningful places in the city, and to explore the potential and value of sharing this personal geo-located emotion data with other people in novel ways in the (near) future.