Earlier this year, the Future Cities Catapult teamed up with Microsoft and Guide Dogs to conduct “Cities Unlocked“: a project, to improve cities for the visually impaired. But, says Claire Mookerjee, it could hold provide benefits for all of us.

Imagine if the city you worked in worked hard for you, too. If signs showed what you needed to know, when you needed to know it; stations didn’t need barriers, because trains and busses already knew you had a ticket; store fronts told you that they stocked an item, just because it was on your shopping list.

It’s not the pipe dream it sounds. With maturing mobile technology, a growing pool of personal data in digital form, and increasingly smart streets, this kind of city experience is an increasing reality. At the Future Cities Catapult, we’ve been working on a project with Microsoft and Guide Dogs to investigate how that idea could be used to help visually impaired people navigate their cities better.

To do that, we wanted to understand how the visually impaired experienced the world; to know what they found confusing, frustrating or, occasionally, helpful. So as part of Cities Unlocked we followed participants, watching them and monitoring their emotional reactions using EEGs to see how urban environments made them feel. Now we know, for instance, that visually impaired people find pedestrianized areas particularly frustrating, while both sighted and non-sighted individuals find green space relaxing.

Armed with those kinds of insights, we worked with Microsoft to build a prototype device, that supplies a 3D soundscape to augment the sounds of the street. Running on a smartphone, the app uses location data and interactions with nearby wireless networks to provide audio cues – about orientation, navigation and nearby points of interest – via a set of bone-conduction headphones, which don’t block out ambient sound. In real-world trials – on a route between Reading and London involving walking, bus and train journeys – participants reported feeling significantly more confident and comfortable in their surroundings when using the device.

But it’s not just the visually impaired that can benefit. Participants in our study all felt happier because they knew more about their surroundings: something we all deserve.

Now, more than ever, our cities can communicate with us about themselves. They can provide information, answer our questions and even express their characters in interesting new ways: that can mean basic navigational prompts in a city we’ve never visited, provided via discrete vibration of a smart watch, or nuanced local knowledge about the vibe of a coffee shop, articulated through music.

The technology’s certainly ready. We each carry a smartphone that’s capable of knowing where we are, sending and receiving large amounts of data, interacting with other devices and logging rich streams of information. Indeed, tools like digital maps are already using much of it to curate digital representations of space for us, to provide the information we most likely need. All of this allows more opportunity to interact with a world of invisible information that currently circulates in our city streets, be it public Wi-Fi networks, Bluetooth beacons in shops or Near Field Communication sensors at tills.

So we find ourselves at a tipping point. The technology is woven into our cities, becoming increasingly rich by the day: what we need now is for city leaders to make the most of it and test new ways of using it at scale. With the integrated city systems market up for grabs worth an estimated £200bn by 2030, there’s certainly a huge financial incentive to do so.

Some people are already seizing the opportunity. Take Neatebox: a smartphone app that wirelessly triggers the push-button at a pedestrian crossing when you stand by it for a few seconds. It’s an elegant, if simplistic, example of how wireless networks and proximity data can be used by your smartphone to make life easier. Originally designed for those with disabilities, it could make life more straightforward for all of us.

But we think cities can achieve far more. What if every signpost in the city sensed your presence? Walking towards a sign wouldn’t just allow you to read it more clearly, but also to interact with it: it could point you in the direction of the store you’re already trying to find, tell you how long it will take to get home including the walk to the bus stop or even change language on the fly.

Taken to the extreme, this kind of interconnected city would create opportunities to redesign the public spaces. If a train knows that you’re allowed to be boarding it because you’ve already paid for your travel online, the need for ticketing – and with it the associated kiosks, machines and barriers – simply melts away. Instead, the station can become a more open and inclusive space, which allows for free movement and an emphasis on accessibility, rather than ticket purchase and validation.

That example, of course, requires the use of technology to be absolutely pervasive. But at a smaller scale it’s already happening. Now, it’s time for city leaders to make sure it’s done right, in a way that makes citizens lives easier and more enjoyable, while protecting their privacy and security. The city can work hard for us – we just need to make it do so.

Claire Mookerjee is project lead for urbanism at Future Cities Catapult