Context is everything. Your smartphone knows where you are and what direction you're facing, but that's about it. What if it knew your context?
Not just where you are, but what you're doing, who you're with – and what you're likely to do next. It's called contextual computing, and it's going to make your smartphone into a much better personal assistant.
With context, a smartphone, tablet, wearable device or headset could detect when you're driving and read your text messages out loud. It could detect when you're low on battery and start preserving power. It could even alert you when someone really important to you is suddenly nearby.
Contextual computing is about improving interaction between human and computer – and it's about computers becoming intelligent.
"It's the ability for a device, object or service to be aware of not only the users surroundings but about the user, their views, behaviours and their interests," says Kevin Curran, senior member at the Institute of Electrical and Electronics Engineers (IEEE) and Reader in Computer Science at the University of Ulster. "They adapt their functionality and behaviour to the user and his or her situation."
Defining exactly what we mean by 'context' is tricky, but it's generally agreed that it includes the user's location, environment and orientation, their emotional state, the task they're engaged in, the date and time, and the people and objects in their environment. Your phone can probably already calculate some of those already.
Your next smartphone
Contextual computing isn't much about your smartphone; it's about your next smartphone. Producing context can be based on rules and the sensor inputs that now fill our phones, but it's also about machines making assumptions and anticipating our every whim. It's the missing link.
"Instead of the user having to go and look for something like hotels, this device would already know what kind of hotel you are looking for by using the information gathered on what hotels they have picked in the past what facilities they used," says Curran.
If you always stay at hotels with a swimming pool or a spa, that's the context within which the phone will help you search. But it could go further; your phone will know when you're due to arrive at the hotel, and if you're in the car, talk you in. If you're on the train, it will have downloaded your favourite podcasts and turned the ringer off if you're sat in the silent carriage.
Calculate the context
There are already some apps that try to make use of the sensors on smartphones to calculate some context. CallWho uses your call history to estimate who you would like to call; pull your phone out of your pocket for your daily phone call with your wife and her face will be at the top of the list. It also makes pictures of your contacts bigger; after all, who ever thought an A-Z list of contacts was pleasant to scroll through?
Sickweather monitors Facebook statuses about illness and cross-references GPS locations to tell you whether you're near to someone who might pass on sickness. In return you tell the app if you're unwell. That approach might seem hopelessly inaccurate or a novelty, but building models of movement can have life-saving effects. Back in 2009, researchers at Telefónica used mobile phone records of its users in Mexico to help the government limit the spread of the H1N1 epidemic.
By revealing exactly where people were, the government could evaluate their decision to close an airport and a university campus; in doing so they reduced mobility by a third and pushed back the spread of the disease by at least a couple of days. Combine social media with that – and the much more precise GPS data smartphones are capable of now – and contextual awareness becomes a powerful means of pattern-spotting.
A contextually aware app that does just that is Agent, which uses GPS, gyroscope, accelerometer, Bluetooth, temperature and WiFi data from your phone combined with social data on who you're with to assess your context, and make decisions for you.
It goes from the mundane (it knows when you're sleeping and automatically silences your phone) to the possibly life-saving (it can sense that you're driving and automatically reads allowed text messages), but there's more to come. "We have abundant computing power, along with lots of sensors, which means that smartphones can look at many inputs and start learning what we are doing, where we are, and who with,"
Kulveer Taggar, CEO at Agent, told TechRadar. Taggar thinks that in future Agent will be able to use a phone's microphone to sense that you're at a party or concert – somewhere loud where you're not going to hear your phone if it rings – and so switch to vibrate. For now Agent is limited to Android, but an iOS version is in the offing.
With Apple's iBeacon technology ramping-up, expect a urge in retail-based apps that enable hardware in shops and stores in shopping centres and malls to send you messages, reminders and discount codes according to your exact location in a store or mall.
iBeacons could also be used at home; Placed automatically opens other apps on your phone as you approach specific devices in your home. For now it's messy and pricey – you'd have to fill your house with iBeacon hardware – but it gives you the right app when you need it. So it could launch a to-do app like Wunderlist or Remember The Milk when you sit down at your desk, open Spotify as you walk out the front door, or fire-up a remote control app when you sit down in front of your TV.
"The remote control of a TV could be used in contextual computing scenarios by identifying the person that is holding it and displaying options suited to that viewer," says Curran, adding that the future of contextual computing probably isn't about apps at all. "As phones get smarter and tablets become popular," he says, "users will have a device where apps disappear and become part of the gadget's intelligence."
Curran predicts that the future Internet will be a "persuasive sensing and acting knowledge network … able to make decisions, actuate environmental objects and assist users." Not surprisingly given their concentration on mobile devices, contextual computing is a big focus at the digital big boys. Microsoft's new voice-based personal digital assistant Cortana can silence your phone during your regular quiet times, warn you about traffic on common routes, and even have an electronic boarding pass ready for you when you check your phone at an airport. At present it's a Windows Phone 8.1 technology, but it should spread.
However, even the present generation of personal digital assistant are becoming context-aware.
"Google Now is one of the best," says Taggar of current contextual computing innovations. "Google's goal is very much a focus on showing you your search results before you need them, which is slightly different to Agent," he says. "But Google and Apple are definitely looking at this."
They certainly are. Google's Project Tango, a 5-inch Android smartphone prototype that tracks the device's 3D motion to create a 3D map of the surroundings.
Meanwhile, Twitter acquired in April the team behind the Cover Lock Screen Android app that lets you customise what apps you see, and when. Expect the Twitter app to become a lot more contextual.
Relax and unwind
Contextual computing isn't just about making decisions – it's also about switching-off. As well as saving us brain-cycles throughout the day by automating repetitive tasks, a contextually aware device won't disturb you when you're sleeping, when you're busy working, or when you're in specific places (say, a museum, the theatre or on holiday).
Could it be a key weapon in helping us quieten our busy, gadget-driven lives? Instead of being slaves to our phones and in a permanent 'always on' state, devices will move from being an erratic distraction to a helpful assistant.
If it's done properly, contextual computing could be about taking the power back.