There is a new generation of apps underway that are not limiting themselves to just touch, hearing and seeing. Instead they are deploying new senses - at least new to the mobile environment - such as the sense of smell.
Some are even incorporating low-level kinematic capabilities to sense force or velocity of nearby objects.
Hands Warm and Off the Steering Wheel
Not yet available for US users, PocketHeat is an app that keeps your hands warm. It works by making the iPhone max out 100% of its power processing capacity, according to TUAW. A slider adjusts the temperature and illuminates the "heating elements."
The iBreath, while not an app, is part of this trend. An accessory for the iPod that transmits music from a car radio, it also can be used as a portable breathalyzer test, writes MacNewsWorld. A driver blows into the attached tube to get a measurement of blood alcohol content (BAC) that measures accurately to within 0.01 percent BAC with a maximum limit of 0.12.
From Illness to Good Parking Spots
More interesting - and likely more useful - are the apps under development in scientific or corporate labs. Software being developed by American and Australian scientists is expected to allow patients to cough into their phone. The app will then tell the user where he or she has the cold, flu, pneumonia or other respiratory diseases, the Telegraph writes.
The software would compare the patient’s cough to a pre-recorded database of coughs: these include coughs of people with all respiratory diseases, in both genders, and various ages, weights and other variables.
STAR Analytical Services, the firm behind the research, wants to have at least 1,000 coughs in its database before it launches. Currently it has about a dozen.
Apple has filed for several patents that could develop into apps that help you find a parking spot or interpret three-dimensional force-based "gestures" with a mouse, according to Apple Insider.
One application describes a computer mouse equipped with one or more force and motion detection sensors. How the device is handled relative to a surface - say, a desk - would be interpreted by the device and shared with the computer. "This data could be used to understand "gestures" in the same way Apple has employed multi-touch gestures on the iPhone and with trackpads on its MacBook and MacBook Pro computers," Apple Insider says.
Such gestures may include, for example, brushing motions, scooping motions, nudges, tilt and slides, and tilt and taps, the application reads. The application can then respond to each gesture (or gesture combination) in any number of ways.
Another application would make finding a parking spot easier. With the car's current location identified by GPS, the app would access a database of information with parking regulations, traffic patterns, travel times and more. "Given that cities often have specific times where parking spots are available, or where parking rates are adjusted, such a system would look to help users avoid tickets and find the fastest, quickest and least expensive parking option."