It doesn’t matter what brand of smartphone you prefer; we’ve all faced trouble typing things out due to tiny on-screen keys for the keyboard. Research from ETH Zürich could mean fewer typing errors on smartphones of the future. The researchers have developed a new AI solution enabling touchscreens to sense with eight times higher resolution than current devices.
Their solution can infer much more precisely where fingers are on the touchscreen. Researchers on the project say the challenge with typing on modern smartphones is that the touch sensors that detect where fingers are on the screen haven’t changed much since the mid-2000s. While those sensors haven’t changed much, the screens themselves have improved significantly with more resolution and higher fidelity.
While the screen resolution of the latest iPhone is 2532×1170, the touch sensor has a vastly inferior resolution of 32×15 pixels. Capacitive touch screens detect the position of the fingers using changes in the electric field between the sensor lines to sense the proximity of the finger when touching the screen surface.
The team says because capacitive sensing captures the proximity, it’s not able to detect true finger contact. The method the team came up with is called CapConnect and combines two approaches. The tech uses the touchscreen as an image sensor that can see about eight millimeters away and a depth camera that records an image of how close objects are. CapConnect exploits the insight to accurately detect contact areas between fingers and surfaces using a deep learning algorithm built by the team.
The team demonstrated their system reliably distinguishes the touch on surfaces even when fingers touch the screen closely together, such as in a pinch gesture. Researchers believe the AI solution could pave the way for new touch sensing in the future for mobile phones and tablets, allowing them to operate more reliably and precisely while reducing the footprint and complex city of sensor manufacturing.