There’s finally a fix for Apple’s faulty iPhone 5S sensors — provided by a tiny company that develops technology for iPhone developers who need accurate spatial orientation and movement data.
The company is RealityCap, which makes software that helps iOS apps sense their surroundings in three dimensions by using the iPhone’s camera, sensor, and inertial data. Since it clearly needs accurate data to work, the company carefully investigated the iPhone 5S sensor problems which came to light in early October, finding that Apple’s newest flagship smartphone reported errors of as much as 5-6 degrees.
That’s obviously a problem for gaming and other apps that need good data.
With the new calibration software, iPhone 5S reports accurate data
Today, RealityCap released a fix for iOS apps that will enable them to adjust their readings for the incorrect iPhone 5S sensor data — essentially fixing hardware in software.
“Since the bias is more or less constant, a simple one-time calibration process can correct for it,” CEO Eagle Jones posted tonight. “There are several ways to approach this, but we find it most straightforward to ask the user to place their device on a flat, level surface. We then capture accelerometer data over a period of time (to smooth out any vibration or noise in the measurements).”
In a generous gesture, the company has posted sample code on Github that shows developers exactly how to accomplish this in their apps.
The company does offer two caveats about its approach:
- The user doesn’t move the device during calibration. Small vibrations or bumps will be smoothed out by filtering, but larger disturbances could cause problems.
- The surface is closer to flat than the existing bias of the device. Therefore, on older devices where the bias range is closer to +/-1 degree, this calibration could actually make things worse.
There are still questions about exactly why the problem is occurring, and how the sensor in question — which is provided by Bosch — is making the errors it is making. There is speculation that Apple went with a cheaper sensor than in previous iPhone generations which has a wider range of error. But there also appear to be multiple models involved.
At the time this story broke, I asked Apple for comment, to no avail.
The company has declined to comment on the case, but hopefully it will use the sample code on Github that RealityCap has built to modify the data that the iPhone 5S’s sensors is delivering to the operating system and apps, which would be a lot more efficient than every developer having to do the same job themselves.
Or, they could simply put the previous part back into the iPhone spec and solve the problem at its source.