At least not for certain tasks, such as rendering video, even of 3D environments.
An interesting bit over at Kotaku.
"The difference between the low-precision and the standard arithmetic was trivial," Shaw says. "It was about 14 pixels out of a million, averaged over many, many frames of video." "No human could see any of that," Bates adds.
Why is this a big thing? Because apparently, you can fit 1000 low precision cores in the space where 12 normal cores would go.