Article 571KJ TensorFlow Lite Now Supports Tapping OpenCL For Much Faster GPU Inference

TensorFlow Lite Now Supports Tapping OpenCL For Much Faster GPU Inference

by
from Phoronix on (#571KJ)
TensorFlow Lite for AI inference on mobile devices now has support for making use of OpenCL on Android devices. In doing so, the TFLite performance presents around a 2x speed-up over the existing OpenGL back-end...
External Content
Source RSS or Atom Feed
Feed Location http://www.phoronix.com/rss.php
Feed Title Phoronix
Feed Link https://www.phoronix.com/
Reply 0 comments