Nvidia Tesla T4 brings Turing smarts to AI inferencing
by from Techreport on (#3YQ41)
At his GTC Japan keynote, Nvidia CEO Jensen Huang noted that AI inferencing-or the use of trained neural network models-is set to become a $20-billion market over the next five years. More and more applications are going to demand services like natural language processing, translation, image and video searches, and AI-driven recommendation, according to Nvidia. To power that future, the company is putting the Turing architecture in data centers using the Tesla T4 inferencing card and letting models run on those cards with the TensorRT Hyperscale Platform.
The Tesla ...