Impact of Intel® AI toolkit on AI model sustainability

Artificial Intelligence is embedded into our devices such as mobile phones, computer chips, Automobiles etc. AI are powered by deep learning algorithms which are trained on supercomputers in a datacenter or with hypervisor solution such as AWS. The beauty of AI is the Algorithms need to trained on data once and we can use the model of Inferencing on real application such as Autonomous driving vehicle, conversational AI and ChatGPT. As the algorithms are developed into production will consume power continuously at the data centers. The work revolves around implementation of optimized libraries which can accelerate the inference work and also consume less power compared to AI accelerators.

×


Watch the oneAPI DevSummit hosted by UXL:

Watch Now