Intelligent CIO Europe Issue 87 | Page 51

FEATURE : AI

Optimising AI inference : Cost , compliance and environmental impact

The future of AI inference lies in striking the right balance between cost , speed and availability . From edge versus cloud benefits to regulatory challenges and sustainability practices , organisations must think carefully about their decisions when it comes to AI . We spoke to Robin Ferris , Enterprise Architect , AI Lead at Pulsant , who talks about AI inference and how to create a digital infrastructure that ’ s right for you .

w

Which do you believe offers more advantages , Edge inference or Cloud inference – and can you outline the benefits of each ?
It comes down to what each of them can deliver for those working with AI models . One of the biggest things is local vs . remote . Edge is much more local to the model and where the data is generated , whereas remote is elsewhere and the data has to travel to it . This therefore requires thought to be given to latency and privacy issues . That ’ s where the conversation begins around what is being ingested , how that data is being dealt with and what the outcome is .
Some of the models we ’ ve seen are ingesting real time live imagery compared with someone that ’ s ingesting a data feed of information which could just be numbers . You ’ d almost have competing strategies there . It ’ s about asking the right questions and considering the various elements such as factoring in latency , predicting the outcome and whether it ’ s a system

ORGANISATIONS MUST QUESTION THE LEVEL OF SCALABILITY THEY DESIRE

AND HOW MUCH POWER THEY REQUIRE TO GET

THERE , AS WELL AS THE LENGTH OF THE JOURNEY .

www . intelligentcio . com INTELLIGENTCIO EUROPE 51