Amazon ’ s EC2 instances are now powered by AWS Inferentia chips that can deliver up to 30 % higher throughput and up to 45 % lower cost per inference . By contrast , Amazon EC2 F1 instances use FPGAs to enable delivery of custom hardware accelerations , according to Analytics India Magazine .
Nevertheless , Stephen Simpson argues that it is also important to address dataintensive network communication latency , in spite of promising innovation currently being seen in the core data centre servers . He says that the emerging trend is to “ offload these responsibilities – including encryption and data loading – to a dedicated processing unit that also provides advanced security and accelerated data-movement capabilities ”. datacentremagazine . com 107