Machine Learning Chip ‘Inferentia’ launched by Amazon taking on Nvidia and Intel

Image Credits Amazon

On 28th November, Wednesday, Amazon launched a machine learning microchip which both Intel and Nvidia Corpare including on to enhance their earnings in the coming years.

Amazon is one of the biggest buyers of chips from Intel and Nvidia, its semiconductors aid power Amazon’s thriving cloud computing unit, Amazon Web Services. Now, Amazon has started to design its own chips.

Amazon’s ‘Inferentia’ chip which was announced on Wednesday that will help with what the researchers call inference. This is the process of taking an Artificial Intelligence algorithm and putting it to employ, for example by scanning income audio and translating that into text-based requests.

Amazon chip is not a direct threat to Intel and Nvidia’s business as it is not for sale. They will sell this service to their cloud customer who runs atop the chips starting next year. Amazon mainly relies on its own chip which could deprive the major customers of both Intel and Nvidia.

Intel’s processors currently dominated the market for machine learning inference, that the analysts at Morningstar consider will worth $11.8 billion by 2021. Nvidia launched its own inference chip in September to compete with Intel.

On Monday, Amazon announced a processor chip for its cloud unit called gravitation. This chip is powered by technology from SoftBank Group Crop- controlled Arm Holdings. Armed based chips currently power mobile phones, but multiple companies are trying to make it suitable for data centers. This Arm Chips potentially represents a foremost confront to Intel’s dominance in that market.

Amazon is not the only one among the cloud competing vendors while designing its own chips. Alphabet-owned Google’s cloud unit in the year 2016 disclosed an artificial intelligence chip which is designed to take on chips from Nvidia. An analyst pointed out that such investment is driving up research and capital expenses for big tech companies.

Google Cloud has stated that the customer’s demand for Google’s custom chip, the TPU, which is strong. But these chips are costly to use and requires software customization. Google Cloud charges $8 per hour of access to its TPU chips and $2.48 per hour in the United States for access to Nvidia’s chips, according to Google’s website.

      

Related Posts :