Banner photo for blog

Qualcomm Inc. are a major player in the world of wireless telecommunications and mobile technology and are well known as an industry leading mobile chip manufacturer.  They are probably best known for their “Snapdragon” line of mobile chips used in many premium smartphones such as Samsung’s Galaxy S smartphones, which has used Snapdragon chips since the Galaxy S2 released in 2011.

Check out a Forbes post on the newst Qualcomm chip within the brand new Samsung phone here

Now, it seems that Qualcomm are trying to take their mobile computing knowledge and expertise and apply it to the datacenter. The Qualcomm “Cloud AI 100” is their newly announced dedicated AI inference chip that is designed to meet the growing demand for cloud AI inferencing by accelerating AI experiences, while providing high power efficiency with low power consumption. Qualcomm is also responsible for developing 5G-capable chipsets that will be used in a wide range of mobile devices, so it makes sense that the new Cloud AI 100 chip will be optimised for edge-cloud infrastructure as 5G networking will benefit greatly from this infrastructure.  

Qualcomm says that this new AI solution is “built from the ground up” for AI acceleration and will offer 10x the performance per watt when compared to the top solutions already available on the market such as Nvidia’s T4 series of accelerators and Google’s Edge TPU inference chips. The AI performance of the chip will be about 350 TOPS (trillion operations per second) according to Qualcomm, which would make it about 50x more powerful at AI workloads than their current flagship mobile chipset, the Snapdragon 855. The Cloud AI 100 will be compatible with most industry-leading software stacks such as Keras, Caffe, Tensorflow and PaddlePaddle with OnnX, XLA and Glow runtimes also being supported.  

There is definitely demand in the datacenter market for more power-efficient AI accelerators.  Joe Spisak, who is a product manager at Facebook, said at the Qualcomm event that Facebook computers make about 200 trillion predictions everyday, and that this ever increasing workload means that it is difficult to keep up with the growing power demands of their data centers.  Qualcomm’s Cloud AI 100 seems like a very promising solution to the problem of meeting the power needs of data centers, meaning that we could be seeing widespread adoption of these accelerators from companies such as Facebook that need to process vast amounts of data while keeping their power consumption in check.  This demand for more power efficient solutions is not only because of financial incentive but also because of the growing pressure to reduce the impact of data centers on the environment, which mainly comes from the unsustainable methods of producing the power demanded by these data centers.

An interesting blog post on how data centers and minimising the impact on the enviroment. Click here

Qualcomm will likely see competition in this sector of the market in the near future however, as Intel in particular have been buying up smaller companies that make a variety of chips with various attributes that would be beneficial for AI accelerators.  Qualcomm also has to catch up with Nvidia who have been very successful in adapting their GPUs for use in data centers, which has been very lucrative for them, earning them billions.

If you have any questions about AI accelerators or their benefits or are interested in HPC solutions please contact us with the information below

Sales@serverfactory.co.uk

+44 (0)20 3432 5270