Subscribe for future emails here! Check out our new audio digest! Mini Briefs The startup making deep learning possible without specialized hardware GPUs have long been the standard hardware used for performing machine learning. To train today’s large deep learning models, practitioners and researchers often use GPUs, which require additional specialized knowledge to utilize for training models. The need for GPUs to train deep learning models has increased the barriers to performing research and creating products for those who do not have access to such specialized hardware. But Neural Magic, a startup created by MIT professor Nir Shavit, wants to change that. The startup, which recently released its first line of products, redesigned the deep learning software to run more efficiently on a CPU, achieving the same speed that a GPU would by avoiding the need to ferry data on and off the GPU. While Shavit thinks that CPUs will come to be “the actual fabric for running machine-learning algorithms,” MIT research scientist Neil Thompson is less sure, stating that while Neural Magic can squeeze more performance out of existing hardware, “fundamental hardware advancements will still be the only way to drive computing forward.”
Last Week in AI News #70
Last Week in AI News #70
Last Week in AI News #70
Subscribe for future emails here! Check out our new audio digest! Mini Briefs The startup making deep learning possible without specialized hardware GPUs have long been the standard hardware used for performing machine learning. To train today’s large deep learning models, practitioners and researchers often use GPUs, which require additional specialized knowledge to utilize for training models. The need for GPUs to train deep learning models has increased the barriers to performing research and creating products for those who do not have access to such specialized hardware. But Neural Magic, a startup created by MIT professor Nir Shavit, wants to change that. The startup, which recently released its first line of products, redesigned the deep learning software to run more efficiently on a CPU, achieving the same speed that a GPU would by avoiding the need to ferry data on and off the GPU. While Shavit thinks that CPUs will come to be “the actual fabric for running machine-learning algorithms,” MIT research scientist Neil Thompson is less sure, stating that while Neural Magic can squeeze more performance out of existing hardware, “fundamental hardware advancements will still be the only way to drive computing forward.”