Compression in the AI world: MobileNets, Pruning & Quantisation
amandalmia.substack.com
Learn how you can compress a 500 MB large deep learning model to just 5 MB with almost no drop in accuracy. Over the past couple of years, the state-of-the-art AI algorithms have started moving out from research labs to out in the real world. This has significant implications as the advancements that have been happening in the AI community, now has the potential to reach billions of people across the globe. However, once you decide to deploy an AI model, there are a ton of things that need to be decided, one of them being the nature of the deployment.
Compression in the AI world: MobileNets, Pruning & Quantisation
Compression in the AI world: MobileNets…
Compression in the AI world: MobileNets, Pruning & Quantisation
Learn how you can compress a 500 MB large deep learning model to just 5 MB with almost no drop in accuracy. Over the past couple of years, the state-of-the-art AI algorithms have started moving out from research labs to out in the real world. This has significant implications as the advancements that have been happening in the AI community, now has the potential to reach billions of people across the globe. However, once you decide to deploy an AI model, there are a ton of things that need to be decided, one of them being the nature of the deployment.