Vespa's position in the ML ecosystem
By Lester Solbakken
The Presentation
In recent years there has been significant developments and breakthroughs in analyzing and learning from large amounts of data. Particularly the advances in deep learning has arguably been the primary driver for this, as the bar has been raised in many fields such as image recognition and textual analysis. For the general public, perhaps the most evident has been the highly publicized success of AlphaGo, AlphaZero and now AlphaStar, which has demonstrated the advances in AI and reinforcement learning. At Verizon Media, the insights gained from analyzing data is commonly used to improve our websites in some manner. For instance, machine learned models are used to successfully increase personalized relevance in search results or ad impressions. In other cases, such as in recommendation systems, machine learning is the core technology.
With the advances on the algorithmic side of machine learning there has naturally enough been a corresponding increase in learning frameworks as well, such as TensorFlow, PyTorch/Caffe2, and MxNet. While many of these frameworks are relatively easy to set up and use for training models, model inference in production is leess straight forward as it depends heavily upon the concerns of the application as a whole. In larger applications, models are not usually run in isolation, but as a part of a system collaborating to compute relevant pieces of information. This presents some unique and hard challenges when it comes to engineering solutions that work at scale
Lester Solbakken is a principal software engineer at Verizon Media (formerly Yahoo) where his focus is on machine learning solutions on Vespa.
Legg igjen en kommentar