Events
 • 
5
 minutes read

Takeaways from MLOps World Conference 2022

Learnings from one of the largest gatherings on MLOps

By 
Ayush Patel
Takeaways from MLOps World Conference 2022
In this post:

It was an amazing surprise when I received an invitation to attend the MLOps World conference in Toronto! Toronto has always been a nurturing host to a vibrant community of AI/ML practitioners and it was a great experience to get a flavor of it.

This conference has a short but powerful history. MLOps World brought together one of the first prominent communities around MLOps. It commenced in 2019 and the partners and sponsors were young organizations that were working with the "technology of the future". They definitely had good foresight since MLOps was just getting started back then, even though predominantly passed around as a buzzword.

Fast forward to 2022 and in just two years, MLOps has evolved rapidly to one of the foundational pillars for scaling reliable and sustainable AI solutions. There are a plethora of reasons behind this growth, two of the prime ones being the rapid maturity of AI in enterprise and the consequent increase in the demand for AI at scale. In fact, AI has become so ingrained in everyday technology, sometimes customers are not even aware that they are creating a demand for AI features.

The Pioneering Canadian Community: Leading the MLOps Conversation

The opening remarks and keynote sessions were an amazing introduction to the influential AI/ML community that has been nurtured by Canada for decades. While organizations may get easily carried away with US and EU markets, the Canadian space is quite at par showing high promise and a great demand for the next set of AI technologies.

It is interesting to note that Toronto was the birthplace of deep learning—as graciously shared by Tomi Poutanen, one of Geoffrey Hinton's former students! Considering that MLOps is the next big chapter in the AI storyline, Toronto seems to be first on spot again by hosting and nurturing tools, processes, and conversations around the modern AI stack and its close ties with MLOps.

MLOps is Rapidly Evolving and there's a Flurry of New and Exciting Tools to Bridge the Gap 

I like to think of products as a result of demand. For every demand group, hundreds of tools and services tend to pop up in a very short span and most of them end up in a tough competitive net as the demand and the space matures. Who wins then?

It is an old and common adage that the early birds get the fattest worms. While it has been true for several other tech revolutions including SaaS and Cloud, it seems to hold true for AI as well.

The most interesting bit here is that early practitioners do not enter a competitive space even when the market is saturated. In fact, they work as collaborators with other early birds to knit a strong and closed network of complementary products which can easily build the market around itself. The market then learns to repel the latecomers or any new product or service that's outside this network or is redundant with any of the network's features.

MLOps is presently at that sweet spot where the market is not entirely focused on it, yet industry leaders and top AI practitioners have started creating demand for it. Investors and influential customers are already exploring the field and attracting these early birds. It was a delight to hear these early voices chirp and call out for collaborators and experimenters at the conference. The new tools and services aim to bridge the gap that the lack of MLOps has created in Enterprise AI.

MLOps is Still in its Early Days and There are Several Open Challenges

We all know being young has its challenges, and it's no different for MLOps which is perhaps in the teenage years.  

MLOps has evolved from a set of vague ideas to clear cut stages and steps that could be followed to achieve quantifiable results. However, though there has been rapid growth, MLOps is still a work in progress and is looking for answers to various challenges that are rampant in the AI industry.

Here are a few of the open challenges which were frequently discussed:

  • Interoperability of tools: One of the prime challenges faced by MLOps tools and services is finding an easy opening into the current AI stack. Data Scientists and ML Engineers have become comfortable with a straightforward pipeline. So, to make MLOps tools feasible and comfortable, the number one objective is to make them easy to adopt and then make them easily communicate with the existing ecosystem of tools. The tool must fit in the pipeline like it already belonged.
  • Customer awareness: While the demand for MLOps is steadily growing, it is restricted to high-tech organizations that have solved their AI development needs and are looking at scaling and long-term reliability as the next big steps. Unless the wider set of developers and organizations are not consistently educated and made aware of the specific problems that MLOps solves, it could be challenging to pass around the tools beyond advanced groups.
  • AI ethics: AI ethics covers a wide range of issues including model bias and compliance. These issues have come under the spotlight in recent years as more AI solutions came to the market and started reaching a wider audience. Conversations on model biases regarding geographies, skin tones, genders, etc. have become more open and frequent and are aimed at achieving a solution soon. Compliancy also plays a crucial role in securing data and meeting minimum standards across AI solutions.

There's an Uptick in Demand for Standardization of ML Processes and Tools

All these years while AI in the enterprise was in its nascent stage, it was feasible to lift and shift processes and tools in an experimental environment.

However, the goal of the enterprise has been shifting from entertaining experimental POCs to raking in revenues by offering robust solutions that could be relied on for the longterm and maintain high customer retention.

Therefore, it was no surprise to find that processes of organizations are also transitioning from lift-and-shift methods to standardized pipelines which are efficient, fast, and automated to a high degree.

It is Easy to Trust People and then the Organizations Behind Them 

The MLOps World conference was a great opportunity for me to finally meet the faces behind the tools and innovations I have heard of so much before. To hear people talk about their experiences, interests, and knowledge is undoubtedly the best way to get a picture of their vision and plans for their products.

It was great to befriend some amazing minds behind innovative organizations and of course, we are all still in touch! Definitely excited to see and be a part of their growth in the next big stages of the AI evolution.

Liked the content? You'll love our emails!

The best MLOps and AI Observability content handpicked and delivered to your email twice a month

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Censius AI Monitoring Platform
Automate ML Model Monitoring

Explore how Censius helps you monitor, analyze and explain your ML models

Explore Platform

Censius automates model monitoring

so that you can 

boost healthcare

improve models

scale businesses

detect frauds

boost healthcare

improve models

scale businesses

detect frauds

boost healthcare

Start Monitoring