/plushcap/analysis/whylabs/whylabs-posts-detecting-and-fixing-data-drift-in-computer-vision

Detecting and Fixing Data Drift in Computer Vision

What's this blog post about?

This tutorial demonstrates how to set up an ML model drift monitoring system for a computer vision project using the open-source library whylogs and WhyLabs. The case study involves monitoring a computer vision classifier trained to distinguish between cats and dogs, with data divided into day batches. Data drift is detected by profiling the data with whylogs and sending the profiles to WhyLabs for monitoring. Human annotation using Toloka's crowdsourcing platform helps confirm whether model performance has been affected by the drift. Comparing model predictions with human annotations in WhyLabs reveals a drop in accuracy, indicating that the model has indeed drifted.

Company
WhyLabs

Date published
Jan. 26, 2023

Author(s)
WhyLabs Team

Word count
1247

Language
English

Hacker News points
None found.


By Matt Makai. 2021-2024.