DemosMetropolis - Self-Checkout AI Copilot

Metropolis - Self-Checkout AI Copilot

Logo for undefined
Application typeSmart Self-CheckoutDomainVision AIUsageDemoTerms of useTerms of use

Demo Overview

Experience a sample workflow of the Self-Checkout AI Copilot reference app, built with NVIDIA Metropolis Microservices.

The use case is to reduce misscans and improve customer experience by giving existing kiosks visual recognition capabilities, that can be adapted quickly to new products & designs with limited data and model retraining.

What to Expect

Once the demo is lauched, you'll be able to explore the app workflow, presented as a wizard. Each step contains brief instructions on what it does and how you can review or interact with it.

More specifically, the navigation tabs & steps are:

  1. [Start] Stream a video of a self-checkout scene to the app
  2. [Monitor] Review how the app's vision system identifies items, augmenting the barcode scanner. Initially, the app can't identify about half the time, since half of the items were not introduced during model training
  3. [Optimize] Refine the app's prediction quality by adding a couple of new products' visual signatures to the database
  4. [Monitor] Go back to monitor the app's operation to review the improvement
  5. Explore repeating step 3 & 4 to further optimize the app

Technologies Used

Several technologies used include:


This is a standalone demo and not tied to an actual checkout kiosk. The scanner signal is simulated.