VLM YOLO Pro | Blueprints gallery
Blueprints / VLM YOLO Pro

VLM YOLO Pro

Run Vision Language Models (VLM) with YOLO Pro for real-time object detection on your Tachyon device.

Intermediatev1.0.0AI/MLContainers
Introduction
Supported devices
Hardware and devices
Project description

Introduction

Welcome to the VLM YOLO Pro Blueprint App for Particle!

This tutorial walks you through setting up and running real-time object detection using Vision Language Models (VLM) and YOLO Pro on your Linux device.

Supported devices

Hardware and supplies

  • Supported device

Project description:

The VLM YOLO Pro Blueprint provides a powerful starting point for integrating vision-based AI applications into your Tachyon or Raspberry Pi device. This blueprint leverages YOLO Pro for object detection and inference, demonstrating how to deploy real-time AI-powered vision applications at the edge.

The app processes images every 10 seconds, resizing them as needed, and sending them to a running inference API for real-time classification.

Fork this project to your GitHub account Fork and edit