GlassesJS

Zero-model glasses detection for the browser

Detect whether a person is wearing glasses using webcam video and facial landmarks. No AI models, no server, no dependencies — pure math.

Try Live Demo npm install glassesjs

Live Demo

Enable your webcam to see real-time glasses detection with confidence score and per-method breakdown.

Bridge
0
Temple
0
Iris
0
Depth
0
Contrast
0
Color
0

How It Works

GlassesJS combines 6 independent detection methods, each returning a score 0–100. The final confidence is a weighted average.

Bridge Edge Detection

25%

Horizontal Sobel edge detection on the nose bridge area. Glasses frames create strong horizontal edges where they sit on the nose.

Temple Symmetry

20%

Vertical edge analysis at both temples. Glasses arms create symmetric vertical edge patterns on both sides of the face.

Iris Stability

20%

Tracks iris position variance over multiple frames. Glasses lenses refract light, causing higher variance in detected iris position.

Z-Depth Profile

15%

Analyzes Z-coordinate discontinuities across eye landmarks. Glasses create a false plane in front of the face.

Local Contrast

10%

Compares pixel contrast in the eye region vs. cheeks. Glass lenses alter local contrast through reflections and tinting.

Color Anomaly

10%

Samples colors across the eye region and compares with skin baseline. Coated lenses shift color temperature.

API Reference

You already have MediaPipe landmarks — just pass them in.

Single Frame (fast)

import { GlassesDetector } from 'glassesjs';

const detector = new GlassesDetector();
const result = detector.detect(canvas, faceLandmarks);

console.log(result.hasGlasses);   // true / false
console.log(result.confidence);   // 0–100
console.log(result.methods);      // per-method breakdown

Accumulated (accurate)

const detector = new GlassesDetector({
  frameBuffer: 30,
  confidenceThreshold: 70,
});

// In your detection loop, every frame:
detector.addFrame(canvas, faceLandmarks);

// After 10+ frames:
const result = detector.getResult();

// When user changes:
detector.reset();

Library handles MediaPipe internally. Just provide a video element.

Single Frame (fast)

import { StandaloneDetector } from 'glassesjs/standalone';

const detector = await StandaloneDetector.create({
  video: myVideoElement,
  framesForResult: 30,
  confidenceThreshold: 70,
});

// One-shot — accumulates 30 frames, returns result:
const result = await detector.detectOnce();
console.log(result.hasGlasses, result.confidence);

Continuous Detection

const detector = await StandaloneDetector.create({
  video: myVideoElement,
  interval: 5000,    // evaluate every 5 seconds
});

// Start — callback fires every 5 seconds:
detector.start((result) => {
  console.log(result.hasGlasses, result.confidence);
});

// Stop when done:
detector.stop();

// Cleanup:
detector.destroy();

Install

npm

npm install glassesjs

CDN

<script src="https://cdn.jsdelivr.net/npm/glassesjs/dist/glassesjs.min.js"></script>

ES Module

import { GlassesDetector } from 'glassesjs';
import { StandaloneDetector } from 'glassesjs/standalone';