They have pressure sensors (8 per insole), 3DoF motion sensor, and haptics (2 per insole - heel and ball)
Using our wearables SDK to take a picture, speak a prompt with Whisper’s speech to text, and use chrome’s Prompt API to generate a response to be displayed back on the glasses’ display
Using our wearables SDK to take a picture, speak a prompt with Whisper’s speech to text, and use chrome’s Prompt API to generate a response to be displayed back on the glasses’ display
Using our Wearables SDK to experiment with a music player on compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)
Also using a Mudra Band and TapXR for hand controls
Using our Wearables SDK to experiment with a music player on compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)
Also using a Mudra Band and TapXR for hand controls
Using our wearables SDK to display a simple map overlay, using @openstreetmap.bsky.social data to retrieve nearby data
Using our wearables SDK to display a simple map overlay, using @openstreetmap.bsky.social data to retrieve nearby data
Using our wearables SDK to look for certain people, and displaying relevant information when someone is recognized
Works with compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)
Using our wearables SDK to look for certain people, and displaying relevant information when someone is recognized
Works with compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)
With our Wearables SDK, we can both display a 3d wireframe scene on compatible smartglasses, as well as control a cursor by strapping one of our motion modules to our wrist
We’re able to detect pinches by running an @edgeimpulse.com model
With our Wearables SDK, we can both display a 3d wireframe scene on compatible smartglasses, as well as control a cursor by strapping one of our motion modules to our wrist
We’re able to detect pinches by running an @edgeimpulse.com model
Using our Wearables SDK, we’re able to visualize MediaPipe tracking data (body, hands, and face) on compatible smartglasses (in this case the Brilliant Labs Frame running our custom firmware)
Great for yoga, working out, dancing, etc
Using our Wearables SDK, we’re able to visualize MediaPipe tracking data (body, hands, and face) on compatible smartglasses (in this case the Brilliant Labs Frame running our custom firmware)
Great for yoga, working out, dancing, etc
Added foreign language support to our Wearables SDK, allowing us to display languages like Chinese, Japanese, and Korean
In this case we’re using @hf.co models for both realtime transcription and translation - all done locally (no cloud required)
Added foreign language support to our Wearables SDK, allowing us to display languages like Chinese, Japanese, and Korean
In this case we’re using @hf.co models for both realtime transcription and translation - all done locally (no cloud required)
Added wireframe support to our Wearables SDK, allowing us to display realtime 3d graphics (like A-Frame and @threejs.org) on compatible smart glasses (in this case the Brilliant Labs Frame)
Added wireframe support to our Wearables SDK, allowing us to display realtime 3d graphics (like A-Frame and @threejs.org) on compatible smart glasses (in this case the Brilliant Labs Frame)
We used @hf.co’s Whisper model to transcribe microphone data locally, and then display the transcription on the smart glasses’ display (in this case the Brilliant Labs’ Frame) - no cloud required
We used @hf.co’s Whisper model to transcribe microphone data locally, and then display the transcription on the smart glasses’ display (in this case the Brilliant Labs’ Frame) - no cloud required
Great for stuff like emulating 3d graphics via billboarding
Great for stuff like emulating 3d graphics via billboarding
This is a simple example using @nextjs.org, @supabase.com, and @tailwindcss.com to make a basic data collection website that can both record and view sensor data
This is a simple example using @nextjs.org, @supabase.com, and @tailwindcss.com to make a basic data collection website that can both record and view sensor data
Made a simple @edgeimpulse.com model to detect nods and head shakes
Made a simple @edgeimpulse.com model to detect nods and head shakes
Created a virtual webcam using OBS and our SwiftUI camera demo
Created a virtual webcam using OBS and our SwiftUI camera demo
Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the Brilliant Labs’ Frame running our custom firmware)
Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the Brilliant Labs’ Frame running our custom firmware)
Preview what your ar glasses applications will look like using your webcam, a video file, or even an interactive 3D scene
Preview what your ar glasses applications will look like using your webcam, a video file, or even an interactive 3D scene
We created a Canvas-like interface for drawing primitives onscreen for smartglasses with displays, like the Brilliant Labs Frame
Easily draw rectangles, circles, rounded rectangles, ellipses, polygons, and segments
We created a Canvas-like interface for drawing primitives onscreen for smartglasses with displays, like the Brilliant Labs Frame
Easily draw rectangles, circles, rounded rectangles, ellipses, polygons, and segments
As we port our firmware to the Brilliant Labs’ Frame, we added microphone support to our SDK’s, allowing us to run @hf.co’s Whisper Web to transcribe speech in the browser
As we port our firmware to the Brilliant Labs’ Frame, we added microphone support to our SDK’s, allowing us to run @hf.co’s Whisper Web to transcribe speech in the browser
As we port our firmware to the Brilliant Labs’ Frame, we added camera support to our SDK’s
As we port our firmware to the Brilliant Labs’ Frame, we added camera support to our SDK’s
By wrapping our motion module around our ankle, we can add kick/stomp detection running an @edgeimpulse.com ML model
By wrapping our motion module around our ankle, we can add kick/stomp detection running an @edgeimpulse.com ML model