This video shows a demonstration of the K-glass 2, gaze-activated object recognition HMD. The K-Glass 2 combines both technology of gaze user interface and object recognition. When user looks at target object on a glass screen then recognition results can be augmented on the screen. It include 10mW gaze image sensor, 65mW multi-core object recognition processor. The video contains two demonstration scenarios: ¡°Car Maintenance¡± and ¡°Shopping at Cafe¡±. The recognition processor performs gaze-activated object recognition by recognizing objects and scene simultaneously (33fps).
* Paper: I. Hong, et al., “A 2.71nJ/Pixel 3D-Stacked Gaze-Activated Object Recognition System for Low-power Mobile HMD Applications” IEEE ISSCC, 2015. [ pdf ]
This video shows a demonstration of the K-Glass, real-time augment reality HMD.
The K-glass is wearable, hands-free HMD that enables users to find The K-glass augment reality HMD include BONE-AR(real-time AR processor), touch pad interface, 5Mpixel camera, and Micoro display. The video contains two demonstration scenarios: "Library" and "Toy Store". This HMD can achieve 33 frame/sec and 381mW average power processor can achieve 30 frame/sec with 342GOPS peak performance, 320mW average power and 12.70mJ/Frame energy efficiency.
* Paper: G. Kim, et al., “A 1.22TOPS and 1.52mW/MHz Augmented Reality Multi-Core Processor with Neural network NoC for HMD Applications” IEEE ISSCC, 2014. [ pdf ]
This video shows a demonstration of the BONE-V6, context-aware object recognition application. The BONE-V6 object recognition chip is integrated with the Text Instrument¡¯s OMAP4460-based multimedia board through the FPGA expansion board. The system includes 9-inch TFT-LCD display monitor for portable demonstration. The video contains two demonstration scenarios: ¡°Advanced Driver Assistance System¡± and ¡°Human Recognition & Tracking in Unmanned Aerial Vehicle (UAV) System¡±. The recognition processor performs context-aware object recognition by recognizing objects and scene simultaneously (30fps).
* Paper: J. Park, et al., “A 646 GOPS/W Multi-classifier Many-core Processor with Cortex-like Architecture for Super-Resolution Recognition” IEEE ISSCC, 2013. [ pdf ]
This video shows a demonstration of the BONE-V5, unmanned aerial vehicle (UAV) system. The object recognition processor is integrated with the Texas Instrument’s OMAP4430-based multimedia board by using the FPGA extension board. With the proposed algorithm context-aware visual attention model (CAVAM) robust against dynamic noises such as motion blur, illumination and occlusion, this processor can achieve 30 frame/sec with 342GOPS peak performance and 320mW average power dissipation, which is a 2.72 times performance improvement and 3.54 times per-pixel energy reduction compared to the previous state-of-the-art.
* Paper: J. Oh, et al., “A 320mW 342GOPS Real-Time Moving Object Recognition Processor for HD 720p Video Streams” IEEE ISSCC, 2012. [ pdf ]
This video shows a demonstration of the BONE-V4 augmented reality headset. The BONE-V4 object recognition chip, a VGA camera, VGA head-mounted display and battery are integrated into the portable headset. In this "shopping" demonstration, the BONE-V4 recognizes merchandise displayed at a shopping center and overlays information on the head-mounted display. The recognition is robust to variances in illumination, rotation, and scaling (based on the SIFT algorithm), and is processed at 30fps to achieve fluid operation.
* Paper: S. Lee, et al., “A 345mW Heterogeneous Many-Core Processor with an Intelligent Inference Engine for Robust Object Recognition,” IEEE ISSCC, 2010. [ pdf ]
This video shows a demonstration of the BONE-V3, the third generation of KAIST high performance object recognition chip series. We equiped a developed vision system board based on BONE-V3, a VGA camera, and display monitor on a real vehicle for traffic sign recognition. In the video, the vehicle goes through target traffic signs in the road. The BONE-V3 performs real-time sign recognition (30fps) based on SIFT algorithm and notifies it to the driver using both a LCD display and sound.
* Paper: J.-Y. Kim, et al., “A 201.4GOPS 496mW Real-Time Multi-Object Recognition Processor with Bio-Inspired Neural Perception Engine,” IEEE ISSCC, 2009. [ pdf ]
This video shows a demonstration of the BONE-V2, the second generation of KAIST high performance object recognition chip series. We mounted a developed tiny vision system board based on BONE-V2 on a mobile robot. In the video, the mobile robot learns the target object firstly, then searches continously to lift it. Maybe this intelligent robot can do chores for us. Enjoy the video!
* Paper: K. Kim, et al., “A 125GOPS 583mW Network-on-Chip Based Parallel Processor with Bio-inspired Visual Attention Engine,” IEEE ISSCC, 2008. [ pdf ]
#1233, School of Electrical Engineering, KAIST, 291 Daehak-ro (373-1 Guseong-dong), Yuseong-gu,
Daejeon 34141, Republic of Korea / Tel. +82-42-350-8068 / Fax. +82-42-350-3410 / Mail: email@example.com
Copyright (C) 2017, SEMICONDUCTOR SYSTEM LAB., All Rights Reserved.