Faculty of Engineering

High-fidelity Stereo Imaging Depth Estimation for Metaverse Devices

This project is about high-fidelity stereo RGBD imaging for Metaverse devices.
This work proposes to design and realize a fast and accurate 3D imaging architecture that Metaverse VR headsets can leverage. Different from the previous imaging framework, we use the prior of optical encoding, and joint optimization structure in deep optics to explore a new and improved framework for high-resolution RGBD imaging. A state-of-the-art stereo-matching algorithm is explored with jointly optimized lenses and an image recovery network. Coventional imaging algorithms consider only the post processing of captured images, which is less fleasible and with high computational complexity. Our deep stereo depth estimation integrates the optical preprossessing and encoding into the advanced decoding neural networks to achieve a higher accuracy of RGBD imaging with higher resolution. It fully utilizes the information obtained from the stereo camera with a pair of asymmetric lens in a Metaverse device to achieve a more extended depth range of all-in-focus imaging.

Towards Real-time 3D Neural Holography for Near-eye Display

Holography plays a vital role in the advancement of virtual reality (VR) and augmented reality (AR) display technologies. Its ability to create realistic three-dimensional (3D) imagery is crucial for providing immersive experiences to users. However, existing computer-generated holography (CGH) algorithms used in these technologies are either slow or not 3D-compatible. In this project, we explore four inverse neural network architectures to overcome these issues for real time and 3D applications

Human Activity Detection in VR Scenario

Develop a VR-Based Motion Capture System: Create a robust and accurate motion capture system to track and record users’ movements in real-time within the VR environment. To collect data for the later machine learning model training and testing.
Machine Learning Model and Algorithm design for Behavior Detection: Construct a combined system of suitable computer vision models to detect and analysis human activity in VR Scenario. Multi-models will be involved in this system, and an algorithm will be designed to help the system predict activities based on the predictions from those models, e.g. Voting Classifiers.
Real-Time Feedback and User Interaction: Integrate the behavior detection model with the VR environment to offer real-time feedback to users based on their actions and interactions, enhancing immersion and user engagement.

Trash Collection Patrol Bot

Our project is an automatic rubbish disposal car. which is divided into 4 parts.The whole system is operated under the link between a computer and the vehicle, through this link, the computer can sketch the environment and calculate both position and posture of the vehicle, giving the car instructions on movements. The radar module is for outlining​ the surrounding environment. The AI recognition module allows the car to label the spitball that the camera captured, this allows it to find the spitball we wish to throw away. The Robotic Arm module is responsible for grabbing the object and throwing it into the Smart Bin. The Smart Bin should be able to open up automatically when the arm grabs the target as well as when someone’s hand is close to the bin.

Project Matchbox

Project Matchbox aims to recycle old keyboards into DIY controller kits for children. By putting together their own Matchbox controller, kids learn about breadboards, switches, and other basic electronics. This project is perfect for introducing complete beginners to the world of hardware engineering due to its simple design and affordable price point. Since we are using recycled keyboards and many common components, we can keep the cost of production low, making this educational STEM material accessible.

NeuroHarbor

Epilepsy, affecting over 70 million individuals worldwide, can cause significant physical injuries and psychosocial isolation. Epileptic seizures often incapacitate patients’ movement, making everyday activities potentially perilous, even fatal. The accurate and rapid detection of seizure onsets is vital for improving patients’ quality of life. Furthermore, traditional EEG diagnosis requires patients to remain bedridden, tethered by a wired headset until a seizure occurs. This process is disruptive, calling for a compact, wearable, and clinical-grade EEG headset that empowers mobility and real-time seizure monitoring.
We are poised to deliver this solution with an AI-powered wearable EEG headset. This innovative device, equipped with a proprietary AI model, facilitates real-time computing and rapid seizure detection. It ensures minimal latency, lower power consumption, compact design, and robust privacy, outpacing traditional systems. This technology liberates patients from wired, bed-restricted EEG setups, significantly enhancing their quality of life.

Beach Cleaning Robot

With the increasing concern about environmental pollution, particularly in coastal areas, the need for effective beach cleaning solutions has become paramount. This project aims to design and develop an unmanned beach cleaning robot capable of efficiently removing litter and maintaining the cleanliness of sandy shorelines. The successful execution of this initiative holds the potential to completely transform beach cleaning procedures in Hong Kong, actively contributing to environmental preservation, while simultaneously fostering greater awareness and involvement in maintaining the cleanliness of our coastlines.

An AI-Powered Virtual Diet Assistant App – FoodPrint

Ready to embrace a healthy lifestyle but don’t know where to start? Thinking about fasting but never being able to persist? Don’t worry, we have the solution for you! FoodPrint is here to guide you every step of the way—your virtual diet assistant who keeps track of everything you eat and provides you with the best customized advice before starting a new meal. Powered by the latest research from Prof. Chair and her team at CUHK, FoodPrint adopts the 16/8 intermittent fasting method, which has been examined as one of the most effective approaches in alleviating overweight and reducing cardiometabolic risks. With FoodPrint, you can experience the convenience of a dynamic fasting timer, an image-based calorie calculator, a comprehensive diet and weight tracker, and a personalized LLM-powered chatbot diet instructor. Get ready to transform your eating habits and achieve your wellness goals with FoodPrint!

Ergonomic Integrated Management System for 3D Printers @ 9th Inno Show

Our approach involves leveraging the power of a language model, specifically the LLM (Large Language Model), to interpret and understand verbal commands. We tuned the LLM on a dataset of 3D printing-related instructions and implemented a robust natural language processing system to accurately recognize and execute user commands, ensuring seamless interaction with the 3D printer.

UNIFIGHT – A University Mobile Game

When we were asked to make a game in COMP3329, we wanted to make one that could make university life more engaging, and create motivation for students to go to campus to attend classes. We first design a U-life-related story plot, with a dozen of unique and special playable characters that have beautiful graphics. Then we create a simple battle mechanism and different skill set for every character to make the game. Moreover, we take inspiration from Pokemon GO, where players can travel around to receive rewards only in University Areas. We also plan to make PVP functions, including letting different universities compete together.