kit

Student Interest Group Promotion Weeks 2024

Student Interest Groups (SIGs) affiliated to Tam Wing Fan Innovation Wing are now recruiting new members. Please come over to visit their promotion booths to have better understanding about the SIGs. We are looking forward to seeing you.

Project Innobot: Multi-terrain Hexapod

Traditional wheeled robots often encounter challenges when navigating level or complex terrains. To address this real-world issue, our team presents the Multi-terrain Hexapod as a solution. Inspired by the spider’s structure, our Hexapod’s multi-legged design has demonstrated the ability to climb stairs and tackle more intricate terrains. Through this project, we aim to build a comprehensive understanding of legged robots and share our experience in developing the hexapod with the wider robotics community.

Project Innobot: Autonomous Rover

Auto-piloting is an evolving technology that significantly contributes to enhancing production efficiency across industries. The development of navigation rovers enables automated, high-loading point-to-point delivery tasks, reducing the reliance on traditional manpower and decreasing the risk of physical injuries among workers. Our team is committed to cultivating expertise in the field of navigation and sharing our experiences in creating navigation robots with the broader robotics community.

Real-time object detection with Grounding Dino

Grounding Dino is an innovative computer vision library designed to enhance object detection and scene understanding through advanced grounding techniques. It combines computer vision and natural language processing techniques to integrate contextual understanding in machine learning models, improving their accuracy and efficiency in real-world applications. We will showcase some of the strengths and weaknesses of this very powerful tool during this incoming Inno Show!

Armstrong: Robotic Arm Digital Twin

Robotic arms have been widely used in various real-world scenarios, such as automobile manufacturing and housekeeping. However, there is a significant gap at the Innovation Wing, as existing robotics-focused projects mainly focus on mobile robots or quadrupeds. To address this gap, our project aims to develop an intelligent manipulator capable of performing precise tasks like maintenance and part sorting in industrial settings.

High-fidelity Stereo Imaging Depth Estimation for Metaverse Devices

This project is about high-fidelity stereo RGBD imaging for Metaverse devices.
This work proposes to design and realize a fast and accurate 3D imaging architecture that Metaverse VR headsets can leverage. Different from the previous imaging framework, we use the prior of optical encoding, and joint optimization structure in deep optics to explore a new and improved framework for high-resolution RGBD imaging. A state-of-the-art stereo-matching algorithm is explored with jointly optimized lenses and an image recovery network. Coventional imaging algorithms consider only the post processing of captured images, which is less fleasible and with high computational complexity. Our deep stereo depth estimation integrates the optical preprossessing and encoding into the advanced decoding neural networks to achieve a higher accuracy of RGBD imaging with higher resolution. It fully utilizes the information obtained from the stereo camera with a pair of asymmetric lens in a Metaverse device to achieve a more extended depth range of all-in-focus imaging.

Towards Real-time 3D Neural Holography for Near-eye Display

Holography plays a vital role in the advancement of virtual reality (VR) and augmented reality (AR) display technologies. Its ability to create realistic three-dimensional (3D) imagery is crucial for providing immersive experiences to users. However, existing computer-generated holography (CGH) algorithms used in these technologies are either slow or not 3D-compatible. In this project, we explore four inverse neural network architectures to overcome these issues for real time and 3D applications

CLIC-Chat

CLIC-Chat is an innovative AI chat tool designed to provide reliable and up-to-date legal assistance with a focus on Hong Kong law. Unlike other chat tools, CLIC-Chat boasts a comprehensive HK knowledge database, ensuring accurate responses tailored to the local context. By leveraging the RAG technique, it guarantees trustworthy information while staying current with the latest legal developments. One of its standout features is the ability to ask valuable questions, akin to a skilled lawyer, to gather essential details from users’ stories. This enables CLIC-Chat to provide personalized legal guidance and suggestions, knowing precisely when to stop questioning and offer relevant insights. With CLIC-Chat, users can confidently navigate legal matters, receive accurate information, and make informed decisions.

Human Activity Detection in VR Scenario

Develop a VR-Based Motion Capture System: Create a robust and accurate motion capture system to track and record users’ movements in real-time within the VR environment. To collect data for the later machine learning model training and testing.
Machine Learning Model and Algorithm design for Behavior Detection: Construct a combined system of suitable computer vision models to detect and analysis human activity in VR Scenario. Multi-models will be involved in this system, and an algorithm will be designed to help the system predict activities based on the predictions from those models, e.g. Voting Classifiers.
Real-Time Feedback and User Interaction: Integrate the behavior detection model with the VR environment to offer real-time feedback to users based on their actions and interactions, enhancing immersion and user engagement.

Trash Collection Patrol Bot

Our project is an automatic rubbish disposal car. which is divided into 4 parts.The whole system is operated under the link between a computer and the vehicle, through this link, the computer can sketch the environment and calculate both position and posture of the vehicle, giving the car instructions on movements. The radar module is for outlining​ the surrounding environment. The AI recognition module allows the car to label the spitball that the camera captured, this allows it to find the spitball we wish to throw away. The Robotic Arm module is responsible for grabbing the object and throwing it into the Smart Bin. The Smart Bin should be able to open up automatically when the arm grabs the target as well as when someone’s hand is close to the bin.