Lab Tours

CSL Student Conference 2024 Lab Tour

Lab tours will take place February 14, Noon-2 PM and all are welcome to visit the labs on their own during this time.

If you would like to join a group tour, we will depart from the ECEB Atrium at 12:10 PM and visit each lab for approximately 10 minutes to ensure we see them all.  We will meet in the rear of the atrium near the Nano lab by the lab tours promotional poster.

Sign up for email updates about the Lab Tours

Monolithic Systems Lab Dr. Girish Krishnan17 Transportation Building
We work on hybrid soft-rigid robotic systems that leverage the accuracy and force of rigid robotics with the dexterity and safety of soft robotics. We build robots to work in agriculture, healthcare and space. Our work focuses on controlling these hybrid soft-rigid robots to complete visual serving and manipulation tasks.
Advanced Controls Research LaboratoryDr. Naira HovakimyanMechanical Engineering Laboratory 2009
Our lab is mainly for experimentation of learning-enabled autonomy for aerial robotics. Ongoing topics include the design of learning-enabled control algorithms for quadrotors, planning and perception algorithms, and nonconventional aerial robotics.
RoboTouch LabDr. Wenzhen YuanSiebel Center for Computer Science 0320
We work on tactile sensing and perception for robotic applications. More specifically, we develop and employ vision-based tactile sensors to obtain high-resolution information of the interacting object for more enhances manipulations.
Gupta AI LabDr. Saurabh GuptaCSL Studio Kitchen
We work on computer vision, robotics and machine learning. We are interested in building agents that can intelligently interact with the physical world around them. Currently, we are working on robots that can perform challenging tasks (e.g. opening doors, interacting with plants, etc.) as well as more vision-based tasks (e.g. understanding hand-object interactions from egocentric videos).
Computation Imaging GroupDr. Minh N. DoCSL Building B15
We present two portable sensing systems: a combination of mmWave radar and RGBD camera for investigating radar applications in human identification, tracking, and activity analysis, and a smartphone-based digitized neurological examination tool. The latter utilizes smartphone devices to document and detect neurological disorders through the analysis of human motion.
Cyber-Physical Experiment Environment for Research (CEER)Dr. David NicolCSL Studio 1236
How does one test the prevention tools on critical infrastructure without impacting current operations? Or validate the results of those tests? These questions, and many more, have one logical solution: a high-fidelity representation of these systems, or testbed, that reflects the real world.
Intelligent Motion LabDr. Kris HauserCSL Studio Intelligent Robotics Lab
Plant property estimation: we will show how a robot system uses a tactile sensor to estimate the stiffness distribution of several artificial plants. We will show how robots distinguish hard branches and soft leaves. We will also show how a robot uses its experience to predict a prior of stiffness distribution.

Human-robot interaction: pouring is a ubiquitous but challenging task for robots. Precisely deciding the pouring amount is challenging. In this demo, we will show how a robot uses visual information to predict a human’s intention to start, stop or pause the pouring.

If you have any trouble filling out the form or have questions email Ben Walt (walt@illinois.edu)