User Tools

Site Tools


2025-26:summer

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
2025-26:summer [2025/04/18 15:49] – [Autonomous Aquatic Robot] alomari2025-26:summer [2025/04/18 15:54] (current) – [Indoor navigation for an omnidirectional robot] alomari
Line 304: Line 304:
  
 **Instructions:** **Instructions:**
-Contact Michael Jenkin as jenkin@yorku.ca if interested+Contact Michael Jenkin by email (jenkin@yorku.caif interested.
  
 ---- ----
Line 310: Line 310:
  
  
-==== Autonomous Aquatic Robot  ====+==== Enhanced avatar for human-robot interaction  ====
  
 **[added 2025-04-18]** **[added 2025-04-18]**
Line 321: Line 321:
  
 **Project Description:**  **Project Description:** 
-Much of the surface of the planet is covered by waterMapping and performing other tasks on these environments can be augmented through the deployment of unmanned surface vessels (USV) that can perform these tasks autonomously. This project involves refining the existing aquatic robot infrastructure to assist in the development of robot team to support surface and underwater monitoring of freshwater areas. Interest in autonomous systems is key, and this project could be suitable for small groups (two students maximum) +Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not beforeIn the lab we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides natural interaction with individuals in the environmentproviding audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of canned animation scripts that can be used by the avatar to provide a natural interaction and non-interaction appearance to the avatar.
-The current robots (Eddy 2A-C) have been deployed for a number of years and the intent this summer is to update/upgrade the hardware/software infrastructure to (i) support multi-robot operations and (ii) to ready the hardware for planned work in UAV-USV-UUV teamwork.+
  
  
 **Required skills or prerequisites:**   **Required skills or prerequisites:**  
-  - Ability to work independently and in groups +  - Ability to work independently and as part of a team. 
-  - Good Python programming skills +  - Knowledge/interest in Unity and C# programming 
-  - Knowledge of/interest in ROS2 would be helpful+  - Ability to work with external partners 
  
  
Line 335: Line 335:
  
 **Instructions:** **Instructions:**
-Contact Michael Jenkin as jenkin@yorku.ca if interested+Contact Michael Jenkin by email (jenkin@yorku.caif interested.
  
 ---- ----
 +
 +==== Indoor navigation for an omnidirectional robot  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Point to point navigation in an indoor environment requires solutions to a number of problems related to mapping, pose estimation and path planning. Fortunately, existing libraries now exist that support all of these tasks. This project involves deploying standard navigation tools on an omnidirectional robot in the lab and then developing appropriate interfaces to enable an individual to provide high-level instructions to the robot to engage in point-to-point navigation in a previously mapped space. 
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and as part of a team.
 +  - Knowledge of ROS would be helpful
 +  - Ability to work with external partners
 +
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +==== Leveraging local LLMs for interactive office assistance  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not before. In the lab, we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides a natural interaction with individuals in the environment, providing audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of individual and group-specific control of the avatar interaction structure. Interest in LLMs and Langchain-based user group aware conversational agents.  
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and as part of a team.
 +  - Knowledge/interest in Unity and C# programming
 +  - Ability to work with external partners
 +
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +
  
  
2025-26/summer.1744991390.txt.gz · Last modified: 2025/04/18 15:49 by alomari

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki