User Tools

Site Tools


2025-26:summer

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
2025-26:summer [2025/04/16 13:32] – [AI-Assisted Biomedical Devices] alomari2025-26:summer [2025/04/18 15:54] (current) – [Indoor navigation for an omnidirectional robot] alomari
Line 149: Line 149:
 **[added 2025-04-16]** **[added 2025-04-16]**
  
-**Course:**  { EECS4080}+**Course:**  {EECS4080}
  
 **Supervisor:**  Maleknaz Nayebi **Supervisor:**  Maleknaz Nayebi
Line 166: Line 166:
  
 ---- ----
 +
 +==== Using Generative AI for Compliance Analysis in Health Care ====
 +
 +**[added 2025-04-16]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:**  Maleknaz Nayebi
 +
 +**Supervisor's email address:**  mnayebi@yorku.ca
 +
 +**Required skills or prerequisites:**  
 +  * Proficient in Python programming
 +
 +**Recommended skills or prerequisites:**
 +Understanding of Machine Learning and Image Processing
 +
 +
 +**Instructions:**
 +Please email your CV and Transcripts to the professor.
 +
 +----
 +
 +
 +==== LLM-augmented Software Quality Assurance Techniques ====
 +
 +**[added 2025-04-16]**
 +
 +**Course:**  {EECS4070}
 +
 +**Supervisor:** Song Wang
 +
 +**Supervisor's email address:** wangsong@yorku.ca
 +
 +**Instructions:**
 +Please email the professor.
 +
 +----
 +
 +
 +==== Benchmarking LLM-Based IDEs for Repository-Level Code Generation ====
 +
 +**[added 2025-04-16]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Song Wang
 +
 +**Supervisor's email address:** wangsong@yorku.ca
 +
 +**Project Description:** 
 +This project aims to benchmark the capabilities of LLM-based Integrated Development Environments (IDEs), such as GitHub Copilot, Gemini Code Assist, and Cursor, in performing repository-level code generation tasks. While these tools have shown impressive performance on function or file-level suggestions, their effectiveness in handling project-wide challenges, such as cross-file dependencies, module integration, refactoring, and implementing features based on high-level specifications—remains unclear. We will develop a benchmark suite based on real-world open-source repositories and evaluate multiple LLM-based IDEs using a combination of automated and human-in-the-loop metrics. The goal is to provide a systematic understanding of the strengths and limitations of current LLM-augmented IDEs in supporting large-scale, context-aware code generation.
 +
 +**Required skills or prerequisites:**  
 +EECS2030, EECS3311, EECS4313/4312
 +
 +**Recommended skills or prerequisites:**
 +Python programming
 +
 +**Instructions:**
 +Send the transcript to the professor.
 +
 +----
 +
 +==== Evaluating Large Language Models on Code Behavior and Execution Analysis ====
 +
 +**[added 2025-04-16]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Song Wang
 +
 +**Supervisor's email address:** wangsong@yorku.ca
 +
 +**Project Description:** 
 +This project aims to evaluate the capabilities of Large Language Models (LLMs) in understanding and analyzing code behavior based on execution results. While LLMs have shown strong performance in code generation and completion, their ability to reason about dynamic execution—such as interpreting outputs, diagnosing runtime errors, and explaining unexpected behaviors, in general, remains underexplored. We will develop a benchmark dataset containing code snippets paired with execution outcomes (e.g., outputs, errors, return values) and assess LLMs on tasks including output prediction, behavior explanation, and error diagnosis. The evaluation will consider both quantitative metrics (e.g., accuracy) and qualitative aspects (e.g., reasoning depth), offering insights into the strengths and limitations of current LLMs in execution-aware code analysis.
 +
 +**Required skills or prerequisites:**  
 +GPA>= B+; EECS3311
 +
 +**Recommended skills or prerequisites:**
 +Python programming
 +
 +**Instructions:**
 +Send the CV and transcript to the professor.
 +
 +----
 +
 +==== Evaluation of Single-switch Scanning Keyboards ====
 +
 +**[added 2025-04-17]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Scott MacKenzie
 +
 +**Supervisor's email address:** mack@yorku.ca
 +
 +**Project Description:** 
 +This project aims to evaluate the capabilities of Large Language Models (LLMs) in understanding and analyzing code behavior based on execution results. While LLMs have shown strong performance in code generation and completion, their ability to reason about dynamic execution—such as interpreting outputs, diagnosing runtime errors, and explaining unexpected behaviors, in general, remains underexplored. We will develop a benchmark dataset containing code snippets paired with execution outcomes (e.g., outputs, errors, return values) and assess LLMs on tasks including output prediction, behavior explanation, and error diagnosis. The evaluation will consider both quantitative metrics (e.g., accuracy) and qualitative aspects (e.g., reasoning depth), offering insights into the strengths and limitations of current LLMs in execution-aware code analysis.
 +
 +**Required skills or prerequisites:**  
 +Skill in designing and conducting a user study, as per EECS 4441 (Human-Computer Interaction) or EECS 4443 (Mobile User Interfaces)
 +
 +**Recommended skills or prerequisites:**
 +See above
 +
 +**Instructions:**
 +Submit CV.  Include information on when EECS 4441 or EECS 4443 was taken, noting the graded received and the title of course project
 +
 +----
 +
 +
 +==== Autonomous Aquatic Robot  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Much of the surface of the planet is covered by water. Mapping and performing other tasks on these environments can be augmented through the deployment of unmanned surface vessels (USV) that can perform these tasks autonomously. This project involves refining the existing aquatic robot infrastructure to assist in the development of a robot team to support surface and underwater monitoring of freshwater areas. Interest in autonomous systems is key, and this project could be suitable for small groups (two students maximum). 
 +The current robots (Eddy 2A-C) have been deployed for a number of years and the intent this summer is to update/upgrade the hardware/software infrastructure to (i) support multi-robot operations and (ii) to ready the hardware for planned work in UAV-USV-UUV teamwork.
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and in groups
 +  - Good Python programming skills
 +  - Knowledge of/interest in ROS2 would be helpful
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +
 +
 +==== Enhanced avatar for human-robot interaction  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not before. In the lab we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides a natural interaction with individuals in the environment, providing audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of canned animation scripts that can be used by the avatar to provide a natural interaction and non-interaction appearance to the avatar.
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and as part of a team.
 +  - Knowledge/interest in Unity and C# programming
 +  - Ability to work with external partners
 +
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +==== Indoor navigation for an omnidirectional robot  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Point to point navigation in an indoor environment requires solutions to a number of problems related to mapping, pose estimation and path planning. Fortunately, existing libraries now exist that support all of these tasks. This project involves deploying standard navigation tools on an omnidirectional robot in the lab and then developing appropriate interfaces to enable an individual to provide high-level instructions to the robot to engage in point-to-point navigation in a previously mapped space. 
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and as part of a team.
 +  - Knowledge of ROS would be helpful
 +  - Ability to work with external partners
 +
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +==== Leveraging local LLMs for interactive office assistance  ====
 +
 +**[added 2025-04-18]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Michael Jenkin
 +
 +**Supervisor's email address:** jenkin@yorku.ca
 +
 +**Project Description:** 
 +Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not before. In the lab, we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides a natural interaction with individuals in the environment, providing audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of individual and group-specific control of the avatar interaction structure. Interest in LLMs and Langchain-based user group aware conversational agents.  
 +
 +
 +**Required skills or prerequisites:**  
 +  - Ability to work independently and as part of a team.
 +  - Knowledge/interest in Unity and C# programming
 +  - Ability to work with external partners
 +
 +
 +
 +**Recommended skills or prerequisites:**
 +None beyond 4080 prerequisites
 +
 +**Instructions:**
 +Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.
 +
 +----
 +
 +
 +
 +
2025-26/summer.1744810371.txt.gz · Last modified: 2025/04/16 13:32 by alomari

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki