Table of Contents

S25 Project Listings

Instructions for Faculty Members

Students:


LLM4SE (Large Language Models for Software Engineering)

[added 2025-04-11]

Course: {EECS4070 | EECS4080}

Supervisor: Zhen Ming (Jack) Jiang

Supervisor's email address: zmjiang@yorku.ca

Project Description: Software engineering data (e.g., source code repositories and bug databases) contain a wealth of information about a project's status and history. With the recent advances of large language models (e.g., GPT and BERT) as well as their applications (e.g., ChatGPT or GitHub Copilot), many software engineering tasks can be automated or optimized. In this project, the student(s) will explore and investigate various software engineering applications which can benefit from the use of LLMs.

Required skills or prerequisites:

Recommended skills or prerequisites: Some knowledge in AI would be preferred but not required

Instructions: Send c.v. and unofficial transcript to the supervisor.


FMOps

[added 2025-04-11]

Course: {EECS4070 | EECS4080}

Supervisor: Zhen Ming (Jack) Jiang

Supervisor's email address: zmjiang@yorku.ca

Project Description: Artificial Intelligence is gaining rapid popularity in both research and practice, due to the recent advances in machine learning (ML) research and development. Many ML applications (e.g., Tesla’s autonomous vehicle and Apple’s Siri) are already being used widely in people’s everyday lives. McKinsey recently estimated that ML applications have the potential to create between $3.5 and $5.8 trillion in value annually. Foundation models are large AI models trained on a vast quantity of data at scale. FMs can be used to power a wide range of downstream tasks (e.g., chat bots, code assistants, tutors, etc.). However, there remain many challenges in efficiently training, deploying and monitoring such FM infrastructure. In addition, there is a lack of tools and processes to further develop applications or services on top of such FMs. The goal of this project is to develop engineering tools and best practices to support effectively operationalizing FMs.

Required skills or prerequisites:

Recommended skills or prerequisites: Some knowledge in AI would be preferred but not required

Instructions: Send c.v. and unofficial transcript to the supervisor.


AI Safety and AI Alignment

[added 2025-04-11]

Course: { EECS4080 | EECS4070}

Supervisor: Laleh Seyyed-Kalantari

Supervisor's email address: lsk@yorku.ca

Topics of Interest:

Required skills or prerequisite courses:

Recommended skills or prerequisite courses:

Instructions: Please fill this form and email me same materials if you are interested: https://docs.google.com/forms/d/e/1FAIpQLSfI_nBfwLKykI0W62J_LJEez-gDrwDxFiSg4RTwNw438v9U1Q/viewform


Computer Architecture & Other Topics

[added 2025-04-11]

Course: { EECS4080 | EECS4480}

Supervisor: Anirudh M Kaushik

Supervisor's email address: kaushika@yorku.ca

Topics of Interest: Computer architecture, embedded systems, compilers, electronic design automation tools, databases, software analysis

Instructions: Please email the professor.


Wearable Biomedical Devices

[added 2025-04-11]

Course: {EECS4080 | EECS4070}

Supervisor: Razieh Salahandish

Supervisor's email address: raziehs@yorku.ca

Instructions: Please email the professor.


AI-Assisted Biomedical Devices

[added 2025-04-11]

Course: {EECS4080 | EEC4070}

Supervisor: Razieh Salahandish

Supervisor's email address: raziehs@yorku.ca

Instructions: Please email the professor.


Image processing for Software Engineering

[added 2025-04-16]

Course: {EECS4080}

Supervisor: Maleknaz Nayebi

Supervisor's email address: mnayebi@yorku.ca

Required skills or prerequisites:

Recommended skills or prerequisites: Understanding of Machine Learning and Image Processing

Instructions: Please email your CV and Transcripts to the professor.


Using Generative AI for Compliance Analysis in Health Care

[added 2025-04-16]

Course: {EECS4080}

Supervisor: Maleknaz Nayebi

Supervisor's email address: mnayebi@yorku.ca

Required skills or prerequisites:

Recommended skills or prerequisites: Understanding of Machine Learning and Image Processing

Instructions: Please email your CV and Transcripts to the professor.


LLM-augmented Software Quality Assurance Techniques

[added 2025-04-16]

Course: {EECS4070}

Supervisor: Song Wang

Supervisor's email address: wangsong@yorku.ca

Instructions: Please email the professor.


Benchmarking LLM-Based IDEs for Repository-Level Code Generation

[added 2025-04-16]

Course: {EECS4080}

Supervisor: Song Wang

Supervisor's email address: wangsong@yorku.ca

Project Description: This project aims to benchmark the capabilities of LLM-based Integrated Development Environments (IDEs), such as GitHub Copilot, Gemini Code Assist, and Cursor, in performing repository-level code generation tasks. While these tools have shown impressive performance on function or file-level suggestions, their effectiveness in handling project-wide challenges, such as cross-file dependencies, module integration, refactoring, and implementing features based on high-level specifications—remains unclear. We will develop a benchmark suite based on real-world open-source repositories and evaluate multiple LLM-based IDEs using a combination of automated and human-in-the-loop metrics. The goal is to provide a systematic understanding of the strengths and limitations of current LLM-augmented IDEs in supporting large-scale, context-aware code generation.

Required skills or prerequisites: EECS2030, EECS3311, EECS4313/4312

Recommended skills or prerequisites: Python programming

Instructions: Send the transcript to the professor.


Evaluating Large Language Models on Code Behavior and Execution Analysis

[added 2025-04-16]

Course: {EECS4080}

Supervisor: Song Wang

Supervisor's email address: wangsong@yorku.ca

Project Description: This project aims to evaluate the capabilities of Large Language Models (LLMs) in understanding and analyzing code behavior based on execution results. While LLMs have shown strong performance in code generation and completion, their ability to reason about dynamic execution—such as interpreting outputs, diagnosing runtime errors, and explaining unexpected behaviors, in general, remains underexplored. We will develop a benchmark dataset containing code snippets paired with execution outcomes (e.g., outputs, errors, return values) and assess LLMs on tasks including output prediction, behavior explanation, and error diagnosis. The evaluation will consider both quantitative metrics (e.g., accuracy) and qualitative aspects (e.g., reasoning depth), offering insights into the strengths and limitations of current LLMs in execution-aware code analysis.

Required skills or prerequisites: GPA>= B+; EECS3311

Recommended skills or prerequisites: Python programming

Instructions: Send the CV and transcript to the professor.


Evaluation of Single-switch Scanning Keyboards

[added 2025-04-17]

Course: {EECS4080}

Supervisor: Scott MacKenzie

Supervisor's email address: mack@yorku.ca

Project Description: This project aims to evaluate the capabilities of Large Language Models (LLMs) in understanding and analyzing code behavior based on execution results. While LLMs have shown strong performance in code generation and completion, their ability to reason about dynamic execution—such as interpreting outputs, diagnosing runtime errors, and explaining unexpected behaviors, in general, remains underexplored. We will develop a benchmark dataset containing code snippets paired with execution outcomes (e.g., outputs, errors, return values) and assess LLMs on tasks including output prediction, behavior explanation, and error diagnosis. The evaluation will consider both quantitative metrics (e.g., accuracy) and qualitative aspects (e.g., reasoning depth), offering insights into the strengths and limitations of current LLMs in execution-aware code analysis.

Required skills or prerequisites: Skill in designing and conducting a user study, as per EECS 4441 (Human-Computer Interaction) or EECS 4443 (Mobile User Interfaces)

Recommended skills or prerequisites: See above

Instructions: Submit CV. Include information on when EECS 4441 or EECS 4443 was taken, noting the graded received and the title of course project


Autonomous Aquatic Robot

[added 2025-04-18]

Course: {EECS4080}

Supervisor: Michael Jenkin

Supervisor's email address: jenkin@yorku.ca

Project Description: Much of the surface of the planet is covered by water. Mapping and performing other tasks on these environments can be augmented through the deployment of unmanned surface vessels (USV) that can perform these tasks autonomously. This project involves refining the existing aquatic robot infrastructure to assist in the development of a robot team to support surface and underwater monitoring of freshwater areas. Interest in autonomous systems is key, and this project could be suitable for small groups (two students maximum). The current robots (Eddy 2A-C) have been deployed for a number of years and the intent this summer is to update/upgrade the hardware/software infrastructure to (i) support multi-robot operations and (ii) to ready the hardware for planned work in UAV-USV-UUV teamwork.

Required skills or prerequisites:

  1. Ability to work independently and in groups
  2. Good Python programming skills
  3. Knowledge of/interest in ROS2 would be helpful

Recommended skills or prerequisites: None beyond 4080 prerequisites

Instructions: Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.


Enhanced avatar for human-robot interaction

[added 2025-04-18]

Course: {EECS4080}

Supervisor: Michael Jenkin

Supervisor's email address: jenkin@yorku.ca

Project Description: Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not before. In the lab we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides a natural interaction with individuals in the environment, providing audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of canned animation scripts that can be used by the avatar to provide a natural interaction and non-interaction appearance to the avatar.

Required skills or prerequisites:

  1. Ability to work independently and as part of a team.
  2. Knowledge/interest in Unity and C# programming
  3. Ability to work with external partners

Recommended skills or prerequisites: None beyond 4080 prerequisites

Instructions: Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.


Indoor navigation for an omnidirectional robot

[added 2025-04-18]

Course: {EECS4080}

Supervisor: Michael Jenkin

Supervisor's email address: jenkin@yorku.ca

Project Description: Point to point navigation in an indoor environment requires solutions to a number of problems related to mapping, pose estimation and path planning. Fortunately, existing libraries now exist that support all of these tasks. This project involves deploying standard navigation tools on an omnidirectional robot in the lab and then developing appropriate interfaces to enable an individual to provide high-level instructions to the robot to engage in point-to-point navigation in a previously mapped space.

Required skills or prerequisites:

  1. Ability to work independently and as part of a team.
  2. Knowledge of ROS would be helpful
  3. Ability to work with external partners

Recommended skills or prerequisites: None beyond 4080 prerequisites

Instructions: Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.


Leveraging local LLMs for interactive office assistance

[added 2025-04-18]

Course: {EECS4080}

Supervisor: Michael Jenkin

Supervisor's email address: jenkin@yorku.ca

Project Description: Avatars have been proposed as a key element in user interface designs since the development of Microsoft's Clippy, if not before. In the lab, we have been developing a Unity-based avatar that operates as the front end of a LLM-based avatar that can be deployed in various environments. This forward facing avatar provides a natural interaction with individuals in the environment, providing audio-based input and output and literally putting a face on the underlying system. The basic goal of the project is to take the operational system and to enhance it in a number of ways, perhaps most critically through the addition of individual and group-specific control of the avatar interaction structure. Interest in LLMs and Langchain-based user group aware conversational agents.

Required skills or prerequisites:

  1. Ability to work independently and as part of a team.
  2. Knowledge/interest in Unity and C# programming
  3. Ability to work with external partners

Recommended skills or prerequisites: None beyond 4080 prerequisites

Instructions: Contact Michael Jenkin by email (jenkin@yorku.ca) if interested.