2025-26:fall

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
2025-26:fall [2025/07/29 12:42] mnayebi2025-26:fall [2025/08/20 13:28] (current) sallin
Line 8: Line 8:
   * You can **subscribe** to this page in order to receive an email whenever the project listing page updates. Only logged in users have access to the “Manage Subscriptions” page tool.  See [[https://www.dokuwiki.org/subscription]]   * You can **subscribe** to this page in order to receive an email whenever the project listing page updates. Only logged in users have access to the “Manage Subscriptions” page tool.  See [[https://www.dokuwiki.org/subscription]]
   * The Table of Contents can be collapsed/expanded.   * The Table of Contents can be collapsed/expanded.
 +  * As a general rule, projects from the past are good indicators as to what faculty are interested in; [[https://wiki.eecs.yorku.ca/dept/project-courses/projects.|see this link for an archive]].  In addition, [[https://lassonde.yorku.ca/research/lura-and-usra-research-at-lassonde#browse-projects|check out some of prior LURA/USRA project descriptions at this link]]; these are also good indicators as to the kind of work faculty will want to mentor.
  
 /** DO NOT EDIT ABOVE THIS LINE PLEASE **/ /** DO NOT EDIT ABOVE THIS LINE PLEASE **/
Line 14: Line 15:
  
 ==== Computer Security Projects ==== ==== Computer Security Projects ====
- 
-  
  
 ** [added 2025-07-21] **  ** [added 2025-07-21] ** 
Line 32: Line 31:
  
 ** Instructions:** Reach out to security faculty to see if they have the capacity to supervise this term.  For questions about eligible security projects, contact the CSec Coordinator (Yan Shvartzshnaider). ** Instructions:** Reach out to security faculty to see if they have the capacity to supervise this term.  For questions about eligible security projects, contact the CSec Coordinator (Yan Shvartzshnaider).
 +
 +----
 +
 +==== Emotion-Aware Analysis of EECS Course Feedback for Instructional Improvement ====
 +
 +** [added 2025-08-08] ** 
 + 
 +** Course:**  {EECS4080} 
 +
 +** Supervisors:**  Pooja Vashith
 + 
 +** Supervisor's email address: ** vashistp@yorku.ca
 +
 +** Project Description: ** This project aims to uncover meaningful insights from EECS course evaluations by applying natural language processing (NLP) techniques to student feedback. While most universities collect large volumes of student comments in course evaluations, these are typically underused, especially when embedded in PDF files. Qualitative feedback is often reviewed manually or averaged superficially, leaving behind rich emotional and experiential data that could inform course improvement.
 +
 +The primary goal is to build a processing pipeline that extracts, cleans, and analyzes this feedback using both basic sentiment analysis tools (e.g., VADER) and advanced emotion classification models (e.g., GoEmotions). The emotional tone expressed in the feedback will be mapped to different course components such as the instructor, teaching assistant, assessments, and course content. NB: These are already separated in the evaluation structure.
 +
 +By comparing the expressiveness and usefulness of simple versus fine-grained emotional analysis, this research will help determine which approaches are more effective at surfacing actionable insights. These insights will be visualized to highlight recurring patterns of sentiment or emotion across course components, such as whether students consistently express frustration about assessments or admiration for certain instructors.
 +
 +This project is educational in nature as it equips the student with skills in text analytics, NLP tools, and data visualization while contributing to a broader understanding of how data-driven analysis can support evidence-based teaching and curriculum refinement in academic institutions.
 +
 +** Required skills or prerequisites: **EECS 4412 or EECS4404
 +
 +Data Analysis, Report Writing, Python programming, web app development, appetite for research
 +
 +** Instructions:** sen a CV, transcript, statement of interest, and skills to the instructor (Pooja).
 +
 +----
 +
 +==== Deep Learning and AI in Incident Management ====
 +
 +** [added 2025-08-20] ** 
 + 
 +** Course:**  {EECS4070 | EECS4080 | EECS4090} 
 +
 +** Supervisors:**  Marios Fokaefs
 + 
 +** Supervisor's email address: ** fokaefs@yorku.ca
 +
 +** Project Description: ** "Large scale complex software systems generate immense amounts of event data. This creates a significant cognitive and work load for reliability engineers and a number of different challenges. First, the detection of problems becomes problematic and delayed due to the sheer amount of data. When problems are finally detected, their analysis and resolution may take even more time, which translates in loss of revenue. After resolution, the whole cycle must be well-documented, otherwise reproducibility is reduced and unnecessary effort may be invested. 
 +
 +
 +** Required skills or prerequisites: **
 +
 +Student must have:
 +
 +Excellent programming skills (preferably python)
 +Good software design skills (must have at least a B+ in EECS3311 or similar courses)
 +Some experience with the use of LLM models as a user and as a developer
 +
 +
 +** Instructions:** Interested students must submit to the instructor (Marios):
 +
 +- CV
 +
 +- A statement of interest
 +
 +- Latest transcript
 +
 +- Other evidence (e.g., software repositories) as proof of skills
 +
 +-----
 +
 +==== Beyond the Mask: Reimagining Facial Recognition with Deep Transfer Learning ====
 +
 +** [added 2025-08-21] ** 
 + 
 +** Course:**  {EECS4480} 
 +
 +** Supervisors:**  Sunila Akbar
 + 
 +** Supervisor's email address: ** sunila@yorku.ca
 + 
 +** Project Description: ** "The project involves adapting a state-of-the-art, pretrained deep learning model for facial recognition to accurately identify individuals wearing masks. The student will utilize publicly available datasets and apply data augmentation techniques to simulate mask-wearing scenarios. Transfer learning will be employed to fine-tune the model for this specific task. The performance of the resulting model will be rigorously evaluated against established benchmarks.
 +
 +Application Domain: The proposed solution has relevance in environments where mask-wearing is mandatory, such as healthcare facilities, long-term care homes, food service industries, and chemical or pharmaceutical plants. Accurate masked facial recognition can enhance access control, attendance tracking, and safety compliance in these critical settings."
 +
 +** Required skills or prerequisites: ** 
 +
 +Python, PyTorch, NumPy, Scikit-learn, OpenCV
 +Knowledge of any deep learning model is a plus
 +Hyperparameter tuning and optimization
 +Understanding of image processing techniques and object detection evaluation metrics
 +General interest in computer vision algorithms and applications
 +
 +** Instructions:** Send CV, Transcript to the instructor (Sunila).
 +
 +----
 +
 +
 +==== Smart Tools for Smarter Brain Scans: Motion Correction in fMRI  ==== 
 +
 +**[added 2025-08-08]**
 +
 +**Course:** {EECS4080 | EECS4088}
 +
 +**Supervisor:** Sima Soltanpour
 +
 +** Supervisor's email address:** simasp@yorku.ca
 +
 +** Project Description: ** Functional Magnetic Resonance Imaging (fMRI) is a widely used technique for studying brain function, but its accuracy is often limited by motion caused by head movement during scanning. The artifacts can distort signal measurements and reduce the reliability of data analysis. This project aims to investigate and implement motion correction techniques for fMRI data using both traditional preprocessing pipelines and emerging AI-based approaches. Students will explore how image quality and signal stability can be improved through algorithmic correction. This research-focused project provides an opportunity to gain experience in neuroimaging, signal processing, and the application of machine learning to real-world biomedical data.
 +
 +** Recommended skills or prerequisites: **   
 +
 +  * Python programming
 +  * Interest in AI and machine learning for biomedical applications
 +
 +** Instructions: ** P Please email your CV and unofficial transcript to the professor (Sima).
 +
 +----
 +
 +==== Fairness and Prediction for Online Algorithms  ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:** {EECS4080}
 +
 +**Supervisor:** Shahin Kamali
 +
 +** Supervisor's email address:** kamalis@yorku.ca
 +
 +** Lab Link: ** [[https://sites.google.com/view/shahinkamali/home|here]]
 +
 +** Project Description: ** In this course, we will explore recent advances in algorithm design that incorporate fairness considerations. Achieving fairness often requires tools such as randomization and prediction. A typical setting involves scenarios where different groups or agents provide parts of the input, and the goal is to design algorithms that produce solutions that are fair across these groups. Typical applications include data structures (where different groups issue queries) and scheduling and packing problems. 
 +
 +** Recommended skills or prerequisites: **   
 +
 +  * Online Computation and Competitive Analysis (Allan Borodin, Ran El-Yaniv)
 +
 +** Instructions: ** Please email your CV and unofficial transcript to the supervisor (Shahin).
 +
 +----
 +
 +====  Seeing Code: Image Processing for Software Engineering  ====
 +
 +**[added 2025-08-09]**
 +
 +
 +**Course:**  {EECS4088/4080}
 +
 +**Supervisor:**  Maleknaz Nayebi (Research Faculty/Associate Director of CIFAL York)
 +
 +**Supervisor's email address:**  mnayebi@yorku.ca
 +
 +**Required skills or prerequisites:**  
 +  * Proficient in Python programming
 +
 +**Recommended skills or prerequisites:**
 +Understanding of Machine Learning and Image Processing
 +
 +** Project Description: ** Software development is no longer just about text-based code. Developers increasingly share screenshots, diagrams, whiteboard sketches, and UI mockups in forums, documentation, and collaborative tools. But while humans can glance at an image and instantly understand what’s there, most software engineering tools ignore this visual goldmine. This project will explore how image processing and computer vision can be applied to help developers work smarter. Imagine tools that can:
 +(i) Automatically read and interpret code snippets from screenshots on Stack Overflow or GitHub issues
 +(ii) Detect UI elements and workflows from mobile app screenshots for automated testing
 +(iii) Extract architecture diagrams from PDFs and turn them into editable models
 +(iv) Identify errors, warnings, or environment details from IDE screenshots to improve bug reports
 +You’ll work with a small dataset of real-world images from developer communities, apply OCR (Optical Character Recognition), object detection, and layout analysis, and experiment with AI techniques to transform images into structured, machine-readable insights.
 +
 +**Why This is Cool:**
 +(a) You’ll be working at the intersection of computer vision and software engineering — an emerging research frontier.
 +(b) You will work along with MSc and PhD students who were starting from where you are right now ... being my undergrad student for 4080/4088
 +(c) The project is grounded in real developer problems and could lead to tools that people actually use, and you may get to work with some of our industry partners. 
 +(d) You’ll gain experience with image processing libraries (like OpenCV, Tesseract), Python-based pipelines, and possibly even fine-tuning vision-language models.
 +(e) There’s potential for research publication or open-source release if results are promising. 
 +
 +**Instructions:**
 +Please email your CV and Transcripts to the professor (Maleknaz).
 +
 +----
 +
 +==== Using Generative AI for Compliance Analysis in Health Care ====
 +
 +**[added 2025-08-09]**
 +
 +
 +**Course:**  {EECS4080/4088}
 +
 +**Supervisor:**  Maleknaz Nayebi (Research Faculty/Associate Director of CIFAL York)
 +
 +**Supervisor's email address:**  mnayebi@yorku.ca
 +
 +**Required skills or prerequisites:**  
 +  * Proficient in Python programming
 +
 +**Recommended skills or prerequisites:**
 +Understanding of Machine Learning, prompt engineering, and GenAI
 +
 +**Project Description:** Health care is one of the most highly regulated industries in the world. Every new medical device, digital health tool, or clinical process must comply with complex rules and standards — from privacy laws like HIPAA to advertising regulations and medical ethics guidelines. The challenge? These rules are buried in long, dense, and ever-changing documents that are hard for humans to keep up with. This project will explore how Generative AI can act as an intelligent assistant for compliance analysis. Imagine a system that can:
 +(i) Read hundreds of pages of regulatory text and highlight the exact rules relevant to a given health care product or service
 +(ii) Compare a draft document or ad campaign against regulatory requirements to spot potential violations
 +(iii) Provide plain-language summaries of compliance risks for non-experts in health care teams
 +(iv) Learn from feedback to improve over time
 +
 +You’ll work with real-world health care regulations and guidance documents, build AI pipelines that integrate text extraction, retrieval-augmented generation (RAG), and natural language understanding, and evaluate how well AI can assist compliance officers and health care innovators.
 +
 +
 +**Why This is Cool:**
 +(a) You’ll be applying AI to a real-world, high-impact domain where mistakes can affect patient safety and legal outcomes
 +(b) You’ll learn to work with state-of-the-art Generative AI tools (like OpenAI, Hugging Face models) for specialized, high-stakes tasks
 +(c) The project bridges machine learning, information retrieval, and domain-specific knowledge — skills that are highly sought after in industry
 +(d) Your work could inform research papers, prototypes, and real tools that help make health care safer and more efficient
 +
 +**Instructions:**
 +Please email your CV and Transcripts to the professor (Maleknaz).
 +
 +----
 +
 +
 +==== The impact of quantity and quality of feedback on RLHF  ==== 
 +
 +**[added 2025-08-08]**
 +
 +**Course:** {EECS4080}
 +
 +**Supervisor:** Ines Arous
 +
 +** Supervisor's email address:** inesar@yorku.ca
 +
 +** Lab Link:** [[https://inesarous.github.io/|here]]
 +
 +** Project Description: ** Reinforcement learning with human feedback (RLHF) has become widely used to enhance the performance of large language models. These methods rely heavily on the availability of large amounts of high-quality human feedback. Yet, it is unclear how the quantity and quality of feedback influence the performance of language models. This project aims to address these gaps by analyzing the relationship between the properties of human feedback and the framework of RLHF, with a particular focus on its core component—the reward model. The student will conduct an empirical evaluation on a summarization task, exploring how different quantities and qualities of feedback impact the effectiveness of the reward model in RLHF. The student will also investigate various sampling strategies to identify the minimum feedback needed for comparable performance with a reward model trained on a large dataset. To examine the impact of feedback quality, the student will simulate scenarios where the feedback is noisy and evaluate the reward model's accuracy as the quality of annotations is varied. 
 +
 +** Required skills or prerequisites: **   
 +
 +  * Major in Computer Science/Software Engineering/Computer Engineering
 +  * Third year and up
 +  * You must have completed a Machine Learning/ Artificial Intelligence course. 
 +  * Total GPA over B+ (Preferably A/A+)
 +
 +** Instructions: ** Please email your CV and Transcripts to the professor (Ines).
 +
 +----
 +
 +==== Guidelines for Human Evaluation of Generated Answers by LLMs  ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:** {EECS4080}
 +
 +**Supervisor:** Ines Arous
 +
 +** Supervisor's email address:** inesar@yorku.ca
 +
 +** Lab Link:** [[https://inesarous.github.io/|here]]
 +
 +** Project Description: ** The project will use theories from behavioral science and psychology to derive guidelines for human evaluation of generated answers by LLMs. The goal is to leverage theories such as the power analysis to quantify the number of participants. Other theories, such as construct validity (measuring intended personalization traits), content validity (ensuring coverage of relevant personalization dimensions), and ecological validity (reflecting real-world use cases), will be explored.
 +
 +** Required skills or prerequisites: **   
 +
 +  * You must have completed a Machine Learning/NLP course. 
 +  * Total GPA over B+ (Preferably A/A+)
 +
 +** Instructions: ** Please send your CV, transcript and statement of interest to the professor (Ines).
 +
 +----
 +
 +==== Comparison of LLM personalization techniques on domain specific applications  ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:** {EECS4088}
 +
 +**Supervisor:** Ines Arous
 +
 +** Supervisor's email address:** inesar@yorku.ca
 +
 +** Lab Link:** [[https://inesarous.github.io/|here]]
 +
 +** Project Description: ** The project will compare between current LLM personalization techniques, such as chain of thought prompting, retrieval augmented generation (RAG), and reinforcement learning with human feedback (RLHF), on domain-specific tasks using existing datasets.
 +
 +** Required skills or prerequisites: **   
 +
 +  * You must have completed a Machine Learning or a deep learning course. 
 +  * Total GPA over B+ (Preferably A/A+)
 +
 +** Instructions: ** Send your CV, transcripts, and previous ML-related code to the professor (Ines).
 +
 +----
 +
 +==== Vision Transformer-Based Pipelines for Biomedical Image Analysis and Secure Data Collection  ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:** {EECS4080 | EECS4090}
 +
 +**Supervisor:** Navid Mohaghegh
 +
 +** Supervisor's email address:** navidmo@yorku.ca
 +
 +** Project Description: ** This project focuses on applying state-of-the-art deep learning models such as Vision Transformers (ViT) and hybrid CNN-transformer architectures (e.g., OWL-ViT) to biomedical image datasets (e.g., retinal scans, ultrasound). The student will build a scalable inference pipeline and benchmark performance against traditional CNN baselines.
 +
 +** Recommended skills or prerequisites: **   
 +
 +  * Python, PyTorch, OpenCV
 +  * Experience with deep learning models and image processing such as YOLO
 +  * Interest in biomedical applications of AI
 +  * Interest in privacy aware and secure data collection and processing 
 +
 +** Instructions: ** Please email your CV and unofficial transcript to the supervisor (Navid).
 +
 +----
 +
 +==== AI-Driven Next-Generation Firewall and Network Anomaly Detection  ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:** {EECS4080 | EECS4090}
 +
 +**Supervisor:** Navid Mohaghegh
 +
 +** Supervisor's email address:** navidmo@yorku.ca
 +
 +** Project Description: ** This project involves the design and implementation of a prototype next-generation network and application level firewalls and API gateways that leverages AI models for real-time anomaly detection across diverse network and application traffic sources. The system will integrate stream processing, feature extraction, and transformer-based models for behavioural analysis. Students will implement components such as custom packet inspection, classification pipelines, and zero-day attack detection using labeled datasets for supervised learning and simulated traffic along with unsupervised learning methods. We also develop lightweight federated learning framework to detect distributed attacks such as coordinated port scans, botnet behaviour, and insider threats. It will involve experimenting with decentralized model training, edge device simulations, and privacy-preserving protocols (e.g., differential privacy or homomorphic encryption).
 +
 +
 +** Recommended skills or prerequisites: **   
 +  * Python, Scapy, Wireshark, Zeek, Suricata, hands-on Linux and FreeBSD
 +  * ML libraries (PyTorch, TensorFlow, scikit-learn and R)
 +  * Experience with networking or cybersecurity is a plus
 +  * Familiarity with FL frameworks (Flower, FedML, or similar)
 +  * Interest in privacy-preserving ML or cyber defense
 +
 +** Instructions: ** Please email your CV and unofficial transcript to the supervisor (Navid).
 +
 +----
 +
 +==== Fair or Fake? Toward Building Fair and Explainable AI Models for Fake Content Detection ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Mona Nasery
 +
 +** Supervisor's email address:** monan@yorku.ca
 +
 +** Project Description: ** This project focuses on building and analyzing AI models for misinformation detection, with a particular emphasis on bias and explainability. Depending on the student’s interest, the project may focus on:
 +
 +  * Text-based fake news detection using transformer models,
 +  * Deepfake detection (image/video) using vision-language or video-based models.
 +
 +In both cases, the goal is to examine whether the AI behaves differently across content types or user traits (e.g., political leaning, race, gender) and to use explainability tools (e.g., SHAP, attention maps, visual saliency) to understand and potentially improve model behavior.
 +
 +The student will prototype a system that not only makes predictions, but also provides interpretable insights and evaluates fairness, contributing toward the development of more responsible AI systems for misinformation detection.
 +
 +** Required skills or prerequisites: **   
 +  * Good Python programming skills
 +  * Understanding of machine learning and deep learning (You must have completed a Machine Learning course)
 +
 +** Recommended skills or prerequisites: ** 
 +  * Familiarity with frameworks like PyTorch, TensorFlow, or HuggingFace Transformers
 +  * Comfort working with real-world datasets and performing data preprocessing
 +  * Some background in NLP (for fake news) or computer vision (for deepfakes) is an asset" 
 + 
 +** Instructions: ** Please email your CV and unofficial transcript to the supervisor (Mona).
 +
 +----
 +
 +==== Understanding Vibe Coding: UX Perspectives on AI-Driven Software Generation ==== 
 +
 +**[added 2025-07-22]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Emily Kuang
 +
 +** Supervisor's email address:** emily.kuang@lassonde.yorku.ca
 +
 +** Project Description: ** Vibe coding is a new paradigm in software development where users describe what they want, and AI tools generate the code. This “prompt-to-code” workflow is part of a growing shift toward low-code/no-code platforms, making it easier and faster to prototype software. This project investigates how UX professionals engage with vibe coding tools, a perspective that has been largely overlooked in favour of software developer-focused studies. The main tasks include: 
 +
 +  * Recruiting and running the study with UX professionals 
 +  * Collecting and analyzing study data 
 +
 +** Required skills or prerequisites: **   
 +  * Completed TCPS 2: CORE-2022 (Course on Research Ethics) 
 +  * Ability to conduct user studies and administer surveys  
 +  * Data collection and basic data analysis (e.g., interpreting SUS scores, coding qualitative responses) 
 + 
 +** Recommended skills or prerequisites: ** 
 +  * Experience with web design tools and languages (e.g., Figma, JavaScript, HTML/CSS) 
 +  * Familiarity with AI-assisted development tools (e.g., Replit, Anima, GitHub Copilot) 
 +  * EECS 3461 and EECS 4441 or equivalent 
 + 
 +** Instructions: ** Email your CV and unofficial transcript to the professor. Put “EECS 4080 Inquiry” in the subject line.
 +
 +----
 +
 +==== Using Mixed Reality to Support Programming in CS1 ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080 | EECS4088}
 +
 +**Supervisor:** Meiying Qin
 +
 +** Supervisor's email address:** mqin@yorku.ca
 +
 +** Project Description: ** Debugging is one of the most important skills for computer science students. However, first-year students are usually not comfortable with working with a debugger. In order to help ease the process for first-year students, we plan to write an application that can visualize the process by animating the variable manipulated, either on a screen or using mixed reality. In this project, students will have the opportunity to gain hands-on experience in both designing and implementing a software application. Students will gain experience in mixed reality.
 +
 +** Required skills or prerequisites: **   
 +  * Familiarity with C#
 +
 +** Instructions: ** Please email Meiying (mqin@yorku.ca) your CV and transcript, with a statement of why you are interested in the project.
 +
 +----
 +
 +==== Building Robots Tutors ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080 | EECS4088}
 +
 +**Supervisor:** Meiying Qin
 +
 +** Supervisor's email address:** mqin@yorku.ca
 +
 +** Project Description: ** The research is to innovate cost-effective robot tutors that are accessible on a broader scale, fostering inclusive and impactful learning experiences. Robot tutors have demonstrated effectiveness in aiding students, yet thier widespread adoption faces hurdles due to high costs and limited scalability. Current robot tutors are often impractical for widespread use in universities due to their expenses. This project seeks to overcome these limitations by developing an affordable robot tutor. The objective is to create a solution that meets the education needs of university students without imposing financial constraints. 
 +
 +** Required skills or prerequisites: **   
 +  * You may choose work with hardware or software; skills in either domain in recommended.
 +
 +** Instructions: ** Please email Meiying (mqin@yorku.ca) your CV and transcript, with a statement of why you are interested in the project.
 +
 +----
 +
 +==== Sims for University Life ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080 | EECS4088}
 +
 +**Supervisor:** Meiying Qin
 +
 +** Supervisor's email address:** mqin@yorku.ca
 +
 +** Project Description: ** One of the biggest challenges that first-year students face is the transition from high school to university. This is expected to be more pronounced once the York Markham campus opens, as all courses will use the flipped-class model. In this model, students are required to be more active in learning and preview the content before each class in order to stay on track. In order to assist first-year students in making a smoother transition even before school starts, we plan to release a game that simulates the life of a computer science student at the Markham campus to provide students with a preview of university life. In this project, students have the opportunity to gain hands-on experience in both designing and implementing a game.
 +
 +** Required skills or prerequisites: **   
 +  * Familiar with game development and Unity
 + 
 +** Instructions: ** Please email Meiying (mqin@yorku.ca) your CV and transcript, with a statement of why you are interested in the project.
 +
 +
 +----
 +
 +==== Make An Accessible Role Playing Game ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080 }
 +
 +**Supervisor:** Sonya Allin
 +
 +** Supervisor's email address:** sallin@yorku.ca
 +
 +** Project Description: ** This project requires development of a modular, extensible role playing game with accessibility features designed for blind users and users with low vision. Preliminary prototyping work exists in Java; you are welcome to build a clone in Python. Desirable accessiblity features include directional sound (via the openAL library), a customizable map and customizable voices for RPG narration. The goal of this project is to create reusable modules that others can use to develop inclusive games.
 +
 +** Required skills or prerequisites: **   
 +  * Proficiency in Java and/or Python and associated frameworks (e.g., JUnit, OpenAL) 
 +  * Familiarity with version control (e.g., Git/GitHub) 
 +  * Strong debugging and testing skills 
 + 
 +** Recommended skills or prerequisites: **
 +  * Familiarity with Human-Centered Design or Accessibility Principles (e.g., WCAG, Universal Design) 
 +  * EECS 3461 and EECS 4441 or equivalent 
 +  * Experience with API integration 
 + 
 +** Instructions: ** Please email Sonya (sallin@yorku.ca) your CV and transcript, with a statement of why you are interested in the project.  Use the subject line "**[EECS4080] Independent Project Inquiry**".
 +----
 +
 +==== Make An Accessible Arcade Game ==== 
 +
 +**[added 2025-08-05]**
 +
 +**Course:**  {EECS4080}
 +
 +**Supervisor:** Sonya Allin
 +
 +** Supervisor's email address:** sallin@yorku.ca
 +
 +** Project Description: ** This project requires development of an arcade game with accessibility features designed for blind users and users with low vision. Preliminary prototyping work exists in Java; you are welcome to build a clone in Python. Accessiblity features include directional sound and a haptic game controller. The goal of this project is to create reusable modules that others can use to develop inclusive games.
 +
 +** Required skills or prerequisites: **   
 +  * Proficiency in Java and/or Python and associated frameworks (e.g., JUnit, OpenAL) 
 +  * Familiarity with version control (e.g., Git/GitHub) 
 +  * Strong debugging and testing skills 
 + 
 +** Recommended skills or prerequisites: **
 +  * Familiarity with Human-Centered Design or Accessibility Principles (e.g., WCAG, Universal Design) 
 +  * EECS 3461 and EECS 4441 or equivalent 
 +  * Experience with API integration 
 + 
 +** Instructions: ** Please email Sonya (sallin@yorku.ca) your CV and transcript, with a statement of why you are interested in the project. Use the subject line "**[EECS4080] Independent Project Inquiry**".
  
 ---- ----
Line 292: Line 783:
  
 ---- ----
- 
-==== Image Processing for Software Engineering ==== 
- 
-**[added 2025-07-15]** 
- 
- 
-**Course:**  {EECS4088/4080} 
- 
-**Supervisor:**  Maleknaz Nayebi 
- 
-**Supervisor's email address:**  mnayebi@yorku.ca 
- 
-**Required skills or prerequisites:**   
-  * Proficient in Python programming 
- 
-**Recommended skills or prerequisites:** 
-Understanding of Machine Learning and Image Processing 
- 
- 
-**Instructions:** 
-Please email your CV and Transcripts to the professor. 
- 
----- 
- 
-==== Using Generative AI for Compliance Analysis in Health Care ==== 
- 
-**[added 2025-07-15]** 
- 
- 
-**Course:**  {EECS4080/4088} 
- 
-**Supervisor:**  Maleknaz Nayebi 
- 
-**Supervisor's email address:**  mnayebi@yorku.ca 
- 
-**Required skills or prerequisites:**   
-  * Proficient in Python programming 
- 
-**Recommended skills or prerequisites:** 
-Understanding of Machine Learning and Image Processing 
- 
- 
-**Instructions:** 
-Please email your CV and Transcripts to the professor. 
- 
----- 
- 
  
 ==== LLM-augmented Software Quality Assurance Techniques ==== ==== LLM-augmented Software Quality Assurance Techniques ====
2025-26/fall.1753792931.txt.gz · Last modified: by mnayebi