
VR Inflight Safety Procedure Training
A Virtual Reality (VR) training application built with Unity and C# that simulates standard inflight safety procedures inside an immersive aircraft environment using VR controllers and interactive procedural validation.
Tech Stack
Why I Built This
Traditional inflight safety demonstrations are often passive and rely heavily on memorization, which limits long-term retention and practical familiarity with emergency procedures. I wanted to build an immersive VR-based training system that allows users to actively perform critical inflight safety procedures rather than simply watch or read instructions.
The project was designed to combine theoretical assessment with hands-on procedural simulation inside a virtual aircraft cabin environment. By integrating VR interaction mechanics such as object grabbing, proximity detection, and procedural step validation, the system provides a more engaging and realistic approach to aviation safety education.
The application also serves as a scalable foundation for aviation training simulations, academic VR research, and safety orientation programs.

How It Works
The application is divided into two primary modules: a quiz-based theoretical assessment system and an interactive VR practicum environment.
The Quiz Module contains over 100 inflight safety-related questions, with each session randomly selecting 20 questions to ensure varied assessments for every attempt. This module reinforces procedural understanding before or after practical VR training sessions.
In the Practicum Module, users are placed inside a virtual aircraft cabin where they interact with safety equipment using VR controllers. Each procedure requires specific validated actions before the system allows progression to the next section.
The training consists of five required procedures:
Life Vest Procedure
- Pick up the life vest
- Position it correctly on the body
- Secure all three buckles
Oxygen Mask Procedure
- Grab the oxygen mask
- Attach it correctly to the mouth area
Air Sickness Bag Procedure
- Grab the bag using both hands
- Position it properly near the mouth
Seatbelt Procedure
- Locate the seatbelt
- Properly fasten the buckle
Emergency Exit Procedure
- Follow directional indicators
- Navigate toward the designated aircraft exit
The system uses collision detection, proximity triggers, object interaction systems, and procedural validation logic to verify whether each action is performed correctly before marking a task as completed.


Key Decisions
Procedural Validation System
The training system validates each procedural step individually before allowing progression. This prevents users from bypassing critical safety actions and ensures that every required interaction is completed in the correct order.
Immersive VR Interaction Design
Instead of using simplified button-based interactions, the system relies on natural VR controller actions such as grabbing, positioning, and proximity-based interaction. This creates a more realistic and engaging training experience that better simulates real-world procedural behavior.
Randomized Quiz Generation
Each quiz session randomly selects 20 questions from a pool of more than 100 safety-related questions. This reduces repetition between attempts and encourages broader retention of inflight safety knowledge.
List<QuestionData> GetRandomQuestions(List<QuestionData> source, int count)
{
List<QuestionData> copy = new List<QuestionData>(source);
List<QuestionData> result = new List<QuestionData>();
for (int i = 0; i < count; i++)
{
int rand = Random.Range(0, copy.Count);
result.Add(copy[rand]);
copy.RemoveAt(rand);
}
return result;
}Step-based Training Structure
The practicum was intentionally divided into isolated procedural sections so users can focus on one safety task at a time. This modular structure also makes the system easier to expand with additional emergency scenarios in future iterations.
What I Learned
This project significantly improved my understanding of VR interaction systems, immersive user experience design, and procedural simulation development using Unity and C#.
I learned how to implement object grabbing systems, collision-based interaction handling, proximity validation, and sequential task verification within a VR environment. The project also strengthened my understanding of designing intuitive interactions that feel natural when using VR controllers.
Beyond the technical implementation, I gained deeper experience in balancing realism, usability, and instructional clarity when building simulation-based learning systems.
The project also reinforced the importance of procedural accuracy in training applications, particularly in scenarios where incorrect actions could affect user understanding of real-world safety protocols.
Let's Connect