draft

Overview

This project focuses on developing an interactive VR training simulation for the assembly and maintenance of components in the TRUMPF Drive Laser system. The primary objective is to guide users through disassembling, maintaining, and replacing matchboxes safely, using immersive training modes, holographic visual aids, and step-by-step instructions.


System Features (What)

1. Tool-Based Interactions

  • Users interact with a wide range of moveable assembly parts, utilizing real-world tools (wrenches, torque wrenches, hex keys) with precision-based physics.
  • Hand tracking ensures users interact naturally with tools for tasks such as loosening bolts and removing components.
  • Holographic Guidance provides visual cues for hand placements, directions (arrows), and points of interest during tool interactions.

2. Component Disassembly (Physical-Based Interactions & Matchbox Handling)

  • Users engage in the physical disassembly of components like matchboxes and RF ducting, with accurate physics-driven behavior.
  • Two-handed interactions simulate handling heavier components, while holographic guides assist in the correct disassembly sequence.
  • Users disconnect and remove matchboxes by loosening bolts and handling the matchbox with precision, ensuring no component collisions.
  • Real-time Holographic Guidance ensures users handle parts carefully, avoiding over-bending or collisions.
  • Collision detection ensures users do not mishandle components, especially during assessment mode.

3. Instructional Guidance System

  • Users are guided through tasks using text-to-speech instructions paired with holographic visuals, ensuring step-by-step guidance throughout.
  • A wrist-mounted interface allows users to access instructions, manage settings, and follow task progress in real-time.
  • Instructions adapt dynamically based on user progress.
  • Optional troubleshooting hints are available on demand.

4. Assessment and Training Modes

  • Users can toggle between Training Mode, which provides step-by-step assistance, and Assessment Mode, which removes guidance and provides performance tracking.
  • The system dynamically adjusts to user progress in Training Mode, ensuring that users receive additional assistance as needed.
  • Assessment Mode evaluates users based on task completion, tracking metrics like time spent, accuracy, and errors. It removes guidance for a more challenging and realistic task environment.

5. Environmental Interactions

  • The training simulation is designed for a 3x3 meter room-scale VR environment, allowing users to interact with tools, ladders, and components within a realistic workspace.
  • Objects such as tools and ladders are fully interactive, with physics-based movement ensuring a realistic training experience.
  • Users interact with ladders or stools to position themselves for higher tasks. Snap points guide correct ladder placement.
  • Once placed, users can teleport to the top of the ladder, providing ergonomic access to upper components such as the matchboxes.
  • Explorer Mode allows users to move freely in the environment, explore components, and review areas outside the main task sequence.

6. Performance Tracking and Assessment

  • The system tracks user performance, measuring task accuracy, time taken, and errors like improper tool usage or component collisions.
  • Users receive detailed feedback at the end of each assessment, highlighting areas for improvement.
  • Assessment results are generated based on how efficiently tasks were completed and whether the user followed the correct procedures.

7. Detailed Assessment Metrics

  • Time to Completion: Set thresholds for how long tasks should take. If the user exceeds the time limit, it will impact the final score.
  • Accuracy: Track the precision of each task (e.g., ensuring bolts are loosened correctly and no components are mishandled).
  • Error Rate: Measure the number of mistakes (e.g., over-bending parts, incorrect tool usage) and track their impact on the final performance score.
  • Task Success: Completion is considered successful when users stay within the specified time, accuracy, and error thresholds. These metrics will be communicated to users as part of their post-training evaluation.

8. Lesson Navigation and Wrist-Mounted UI

  • The training system features a wrist-mounted UI that allows users to navigate between lessons, manage training sessions, and access task progress.
  • Lesson Navigation enables users to easily select different training modules, review completed tasks, and choose between guided and assessment modes.
  • The UI provides real-time progress indicators, including step indicators (current step number, total steps) and a checklist of actions needed for the current step.
  • The UI dynamically updates based on user progression, offering options such as mute/unmute voice-over, step replay, and auto-advance when all required actions are completed.
  • The interface includes real-time feedback and control options, ensuring users stay engaged without breaking immersion.

Technical Requirements (How)

1. Tool-Based Interactions

  • Implement Meta Quest 3 hand tracking to ensure natural, two-handed control of tools, with physics-based feedback for torque, grip, and force application.
  • Utilize Unity’s physics engine for realistic interactions with moveable assembly parts, ensuring tools behave correctly when interacting with bolts, screws, and other components.
  • Program holographic guidance using Unity’s XR toolkit to provide visual indicators for correct tool use, ensuring tools snap to correct positions during interactions.

2. Component Disassembly (Physical-Based Interactions & Matchbox Handling)

  • Design two-handed physics-based interactions for handling heavier components like RF ducting and matchboxes.
  • Integrate holographic guidance systems to provide real-time cues for disassembly, leveraging Unity’s XR tools to trigger visual guides based on user actions.
  • Include collision detection to track and provide feedback on mishandling or overextension during component removal.
  • Program collision detection and response for matchbox handling, ensuring users follow the proper removal process and avoid component damage.

3. Instructional Guidance System

  • Integrate text-to-speech functionality within Unity to deliver real-time verbal instructions that adapt based on user progress.
  • Develop a wrist-mounted UI using Unity’s canvas system, allowing users to view step-by-step instructions and settings without breaking immersion.
  • Program dynamic assistance that adjusts instructions or offers troubleshooting hints based on user performance and task completion.

4. Assessment and Training Modes

  • Develop Training Mode to provide step-by-step assistance using holographic cues and text-to-speech.
  • Program Assessment Mode to remove guidance and evaluate user performance based on tracked metrics such as time, accuracy, and error rate.
  • Ensure the system tracks progress dynamically and provides real-time feedback during assessments.

5. Environmental Interactions

  • Implement ladder placement with snap points using Unity’s physics system, allowing users to place ladders in predefined locations within the virtual environment.
  • Design a teleportation system that allows users to move efficiently to the top of the ladder, using trigger points for user navigation and movement.
  • Implement Explorer Mode with free-form movement, allowing users to move around the virtual environment, inspect components, and explore outside the task sequence.
  • Ensure the teleportation and ladder interaction systems integrate with holographic guidance to indicate proper ladder placement.

6. Performance Tracking and Assessment

  • Develop a comprehensive performance tracking system to log user metrics like time taken, accuracy, and task completion.
  • Use Unity’s analytics tools to track errors and task performance, providing users with end-of-task feedback on areas for improvement.
  • Design the assessment mode to remove instructional aids, requiring users to perform tasks independently, with scoring based on completion accuracy and efficiency.

7. Lesson Navigation and Wrist-Mounted UI

  • Implement a wrist-mounted UI using Unity’s XR Toolkit, allowing users to easily navigate lessons, check task progress, and manage settings.
  • Program lesson navigation to dynamically adapt based on user progression, providing access to completed tasks and recommendations for next steps.
  • Ensure the UI provides features such as auto-advance, step replay, and checklist updates, allowing users to engage with lessons seamlessly.

Preproduction Requirements

1. Lesson Scripts

  • Preproduction ensures accuracy and consistency in the lesson scripts. Some scripting has been provided, but preproduction is focused on refining the lesson scripts and building the associated assessment for the lesson.

2. UI Design

  • Finalize the design for all user interfaces, including the wrist-mounted UI, ensuring it supports dynamic lesson navigation, real-time feedback, and user control.

3. 3D Models and Animations

  • Optimize all 3D assets, including components, tools, and the environment, for VR use. Include animations for the guidance system, ensuring visual cues such as arrows and hand placements are smooth and accurate.

4. Testing Environment

  • Establish a controlled testing environment to verify the performance of each interaction system in isolation. The environment will allow for testing hand tracking, tool interaction, instructional guidance, and other key systems in a focused and manageable scope. This testing will ensure each feature functions as expected before integration into the full simulation.

Optional Feature

Troubleshooting on Demand

  • The system includes an optional feature that allows users to request troubleshooting information on demand. If a user encounters a problem (e.g., difficulty loosening a bolt or placing a component), they can activate troubleshooting hints that provide step-by-step guidance on correcting the issue.
  • This feature can be accessed via the wrist-mounted UI and includes real-time holographic prompts to help users resolve the problem quickly.