|
| 1 | +# Project Documentation: Air Mouse Controlled Robotic Arm Simulator |
| 2 | + |
| 3 | +A comprehensive guide to the hardware, firmware, software, and algorithms used in the project. |
| 4 | + |
| 5 | +### Table of Contents |
| 6 | +* [1. Project Overview](#1-project-overview) |
| 7 | + * [1.1. Core Concept](#11-core-concept) |
| 8 | + * [1.2. Key Features](#12-key-features) |
| 9 | +* [2. Hardware Setup & Components](#2-hardware-setup--components) |
| 10 | + * [2.1. Components List](#21-components-list) |
| 11 | + * [2.2. Wiring Diagram & Instructions](#22-wiring-diagram--instructions) |
| 12 | + * [2.3. Component Roles](#23-component-roles) |
| 13 | +* [3. Arduino Firmware Explained](#3-arduino-firmware-explained) |
| 14 | + * [3.1. Primary Function](#31-primary-function) |
| 15 | + * [3.2. Two-Way Communication Protocol](#32-two-way-communication-protocol) |
| 16 | + * [3.3. Code Breakdown](#33-code-breakdown) |
| 17 | +* [4. Unity Project Architecture](#4-unity-project-architecture) |
| 18 | + * [4.1. Core Systems Overview](#41-core-systems-overview) |
| 19 | + * [4.2. Scene & Prefab Setup](#42-scene--prefab-setup) |
| 20 | + * [4.3. Script-by-Script Breakdown](#43-script-by-script-breakdown) |
| 21 | +* [5. Key Algorithms & Techniques](#5-key-algorithms--techniques) |
| 22 | + * [5.1. Inverse Kinematics: Cyclic Coordinate Descent (CCD)](#51-inverse-kinematics-cyclic-coordinate-descent-ccd) |
| 23 | + * [5.2. Procedural Audio Synthesis for Motor Sounds](#52-procedural-audio-synthesis-for-motor-sounds) |
| 24 | + * [5.3. Raymarching Shaders for UI and 3D Objects](#53-raymarching-shaders-for-ui-and-3d-objects) |
| 25 | + * [5.4. Event-Driven Architecture for Decoupled Logic](#54-event-driven-architecture-for-decoupled-logic) |
| 26 | +* [6. User Interface (UI) System](#6-user-interface-ui-system) |
| 27 | + * [6.1. Main Menu & Connection Panel](#61-main-menu--connection-panel) |
| 28 | + * [6.2. Air Mouse & Haptics Control Panel](#62-air-mouse--haptics-control-panel) |
| 29 | + * [6.3. In-Game HUD](#63-in-game-hud) |
| 30 | +* [7. Signal Flow: How It All Works Together](#7-signal-flow-how-it-all-works-together) |
| 31 | + |
| 32 | +--- |
| 33 | + |
| 34 | +## 1. Project Overview |
| 35 | + |
| 36 | +### 1.1. Core Concept |
| 37 | +This project is an interactive simulation of a robotic arm controlled in 3D space by a custom-built, motion-sensing "air mouse." The user physically moves the air mouse, and the on-screen robotic arm mimics the motion to perform tasks in a training game. The system is designed to be highly modular, allowing for easy customization of the arm's structure, controls, and game mechanics. |
| 38 | + |
| 39 | +### 1.2. Key Features |
| 40 | +* **Real-time Motion Control**: The arm's target is controlled by the pitch, yaw, and roll of a physical MPU-6050 sensor. |
| 41 | +* **Modular Robotic Arm**: The arm is built using an Inverse Kinematics (IK) system that can support any number of joints and bone lengths. |
| 42 | +* **Interactive Training Game**: A game loop where the player picks up and assembles parts at target locations. |
| 43 | +* **Haptic Feedback**: A vibration motor provides physical feedback for key game events like picking up and placing objects. |
| 44 | +* **Procedural Audio**: Joint movement sounds are generated in real-time, creating a dynamic and realistic effect that syncs perfectly with motion speed. |
| 45 | +* **Advanced UI Shaders**: Custom raymarching shaders create a dynamic, glowing "holographic" cube and background that react to the air mouse's rotation. |
| 46 | +* **Comprehensive UI Control**: In-game menus allow the user to connect, configure all air mouse and haptic feedback parameters, and view a live data stream. |
| 47 | + |
| 48 | +--- |
| 49 | + |
| 50 | +## 2. Hardware Setup & Components |
| 51 | + |
| 52 | +### 2.1. Components List |
| 53 | +* Arduino Nano (or compatible board) |
| 54 | +* MPU-6050 6-Axis Accelerometer & Gyroscope Module |
| 55 | +* Vibration Motor Module (with built-in driver) |
| 56 | +* Pushbutton (for optional click input) |
| 57 | +* Breadboard and Jumper Wires |
| 58 | + |
| 59 | +### 2.2. Wiring Diagram & Instructions |
| 60 | +The components are connected to the Arduino Nano as follows. Power is supplied via the USB connection to the computer. |
| 61 | + |
| 62 | + |
| 63 | + |
| 64 | +* **MPU-6050 Sensor:** |
| 65 | + * `VCC` → `5V` on Arduino |
| 66 | + * `GND` → `GND` on Arduino |
| 67 | + * `SCL` → `A5` on Arduino (I2C Clock) |
| 68 | + * `SDA` → `A4` on Arduino (I2C Data) |
| 69 | +* **Vibration Motor Module:** |
| 70 | + * `GND` → `GND` on Arduino |
| 71 | + * `VCC` → `5V` on Arduino |
| 72 | + * `IN` → Pin `D9` on Arduino (must be a PWM pin, marked with `~`) |
| 73 | +* **Pushbutton (Optional):** |
| 74 | + * One leg → `GND` on Arduino |
| 75 | + * Other leg → Pin `D3` on Arduino |
| 76 | + |
| 77 | +### 2.3. Component Roles |
| 78 | +* **Arduino Nano**: The "brain" of the physical controller. Its sole purpose is to read raw data from the MPU-6050, listen for commands from Unity, and send all data to the computer over the USB serial port. |
| 79 | +* **MPU-6050**: The motion sensor. It contains a 3-axis accelerometer and a 3-axis gyroscope, providing the rotational data (pitch, roll, yaw) that drives the air mouse. |
| 80 | +* **Vibration Motor Module**: The haptic feedback device. The module includes a driver transistor, allowing the Arduino to safely turn the motor on and off with a simple digital signal, and control its intensity with PWM. |
| 81 | + |
| 82 | +--- |
| 83 | + |
| 84 | +## 3. Arduino Firmware Explained |
| 85 | + |
| 86 | +### 3.1. Primary Function |
| 87 | +The Arduino code is designed to be a "dumb" data forwarder. It performs **no calculations or game logic**. This is a deliberate design choice that makes the hardware universal; all complex logic is handled in Unity, allowing for rapid iteration without ever needing to re-upload code to the Arduino. |
| 88 | + |
| 89 | +### 3.2. Two-Way Communication Protocol |
| 90 | +Communication happens over the serial port at a **115200 baud rate**. |
| 91 | + |
| 92 | +* **Arduino to Unity (Sensor Data)**: The Arduino constantly sends a single line of comma-separated values (CSV) representing the full sensor state. |
| 93 | + * **Format**: `ax,ay,az,gx,gy,gz,clickState` |
| 94 | + * `ax, ay, az`: Raw accelerometer data. |
| 95 | + * `gx, gy, gz`: Raw gyroscope data (used for pitch, yaw, roll). |
| 96 | + * `clickState`: `1` if the button is pressed, `0` otherwise. |
| 97 | + |
| 98 | +* **Unity to Arduino (Haptic Commands)**: Unity sends simple string commands to the Arduino to control the vibration motor. |
| 99 | + * **Format**: `V,intensity,duration\n` |
| 100 | + * `V`: The command prefix for **V**ibration. |
| 101 | + * `intensity`: A number from `0` (off) to `255` (full power). |
| 102 | + * `duration`: The time in milliseconds the motor should stay on. |
| 103 | + * `\n`: A newline character to signal the end of the command. |
| 104 | + |
| 105 | +### 3.3. Code Breakdown |
| 106 | +* **`setup()`**: Initializes serial communication, connects to the MPU-6050 sensor, and configures the motor pin as an output. |
| 107 | +* **`loop()`**: |
| 108 | + 1. **Listen for Commands**: It first checks `Serial.available()` to see if a command has arrived from Unity. If so, it reads the string and parses the intensity and duration. |
| 109 | + 2. **Manage Vibration (Non-Blocking)**: The code uses the `millis()` function for vibration control. When a command is received, it turns the motor on and calculates a future timestamp (`vibrationStopTime`). It then checks on every loop if the current time has passed this timestamp. If it has, it turns the motor off. This avoids using `delay()`, which would halt the program and disrupt the sensor data stream. |
| 110 | + 3. **Read and Send Sensor Data**: It reads the latest data from the MPU-6050 and sends it to the computer in the specified CSV format. |
| 111 | + |
| 112 | +--- |
| 113 | + |
| 114 | +## 4. Unity Project Architecture |
| 115 | + |
| 116 | +### 4.1. Core Systems Overview |
| 117 | +The Unity project is built on several independent, modular systems that communicate with each other. |
| 118 | +* **Input System (`AirMouseInput`)**: Handles all communication with the Arduino. |
| 119 | +* **Robotic Arm System (`IKController`)**: Manages the arm's physical simulation. |
| 120 | +* **Game Logic (`AssemblyGameManager`)**: Controls the training game rules and state. |
| 121 | +* **UI System (Various Scripts)**: Manages all menus, displays, and visual feedback. |
| 122 | +* **Sound System (Various Scripts)**: Manages procedural joint audio and event-based sound effects. |
| 123 | + |
| 124 | +### 4.2. Scene & Prefab Setup |
| 125 | +* **`IK_System`**: An empty GameObject containing the hierarchy of arm joints. The `IKController` script is attached here. |
| 126 | +* **`GameManager`**: An empty GameObject with the `AssemblyGameManager` script. |
| 127 | +* **`UI Canvas`**: Contains all UI panels, buttons, and text displays. |
| 128 | +* **Prefabs**: `AssemblyPart` and `TargetLocation` prefabs are used to spawn the game objects for each round. |
| 129 | + |
| 130 | +### 4.3. Script-by-Script Breakdown |
| 131 | +* **`AirMouseInput.cs`**: |
| 132 | + * **Purpose**: Manages the serial port connection on a separate thread to prevent the game from freezing. Reads incoming sensor data, queues outgoing vibration commands, and provides smoothed input values to other scripts. |
| 133 | + * **Key Logic**: Uses `System.IO.Ports.SerialPort` for communication. A `Thread` reads data in a loop. `ConcurrentQueue` safely handles commands sent from the main game thread. `Vector2.SmoothDamp` provides frame-rate independent smoothing for buttery-smooth controls in both the editor and final builds. |
| 134 | + |
| 135 | +* **`IKController.cs`**: |
| 136 | + * **Purpose**: Implements the Inverse Kinematics algorithm. |
| 137 | + * **Key Logic**: Takes a list of joint `Transforms` and a `Target` transform. In `LateUpdate`, it iteratively adjusts each joint's rotation to make the `EndEffector` reach the `Target`. See the Algorithms section for more detail. |
| 138 | + |
| 139 | +* **`TargetController.cs`**: |
| 140 | + * **Purpose**: Moves the IK target in the scene based on user input. |
| 141 | + * **Key Logic**: Has two modes. In `Mouse` mode, it uses screen-to-world raycasting. In `AirMouse` mode, it reads the smoothed data from `AirMouseInput` and translates its position accordingly. It also implements the movement bounding box. |
| 142 | + |
| 143 | +* **`AssemblyGameManager.cs`**: |
| 144 | + * **Purpose**: The "brain" of the training game. Manages game state, spawning, and win conditions. |
| 145 | + * **Key Logic**: Uses a state machine (`AwaitingPickup`, `AssemblingPart`). Spawns parts and locations at random. The "pickup" mechanic is achieved by parenting the assembly part to the player's target. It uses C# `Action` events to announce key moments (`OnPartPickedUp`, etc.) to other scripts in a decoupled way. |
| 146 | + |
| 147 | +* **`JointSoundController.cs`**: |
| 148 | + * **Purpose**: Generates realistic, procedural motor sounds for a single joint. |
| 149 | + * **Key Logic**: Attached to each joint pivot. It calculates angular speed in `Update()`. In `OnAudioFilterRead()`, it generates a sound wave from scratch, mixing a base waveform (like Sawtooth) with percussive clicks and a low-frequency grind. The pitch and volume are directly controlled by the joint's movement speed. |
| 150 | + |
| 151 | +* **UI Controller Scripts (`MainMenuController`, `AirMouseUIController`, etc.)**: |
| 152 | + * **Purpose**: These scripts act as bridges between the UI elements (sliders, buttons) and the public variables of the core system scripts. |
| 153 | + * **Key Logic**: They read initial values from scripts like `AirMouseInput` to populate the UI. They use `onClick` and `onValueChanged` listeners to call functions that update the variables on the target scripts. |
| 154 | + |
| 155 | +--- |
| 156 | + |
| 157 | +## 5. Key Algorithms & Techniques |
| 158 | + |
| 159 | +### 5.1. Inverse Kinematics: Cyclic Coordinate Descent (CCD) |
| 160 | +The `IKController` uses CCD, an intuitive iterative algorithm. |
| 161 | +1. The loop starts from the joint closest to the end effector and moves backward toward the base. |
| 162 | +2. For each joint, it creates two vectors: `VectorA` (from the current joint to the end effector) and `VectorB` (from the current joint to the target). |
| 163 | +3. It calculates the rotation needed to align `VectorA` with `VectorB` using `Quaternion.FromToRotation`. |
| 164 | +4. It applies this small rotation to the current joint. |
| 165 | +5. This process repeats for all joints. By running this entire cycle several times per frame, the arm quickly converges on a solution and points at the target. |
| 166 | + |
| 167 | +### 5.2. Procedural Audio Synthesis for Motor Sounds |
| 168 | +To create a realistic sound that syncs perfectly with movement, the `JointSoundController` generates audio from code instead of playing a recording. |
| 169 | +* **`OnAudioFilterRead(float[] data, int channels)`**: This special Unity function runs on a separate audio thread. It gives the script direct access to the audio buffer (`data`) before it's sent to the speakers. |
| 170 | +* **Waveform Generation**: The script generates a base tone by calculating the values of a mathematical function (like `Sine` or `Sawtooth`) over time. The `frequency` of this function determines the pitch. |
| 171 | +* **Layer Mixing**: A convincing mechanical sound is created by mixing three layers: |
| 172 | + 1. **Whine**: The base `Sawtooth` waveform mixed with a small amount of random noise. |
| 173 | + 2. **Clicks**: A percussive layer made by generating a burst of loud noise that quickly decays, triggered at a frequency proportional to movement speed. |
| 174 | + 3. **Grind**: A low-frequency sine wave that modulates the amplitude of the whine layer, creating a "wobble" effect. |
| 175 | + |
| 176 | +### 5.3. Raymarching Shaders for UI and 3D Objects |
| 177 | +The holographic cube and background effects are created using raymarching, a rendering technique different from standard polygons. |
| 178 | +* **Signed Distance Field (SDF)**: The core of the shader is a function `D(p)` that, for any point in space `p`, returns the shortest distance to any object in the scene. |
| 179 | +* **Raymarching Loop**: The shader casts a ray from the camera. Instead of checking for triangle intersections, it evaluates the SDF to find the largest "safe" step it can take along the ray. It takes this step and repeats the process. This is much more efficient for rendering complex mathematical shapes. |
| 180 | +* **Shader Logic**: The shader calculates the cube's SDF, which is rotated based on the air mouse input. When a ray hits the surface, it calculates reflections of a procedural sky and floor to determine the final color. The background-only version uses the same technique but makes the cube itself invisible, showing only the reflections. |
| 181 | + |
| 182 | +### 5.4. Event-Driven Architecture for Decoupled Logic |
| 183 | +The `AssemblyGameManager` uses `public static event Action` to announce game events. Other scripts, like `GameLogUI` and `GameSoundEffects`, "subscribe" to these events. |
| 184 | +* **Benefit**: This is a powerful design pattern. The `GameManager` doesn't need to know that a UI or sound script exists. It just shouts "A part was picked up!" into the void. Any script interested in that event can listen for it. This makes the code highly modular and easy to expand—you can add new feedback systems without ever touching the `GameManager` code again. |
| 185 | + |
| 186 | +--- |
| 187 | + |
| 188 | +## 6. User Interface (UI) System |
| 189 | + |
| 190 | +### 6.1. Main Menu & Connection Panel |
| 191 | +* **Functionality**: Allows the user to input the COM port and baud rate. A "Connect" button attempts to initialize the `AirMouseInput` script. A toggle allows the user to bypass this and use the standard mouse. |
| 192 | +* **Feedback**: A status text field provides real-time updates ("Connecting...", "Connected!", "Failed"). Upon successful connection, a raw data display panel appears, showing the live data stream from the Arduino. |
| 193 | + |
| 194 | +### 6.2. Air Mouse & Haptics Control Panel |
| 195 | +* **Functionality**: This panel, controlled by `AirMouseUIController`, provides sliders, toggles, and dropdowns to configure every public variable on the `AirMouseInput` and `AssemblyGameManager` scripts. |
| 196 | +* **Controls**: Sensitivity, roll sensitivity, deadzone, smoothing, axis mapping, axis inversion, and all haptic feedback intensity/duration values can be tweaked in real-time. |
| 197 | + |
| 198 | +### 6.3. In-Game HUD |
| 199 | +* **Targeting Line**: A `LineRenderer` object controlled by `TargetingLineUI`. It draws a line between the player's controller and the current objective (either the part to be picked up or the assembly location). A `TextMeshPro` object at the line's midpoint displays the distance. |
| 200 | +* **Game Log**: A `TextMeshPro` text element controlled by `GameLogUI`. It listens for game events and displays status messages ("New Round!", "Part Picked Up!") with a smooth fade-out effect. |
| 201 | + |
| 202 | +--- |
| 203 | + |
| 204 | +## 7. Signal Flow: How It All Works Together |
| 205 | +This is the end-to-end data flow for a single user action: |
| 206 | + |
| 207 | +1. The user physically moves the air mouse hardware. |
| 208 | +2. The MPU-6050 sensor detects the change in orientation. |
| 209 | +3. The Arduino's `loop()` reads the new sensor values. |
| 210 | +4. The Arduino formats the data into a CSV string and sends it over the USB serial port. |
| 211 | +5. In Unity, the `AirMouseInput` script's dedicated thread is constantly listening. It receives the CSV string. |
| 212 | +6. The thread parses the string into numerical values for pitch and yaw. |
| 213 | +7. In the main game thread, the `Update()` method in `AirMouseInput` reads these latest values and applies `SmoothDamp` to create a smoothed input vector. |
| 214 | +8. The `TargetController` script reads this final, smoothed vector and updates the 3D position of the `Target` GameObject in the scene. |
| 215 | +9. The `IKController`'s `LateUpdate()` method detects that its `Target` has moved. It runs the CCD algorithm to calculate the new rotations for all arm joints to make the end effector follow the target. |
| 216 | +10. As the joints rotate, the `JointSoundController` script on each joint detects the angular speed and generates a procedural motor sound with a corresponding pitch and volume. |
| 217 | +11. If the `Target` moves close to an `AssemblyPart`, the `AssemblyGameManager` script detects this, parents the part to the target, and fires the `OnPartPickedUp` event. |
| 218 | +12. The `GameSoundEffects` and `GameLogUI` scripts hear this event and play the pickup sound and display the "Part Picked Up!" message, respectively. |
| 219 | +13. The `AssemblyGameManager` also calls the `SendVibrationCommand` on the `AirMouseInput` script. |
| 220 | +14. `AirMouseInput` queues the command. On its next processing cycle, the serial thread sends the `V,200,150\n` command back to the Arduino. |
| 221 | +15. The Arduino's `loop()` receives the command, turns on the vibration motor, and sets a timer to turn it off, providing haptic feedback to the user. |
0 commit comments