UnityEngine.Input scripting API (see Unity’s Input scripting reference for more information).InputManager.asset if they do not already exist.GetLocalControllerPosition() and GetLocalControllerRotation(), which return a Vector3 and Quaternion, respectively.OVRManager.display.RecenterPose(), similar to the head and eye poses.Primary and Secondary in OVRInput: Primary always refers to the left controller and Secondary always refers to the right controller.
</oc-docs-device>OVRInput.Update() and OVRInput.FixedUpdate() once per frame at the beginning of any component’s Update and FixedUpdate methods, respectively.Get(), GetDown(), and GetUp().Get() queries the current state of a controller.GetDown() queries if a controller was pressed this frame.GetUp() queries if a controller was released this frame.Get() that provide access to different sets of controls. These sets of controls are exposed through enumerations defined by OVRInput as follows:| Control | Enumerates |
|---|---|
OVRInput.Button | Traditional buttons found on gamepads, controllers, and the back button. |
OVRInput.Touch | Capacitive-sensitive control surfaces found on the controller. |
OVRInput.NearTouch | Proximity-sensitive control surfaces found on the controller. |
OVRInput.Axis1D | One-dimensional controls such as triggers that report a floating point state. |
OVRInput.Axis2D | Two-dimensional controls including thumbsticks. Reports a Vector2 state. |
| Control |
|---|
OVRInput.RawButton |
OVRInput.RawTouch |
OVRInput.RawNearTouch |
OVRInput.RawAxis1D |
OVRInput.RawAxis2D |
Touch), as well as when they are in close proximity (NearTouch). This allows for detecting several distinct states of a user’s interaction with a specific control surface. For example, if a user’s index finger is fully removed from a control surface, the NearTouch for that control will report false. As the user’s finger approaches the control and gets within close proximity to it, the NearTouch will report true prior to the user making physical contact. When the user makes physical contact, the Touch for that control will report true. When the user pushes the index trigger down, the Button for that control will report true. These distinct states can be used to accurately detect the user’s interaction with the controller and enable a variety of control schemes.// returns true if the primary button (typically “A”) is currently pressed. OVRInput.Get(OVRInput.Button.One); // returns true if the primary button (typically “A”) was pressed this frame. OVRInput.GetDown(OVRInput.Button.One); // returns true if the “X” button was released this frame. OVRInput.GetUp(OVRInput.RawButton.X); // returns a Vector2 of the primary (typically the Left) thumbstick’s current state. // (X/Y range of -1.0f to 1.0f) OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick); // returns true if the primary thumbstick is currently pressed (clicked as a button) OVRInput.Get(OVRInput.Button.PrimaryThumbstick); // returns true if the primary thumbstick has been moved upwards more than halfway. // (Up/Down/Left/Right - Interpret the thumbstick as a D-pad). OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp); // returns a float of the secondary (typically the Right) index finger trigger’s current state. // (range of 0.0f to 1.0f) OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger); // returns a float of the left index finger trigger’s current state. // (range of 0.0f to 1.0f) OVRInput.Get(OVRInput.RawAxis1D.LIndexTrigger); // returns true if the left index finger trigger has been pressed more than halfway. // (Interpret the trigger as a button). OVRInput.Get(OVRInput.RawButton.LIndexTrigger); // returns true if the secondary gamepad button, typically “B”, is currently touched by the user. OVRInput.Get(OVRInput.Touch.Two);
Get() also takes an optional controller parameter. The list of supported controllers is defined in the OVRInput.Controller enumeration.Get(), the default is to use the Active controller, which corresponds to the controller that most recently reported user input. For example, a user may use a pair of controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to the Xbox controller once the user provides input with it. The current Active controller can be queried with OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with OVRInput.GetConnectedControllers().// returns a float of the hand trigger’s current state on the left controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch); // returns a float of the hand trigger’s current state on the right controller. OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow more convenient development of hand-agnostic input code.// returns a float of the hand trigger’s current state on the left controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch); // returns a float of the hand trigger’s current state on the right controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
// public variable that can be set to LTouch or RTouch in the Unity Inspector public Controller controller; // returns a float of the hand trigger’s current state on the controller // specified by the controller variable. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller); // returns true if the primary button (“A” or “X”) is pressed on the controller // specified by the controller variable. OVRInput.Get(OVRInput.Button.One, controller);
OVRInput.Controller.Touch, the virtual mapping closely matches the layout of a typical gamepad split across the left and right hands.
OVRInput.Controller.LTouch or OVRInput.Controller.RTouch, the virtual mapping changes to allow for hand-agnostic input bindings. For example, the same script can dynamically query the left or right controller depending on which hand it is attached to, and Button.One is mapped appropriately to either the A or X button.
