// Include the OculusVR SDK
#include <OVR_CAPI.h>
void Application()
{
ovrResult result = ovr_Initialize(nullptr);
if (OVR_FAILURE(result))
return;
ovrSession session;
ovrGraphicsLuid luid;
result = ovr_Create(&session, &luid);
if (OVR_FAILURE(result))
{
ovr_Shutdown();
return;
}
ovrHmdDesc desc = ovr_GetHmdDesc(session);
ovrSizei resolution = desc.Resolution;
ovr_Destroy(session);
ovr_Shutdown();
}
ovr_Initialize
is called before any other API functions and ovr_Shutdown
is called to shut down the library before you exit the program. In between these function calls, you are free to create HMD objects, access tracking state, and perform application rendering.ovr_Create
creates the HMD. Use the LUID returned by ovr_Create
to select the IDXGIAdapter on which your ID3D11Device or ID3D12Device is created. Finally, ovr_Destroy
must be called to clear the HMD before shutting down the library.ovr_GetHmdDesc()
to get a description of the HMD.ovr_Create
returns a failed ovrResult
unless a virtual HMD is enabled through RiftConfigUtil. Although the virtual HMD will not provide any sensor input, it can be useful for debugging Rift-compatible rendering code and for general development without a physical device.ovrHmdDesc
) handle can be retrieved by calling ovr_GetHmdDesc
. The following table describes the fields:Field | Type | Description |
---|---|---|
Type | ovrHmdType | Type of the HMD. |
ProductName | char[] | Name of the product as a string. |
Manufacturer | char[] | Name of the manufacturer. |
VendorId | short | Vendor ID reported by the headset USB device. |
ProductId | short | Product ID reported by the headset USB device. |
SerialNumber | char[] | Serial number string reported by the headset USB device. |
FirmwareMajor | short | The major version of the sensor firmware. |
FirmwareMinor | short | The minor version of the sensor firmware. |
AvailableHmdCaps | unsigned int | Capability bits described by ovrHmdCaps which the HMD currently supports. |
DefaultHmdCaps | unsigned int | Default capability bits described by ovrHmdCaps for the current HMD. |
AvailableTrackingCaps | unsigned int | Capability bits described by ovrTrackingCaps which the HMD currently supports. |
DefaultTrackingCaps | unsigned int | Default capability bits described by ovrTrackingCaps for the current HMD. |
DefaultEyeFov | ovrFovPort[] | Recommended optical field of view for each eye. |
MaxEyeFov | ovrFovPort[] | Maximum optical field of view that can be practically rendered for each eye. |
Resolution | ovrSizei | Resolution of the full HMD screen (both eyes) in pixels. |
DisplayRefreshRate | float | Nominal refresh rate of the HMD in cycles per second at the time of HMD creation. |
ovrTrackerDesc
) handle can be retrieved by calling ovr_GetTrackerDesc
. The following table describes the fields:Field | Type | Description |
---|---|---|
FrustumHFovInRadians | float | The horizontal FOV of the position sensor frustum. |
FrustumVFovInRadians | float | The vertical FOV of the position sensor frustum. |
FrustumNearZInMeters | float | The distance from the position sensor to the near frustum bounds. |
FrustumFarZInMeters | float | The distance from the position sensor to the far frustum bounds. |
ovrSession
is created, you can poll sensor fusion for head position and orientation by calling ovr_GetTrackingState
. These calls are demonstrated by the following code:// Query the HMD for ts current tracking state.
ovrTrackingState ts = ovr_GetTrackingState(session, ovr_GetTimeInSeconds(), ovrTrue);
if (ts.StatusFlags & (ovrStatus_OrientationTracked | ovrStatus_PositionTracked))
{
ovrPosef pose = ts.HeadPose.ThePose;
...
}
ovr_GetTrackingState
. This state includes the predicted head pose and the current tracking state of the HMD as described by StatusFlags. This state can change at runtime based on the available devices and user behavior. For example, the ovrStatus_PositionTracked
flag is only reported when HeadPose includes the absolute positional tracking data from the sensor.ovrPoseStatef
includes full six degrees of freedom (6DoF) head tracking data including orientation, position, and their first and second derivatives. The pose value is reported for a specified absolute point in time using prediction, typically corresponding to the time in the future that this frame’s image will be displayed on screen. To facilitate prediction, ovr_GetTrackingState
takes absolute time, in seconds, as a second argument. The current value of absolute time can be obtained by calling ovr_GetTimeInSeconds
. If the time passed into ovr_GetTrackingState
is the current time or earlier, the tracking state returned will be based on the latest sensor readings with no prediction. In a production application, however, you should use the real-time computed value returned by GetPredictedDisplayTime
. Prediction is covered in more detail in the section on Frame Timing.Quatf::GetEulerAngles<>
to extract the Euler angles in the desired axis rotation order.ovrTrackerDesc
struct as follows:ovrSession session;
ovrGraphicsLuid luid;
if(OVR_SUCCESS(ovr_Create(&session, &luid)))
{
// Extract tracking frustum parameters.
float frustomHorizontalFOV = session->CameraFrustumHFovInRadians;
...
Field | Type | Typical Value |
---|---|---|
FrustumHFovInRadians | float | 1.292 radians (74 degrees) |
FrustumVFovInRadians | float | 0.942 radians (54 degrees) |
FrustumNearZInMeters | float | 0.4m |
FrustumFarZInMeters | float | 2.5m |
ovr_RecenterTrackingOrigin
, which resets the tracking origin to the headset’s current location and sets the yaw origin to the current headset yaw value. Additionally, it can be manually specified to any location using the API call ovr_SpecifyTrackingOrigin
.ovr_GetTrackingOriginType
. To set the origin, use ovr_SetTrackingOriginType
.Note: The tracking origin is set on a per application basis; switching focus between different VR apps also switches the tracking origin.
ovr_GetTrackingState
. The returned ovrTrackingState
struct contains several items relevant to position tracking:ovr_RecenterTrackingOrigin
, though it refers to the same location in real-world space. Otherwise it will remain as an identity pose. Different tracking origin types will report different CalibrateOrigin poses, as the calibration origin refers to a fixed position in real-world space but the two tracking origin types refer to different y levels.ovrStatus_PositionTracked
flag that is set only when the headset is being actively tracked.ovrStatus_PositionTracked
flag is set.ovr_GetTrackerPose
. The returned ovrTrackerPose
struct contains the following:Pose
: the pose of the sensor relative to the tracking origin.LeveledPose
: the pose of the sensor relative to the tracking origin but with roll and pitch zeroed out. You can use this as a reference point to render real-world objects in the correct place.xrGetSystem
.#include <openxr.h>
void Application()
{
XrInstanceCreateInfo createInfo{XR_TYPE_INSTANCE_CREATE_INFO};
createInfo.applicationInfo.applicationVersion = 1;
strcpy(createInfo.applicationInfo.applicationName, "application");
createInfo.applicationInfo.apiVersion = XR_API_VERSION_1_0;
XrInstance instance;
XrResult result = xrCreateInstance(&createInfo, &instance);
if (XR_FAILED(result))
return;
bool isVR = YourArgumentCheckHere("-vrmode", "none");
if (isVR)
{
XrSystemId systemId;
XrSystemGetInfo systemGetInfo{XR_TYPE_SYSTEM_GET_INFO, nullptr, XR_FORM_FACTOR_HEAD_MOUNTED_DISPLAY};
result = xrGetSystem(instance, &systemGetInfo, &systemId);
if (XR_SUCCEEDED(result))
{
// Create XrSession,
// run VR app loop.
}
else
{
// no headset connected.
// Run in desktop mode.
}
}
xrDestroyInstance(instance);
}