Base class for an XR interface implementation.
This class needs to be implemented to make an AR or VR platform available to Godot and these should be implemented as C++ modules or GDNative modules (note that for GDNative the subclass XRScriptInterface should be used). Part of the interface is exposed to GDScript so you can detect, enable and configure an AR or VR platform.
Interfaces should be written in such a way that simply enabling them will give us a working setup. You can query the available interfaces through [XRServer].
https://docs.godotengine.org/en/latest/tutorials/vr/index.html
If this is an AR interface that requires displaying a camera feed as the background, this method returns the feed ID in the [CameraServer] for this interface.
Returns a combination of [enum Capabilities] flags providing information about the capabilities of this interface.
Returns the name of this interface (OpenXR, OpenVR, OpenHMD, ARKit, etc).
Returns an array of vectors that denotes the physical play area mapped to the virtual space around the [XROrigin3D] point. The points form a convex polygon that can be used to react to or visualise the play area. This returns an empty array if this feature is not supported or if the information is not yet available.
Returns the resolution at which we should render our intermediate results before things like lens distortion are applied by the VR platform.
If supported, returns the status of our tracking. This will allow you to provide feedback to the user whether there are issues with positional tracking.
Returns the number of views that need to be rendered for this device. 1 for Monoscopic, 2 for Stereoscopic.
Call this to initialize this interface. The first interface that is initialized is identified as the primary interface and it will be used for rendering output.
After initializing the interface you want to use you then need to enable the AR/VR mode of a viewport and rendering should commence.
[b]Note:[/b] You must enable the XR mode on the main viewport for any device that uses the main output of Godot, such as for mobile VR.
If you do this for a platform that handles its own output (such as OpenVR) Godot will show just one eye without distortion on screen. Alternatively, you can add a separate viewport node to your scene and enable AR/VR on that viewport. It will be used to output to the HMD, leaving you free to do anything you like in the main window, such as using a separate camera as a spectator camera or rendering something completely different.
While currently not used, you can activate additional interfaces. You may wish to do this if you want to track controllers from other platforms. However, at this point in time only one interface can render to an HMD.
Is [code]true[/code] if this interface has been initialised.
Sets the active play area mode, will return [code]false[/code] if the mode can't be used with this interface.
Call this to find out if a given play area mode is supported by this interface.
Triggers a haptic pulse on a device associated with this interface.
[code]action_name[/code] is the name of the action for this pulse.
[code]tracker_name[/code] is optional and can be used to direct the pulse to a specific device provided that device is bound to this haptic.
Turns the interface off.
On an AR interface, [code]true[/code] if anchor detection is enabled.
[code]true[/code] if this is the primary interface.
The play area mode for this interface.
Emitted when the play area is changed. This can be a result of the player resetting the boundary or entering a new play area, the player changing the play area mode, the world scale changing or the player resetting their headset orientation.
No XR capabilities.
This interface can work with normal rendering output (non-HMD based AR).
This interface supports stereoscopic rendering.
This interface supports quad rendering (not yet supported by Godot).
this interface supports VR.
This interface supports AR (video background and real world tracking).
This interface outputs to an external device. If the main viewport is used, the on screen output is an unmodified buffer of either the left or right eye (stretched if the viewport size is not changed to the same aspect ratio of [method get_render_target_size]). Using a separate viewport node frees up the main viewport for other purposes.
Tracking is behaving as expected.
Tracking is hindered by excessive motion (the player is moving faster than tracking can keep up).
Tracking is hindered by insufficient features, it's too dark (for camera-based tracking), player is blocked, etc.
We don't know the status of the tracking or this interface does not provide feedback.
Tracking is not functional (camera not plugged in or obscured, lighthouses turned off, etc.).
Play area mode not set or not available.
Play area only supports orientation tracking, no positional tracking, area will center around player.
Player is in seated position, limited positional tracking, fixed guardian around player.
Player is free to move around, full positional tracking.
Same as roomscale but origin point is fixed to the center of the physical space, XRServer.center_on_hmd disabled.