From afc8c6391ccc2d3d69dcad0d93b2639bfe77dc55 Mon Sep 17 00:00:00 2001 From: Bastiaan Olij Date: Thu, 9 Apr 2020 00:47:36 +1000 Subject: Renaming all ARVR nodes to XR --- doc/classes/@GlobalScope.xml | 4 +- doc/classes/ARVRAnchor.xml | 66 ------------ doc/classes/ARVRCamera.xml | 17 --- doc/classes/ARVRController.xml | 106 ------------------- doc/classes/ARVRInterface.xml | 127 ----------------------- doc/classes/ARVROrigin.xml | 25 ----- doc/classes/ARVRPositionalTracker.xml | 111 -------------------- doc/classes/ARVRServer.xml | 188 ---------------------------------- doc/classes/RenderingServer.xml | 6 +- doc/classes/SubViewport.xml | 2 +- doc/classes/XRAnchor3D.xml | 66 ++++++++++++ doc/classes/XRCamera3D.xml | 17 +++ doc/classes/XRController3D.xml | 106 +++++++++++++++++++ doc/classes/XRInterface.xml | 127 +++++++++++++++++++++++ doc/classes/XROrigin3D.xml | 25 +++++ doc/classes/XRPositionalTracker.xml | 111 ++++++++++++++++++++ doc/classes/XRServer.xml | 188 ++++++++++++++++++++++++++++++++++ 17 files changed, 646 insertions(+), 646 deletions(-) delete mode 100644 doc/classes/ARVRAnchor.xml delete mode 100644 doc/classes/ARVRCamera.xml delete mode 100644 doc/classes/ARVRController.xml delete mode 100644 doc/classes/ARVRInterface.xml delete mode 100644 doc/classes/ARVROrigin.xml delete mode 100644 doc/classes/ARVRPositionalTracker.xml delete mode 100644 doc/classes/ARVRServer.xml create mode 100644 doc/classes/XRAnchor3D.xml create mode 100644 doc/classes/XRCamera3D.xml create mode 100644 doc/classes/XRController3D.xml create mode 100644 doc/classes/XRInterface.xml create mode 100644 doc/classes/XROrigin3D.xml create mode 100644 doc/classes/XRPositionalTracker.xml create mode 100644 doc/classes/XRServer.xml (limited to 'doc/classes') diff --git a/doc/classes/@GlobalScope.xml b/doc/classes/@GlobalScope.xml index 8c6821eaac..f462aa989d 100644 --- a/doc/classes/@GlobalScope.xml +++ b/doc/classes/@GlobalScope.xml @@ -12,8 +12,8 @@ - - The [ARVRServer] singleton. + + The [XRServer] singleton. The [AudioServer] singleton. diff --git a/doc/classes/ARVRAnchor.xml b/doc/classes/ARVRAnchor.xml deleted file mode 100644 index 82575ce7cb..0000000000 --- a/doc/classes/ARVRAnchor.xml +++ /dev/null @@ -1,66 +0,0 @@ - - - - An anchor point in AR space. - - - The [ARVRAnchor] point is a spatial node that maps a real world location identified by the AR platform to a position within the game world. For example, as long as plane detection in ARKit is on, ARKit will identify and update the position of planes (tables, floors, etc) and create anchors for them. - This node is mapped to one of the anchors through its unique ID. When you receive a signal that a new anchor is available, you should add this node to your scene for that anchor. You can predefine nodes and set the ID; the nodes will simply remain on 0,0,0 until a plane is recognized. - Keep in mind that, as long as plane detection is enabled, the size, placing and orientation of an anchor will be updated as the detection logic learns more about the real world out there especially if only part of the surface is in view. - - - - - - - - - Returns the name given to this anchor. - - - - - - - Returns [code]true[/code] if the anchor is being tracked and [code]false[/code] if no anchor with this ID is currently known. - - - - - - - If provided by the [ARVRInterface], this returns a mesh object for the anchor. For an anchor, this can be a shape related to the object being tracked or it can be a mesh that provides topology related to the anchor and can be used to create shadows/reflections on surfaces or for generating collision shapes. - - - - - - - Returns a plane aligned with our anchor; handy for intersection testing. - - - - - - - Returns the estimated size of the plane that was detected. Say when the anchor relates to a table in the real world, this is the estimated size of the surface of that table. - - - - - - The anchor's ID. You can set this before the anchor itself exists. The first anchor gets an ID of [code]1[/code], the second an ID of [code]2[/code], etc. When anchors get removed, the engine can then assign the corresponding ID to new anchors. The most common situation where anchors "disappear" is when the AR server identifies that two anchors represent different parts of the same plane and merges them. - - - - - - - - Emitted when the mesh associated with the anchor changes or when one becomes available. This is especially important for topology that is constantly being [code]mesh_updated[/code]. - - - - - - diff --git a/doc/classes/ARVRCamera.xml b/doc/classes/ARVRCamera.xml deleted file mode 100644 index c97d5cf1d8..0000000000 --- a/doc/classes/ARVRCamera.xml +++ /dev/null @@ -1,17 +0,0 @@ - - - - A camera node with a few overrules for AR/VR applied, such as location tracking. - - - This is a helper spatial node for our camera; note that, if stereoscopic rendering is applicable (VR-HMD), most of the camera properties are ignored, as the HMD information overrides them. The only properties that can be trusted are the near and far planes. - The position and orientation of this node is automatically updated by the ARVR Server to represent the location of the HMD if such tracking is available and can thus be used by game logic. Note that, in contrast to the ARVR Controller, the render thread has access to the most up-to-date tracking data of the HMD and the location of the ARVRCamera can lag a few milliseconds behind what is used for rendering as a result. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - diff --git a/doc/classes/ARVRController.xml b/doc/classes/ARVRController.xml deleted file mode 100644 index 572b47ce6d..0000000000 --- a/doc/classes/ARVRController.xml +++ /dev/null @@ -1,106 +0,0 @@ - - - - A spatial node representing a spatially-tracked controller. - - - This is a helper spatial node that is linked to the tracking of controllers. It also offers several handy passthroughs to the state of buttons and such on the controllers. - Controllers are linked by their ID. You can create controller nodes before the controllers are available. If your game always uses two controllers (one for each hand), you can predefine the controllers with ID 1 and 2; they will become active as soon as the controllers are identified. If you expect additional controllers to be used, you should react to the signals and add ARVRController nodes to your scene. - The position of the controller node is automatically updated by the [ARVRServer]. This makes this node ideal to add child nodes to visualize the controller. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - - If active, returns the name of the associated controller if provided by the AR/VR SDK used. - - - - - - - Returns the hand holding this controller, if known. See [enum ARVRPositionalTracker.TrackerHand]. - - - - - - - Returns [code]true[/code] if the bound controller is active. ARVR systems attempt to track active controllers. - - - - - - - - - Returns the value of the given axis for things like triggers, touchpads, etc. that are embedded into the controller. - - - - - - - Returns the ID of the joystick object bound to this. Every controller tracked by the [ARVRServer] that has buttons and axis will also be registered as a joystick within Godot. This means that all the normal joystick tracking and input mapping will work for buttons and axis found on the AR/VR controllers. This ID is purely offered as information so you can link up the controller with its joystick entry. - - - - - - - If provided by the [ARVRInterface], this returns a mesh associated with the controller. This can be used to visualize the controller. - - - - - - - - - Returns [code]true[/code] if the button at index [code]button[/code] is pressed. See [enum JoystickList], in particular the [code]JOY_VR_*[/code] constants. - - - - - - The controller's ID. - A controller ID of 0 is unbound and will always result in an inactive node. Controller ID 1 is reserved for the first controller that identifies itself as the left-hand controller and ID 2 is reserved for the first controller that identifies itself as the right-hand controller. - For any other controller that the [ARVRServer] detects, we continue with controller ID 3. - When a controller is turned off, its slot is freed. This ensures controllers will keep the same ID even when controllers with lower IDs are turned off. - - - The degree to which the controller vibrates. Ranges from [code]0.0[/code] to [code]1.0[/code] with precision [code].01[/code]. If changed, updates [member ARVRPositionalTracker.rumble] accordingly. - This is a useful property to animate if you want the controller to vibrate for a limited duration. - - - - - - - - Emitted when a button on this controller is pressed. - - - - - - - Emitted when a button on this controller is released. - - - - - - - Emitted when the mesh associated with the controller changes or when one becomes available. Generally speaking this will be a static mesh after becoming available. - - - - - - diff --git a/doc/classes/ARVRInterface.xml b/doc/classes/ARVRInterface.xml deleted file mode 100644 index 0727bda668..0000000000 --- a/doc/classes/ARVRInterface.xml +++ /dev/null @@ -1,127 +0,0 @@ - - - - Base class for an AR/VR interface implementation. - - - This class needs to be implemented to make an AR or VR platform available to Godot and these should be implemented as C++ modules or GDNative modules (note that for GDNative the subclass ARVRScriptInterface should be used). Part of the interface is exposed to GDScript so you can detect, enable and configure an AR or VR platform. - Interfaces should be written in such a way that simply enabling them will give us a working setup. You can query the available interfaces through [ARVRServer]. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - - If this is an AR interface that requires displaying a camera feed as the background, this method returns the feed ID in the [CameraServer] for this interface. - - - - - - - Returns a combination of [enum Capabilities] flags providing information about the capabilities of this interface. - - - - - - - Returns the name of this interface (OpenVR, OpenHMD, ARKit, etc). - - - - - - - Returns the resolution at which we should render our intermediate results before things like lens distortion are applied by the VR platform. - - - - - - - If supported, returns the status of our tracking. This will allow you to provide feedback to the user whether there are issues with positional tracking. - - - - - - - Call this to initialize this interface. The first interface that is initialized is identified as the primary interface and it will be used for rendering output. - After initializing the interface you want to use you then need to enable the AR/VR mode of a viewport and rendering should commence. - [b]Note:[/b] You must enable the AR/VR mode on the main viewport for any device that uses the main output of Godot, such as for mobile VR. - If you do this for a platform that handles its own output (such as OpenVR) Godot will show just one eye without distortion on screen. Alternatively, you can add a separate viewport node to your scene and enable AR/VR on that viewport. It will be used to output to the HMD, leaving you free to do anything you like in the main window, such as using a separate camera as a spectator camera or rendering something completely different. - While currently not used, you can activate additional interfaces. You may wish to do this if you want to track controllers from other platforms. However, at this point in time only one interface can render to an HMD. - - - - - - - Returns [code]true[/code] if the current output of this interface is in stereo. - - - - - - - Turns the interface off. - - - - - - On an AR interface, [code]true[/code] if anchor detection is enabled. - - - [code]true[/code] if this interface been initialized. - - - [code]true[/code] if this is the primary interface. - - - - - No ARVR capabilities. - - - This interface can work with normal rendering output (non-HMD based AR). - - - This interface supports stereoscopic rendering. - - - This interface supports AR (video background and real world tracking). - - - This interface outputs to an external device. If the main viewport is used, the on screen output is an unmodified buffer of either the left or right eye (stretched if the viewport size is not changed to the same aspect ratio of [method get_render_targetsize]). Using a separate viewport node frees up the main viewport for other purposes. - - - Mono output, this is mostly used internally when retrieving positioning information for our camera node or when stereo scopic rendering is not supported. - - - Left eye output, this is mostly used internally when rendering the image for the left eye and obtaining positioning and projection information. - - - Right eye output, this is mostly used internally when rendering the image for the right eye and obtaining positioning and projection information. - - - Tracking is behaving as expected. - - - Tracking is hindered by excessive motion (the player is moving faster than tracking can keep up). - - - Tracking is hindered by insufficient features, it's too dark (for camera-based tracking), player is blocked, etc. - - - We don't know the status of the tracking or this interface does not provide feedback. - - - Tracking is not functional (camera not plugged in or obscured, lighthouses turned off, etc.). - - - diff --git a/doc/classes/ARVROrigin.xml b/doc/classes/ARVROrigin.xml deleted file mode 100644 index a88a89c927..0000000000 --- a/doc/classes/ARVROrigin.xml +++ /dev/null @@ -1,25 +0,0 @@ - - - - The origin point in AR/VR. - - - This is a special node within the AR/VR system that maps the physical location of the center of our tracking space to the virtual location within our game world. - There should be only one of these nodes in your scene and you must have one. All the ARVRCamera, ARVRController and ARVRAnchor nodes should be direct children of this node for spatial tracking to work correctly. - It is the position of this node that you update when your character needs to move through your game world while we're not moving in the real world. Movement in the real world is always in relation to this origin point. - For example, if your character is driving a car, the ARVROrigin node should be a child node of this car. Or, if you're implementing a teleport system to move your character, you should change the position of this node. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - Allows you to adjust the scale to your game's units. Most AR/VR platforms assume a scale of 1 game world unit = 1 real world meter. - [b]Note:[/b] This method is a passthrough to the [ARVRServer] itself. - - - - - diff --git a/doc/classes/ARVRPositionalTracker.xml b/doc/classes/ARVRPositionalTracker.xml deleted file mode 100644 index 640b721d37..0000000000 --- a/doc/classes/ARVRPositionalTracker.xml +++ /dev/null @@ -1,111 +0,0 @@ - - - - A tracked object. - - - An instance of this object represents a device that is tracked, such as a controller or anchor point. HMDs aren't represented here as they are handled internally. - As controllers are turned on and the AR/VR interface detects them, instances of this object are automatically added to this list of active tracking objects accessible through the [ARVRServer]. - The [ARVRController] and [ARVRAnchor] both consume objects of this type and should be used in your project. The positional trackers are just under-the-hood objects that make this all work. These are mostly exposed so that GDNative-based interfaces can interact with them. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - - Returns the hand holding this tracker, if known. See [enum TrackerHand] constants. - - - - - - - If this is a controller that is being tracked, the controller will also be represented by a joystick entry with this ID. - - - - - - - Returns the mesh related to a controller or anchor point if one is available. - - - - - - - Returns the controller or anchor point's name if available. - - - - - - - Returns the controller's orientation matrix. - - - - - - - Returns the world-space controller position. - - - - - - - Returns the internal tracker ID. This uniquely identifies the tracker per tracker type and matches the ID you need to specify for nodes such as the [ARVRController] and [ARVRAnchor] nodes. - - - - - - - Returns [code]true[/code] if this device tracks orientation. - - - - - - - Returns [code]true[/code] if this device tracks position. - - - - - - - - - Returns the transform combining this device's orientation and position. - - - - - - - Returns the tracker's type. - - - - - - The degree to which the tracker rumbles. Ranges from [code]0.0[/code] to [code]1.0[/code] with precision [code].01[/code]. - - - - - The hand this tracker is held in is unknown or not applicable. - - - This tracker is the left hand controller. - - - This tracker is the right hand controller. - - - diff --git a/doc/classes/ARVRServer.xml b/doc/classes/ARVRServer.xml deleted file mode 100644 index d8d069c048..0000000000 --- a/doc/classes/ARVRServer.xml +++ /dev/null @@ -1,188 +0,0 @@ - - - - Server for AR and VR features. - - - The AR/VR server is the heart of our Advanced and Virtual Reality solution and handles all the processing. - - - https://docs.godotengine.org/en/latest/tutorials/vr/index.html - - - - - - - - - - - This is an important function to understand correctly. AR and VR platforms all handle positioning slightly differently. - For platforms that do not offer spatial tracking, our origin point (0,0,0) is the location of our HMD, but you have little control over the direction the player is facing in the real world. - For platforms that do offer spatial tracking, our origin point depends very much on the system. For OpenVR, our origin point is usually the center of the tracking space, on the ground. For other platforms, it's often the location of the tracking camera. - This method allows you to center your tracker on the location of the HMD. It will take the current location of the HMD and use that to adjust all your tracking data; in essence, realigning the real world to your player's current position in the game world. - For this method to produce usable results, tracking information must be available. This often takes a few frames after starting your game. - You should call this method after a few seconds have passed. For instance, when the user requests a realignment of the display holding a designated button on a controller for a short period of time, or when implementing a teleport mechanism. - - - - - - - - - Finds an interface by its name. For instance, if your project uses capabilities of an AR/VR platform, you can find the interface for that platform by name and initialize it. - - - - - - - Returns the primary interface's transformation. - - - - - - - - - Returns the interface registered at a given index in our list of interfaces. - - - - - - - Returns the number of interfaces currently registered with the AR/VR server. If your project supports multiple AR/VR platforms, you can look through the available interface, and either present the user with a selection or simply try to initialize each interface and use the first one that returns [code]true[/code]. - - - - - - - Returns a list of available interfaces the ID and name of each interface. - - - - - - - Returns the absolute timestamp (in μs) of the last [ARVRServer] commit of the AR/VR eyes to [RenderingServer]. The value comes from an internal call to [method OS.get_ticks_usec]. - - - - - - - Returns the duration (in μs) of the last frame. This is computed as the difference between [method get_last_commit_usec] and [method get_last_process_usec] when committing. - - - - - - - Returns the absolute timestamp (in μs) of the last [ARVRServer] process callback. The value comes from an internal call to [method OS.get_ticks_usec]. - - - - - - - Returns the reference frame transform. Mostly used internally and exposed for GDNative build interfaces. - - - - - - - - - Returns the positional tracker at the given ID. - - - - - - - Returns the number of trackers currently registered. - - - - - - The primary [ARVRInterface] currently bound to the [ARVRServer]. - - - Allows you to adjust the scale to your game's units. Most AR/VR platforms assume a scale of 1 game world unit = 1 real world meter. - - - - - - - - Emitted when a new interface has been added. - - - - - - - Emitted when an interface is removed. - - - - - - - - - - - Emitted when a new tracker has been added. If you don't use a fixed number of controllers or if you're using [ARVRAnchor]s for an AR solution, it is important to react to this signal to add the appropriate [ARVRController] or [ARVRAnchor] nodes related to this new tracker. - - - - - - - - - - - Emitted when a tracker is removed. You should remove any [ARVRController] or [ARVRAnchor] points if applicable. This is not mandatory, the nodes simply become inactive and will be made active again when a new tracker becomes available (i.e. a new controller is switched on that takes the place of the previous one). - - - - - - The tracker tracks the location of a controller. - - - The tracker tracks the location of a base station. - - - The tracker tracks the location and size of an AR anchor. - - - Used internally to filter trackers of any known type. - - - Used internally if we haven't set the tracker type yet. - - - Used internally to select all trackers. - - - Fully reset the orientation of the HMD. Regardless of what direction the user is looking to in the real world. The user will look dead ahead in the virtual world. - - - Resets the orientation but keeps the tilt of the device. So if we're looking down, we keep looking down but heading will be reset. - - - Does not reset the orientation of the HMD, only the position of the player gets centered. - - - diff --git a/doc/classes/RenderingServer.xml b/doc/classes/RenderingServer.xml index d2d13fe406..aa393877b2 100644 --- a/doc/classes/RenderingServer.xml +++ b/doc/classes/RenderingServer.xml @@ -3067,15 +3067,15 @@ Sets when the viewport should be updated. See [enum ViewportUpdateMode] constants for options. - + - + - If [code]true[/code], the viewport uses augmented or virtual reality technologies. See [ARVRInterface]. + If [code]true[/code], the viewport uses augmented or virtual reality technologies. See [XRInterface]. diff --git a/doc/classes/SubViewport.xml b/doc/classes/SubViewport.xml index dc3d748496..e877050bf8 100644 --- a/doc/classes/SubViewport.xml +++ b/doc/classes/SubViewport.xml @@ -9,7 +9,7 @@ - + If [code]true[/code], the sub-viewport will be used in AR/VR process. diff --git a/doc/classes/XRAnchor3D.xml b/doc/classes/XRAnchor3D.xml new file mode 100644 index 0000000000..a409c79230 --- /dev/null +++ b/doc/classes/XRAnchor3D.xml @@ -0,0 +1,66 @@ + + + + An anchor point in AR space. + + + The [XRAnchor3D] point is a spatial node that maps a real world location identified by the AR platform to a position within the game world. For example, as long as plane detection in ARKit is on, ARKit will identify and update the position of planes (tables, floors, etc) and create anchors for them. + This node is mapped to one of the anchors through its unique ID. When you receive a signal that a new anchor is available, you should add this node to your scene for that anchor. You can predefine nodes and set the ID; the nodes will simply remain on 0,0,0 until a plane is recognized. + Keep in mind that, as long as plane detection is enabled, the size, placing and orientation of an anchor will be updated as the detection logic learns more about the real world out there especially if only part of the surface is in view. + + + + + + + + + Returns the name given to this anchor. + + + + + + + Returns [code]true[/code] if the anchor is being tracked and [code]false[/code] if no anchor with this ID is currently known. + + + + + + + If provided by the [XRInterface], this returns a mesh object for the anchor. For an anchor, this can be a shape related to the object being tracked or it can be a mesh that provides topology related to the anchor and can be used to create shadows/reflections on surfaces or for generating collision shapes. + + + + + + + Returns a plane aligned with our anchor; handy for intersection testing. + + + + + + + Returns the estimated size of the plane that was detected. Say when the anchor relates to a table in the real world, this is the estimated size of the surface of that table. + + + + + + The anchor's ID. You can set this before the anchor itself exists. The first anchor gets an ID of [code]1[/code], the second an ID of [code]2[/code], etc. When anchors get removed, the engine can then assign the corresponding ID to new anchors. The most common situation where anchors "disappear" is when the AR server identifies that two anchors represent different parts of the same plane and merges them. + + + + + + + + Emitted when the mesh associated with the anchor changes or when one becomes available. This is especially important for topology that is constantly being [code]mesh_updated[/code]. + + + + + + diff --git a/doc/classes/XRCamera3D.xml b/doc/classes/XRCamera3D.xml new file mode 100644 index 0000000000..4d86e24daa --- /dev/null +++ b/doc/classes/XRCamera3D.xml @@ -0,0 +1,17 @@ + + + + A camera node with a few overrules for AR/VR applied, such as location tracking. + + + This is a helper spatial node for our camera; note that, if stereoscopic rendering is applicable (VR-HMD), most of the camera properties are ignored, as the HMD information overrides them. The only properties that can be trusted are the near and far planes. + The position and orientation of this node is automatically updated by the XR Server to represent the location of the HMD if such tracking is available and can thus be used by game logic. Note that, in contrast to the XR Controller, the render thread has access to the most up-to-date tracking data of the HMD and the location of the XRCamera3D can lag a few milliseconds behind what is used for rendering as a result. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + diff --git a/doc/classes/XRController3D.xml b/doc/classes/XRController3D.xml new file mode 100644 index 0000000000..e4a06a80db --- /dev/null +++ b/doc/classes/XRController3D.xml @@ -0,0 +1,106 @@ + + + + A spatial node representing a spatially-tracked controller. + + + This is a helper spatial node that is linked to the tracking of controllers. It also offers several handy passthroughs to the state of buttons and such on the controllers. + Controllers are linked by their ID. You can create controller nodes before the controllers are available. If your game always uses two controllers (one for each hand), you can predefine the controllers with ID 1 and 2; they will become active as soon as the controllers are identified. If you expect additional controllers to be used, you should react to the signals and add XRController3D nodes to your scene. + The position of the controller node is automatically updated by the [XRServer]. This makes this node ideal to add child nodes to visualize the controller. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + + If active, returns the name of the associated controller if provided by the AR/VR SDK used. + + + + + + + Returns the hand holding this controller, if known. See [enum XRPositionalTracker.TrackerHand]. + + + + + + + Returns [code]true[/code] if the bound controller is active. XR systems attempt to track active controllers. + + + + + + + + + Returns the value of the given axis for things like triggers, touchpads, etc. that are embedded into the controller. + + + + + + + Returns the ID of the joystick object bound to this. Every controller tracked by the [XRServer] that has buttons and axis will also be registered as a joystick within Godot. This means that all the normal joystick tracking and input mapping will work for buttons and axis found on the AR/VR controllers. This ID is purely offered as information so you can link up the controller with its joystick entry. + + + + + + + If provided by the [XRInterface], this returns a mesh associated with the controller. This can be used to visualize the controller. + + + + + + + + + Returns [code]true[/code] if the button at index [code]button[/code] is pressed. See [enum JoystickList], in particular the [code]JOY_VR_*[/code] constants. + + + + + + The controller's ID. + A controller ID of 0 is unbound and will always result in an inactive node. Controller ID 1 is reserved for the first controller that identifies itself as the left-hand controller and ID 2 is reserved for the first controller that identifies itself as the right-hand controller. + For any other controller that the [XRServer] detects, we continue with controller ID 3. + When a controller is turned off, its slot is freed. This ensures controllers will keep the same ID even when controllers with lower IDs are turned off. + + + The degree to which the controller vibrates. Ranges from [code]0.0[/code] to [code]1.0[/code] with precision [code].01[/code]. If changed, updates [member XRPositionalTracker.rumble] accordingly. + This is a useful property to animate if you want the controller to vibrate for a limited duration. + + + + + + + + Emitted when a button on this controller is pressed. + + + + + + + Emitted when a button on this controller is released. + + + + + + + Emitted when the mesh associated with the controller changes or when one becomes available. Generally speaking this will be a static mesh after becoming available. + + + + + + diff --git a/doc/classes/XRInterface.xml b/doc/classes/XRInterface.xml new file mode 100644 index 0000000000..1985010223 --- /dev/null +++ b/doc/classes/XRInterface.xml @@ -0,0 +1,127 @@ + + + + Base class for an AR/VR interface implementation. + + + This class needs to be implemented to make an AR or VR platform available to Godot and these should be implemented as C++ modules or GDNative modules (note that for GDNative the subclass XRScriptInterface should be used). Part of the interface is exposed to GDScript so you can detect, enable and configure an AR or VR platform. + Interfaces should be written in such a way that simply enabling them will give us a working setup. You can query the available interfaces through [XRServer]. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + + If this is an AR interface that requires displaying a camera feed as the background, this method returns the feed ID in the [CameraServer] for this interface. + + + + + + + Returns a combination of [enum Capabilities] flags providing information about the capabilities of this interface. + + + + + + + Returns the name of this interface (OpenVR, OpenHMD, ARKit, etc). + + + + + + + Returns the resolution at which we should render our intermediate results before things like lens distortion are applied by the VR platform. + + + + + + + If supported, returns the status of our tracking. This will allow you to provide feedback to the user whether there are issues with positional tracking. + + + + + + + Call this to initialize this interface. The first interface that is initialized is identified as the primary interface and it will be used for rendering output. + After initializing the interface you want to use you then need to enable the AR/VR mode of a viewport and rendering should commence. + [b]Note:[/b] You must enable the AR/VR mode on the main viewport for any device that uses the main output of Godot, such as for mobile VR. + If you do this for a platform that handles its own output (such as OpenVR) Godot will show just one eye without distortion on screen. Alternatively, you can add a separate viewport node to your scene and enable AR/VR on that viewport. It will be used to output to the HMD, leaving you free to do anything you like in the main window, such as using a separate camera as a spectator camera or rendering something completely different. + While currently not used, you can activate additional interfaces. You may wish to do this if you want to track controllers from other platforms. However, at this point in time only one interface can render to an HMD. + + + + + + + Returns [code]true[/code] if the current output of this interface is in stereo. + + + + + + + Turns the interface off. + + + + + + On an AR interface, [code]true[/code] if anchor detection is enabled. + + + [code]true[/code] if this interface been initialized. + + + [code]true[/code] if this is the primary interface. + + + + + No XR capabilities. + + + This interface can work with normal rendering output (non-HMD based AR). + + + This interface supports stereoscopic rendering. + + + This interface supports AR (video background and real world tracking). + + + This interface outputs to an external device. If the main viewport is used, the on screen output is an unmodified buffer of either the left or right eye (stretched if the viewport size is not changed to the same aspect ratio of [method get_render_targetsize]). Using a separate viewport node frees up the main viewport for other purposes. + + + Mono output, this is mostly used internally when retrieving positioning information for our camera node or when stereo scopic rendering is not supported. + + + Left eye output, this is mostly used internally when rendering the image for the left eye and obtaining positioning and projection information. + + + Right eye output, this is mostly used internally when rendering the image for the right eye and obtaining positioning and projection information. + + + Tracking is behaving as expected. + + + Tracking is hindered by excessive motion (the player is moving faster than tracking can keep up). + + + Tracking is hindered by insufficient features, it's too dark (for camera-based tracking), player is blocked, etc. + + + We don't know the status of the tracking or this interface does not provide feedback. + + + Tracking is not functional (camera not plugged in or obscured, lighthouses turned off, etc.). + + + diff --git a/doc/classes/XROrigin3D.xml b/doc/classes/XROrigin3D.xml new file mode 100644 index 0000000000..57cf673d30 --- /dev/null +++ b/doc/classes/XROrigin3D.xml @@ -0,0 +1,25 @@ + + + + The origin point in AR/VR. + + + This is a special node within the AR/VR system that maps the physical location of the center of our tracking space to the virtual location within our game world. + There should be only one of these nodes in your scene and you must have one. All the XRCamera3D, XRController3D and XRAnchor3D nodes should be direct children of this node for spatial tracking to work correctly. + It is the position of this node that you update when your character needs to move through your game world while we're not moving in the real world. Movement in the real world is always in relation to this origin point. + For example, if your character is driving a car, the XROrigin3D node should be a child node of this car. Or, if you're implementing a teleport system to move your character, you should change the position of this node. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + Allows you to adjust the scale to your game's units. Most AR/VR platforms assume a scale of 1 game world unit = 1 real world meter. + [b]Note:[/b] This method is a passthrough to the [XRServer] itself. + + + + + diff --git a/doc/classes/XRPositionalTracker.xml b/doc/classes/XRPositionalTracker.xml new file mode 100644 index 0000000000..2f7cc21703 --- /dev/null +++ b/doc/classes/XRPositionalTracker.xml @@ -0,0 +1,111 @@ + + + + A tracked object. + + + An instance of this object represents a device that is tracked, such as a controller or anchor point. HMDs aren't represented here as they are handled internally. + As controllers are turned on and the AR/VR interface detects them, instances of this object are automatically added to this list of active tracking objects accessible through the [XRServer]. + The [XRController3D] and [XRAnchor3D] both consume objects of this type and should be used in your project. The positional trackers are just under-the-hood objects that make this all work. These are mostly exposed so that GDNative-based interfaces can interact with them. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + + Returns the hand holding this tracker, if known. See [enum TrackerHand] constants. + + + + + + + If this is a controller that is being tracked, the controller will also be represented by a joystick entry with this ID. + + + + + + + Returns the mesh related to a controller or anchor point if one is available. + + + + + + + Returns the controller or anchor point's name if available. + + + + + + + Returns the controller's orientation matrix. + + + + + + + Returns the world-space controller position. + + + + + + + Returns the internal tracker ID. This uniquely identifies the tracker per tracker type and matches the ID you need to specify for nodes such as the [XRController3D] and [XRAnchor3D] nodes. + + + + + + + Returns [code]true[/code] if this device tracks orientation. + + + + + + + Returns [code]true[/code] if this device tracks position. + + + + + + + + + Returns the transform combining this device's orientation and position. + + + + + + + Returns the tracker's type. + + + + + + The degree to which the tracker rumbles. Ranges from [code]0.0[/code] to [code]1.0[/code] with precision [code].01[/code]. + + + + + The hand this tracker is held in is unknown or not applicable. + + + This tracker is the left hand controller. + + + This tracker is the right hand controller. + + + diff --git a/doc/classes/XRServer.xml b/doc/classes/XRServer.xml new file mode 100644 index 0000000000..5e6002aee3 --- /dev/null +++ b/doc/classes/XRServer.xml @@ -0,0 +1,188 @@ + + + + Server for AR and VR features. + + + The AR/VR server is the heart of our Advanced and Virtual Reality solution and handles all the processing. + + + https://docs.godotengine.org/en/latest/tutorials/vr/index.html + + + + + + + + + + + This is an important function to understand correctly. AR and VR platforms all handle positioning slightly differently. + For platforms that do not offer spatial tracking, our origin point (0,0,0) is the location of our HMD, but you have little control over the direction the player is facing in the real world. + For platforms that do offer spatial tracking, our origin point depends very much on the system. For OpenVR, our origin point is usually the center of the tracking space, on the ground. For other platforms, it's often the location of the tracking camera. + This method allows you to center your tracker on the location of the HMD. It will take the current location of the HMD and use that to adjust all your tracking data; in essence, realigning the real world to your player's current position in the game world. + For this method to produce usable results, tracking information must be available. This often takes a few frames after starting your game. + You should call this method after a few seconds have passed. For instance, when the user requests a realignment of the display holding a designated button on a controller for a short period of time, or when implementing a teleport mechanism. + + + + + + + + + Finds an interface by its name. For instance, if your project uses capabilities of an AR/VR platform, you can find the interface for that platform by name and initialize it. + + + + + + + Returns the primary interface's transformation. + + + + + + + + + Returns the interface registered at a given index in our list of interfaces. + + + + + + + Returns the number of interfaces currently registered with the AR/VR server. If your project supports multiple AR/VR platforms, you can look through the available interface, and either present the user with a selection or simply try to initialize each interface and use the first one that returns [code]true[/code]. + + + + + + + Returns a list of available interfaces the ID and name of each interface. + + + + + + + Returns the absolute timestamp (in μs) of the last [XRServer] commit of the AR/VR eyes to [RenderingServer]. The value comes from an internal call to [method OS.get_ticks_usec]. + + + + + + + Returns the duration (in μs) of the last frame. This is computed as the difference between [method get_last_commit_usec] and [method get_last_process_usec] when committing. + + + + + + + Returns the absolute timestamp (in μs) of the last [XRServer] process callback. The value comes from an internal call to [method OS.get_ticks_usec]. + + + + + + + Returns the reference frame transform. Mostly used internally and exposed for GDNative build interfaces. + + + + + + + + + Returns the positional tracker at the given ID. + + + + + + + Returns the number of trackers currently registered. + + + + + + The primary [XRInterface] currently bound to the [XRServer]. + + + Allows you to adjust the scale to your game's units. Most AR/VR platforms assume a scale of 1 game world unit = 1 real world meter. + + + + + + + + Emitted when a new interface has been added. + + + + + + + Emitted when an interface is removed. + + + + + + + + + + + Emitted when a new tracker has been added. If you don't use a fixed number of controllers or if you're using [XRAnchor3D]s for an AR solution, it is important to react to this signal to add the appropriate [XRController3D] or [XRAnchor3D] nodes related to this new tracker. + + + + + + + + + + + Emitted when a tracker is removed. You should remove any [XRController3D] or [XRAnchor3D] points if applicable. This is not mandatory, the nodes simply become inactive and will be made active again when a new tracker becomes available (i.e. a new controller is switched on that takes the place of the previous one). + + + + + + The tracker tracks the location of a controller. + + + The tracker tracks the location of a base station. + + + The tracker tracks the location and size of an AR anchor. + + + Used internally to filter trackers of any known type. + + + Used internally if we haven't set the tracker type yet. + + + Used internally to select all trackers. + + + Fully reset the orientation of the HMD. Regardless of what direction the user is looking to in the real world. The user will look dead ahead in the virtual world. + + + Resets the orientation but keeps the tilt of the device. So if we're looking down, we keep looking down but heading will be reset. + + + Does not reset the orientation of the HMD, only the position of the player gets centered. + + + -- cgit v1.2.3