* The space bubble around a player's avatar is now visualized. When another avatar enters a player's bubble, The bubble visualization will appear, a soft tone will play, and the "Bubble" HUD button will flash.
* The space bubble radius setting has been removed. Space bubble size now scales based on avatar scale.
* Space bubble collision detection is now more accurate and reliable.
* CTRL + N toggles the bubble.
* The "Bubble" HUD button has been moved to the proper location.
away.js calls MyAvatar.centerBody, however centerBody is only meaningful in HMD mode.
To guard against this, MyAvatar::centerBody is now a no-op if the application is not in HMD mode.
Previously hitting ' a.k.a. reset sensors, would teleport you to the last
position you were in before you switched into desktop mode, or the spawn location.
This fixes that behavior but not re-centering the avatar while in desktop mode.
The internal MyAvatar:_sensorToWorldMatrix was being updated properly, however
the MyAvatar::_sensorToWorldMatrixCache was not. To fix this I've introduced
a new "mode" to updateSensorToWorldMatrix that does not change the sensorToWorldMatrix at all,
but properly copies the value to the cache and also updates hand controller poses etc.
Basically, when using the third person camera in HMD mode. If the controllers are shown.
They should be shown in front of the users camera, not in front of the users avatar.
To accomplish this, two new faux joint indices are introduced.
CAMERA_RELATIVE_CONTROLLER_RIGHTHAND_INDEX and CAMERA_RELATIVE_CONTROLLER_LEFTHAND_INDEX.
These joint indices can be used for Overlay parenting. (But not for entity parenting because they are not transmitted over the network).
They can also be queried for by using the MyAvatar.getAbsoluteJointRotationInObjectFrame() call.
These new indices are now used by the controllerDisplay.js for the hand controller rendering.
They are also used by system/libraries/controllers.js as the origin for hand controller grabbing and interaction lasers.
Reverse the order we acquire the entityTree & holdActions locks, to avoid deadlocks when the network thread also acquires them.
The network thread does this when hold actions from other avatars are received.
In the game loop, physics occurs before avatar update.
Before this PR, when the avatar is moved during avatar update, near grabbed objects will not pick up this move until one frame later, when
the physics is run on the next update.
After this PR, near grabbed objects are adjusted to reflect any position or rotation change that occurred during the avatar update.
* follow helper lean re-centering / reconciliation now modifies bodySensorMatrix, NOT the character controller.
* The character controller now always follows the bodySensorMatrix (in world space).
This decouples the lean re-centering velocity from the velocity used to move the character controller.
We can now independently tune these things separately.
* Navigation walk speed has been reduced
* Tuned IdleToWalk timescale and interp time to reduce foot sliding
* Tuned fwd, back and lateral characteristicSpeeds to better match the
animations. This reduces foot sliding when moving forward and backward.
* Reduced rig state machine hysteresis to 1/60th of a second.
This is fixed by using the pre-action velocity from CharacterController, which does not include any motors or follow velocity.
This pre-action velocity reflects the actual rigid body velocity after collision constraints are resolved.
This should prevent the character f
We know properly account for the offset of the head due to clamping from a small maxHipsOffset.
This means the hands should look more natural when you are out-of-body and are moving your hand controllers.
A AnimContext class was introduced. This context is passed into every node during evaluation/overlay.
It holds non-animVar "global" data passed from the application.
You can do this by parenting an entity to an avatar's -2 joint index.
This will mean that the entity will follow the avatar as it moves in the world, but
will not follow the avatar's position as it moves in sensor space. Essentially, this
gives you the ability to place objects in the user's physical room.
WebTablets now are located in this feature and no longer jitter.
* Removed MyAvatar.reset() access from JavaScript
* Added HMD.centerUI() to JavaScript, which can be used to reset the 3D UI sphere around the current HMD orientation.
* Added MyAvatar.clearIKJOintLimitHistory() which can be used to reset any remembered IK joint limit history.
* Added MyAvatar.centerBody() which can be used to instantly re-orient the avatar's so that the hips and toes
are facing the same direction as the current HMD orientation.
away.js now uses the above new API's instead of MyAvatar.reset()
* When the overlay is hidden because your head is too close to the sphere,
instead of coming back immediately, it waits until the avatar's velocity is near zero
for a period of time.
* Hooked up jump and fly to MyAvatar::hasDriveInput()
* Added an internal state machine to OverlayConductor to manage hiding/showing transitions.
* The overlay menu state is now tied directly to the overlay, so it will change state as the
overlay is dynamically hidden/shown from code.
* Removed slot going directly from MenuOption::Overlays directly to OverlayConductor::setEnable().
Previously the HUD fading in/out would also recenter the hmd sensor and the avatar, which caused many problems including:
* The user's view could shift vertically.
* Your avatar would briefly go into t-pose
* other users would see your avatar go into t-pose.
Now we now move the UI sphere instead, which results in a much smoother experience.
MyAvatar: added hasDriveInput method.
OverlayConductor:
* removed avatar and sensor reset, instead the overlay's modelTransform is changed.
* revived STANDING mode, which is active if myAvatar->getClearOverlayWhenDriving() is true and you are wearing an HMD.
* SITTING & FLAT mode should be unchanged.
* Instead of using avatar velocity to fade out/fade in the hud, We use the presense or absanse of avatar drive input.
* Additionally, we check distance to the UI sphere, and quickly recenter the hud if the users head is too close to the actual hud sphere.
CompositorHelper:
* Bug fixes for ray picks not using the modelTransform.
HmdDisplayPlugin:
* Bug fixes for rendering not using the modelTransform.
Used to disable the 'room-scale' avatar re-centering code.
Disabling this can prevent sliding when the avatar is supposed to be sitting or mounted on a stationary object.
Also, removed a bunch of old, unused leaning and torso twisting code.
* Bug fix for eye tracking in HMD, the "up" orientation of your eyes now match your head.
* DebugDraw: added drawRay method.
* Application: Renamed preRender to postUpdate
* AvatarManager: added postUpdate method that iterates over all avatars.
* MyAvatar: Renamed preRender to preDisplaySide
* MyAvatar: split preRender code into postUpdate and preDisplaySide.
* Removed "Show who is looking at me", "Render focus indicator" and "Render lookat target" debug draw.
* Split "Show Look At Vectors" into "Show My Look At Vectors" and "Show Other Look At Vectors", to make it easier to debug eye tracking.
* "Show Look at Vectors" now draws the right eye red and the left eye blue.
* Removed Avatar and MyAvatar renderBody
* Removed look at rendering from head.
* GLMHelpers: Bugfix for generateBasisVectors when up primary and secondary axis were orthogonal
* After eyes change targets there is now a minimum 1/3rd second delay before they will again.
* Changing gaze from eye to mouth is less likely then switching between eyes.
Improves FTUE, by no longer going over the network to download default avatar animations.
This also includes support for relative animation urls within the animation.json
Also, fix one frame glitch during snap turning, by updating the sensorToWorld matrix
after the MyAvatar::updateOrientation rotates the avatar, but before we perform IK.
This replaces the calculation of the Head left and right eye positions used for eye tracking.
Which was inadvertently removed in this commit 7483b8546b
Store hand controller positions within the avatar in sensor space, not world space.
Before IK the sensorToWorld matrix is updated to reflect the world space motion of the
character controller during physics. This ensures the IK hand targets move properly with the character.
This extraordinary event can occur if a MessageBox is popped up by the opengl driver.
* removed AvatarData::avatarLock
* removed AvatarUpdate
This code was left over from an earlier avatar threading experiment.
Removed AvatarData avatarLock and AvatarUpdate class
Before this fix, a script could call into HMD.getHUDLookAtPosition2D() while the app was shutting down, which in turn would call
getHeadPose() on the currently active display plugin. This call could cause a crash within the openvr plugin, because the SDK was either shutdown, or in the process of shutting down on the main thread.
This fixes this by spliting the previous DisplayPlugin::getHeadPose(int) into two parts:
* updateHeadPose(int) which is only called once a frame and only by the main thread.
* getHeadPose() which is thread-safe and will return a cached copy of the hmd pose sampled by the last updateHeadPose.
Instead, we just store two controller::Poses in MyAvatar.
Existing behavior and scripting APIs have been preserved.
The hand controller debug drawing is slightly different, but still works.
Users in desktop mode should now see the eyes change focus between the left eye, right eye and the mouth.
Users in mirror mode, or third person camera, should more accurately determine which avatar to look at.
Pass a InputCalibrationData to each inputPlugin and inputDevice.
This contains the most up sensorToWorldMatrix, avatarMat and hmdSensorMatrix.
Each input plugin can use this data to transform it's poses into Avatar space
before sending it up the chain.
This fixes a bug in the handControllerGrab.js script that relied on the hand controller
rotation/positions being in the avatar frame.
Moved velocity and angularVelocity into the SpatiallyNestable base class.
Entity velocity and angularVelocity properties are now relative to their parent, similar to the way position and orientation work for entities.
MyAvatar rig animations now use SpatiallyNestable to convert velocity into local frame to drive the animation state machine.
This options are for for developers only and might help debug animation related issues.
* Enable Inverse Kinematics: this can be toggled to disable IK for the avatar.
* Enable Anim Pre and Post Rotations: this option can be used to use FBX pre-rotations from source avatar animations, instead of the current default, which is to use them from the source model.
This only effects FBX files loaded by the animation system, it does not affect changing model orientations via JavaScript.
Compute HMD facing moving average.
When the moving average diverges from the hips by more then 45 degrees, recenter the body.
Also, the follow code has been changed, instead of a follow velocity being passed to the CharacterController
a desired target is passed. The CharacterController homes toward it's target based on the time remaining.
Any follow deltas applied to move the avatar's position closer to it's target is stored and re-applied
to the bodySensorMatrix. This centralizes the moving/homing code to one place, the CharacterContoller.
A new FollowHelper class was also introduced, it groups together the data and logic necessary to perform the
re-centering/follow procedure. This "hopefully" makes it easier to maintain.
Added an isSoft field to the AttachmentData which is edited
by the Attachment Dialog Menu, sent over the network via
AvatarData identity packets and saved in the Interface.ini preferences.
AvatarData and AvatarBulkData version number has been bumped.
Changed Avatar attachment collections to use smart pointers to models
instead of raw ones. Removed _unusedAttachmentModels.
I don't think the caching was worth the added code complexity.
New JavaScript API to get the avatar's default pose.
MyAvatar.getDefaultJointRotation(index);
MyAvatar.getDefaultJointTranslation(index);
See `examples/tPose.js` for example usage