Also, fix one frame glitch during snap turning, by updating the sensorToWorld matrix
after the MyAvatar::updateOrientation rotates the avatar, but before we perform IK.
This replaces the calculation of the Head left and right eye positions used for eye tracking.
Which was inadvertently removed in this commit 7483b8546b
Store hand controller positions within the avatar in sensor space, not world space.
Before IK the sensorToWorld matrix is updated to reflect the world space motion of the
character controller during physics. This ensures the IK hand targets move properly with the character.
This extraordinary event can occur if a MessageBox is popped up by the opengl driver.
* removed AvatarData::avatarLock
* removed AvatarUpdate
This code was left over from an earlier avatar threading experiment.
Removed AvatarData avatarLock and AvatarUpdate class
Before this fix, a script could call into HMD.getHUDLookAtPosition2D() while the app was shutting down, which in turn would call
getHeadPose() on the currently active display plugin. This call could cause a crash within the openvr plugin, because the SDK was either shutdown, or in the process of shutting down on the main thread.
This fixes this by spliting the previous DisplayPlugin::getHeadPose(int) into two parts:
* updateHeadPose(int) which is only called once a frame and only by the main thread.
* getHeadPose() which is thread-safe and will return a cached copy of the hmd pose sampled by the last updateHeadPose.
Instead, we just store two controller::Poses in MyAvatar.
Existing behavior and scripting APIs have been preserved.
The hand controller debug drawing is slightly different, but still works.
Users in desktop mode should now see the eyes change focus between the left eye, right eye and the mouth.
Users in mirror mode, or third person camera, should more accurately determine which avatar to look at.
Pass a InputCalibrationData to each inputPlugin and inputDevice.
This contains the most up sensorToWorldMatrix, avatarMat and hmdSensorMatrix.
Each input plugin can use this data to transform it's poses into Avatar space
before sending it up the chain.
This fixes a bug in the handControllerGrab.js script that relied on the hand controller
rotation/positions being in the avatar frame.
Moved velocity and angularVelocity into the SpatiallyNestable base class.
Entity velocity and angularVelocity properties are now relative to their parent, similar to the way position and orientation work for entities.
MyAvatar rig animations now use SpatiallyNestable to convert velocity into local frame to drive the animation state machine.