Previously we were using a infinitely tall vertical cylinder
to push hand IK targets out of the body, this had the
side-effect of preventing the hands from being raised over
the head. Now, we collide against the same 3d capsule
used by the physics system for avatar collisions.
There were three things that were causing issues with eye look at vectors while wearing an HMD.
1) The matrix returned by AvatarUpdate->getHeadPose() was in the wrong space, it should be in avatar space.
it was actually returning a matrix in sensor/room space.
2) The lookAtPosition was incorrect while wearing an HMD and with no avatars to look at.
3) The eye rotation limits in Rig::updateEyeJoint were relative to the model's zero orientation, NOT relative to the head.
this was causing the eyes to hit limits when the avatar head turned.
* When computing tipPosition, for the next iteration of the CCD,
use the leverArm before it's projected onto the lowerSpine twist axis.
* fix for acos() that was going outside of valid domain. (-1.0, 1.0)
Now there are two sets of of jump takeoff and in-air animations.
* Run - Used when the character jumps or falls with a small forward velocity.
* Standing - Used when the character jumps or falls in-place or backward.
CharacterController
* increased takeoff duration to 250 ms
* increased takeoff to fly duration to 1100 ms
* added standing jump and in-air animations
* added 250 milisecond delay between ground and hover, to prevent going into hover when walking over cracks.
* take-off to in-air transitions now use the new snapshotPrev interp type for a smoother tweening.
interpType defines how the interpolation between two states is performed.
* SnapshotBoth: Stores two snapshots, the previous animation before interpolation begins and the target state at the
interTarget frame. Then during the interpolation period the two snapshots are interpolated to produce smooth motion between them.
* SnapshotPrev: Stores a snapshot of the previous animation before interpolation begins. However the target state is
evaluated dynamically. During the interpolation period the previous snapshot is interpolated with the target pose
to produce smooth motion between them. This mode is useful for interping into a blended animation where the actual
blend factor is not known at the start of the interp or is might change dramatically during the interp.
This options are for for developers only and might help debug animation related issues.
* Enable Inverse Kinematics: this can be toggled to disable IK for the avatar.
* Enable Anim Pre and Post Rotations: this option can be used to use FBX pre-rotations from source avatar animations, instead of the current default, which is to use them from the source model.
This only effects FBX files loaded by the animation system, it does not affect changing model orientations via JavaScript.
Hand animations now have 5 states:
* idle
* open
* grasp
* point
* farGrasp
The handControllerGrab.js script now chooses one of these five animations, based on the state of the HandController object.
Also, removed hand trigger AnimVar setting from C++ Rig class.
This makes the accessible for controller mapping and to JavaScript.
Added 'neuronAvatar.js' as an example of reading joints from the neuron and setting them
on the avatar. NOTE: the rotations are currently in the wrong coordinate frame.
Is a subclass of Model, it overrides the updateClusterMatrices so it will pull
the actual joint matrices from a different rig override.
For the avatar soft attachment system, this override will be the Avatar::_skeletonModel rig.
This will give us the ability for an avatar to "wear" non-rigid attachments, such as clothing.
ModelEntities that were playing animations on models with local pivot offsets were not working correctly.
Specifically, the windmill animation in the demo domain.
We now compose a matrix containing all of the FBX's preTranslation, preRotation and postTranformations.
Hooked up a transition animation from idle to walk in the animation json.
Also fixed a bug in the AnimBlendLinearMove which was preventing the interpolation
into idle from being correct.
New JavaScript API to get the avatar's default pose.
MyAvatar.getDefaultJointRotation(index);
MyAvatar.getDefaultJointTranslation(index);
See `examples/tPose.js` for example usage
It can be called from script with minimal blocking,
because it inspects a copy of the joint values from the Rig, which is updated atomically.
This copy occurs in Rig::updateAnimations()
Logic which extracted rotations from a non-uniformly scaled matrices was sometimes incorrect.
This should fix the roads in Qbit as well as the blocks in toybox.