Because the current IK system doesn't quite handle what we need
for the head and neck IK, we do it procedurally in the rig, and
manually set both neck and head IK targets.
The hand state machine has the following features
* There's a idle to point animation, followed by a looping point hold state.
* There's a point to idle animation.
* The grab state is composed of a linear blend between an open and closed pose.
Additionally the C++ code will ramp on the left and right hand overlays,
This allows the fingers to be animated normally when the user is not actively
pointing or grabbing.
* Application: Forward trigger values to the MyAvatar's PalmData
* SkeletonModel: Pass PalmData to Rig via updateRigFromHandData() this is more explicit then
the Rig::inverseKinematics methods.
* AnimNodeLoader & AnimOverlay: add support for LeftHand and RightHand bone sets
* Rig::updateRigFromHandData() read the triggers and set stateMachine trigger vars
* avatar.json - udpated with new hand state machine with temporary animations
* In HMD mode head orientation and position is set.
* When not in HMD only orientation is set, position should
default to the underlying pose position.
Now triggers 7 states.
Idle, WalkFwd, WalkBwd, StrafeLeft, StrafeRight, TurnLeft & TurnRight.
As well as variable speed walking to match current velocity.
* added AnimPose::copyFromNetworkAnim() which
should, re-map bone ids to match the current
skeleton, and fill in missing bones with bind
pose frames.
* added ability to set a skeleton on a node.
I might need to add a recursive version of this.
* it compiles!
* tests run!