Debug tool that prints out the skeleton hierarchy of fbx files including joint indices, bindPose and defaultPoses.
The verbose option also prints the full FBX transformation set, pre/post rotations etc.
* Removed MyAvatar.reset() access from JavaScript
* Added HMD.centerUI() to JavaScript, which can be used to reset the 3D UI sphere around the current HMD orientation.
* Added MyAvatar.clearIKJOintLimitHistory() which can be used to reset any remembered IK joint limit history.
* Added MyAvatar.centerBody() which can be used to instantly re-orient the avatar's so that the hips and toes
are facing the same direction as the current HMD orientation.
away.js now uses the above new API's instead of MyAvatar.reset()
Used to disable the 'room-scale' avatar re-centering code.
Disabling this can prevent sliding when the avatar is supposed to be sitting or mounted on a stationary object.
Also, removed a bunch of old, unused leaning and torso twisting code.
* Bug fix for eye tracking in HMD, the "up" orientation of your eyes now match your head.
* DebugDraw: added drawRay method.
* Application: Renamed preRender to postUpdate
* AvatarManager: added postUpdate method that iterates over all avatars.
* MyAvatar: Renamed preRender to preDisplaySide
* MyAvatar: split preRender code into postUpdate and preDisplaySide.
* Removed "Show who is looking at me", "Render focus indicator" and "Render lookat target" debug draw.
* Split "Show Look At Vectors" into "Show My Look At Vectors" and "Show Other Look At Vectors", to make it easier to debug eye tracking.
* "Show Look at Vectors" now draws the right eye red and the left eye blue.
* Removed Avatar and MyAvatar renderBody
* Removed look at rendering from head.
* GLMHelpers: Bugfix for generateBasisVectors when up primary and secondary axis were orthogonal
Improves FTUE, by no longer going over the network to download default avatar animations.
This also includes support for relative animation urls within the animation.json
Calling glm::axis() on an identity quaternion does not result in a normalized vector.
This vector was used within Rig::updateEyeJoint() to limit the rotation of the eye balls,
to prevent the eyes from rolling back into the avatar's head.
If the avatar was looking straight ahead, this could result in bad quaternions in the eye ball
joint matrices, which in turn would cause the eye ball mesh or any mesh influenced by the eyeball joints
not to render.
* bug fix in AABox::operator+=
* added AABox::emiggen
* Avatar now has a default bound for it's skinned mesh.
* WIP: AABox tests; NEED MORE
* Model: split collision and model mesh render items.
Because ModelMeshRenderItems need special handling to update bounds for animated joints.
* Model: dynamically update the bound for rigidly bound animated meshes
* Rig: added access to geometryToRigTransform
* RenderableModelEntityItem: try to update bounds for skinned mesh to be the entity dimentions (this doesn't seem to be working)
* Geometry.cpp: removed unused bounds parameter in evalPartBounds
* ModelMeshPartPayload: bounds updating
* non-animated: use existing _localBound
* rigid bound mesh: use _localBound transformed by clusterMatrix joint transform
* fully skinned mesh: use _skinnedMeshBound provided by the application.
If Blender avatars are animated by Blender animations, they have missing
Pre and Post rotations. This step is no longer necessary to have false,
and also makes sure all animations from Blender Work on all avatars (from
other platforms).
This commit make it default to be enabled:
Previously we were using a infinitely tall vertical cylinder
to push hand IK targets out of the body, this had the
side-effect of preventing the hands from being raised over
the head. Now, we collide against the same 3d capsule
used by the physics system for avatar collisions.
There were three things that were causing issues with eye look at vectors while wearing an HMD.
1) The matrix returned by AvatarUpdate->getHeadPose() was in the wrong space, it should be in avatar space.
it was actually returning a matrix in sensor/room space.
2) The lookAtPosition was incorrect while wearing an HMD and with no avatars to look at.
3) The eye rotation limits in Rig::updateEyeJoint were relative to the model's zero orientation, NOT relative to the head.
this was causing the eyes to hit limits when the avatar head turned.
* When computing tipPosition, for the next iteration of the CCD,
use the leverArm before it's projected onto the lowerSpine twist axis.
* fix for acos() that was going outside of valid domain. (-1.0, 1.0)
Now there are two sets of of jump takeoff and in-air animations.
* Run - Used when the character jumps or falls with a small forward velocity.
* Standing - Used when the character jumps or falls in-place or backward.
CharacterController
* increased takeoff duration to 250 ms
* increased takeoff to fly duration to 1100 ms
* added standing jump and in-air animations
* added 250 milisecond delay between ground and hover, to prevent going into hover when walking over cracks.
* take-off to in-air transitions now use the new snapshotPrev interp type for a smoother tweening.
interpType defines how the interpolation between two states is performed.
* SnapshotBoth: Stores two snapshots, the previous animation before interpolation begins and the target state at the
interTarget frame. Then during the interpolation period the two snapshots are interpolated to produce smooth motion between them.
* SnapshotPrev: Stores a snapshot of the previous animation before interpolation begins. However the target state is
evaluated dynamically. During the interpolation period the previous snapshot is interpolated with the target pose
to produce smooth motion between them. This mode is useful for interping into a blended animation where the actual
blend factor is not known at the start of the interp or is might change dramatically during the interp.
This options are for for developers only and might help debug animation related issues.
* Enable Inverse Kinematics: this can be toggled to disable IK for the avatar.
* Enable Anim Pre and Post Rotations: this option can be used to use FBX pre-rotations from source avatar animations, instead of the current default, which is to use them from the source model.
This only effects FBX files loaded by the animation system, it does not affect changing model orientations via JavaScript.
Hand animations now have 5 states:
* idle
* open
* grasp
* point
* farGrasp
The handControllerGrab.js script now chooses one of these five animations, based on the state of the HandController object.
Also, removed hand trigger AnimVar setting from C++ Rig class.
This makes the accessible for controller mapping and to JavaScript.
Added 'neuronAvatar.js' as an example of reading joints from the neuron and setting them
on the avatar. NOTE: the rotations are currently in the wrong coordinate frame.
Is a subclass of Model, it overrides the updateClusterMatrices so it will pull
the actual joint matrices from a different rig override.
For the avatar soft attachment system, this override will be the Avatar::_skeletonModel rig.
This will give us the ability for an avatar to "wear" non-rigid attachments, such as clothing.
ModelEntities that were playing animations on models with local pivot offsets were not working correctly.
Specifically, the windmill animation in the demo domain.
We now compose a matrix containing all of the FBX's preTranslation, preRotation and postTranformations.
Hooked up a transition animation from idle to walk in the animation json.
Also fixed a bug in the AnimBlendLinearMove which was preventing the interpolation
into idle from being correct.
New JavaScript API to get the avatar's default pose.
MyAvatar.getDefaultJointRotation(index);
MyAvatar.getDefaultJointTranslation(index);
See `examples/tPose.js` for example usage
It can be called from script with minimal blocking,
because it inspects a copy of the joint values from the Rig, which is updated atomically.
This copy occurs in Rig::updateAnimations()
Logic which extracted rotations from a non-uniformly scaled matrices was sometimes incorrect.
This should fix the roads in Qbit as well as the blocks in toybox.
This Y_180 flip is defined in skeletonModel not in the rig.
This is important if we wish to use the Rig for both Avatars (180 flip) and Entity models (no 180 flip).
We can hide this 180 flip from script, if we wish, by including it in all the accessors to and from
MyAvatar -> skeletalModel -> Rig.
Added Quaternions::Y_180 to GLMHelpers.
This will allow us in the future to pull preRotations from
animations instead of the model skeleton. It is disabled
by default because our current animations preRotations are
not correct for the left hand.
This makes it much simpler for code out side of the rig to manipulate AnimVars
* Removed mat4 type from AnimVars
* AnimVariantMap now has a _rigToGeometryTransform matrix
used to transform positions and rotations into the correct coordinate frame.
* Moved AnimPose code to extract a quat from a scaled matrix into GLMHelpers
* Moved JointData into shared library
* added methods to the rig to copy into and out of JointData
* JointData translations must be in meters this is so the
fixed point compression wont overflow, also, it's a
consistent wire format.
* No longer normalizing scale in AnimSkeleton and AnimClip
This means graph is animating in 'geometry' coordinates
before unit scale is even applied. This is necessary to
properly work with both Avatar based models and ModelEntity
based models
Many things are broken.
* debug rendering (translations are x100)
* IK hand targets
* follow cam
* I did not even dare to try HMD mode
Changed default eye position to 1.9 meters because
the hifi_team avatars are 2.0 meters tall.
Also, prevent array access with negative indices when eye bones are missing.
ಠ_ಠ
Except for SkeletalModel::computeBounds() JointStates are now completly
encapsulated by the Rig. Now we can start using AnimPoses instead and
in parallel with the JointState implementation. Then we can assert that
they are identical, before removing JointStates.
This check in has many comments with the AJT tag.
Each one of these cases will need to be revisitied and fixed.
In particular // AJT: LEGACY will be used to enclose all code
in the Rig which manipulates the _jointState QVector.
* Deleted AnimationHandle class
* Removed enableAnimGraph and anableRigAnimations from Menu.
* Removed *some* references to old IK system.
But it is still used when computing collision bounding volumes
This should fix the issue with the hips moving erratically when arm IK
is enabled. The main issue is that the IK system assumed that the "Hips"
joint was the root of the skeleton. For Blender avatar this is not the case
as it inserts an "Armature" node at the root instead.
Also moved Rig::updateAnimations() now occurs after
Rig::updateFromHeadParameters() and Rig::updateFromHandParameters().
This should remove a frame of lag for head and hand IK targets.
Rig::updateFromEyeParameters() occurs after Rig::updateAnimations().
But now the eye JointStates are re-computed, this is the actual
fix for the local eye tracking issue.
The IK was assiming that the "Hips" bone index was always 0.
This was not the case for Owen. Now we lookup the Hips index
and cache it for use during the hipsOffset computation.
While in the HMD, updates can occur with very small deltaTime values.
These this makes the position delta method of computing a velocity very
susceptible to noise and precision errors.
MyAvatar: refactored updateFromHMDSensorMatrix() a bit by splitting it into several methods, because
it was getting quite large and becoming hard to follow.
* beginStraighteningLean() - can be called when we would like to trigger a re-centering action.
* shouldBeginStraighteningLean() - contains some of the logic to decide if we should begin a re-centering action.
for now it encapulates the capsule check.
* processStraighteningLean() - performs the actual re-centering calculation.
New code was added to MyAvatar::updateFromHMDSensorMatrix() to trigger re-centering when the avatar speed rises
over a threshold.
Secondly the Rig::computeMotionAnimationState() state machine for animGraph added a state change hysteresis
of 100ms. This hysteresis should help smooth over two issues.
1) When the delta position is 0, because the physics timestep was not evaluated.
2) During re-centering due to desired motion, the avatar velocity can fluctuate causing undesired animation state fluctuation.
Only the JointState._defaultTranslation needs to be multiplied
by the unitScale in JointState::translationIsDefault(). This
was incorrectly flagging some non-animated joints as animated.