Merge branch 'master' of github.com:highfidelity/hifi into audio/parity

This commit is contained in:
Zach Pomerantz 2017-06-13 17:41:52 -04:00
commit a8f69bb2e0
385 changed files with 9669 additions and 4040 deletions

View file

@ -1,4 +1,4 @@
###Dependencies
### Dependencies
* [cmake](https://cmake.org/download/) ~> 3.3.2
* [Qt](https://www.qt.io/download-open-source) ~> 5.6.2
@ -6,7 +6,7 @@
* IMPORTANT: Use the latest available version of OpenSSL to avoid security vulnerabilities.
* [VHACD](https://github.com/virneo/v-hacd)(clone this repository)(Optional)
####CMake External Project Dependencies
#### CMake External Project Dependencies
* [boostconfig](https://github.com/boostorg/config) ~> 1.58
* [Bullet Physics Engine](https://github.com/bulletphysics/bullet3/releases) ~> 2.83
@ -30,16 +30,19 @@ These are not placed in your normal build tree when doing an out of source build
If you would like to use a specific install of a dependency instead of the version that would be grabbed as a CMake ExternalProject, you can pass -DUSE_LOCAL_$NAME=0 (where $NAME is the name of the subfolder in [cmake/externals](cmake/externals)) when you run CMake to tell it not to get that dependency as an external project.
###OS Specific Build Guides
### OS Specific Build Guides
* [BUILD_OSX.md](BUILD_OSX.md) - additional instructions for OS X.
* [BUILD_LINUX.md](BUILD_LINUX.md) - additional instructions for Linux.
* [BUILD_WIN.md](BUILD_WIN.md) - additional instructions for Windows.
* [BUILD_ANDROID.md](BUILD_ANDROID.md) - additional instructions for Android
###CMake
### CMake
Hifi uses CMake to generate build files and project files for your platform.
####Qt
#### Qt
In order for CMake to find the Qt5 find modules, you will need to set a QT_CMAKE_PREFIX_PATH environment variable pointing to your Qt installation.
This can either be entered directly into your shell session before you build or in your shell profile (e.g.: ~/.bash_profile, ~/.bashrc, ~/.zshrc - this depends on your shell and environment).
@ -50,7 +53,8 @@ The path it needs to be set to will depend on where and how Qt5 was installed. e
export QT_CMAKE_PREFIX_PATH=/usr/local/Cellar/qt5/5.6.2/lib/cmake
export QT_CMAKE_PREFIX_PATH=/usr/local/opt/qt5/lib/cmake
####Generating build files
#### Generating build files
Create a build directory in the root of your checkout and then run the CMake build from there. This will keep the rest of the directory clean.
mkdir build
@ -59,14 +63,15 @@ Create a build directory in the root of your checkout and then run the CMake bui
If cmake gives you the same error message repeatedly after the build fails (e.g. you had a typo in the QT_CMAKE_PREFIX_PATH that you fixed but the `.cmake` files still cannot be found), try removing `CMakeCache.txt`.
####Variables
#### Variables
Any variables that need to be set for CMake to find dependencies can be set as ENV variables in your shell profile, or passed directly to CMake with a `-D` flag appended to the `cmake ..` command.
For example, to pass the QT_CMAKE_PREFIX_PATH variable during build file generation:
cmake .. -DQT_CMAKE_PREFIX_PATH=/usr/local/qt/5.6.2/lib/cmake
####Finding Dependencies
#### Finding Dependencies
The following applies for dependencies we do not grab via CMake ExternalProject (OpenSSL is an example), or for dependencies you have opted not to grab as a CMake ExternalProject (via -DUSE_LOCAL_$NAME=0). The list of dependencies we grab by default as external projects can be found in [the CMake External Project Dependencies section](#cmake-external-project-dependencies).
@ -78,8 +83,8 @@ In the examples below the variable $NAME would be replaced by the name of the de
* $NAME_ROOT_DIR - set this variable in your ENV
* HIFI_LIB_DIR - set this variable in your ENV to your High Fidelity lib folder, should contain a folder '$name'
###Optional Components
### Optional Components
####Devices
#### Devices
You can support external input/output devices such as Leap Motion, MIDI, and more by adding each individual SDK in the visible building path. Refer to the readme file available in each device folder in [interface/external/](interface/external) for the detailed explanation of the requirements to use the device.

View file

@ -1,6 +1,6 @@
Please read the [general build guide](BUILD.md) for information on dependencies required for all platforms. Only Android specific instructions are found in this file.
###Android Dependencies
### Android Dependencies
You will need the following tools to build our Android targets.
@ -17,23 +17,23 @@ You will need the following tools to build our Android targets.
You will also need to cross-compile the dependencies required for all platforms for Android, and help CMake find these compiled libraries on your machine.
####Scribe
#### Scribe
High Fidelity has a shader pre-processing tool called `scribe` that various libraries will call on during the build process. You must compile scribe using your native toolchain (following the build instructions for your platform) and then pass a CMake variable or set an ENV variable `SCRIBE_PATH` that is a path to the scribe executable.
CMake will fatally error if it does not find the scribe executable while using the android toolchain.
####Optional Components
#### Optional Components
* [Oculus Mobile SDK](https://developer.oculus.com/downloads/#sdk=mobile) ~> 0.4.2
####ANDROID_LIB_DIR
#### ANDROID_LIB_DIR
Since you won't be installing Android dependencies to system paths on your development machine, CMake will need a little help tracking down your Android dependencies.
This is most easily accomplished by installing all Android dependencies in the same folder. You can place this folder wherever you like on your machine. In this build guide and across our CMakeLists files this folder is referred to as `ANDROID_LIB_DIR`. You can set `ANDROID_LIB_DIR` in your environment or by passing when you run CMake.
####Qt
#### Qt
Install Qt 5.5.1 for Android for your host environment from the [Qt downloads page](http://www.qt.io/download/). Install Qt to ``$ANDROID_LIB_DIR/Qt``. This is required so that our root CMakeLists file can help CMake find your Android Qt installation.
@ -41,7 +41,7 @@ The component required for the Android build is the `Android armv7` component.
If you would like to install Qt to a different location, or attempt to build with a different Qt version, you can pass `ANDROID_QT_CMAKE_PREFIX_PATH` to CMake. Point to the `cmake` folder inside `$VERSION_NUMBER/android_armv7/lib`. Otherwise, our root CMakeLists will set it to `$ANDROID_LIB_DIR/Qt/5.5/android_armv7/lib/cmake`.
####OpenSSL
#### OpenSSL
Cross-compilation of OpenSSL has been tested from an OS X machine running 10.10 compiling OpenSSL 1.0.2. It is likely that the steps below will work for other OpenSSL versions than 1.0.2.
@ -76,7 +76,7 @@ This should generate libcrypto and libssl in the root of the OpenSSL directory.
If you have been building other components it is possible that the OpenSSL compile will fail based on the values other cross-compilations (tbb, bullet) have set. Ensure that you are in a new terminal window to avoid compilation errors from previously set environment variables.
####Oculus Mobile SDK
#### Oculus Mobile SDK
The Oculus Mobile SDK is optional, for Gear VR support. It is not required to compile gvr-interface.
@ -91,7 +91,7 @@ ndk-build
This will create the liboculus.a archive that our FindLibOVR module will look for when cmake is run.
#####Hybrid testing
##### Hybrid testing
Currently the 'vr_dual' mode that would allow us to run a hybrid app has limited support in the Oculus Mobile SDK. The best way to have an application we can launch without having to connect to the GearVR is to put the Gear VR Service into developer mode. This stops Oculus Home from taking over the device when it is plugged into the Gear VR headset, and allows the application to be launched from the Applications page.
@ -99,7 +99,7 @@ To put the Gear VR Service into developer mode you need an application with an O
Once the application is on your device, go to `Settings->Application Manager->Gear VR Service->Manage Storage`. Tap on `VR Service Version` six times. It will scan your device to verify that you have an osig file in an application on your device, and then it will let you enable Developer mode.
###CMake
### CMake
We use CMake to generate the makefiles that compile and deploy the Android APKs to your device. In order to create Makefiles for the Android targets, CMake requires that some environment variables are set, and that other variables are passed to it when it is run.

View file

@ -1,6 +1,7 @@
Please read the [general build guide](BUILD.md) for information on dependencies required for all platforms. Only Linux specific instructions are found in this file.
###Qt5 Dependencies
### Qt5 Dependencies
Should you choose not to install Qt5 via a package manager that handles dependencies for you, you may be missing some Qt5 dependencies. On Ubuntu, for example, the following additional packages are required:
libasound2 libxmu-dev libxi-dev freeglut3-dev libasound2-dev libjack0 libjack-dev libxrandr-dev libudev-dev libssl-dev

View file

@ -1,12 +1,13 @@
Please read the [general build guide](BUILD.md) for information on dependencies required for all platforms. Only OS X specific instructions are found in this file.
###Homebrew
### Homebrew
[Homebrew](https://brew.sh/) is an excellent package manager for OS X. It makes install of some High Fidelity dependencies very simple.
brew tap homebrew/versions
brew install cmake openssl
###OpenSSL
### OpenSSL
Assuming you've installed OpenSSL using the homebrew instructions above, you'll need to set OPENSSL_ROOT_DIR so CMake can find your installations.
For OpenSSL installed via homebrew, set OPENSSL_ROOT_DIR:
@ -15,7 +16,8 @@ For OpenSSL installed via homebrew, set OPENSSL_ROOT_DIR:
Note that this uses the version from the homebrew formula at the time of this writing, and the version in the path will likely change.
###Qt
### Qt
Download and install the [Qt 5.6.2 for macOS](http://download.qt.io/official_releases/qt/5.6/5.6.2/qt-opensource-mac-x64-clang-5.6.2.dmg).
Keep the default components checked when going through the installer.
@ -23,7 +25,8 @@ Keep the default components checked when going through the installer.
Once Qt is installed, you need to manually configure the following:
* Set the QT_CMAKE_PREFIX_PATH environment variable to your `Qt5.6.2/5.6/clang_64/lib/cmake/` directory.
###Xcode
### Xcode
If Xcode is your editor of choice, you can ask CMake to generate Xcode project files instead of Unix Makefiles.
cmake .. -GXcode

View file

@ -1,62 +1,66 @@
This is a stand-alone guide for creating your first High Fidelity build for Windows 64-bit.
###Step 1. Installing Visual Studio 2013
## Building High Fidelity
### Step 1. Installing Visual Studio 2013
If you don't already have the Community or Professional edition of Visual Studio 2013, download and install [Visual Studio Community 2013](https://www.visualstudio.com/en-us/news/releasenotes/vs2013-community-vs). You do not need to install any of the optional components when going through the installer.
Note: Newer versions of Visual Studio are not yet compatible.
###Step 2. Installing CMake
### Step 2. Installing CMake
Download and install the [CMake 3.8.0 win64-x64 Installer](https://cmake.org/files/v3.8/cmake-3.8.0-win64-x64.msi). Make sure "Add CMake to system PATH for all users" is checked when going through the installer.
###Step 3. Installing Qt
### Step 3. Installing Qt
Download and install the [Qt 5.6.2 for Windows 64-bit (VS 2013)](http://download.qt.io/official_releases/qt/5.6/5.6.2/qt-opensource-windows-x86-msvc2013_64-5.6.2.exe).
Keep the default components checked when going through the installer.
###Step 4. Setting Qt Environment Variable
### Step 4. Setting Qt Environment Variable
Go to "Control Panel > System > Advanced System Settings > Environment Variables > New..." (or search “Environment Variables” in Start Search).
* Set "Variable name": QT_CMAKE_PREFIX_PATH
Go to `Control Panel > System > Advanced System Settings > Environment Variables > New...` (or search “Environment Variables” in Start Search).
* Set "Variable name": `QT_CMAKE_PREFIX_PATH`
* Set "Variable value": `%QT_DIR%\5.6\msvc2013_64\lib\cmake`
###Step 5. Installing OpenSSL
### Step 5. Installing OpenSSL
Download and install the [Win64 OpenSSL v1.0.2k Installer](https://slproweb.com/download/Win64OpenSSL-1_0_2k.exe).
Download and install the [Win64 OpenSSL v1.0.2L Installer](https://slproweb.com/download/Win64OpenSSL-1_0_2L.exe).
###Step 6. Running CMake to Generate Build Files
### Step 6. Running CMake to Generate Build Files
Run Command Prompt from Start and run the following commands:
cd "%HIFI_DIR%"
mkdir build
cd build
cmake .. -G "Visual Studio 12 Win64"
````
cd "%HIFI_DIR%"
mkdir build
cd build
cmake .. -G "Visual Studio 12 Win64"
````
Where %HIFI_DIR% is the directory for the highfidelity repository.
Where `%HIFI_DIR%` is the directory for the highfidelity repository.
###Step 7. Making a Build
### Step 7. Making a Build
Open '%HIFI_DIR%\build\hifi.sln' using Visual Studio.
Open `%HIFI_DIR%\build\hifi.sln` using Visual Studio.
Change the Solution Configuration (next to the green play button) from "Debug" to "Release" for best performance.
Run Build > Build Solution.
Run `Build > Build Solution`.
###Step 8. Testing Interface
### Step 8. Testing Interface
Create another environment variable (see Step #4)
* Set "Variable name": _NO_DEBUG_HEAP
* Set "Variable value": 1
* Set "Variable name": `_NO_DEBUG_HEAP`
* Set "Variable value": `1`
In Visual Studio, right+click "interface" under the Apps folder in Solution Explorer and select "Set as Startup Project". Run Debug > Start Debugging.
In Visual Studio, right+click "interface" under the Apps folder in Solution Explorer and select "Set as Startup Project". Run `Debug > Start Debugging`.
Now, you should have a full build of High Fidelity and be able to run the Interface using Visual Studio. Please check our [Docs](https://wiki.highfidelity.com/wiki/Main_Page) for more information regarding the programming workflow.
Note: You can also run Interface by launching it from command line or File Explorer from %HIFI_DIR%\build\interface\Release\interface.exe
Note: You can also run Interface by launching it from command line or File Explorer from `%HIFI_DIR%\build\interface\Release\interface.exe`
###Troubleshooting
## Troubleshooting
For any problems after Step #6, first try this:
* Delete your locally cloned copy of the highfidelity repository
@ -64,18 +68,18 @@ For any problems after Step #6, first try this:
* Redownload the [repository](https://github.com/highfidelity/hifi)
* Restart directions from Step #6
####CMake gives you the same error message repeatedly after the build fails
#### CMake gives you the same error message repeatedly after the build fails
Remove `CMakeCache.txt` found in the '%HIFI_DIR%\build' directory
Remove `CMakeCache.txt` found in the `%HIFI_DIR%\build` directory.
####nmake cannot be found
#### nmake cannot be found
Make sure nmake.exe is located at the following path:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin
If not, add the directory where nmake is located to the PATH environment variable.
####Qt is throwing an error
Make sure you have the correct version (5.6.2) installed and 'QT_CMAKE_PREFIX_PATH' environment variable is set correctly.
#### Qt is throwing an error
Make sure you have the correct version (5.6.2) installed and `QT_CMAKE_PREFIX_PATH` environment variable is set correctly.

View file

@ -2,15 +2,15 @@ Follow the [build guide](BUILD.md) to figure out how to build High Fidelity for
During generation, CMake should produce an `install` target and a `package` target.
###Install
### Install
The `install` target will copy the High Fidelity targets and their dependencies to your `CMAKE_INSTALL_PREFIX`.
###Packaging
### Packaging
To produce an installer, run the `package` target.
####Windows
#### Windows
To produce an executable installer on Windows, the following are required:
@ -20,6 +20,6 @@ To produce an executable installer on Windows, the following are required:
Run the `package` target to create an executable installer using the Nullsoft Scriptable Install System.
####OS X
#### OS X
Run the `package` target to create an Apple Disk Image (.dmg).

View file

@ -166,11 +166,7 @@ void Agent::run() {
// Setup MessagesClient
auto messagesClient = DependencyManager::set<MessagesClient>();
QThread* messagesThread = new QThread;
messagesThread->setObjectName("Messages Client Thread");
messagesClient->moveToThread(messagesThread);
connect(messagesThread, &QThread::started, messagesClient.data(), &MessagesClient::init);
messagesThread->start();
messagesClient->startThread();
// make sure we hear about connected nodes so we can grab an ATP script if a request is pending
connect(nodeList.data(), &LimitedNodeList::nodeActivated, this, &Agent::nodeActivated);
@ -410,6 +406,7 @@ void Agent::executeScript() {
bool openedInLastBlock = !_audioGateOpen && audioGateOpen; // the gate just opened
bool closedInLastBlock = _audioGateOpen && !audioGateOpen; // the gate just closed
_audioGateOpen = audioGateOpen;
Q_UNUSED(openedInLastBlock);
// the codec must be flushed to silence before sending silent packets,
// so delay the transition to silent packets by one packet after becoming silent.

View file

@ -69,17 +69,7 @@ AssignmentClient::AssignmentClient(Assignment::Type requestAssignmentType, QStri
DependencyManager::set<ResourceScriptingInterface>();
DependencyManager::set<UserActivityLoggerScriptingInterface>();
// setup a thread for the NodeList and its PacketReceiver
QThread* nodeThread = new QThread(this);
nodeThread->setObjectName("NodeList Thread");
nodeThread->start();
// make sure the node thread is given highest priority
nodeThread->setPriority(QThread::TimeCriticalPriority);
// put the NodeList on the node thread
nodeList->moveToThread(nodeThread);
nodeList->startThread();
// set the logging target to the the CHILD_TARGET_NAME
LogHandler::getInstance().setTargetName(ASSIGNMENT_CLIENT_TARGET_NAME);
@ -166,14 +156,8 @@ void AssignmentClient::stopAssignmentClient() {
}
AssignmentClient::~AssignmentClient() {
QThread* nodeThread = DependencyManager::get<NodeList>()->thread();
// remove the NodeList from the DependencyManager
DependencyManager::destroy<NodeList>();
// ask the node thread to quit and wait until it is done
nodeThread->quit();
nodeThread->wait();
}
void AssignmentClient::aboutToQuit() {

View file

@ -16,10 +16,10 @@
#include <mutex>
#include <vector>
#include <tbb/concurrent_queue.h>
#include <QThread>
#include <TBBHelpers.h>
#include "AudioMixerSlave.h"
class AudioMixerSlavePool;

View file

@ -16,10 +16,9 @@
#include <mutex>
#include <vector>
#include <tbb/concurrent_queue.h>
#include <QThread>
#include <TBBHelpers.h>
#include <NodeList.h>
#include "AvatarMixerSlave.h"

View file

@ -236,11 +236,7 @@ void EntityScriptServer::run() {
// Setup MessagesClient
auto messagesClient = DependencyManager::set<MessagesClient>();
QThread* messagesThread = new QThread;
messagesThread->setObjectName("Messages Client Thread");
messagesClient->moveToThread(messagesThread);
connect(messagesThread, &QThread::started, messagesClient.data(), &MessagesClient::init);
messagesThread->start();
messagesClient->startThread();
DomainHandler& domainHandler = DependencyManager::get<NodeList>()->getDomainHandler();
connect(&domainHandler, &DomainHandler::settingsReceived, this, &EntityScriptServer::handleSettings);

View file

@ -8,8 +8,8 @@ string(TOUPPER ${EXTERNAL_NAME} EXTERNAL_NAME_UPPER)
if (WIN32)
ExternalProject_Add(
${EXTERNAL_NAME}
URL http://s3.amazonaws.com/hifi-public/dependencies/nvtt-win-2.1.0.zip
URL_MD5 3ea6eeadbcc69071acf9c49ba565760e
URL http://s3.amazonaws.com/hifi-public/dependencies/nvtt-win-2.1.0.hifi.zip
URL_MD5 10da01cf601f88f6dc12a6bc13c89136
CONFIGURE_COMMAND ""
BUILD_COMMAND ""
INSTALL_COMMAND ""
@ -29,8 +29,8 @@ else ()
ExternalProject_Add(
${EXTERNAL_NAME}
URL http://hifi-public.s3.amazonaws.com/dependencies/nvidia-texture-tools-2.1.0.zip
URL_MD5 81b8fa6a9ee3f986088eb6e2215d6a57
URL http://hifi-public.s3.amazonaws.com/dependencies/nvidia-texture-tools-2.1.0.hifi.zip
URL_MD5 5794b950f8b265a9a41b2839b3bf7ebb
CONFIGURE_COMMAND CMAKE_ARGS ${ANDROID_CMAKE_ARGS} -DNVTT_SHARED=1 -DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE} -DCMAKE_INSTALL_PREFIX:PATH=<INSTALL_DIR>
LOG_DOWNLOAD 1
LOG_CONFIGURE 1

View file

@ -21,7 +21,7 @@
</p>
<p>
If your domain has any content that you would like to re-use at a later date, save a manual backup of your models.json.gz file, which is usually stored at the following paths:<br>
<pre>C:\Users\[username]\AppData\Roaming\High Fidelity\assignment-client/entities/models.json.gz</pre>
<pre>C:/Users/[username]/AppData/Roaming/High Fidelity/assignment-client/entities/models.json.gz</pre>
<pre>/Users/[username]/Library/Application Support/High Fidelity/assignment-client/entities/models.json.gz</pre>
<pre>/home/[username]/.local/share/High Fidelity/assignment-client/entities/models.json.gz</pre>
</p>

View file

@ -37,20 +37,12 @@ RenderingClient::RenderingClient(QObject *parent, const QString& launchURLString
DependencyManager::set<AvatarHashMap>();
// get our audio client setup on its own thread
QThread* audioThread = new QThread();
auto audioClient = DependencyManager::set<AudioClient>();
audioClient->setPositionGetter(getPositionForAudio);
audioClient->setOrientationGetter(getOrientationForAudio);
audioClient->startThread();
audioClient->moveToThread(audioThread);
connect(audioThread, &QThread::started, audioClient.data(), &AudioClient::start);
connect(audioClient.data(), &AudioClient::destroyed, audioThread, &QThread::quit);
connect(audioThread, &QThread::finished, audioThread, &QThread::deleteLater);
audioThread->start();
connect(&_avatarTimer, &QTimer::timeout, this, &RenderingClient::sendAvatarPacket);
_avatarTimer.setInterval(16); // 60 FPS
_avatarTimer.start();

View file

@ -103,8 +103,8 @@
"rotationVar": "spine2Rotation",
"typeVar": "spine2Type",
"weightVar": "spine2Weight",
"weight": 1.0,
"flexCoefficients": [1.0, 0.5, 0.5]
"weight": 2.0,
"flexCoefficients": [1.0, 0.5, 0.25]
},
{
"jointName": "Head",
@ -113,7 +113,7 @@
"typeVar": "headType",
"weightVar": "headWeight",
"weight": 4.0,
"flexCoefficients": [1, 0.05, 0.25, 0.25, 0.25]
"flexCoefficients": [1, 0.5, 0.25, 0.2, 0.1]
},
{
"jointName": "LeftArm",

View file

@ -1,14 +1,16 @@
{
"RenderShadowTask": {
"Enabled": {
"enabled": true
}
},
"RenderDeferredTask": {
"AmbientOcclusion": {
"RenderMainView": {
"RenderShadowTask": {
"Enabled": {
"enabled": true
}
},
"RenderDeferredTask": {
"AmbientOcclusion": {
"Enabled": {
"enabled": true
}
}
}
}
}

View file

@ -57,6 +57,8 @@
{ "from": "OculusTouch.LeftThumbUp", "to": "Standard.LeftThumbUp" },
{ "from": "OculusTouch.RightThumbUp", "to": "Standard.RightThumbUp" },
{ "from": "OculusTouch.LeftIndexPoint", "to": "Standard.LeftIndexPoint" },
{ "from": "OculusTouch.RightIndexPoint", "to": "Standard.RightIndexPoint" }
{ "from": "OculusTouch.RightIndexPoint", "to": "Standard.RightIndexPoint" },
{ "from": "OculusTouch.Head", "to" : "Standard.Head", "when" : [ "Application.InHMD"] }
]
}

View file

@ -35,35 +35,35 @@
{ "from": "Vive.RightApplicationMenu", "to": "Standard.RightSecondaryThumb" },
{ "from": "Vive.LeftHand", "to": "Standard.LeftHand", "when": [ "Application.InHMD" ] },
{ "from": "Vive.RightHand", "to": "Standard.RightHand", "when": [ "Application.InHMD" ] },
{ "from": "Vive.RightHand", "to": "Standard.RightHand", "when": [ "Application.InHMD" ] },
{
"from": "Vive.LeftFoot", "to" : "Standard.LeftFoot",
"filters" : [{"type" : "lowVelocity", "rotation" : 1.0, "translation": 1.0}],
"when": [ "Application.InHMD"]
"when": [ "Application.InHMD" ]
},
{
"from": "Vive.RightFoot", "to" : "Standard.RightFoot",
"filters" : [{"type" : "lowVelocity", "rotation" : 1.0, "translation": 1.0}],
"when": [ "Application.InHMD"]
"when": [ "Application.InHMD" ]
},
{
"from": "Vive.Hips", "to" : "Standard.Hips",
"filters" : [{"type" : "lowVelocity", "rotation" : 0.01, "translation": 0.01}],
"when": [ "Application.InHMD"]
"when": [ "Application.InHMD" ]
},
{
"from": "Vive.Spine2", "to" : "Standard.Spine2",
"filters" : [{"type" : "lowVelocity", "rotation" : 0.01, "translation": 0.01}],
"when": [ "Application.InHMD"]
"when": [ "Application.InHMD" ]
},
{ "from": "Vive.Head", "to" : "Standard.Head", "when" : [ "Application.InHMD"] },
{ "from": "Vive.Head", "to" : "Standard.Head", "when": [ "Application.InHMD" ] },
{ "from": "Vive.RightArm", "to" : "Standard.RightArm", "when" : [ "Application.InHMD"] },
{ "from": "Vive.LeftArm", "to" : "Standard.LeftArm", "when" : [ "Application.InHMD"] }
{ "from": "Vive.RightArm", "to" : "Standard.RightArm", "when": [ "Application.InHMD" ] },
{ "from": "Vive.LeftArm", "to" : "Standard.LeftArm", "when": [ "Application.InHMD" ] }
]
}

View file

@ -32,7 +32,7 @@ var EventBridge;
var webChannel = new QWebChannel(qt.webChannelTransport, function (channel) {
// replace the TempEventBridge with the real one.
var tempEventBridge = EventBridge;
EventBridge = channel.objects.eventBridgeWrapper.eventBridge;
EventBridge = channel.objects.eventBridge;
tempEventBridge._callbacks.forEach(function (callback) {
EventBridge.scriptEventReceived.connect(callback);
});

View file

@ -1,85 +1,89 @@
name = Jointy3
name = mannequin
type = body+head
scale = 1
filename = Jointy3/Jointy3.fbx
texdir = Jointy3/textures
filename = mannequin/mannequin.baked.fbx
joint = jointEyeLeft = LeftEye
joint = jointRightHand = RightHand
joint = jointHead = Head
joint = jointEyeRight = RightEye
joint = jointLean = Spine
joint = jointNeck = Neck
joint = jointLeftHand = LeftHand
joint = jointEyeRight = RightEye
joint = jointHead = Head
joint = jointRightHand = RightHand
joint = jointRoot = Hips
joint = jointLean = Spine
joint = jointEyeLeft = LeftEye
freeJoint = LeftArm
freeJoint = LeftForeArm
freeJoint = RightArm
freeJoint = RightForeArm
jointIndex = RightHand = 17
jointIndex = LeftHandIndex3 = 56
jointIndex = Hips = 0
jointIndex = LeftHandRing2 = 47
jointIndex = LeftHandThumb3 = 60
jointIndex = RightShoulder = 14
jointIndex = RightHandRing1 = 30
jointIndex = RightHandRing3 = 32
jointIndex = LeftHandPinky4 = 45
jointIndex = LeftHandRing1 = 46
jointIndex = LeftFoot = 8
jointIndex = RightHandIndex2 = 23
jointIndex = RightToeBase = 4
jointIndex = RightHandMiddle4 = 29
jointIndex = RightHandPinky4 = 37
jointIndex = LeftToe_End = 10
jointIndex = RightEye = 66
jointIndex = RightHandPinky2 = 35
jointIndex = RightHandRing2 = 31
jointIndex = LeftHand = 41
jointIndex = RightToe_End = 5
jointIndex = LeftEye = 65
jointIndex = LeftHandThumb2 = 59
jointIndex = pCylinder73Shape1 = 67
jointIndex = LeftShoulder = 38
jointIndex = LeftHandIndex2 = 55
jointIndex = RightForeArm = 16
jointIndex = LeftHandMiddle2 = 51
jointIndex = RightHandRing4 = 33
jointIndex = LeftLeg = 7
jointIndex = LeftHandThumb4 = 61
jointIndex = LeftForeArm = 40
jointIndex = HeadTop_End = 64
jointIndex = RightHandPinky1 = 34
jointIndex = RightHandIndex1 = 22
jointIndex = LeftHandIndex1 = 54
jointIndex = RightLeg = 2
jointIndex = RightHandIndex4 = 25
jointIndex = Neck = 62
jointIndex = LeftHandMiddle1 = 50
jointIndex = RightHandPinky3 = 36
jointIndex = LeftHandPinky2 = 43
jointIndex = RightHandMiddle3 = 28
jointIndex = RightHandThumb4 = 21
jointIndex = LeftUpLeg = 6
jointIndex = RightFoot = 3
jointIndex = LeftHandThumb1 = 58
jointIndex = LeftArm = 39
jointIndex = RightHandMiddle1 = 26
jointIndex = LeftHandRing3 = 48
jointIndex = LeftHandMiddle4 = 53
jointIndex = RightUpLeg = 1
jointIndex = RightHandMiddle2 = 27
jointIndex = LeftToeBase = 9
jointIndex = RightHandThumb2 = 19
jointIndex = Spine2 = 13
jointIndex = Spine = 11
jointIndex = LeftHandRing4 = 49
jointIndex = Head = 63
jointIndex = LeftHandPinky3 = 44
bs = EyeBlink_L = blink = 1
bs = JawOpen = mouth_Open = 1
bs = LipsFunnel = Oo = 1
bs = BrowsU_L = brow_Up = 1
jointIndex = RightHandIndex2 = 27
jointIndex = LeftHandIndex2 = 51
jointIndex = RightUpLeg = 6
jointIndex = RightToe_End = 10
jointIndex = RightEye = 65
jointIndex = LeftHandPinky1 = 42
jointIndex = RightHandThumb1 = 18
jointIndex = LeftHandIndex4 = 57
jointIndex = LeftHandMiddle3 = 52
jointIndex = RightHandIndex3 = 24
jointIndex = Spine1 = 12
jointIndex = RightHandRing1 = 22
jointIndex = face = 67
jointIndex = LeftUpLeg = 1
jointIndex = LeftHand = 41
jointIndex = LeftHandMiddle1 = 58
jointIndex = LeftHandIndex1 = 50
jointIndex = LeftEye = 64
jointIndex = RightHandIndex1 = 26
jointIndex = LeftHandPinky4 = 45
jointIndex = RightArm = 15
jointIndex = RightHandThumb3 = 20
jointIndex = LeftShoulder = 38
jointIndex = RightHandPinky2 = 19
jointIndex = RightHandThumb1 = 30
jointIndex = RightForeArm = 16
jointIndex = LeftHandMiddle3 = 60
jointIndex = Neck = 62
jointIndex = LeftHandThumb1 = 54
jointIndex = RightHandMiddle2 = 35
jointIndex = LeftHandMiddle4 = 61
jointIndex = mannequin = 68
jointIndex = Spine1 = 12
jointIndex = RightFoot = 8
jointIndex = RightHand = 17
jointIndex = LeftHandIndex3 = 52
jointIndex = RightHandIndex3 = 28
jointIndex = RightHandMiddle4 = 37
jointIndex = LeftLeg = 2
jointIndex = RightHandMiddle1 = 34
jointIndex = Spine2 = 13
jointIndex = LeftHandMiddle2 = 59
jointIndex = LeftHandPinky3 = 44
jointIndex = LeftHandThumb3 = 56
jointIndex = LeftHandRing4 = 49
jointIndex = RightHandThumb2 = 31
jointIndex = LeftHandRing3 = 48
jointIndex = HeadTop_End = 66
jointIndex = LeftHandThumb4 = 57
jointIndex = RightHandThumb3 = 32
jointIndex = RightHandPinky1 = 18
jointIndex = RightLeg = 7
jointIndex = RightHandMiddle3 = 36
jointIndex = RightHandPinky3 = 20
jointIndex = LeftToeBase = 4
jointIndex = LeftForeArm = 40
jointIndex = RightShoulder = 14
jointIndex = LeftHandRing2 = 47
jointIndex = LeftHandThumb2 = 55
jointIndex = Head = 63
jointIndex = RightHandRing4 = 25
jointIndex = LeftHandRing1 = 46
jointIndex = LeftFoot = 3
jointIndex = RightHandRing3 = 24
jointIndex = RightHandThumb4 = 33
jointIndex = LeftArm = 39
jointIndex = LeftToe_End = 5
jointIndex = RightToeBase = 9
jointIndex = RightHandPinky4 = 21
jointIndex = Spine = 11
jointIndex = LeftHandIndex4 = 53
jointIndex = LeftHandPinky2 = 43
jointIndex = RightHandIndex4 = 29
jointIndex = Hips = 0
jointIndex = RightHandRing2 = 23

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View file

@ -177,7 +177,7 @@ ScrollingWindow {
SHAPE_TYPE_STATIC_MESH
],
checkStateOnDisable: false,
warningOnDisable: "Models with 'Exact' automatic collisions cannot be dynamic"
warningOnDisable: "Models with 'Exact' automatic collisions cannot be dynamic, and should not be used as floors"
}
});

View file

@ -2,7 +2,6 @@ import QtQuick 2.5
import QtQuick.Controls 1.2
import QtWebChannel 1.0
import QtWebEngine 1.2
import FileTypeProfile 1.0
import "controls-uit"
import "styles" as HifiStyles
@ -22,8 +21,6 @@ ScrollingWindow {
property alias url: webview.url
property alias webView: webview
property alias eventBridge: eventBridgeWrapper.eventBridge
signal loadingChanged(int status)
x: 100
@ -209,21 +206,7 @@ ScrollingWindow {
WebView {
id: webview
url: "https://highfidelity.com/"
property alias eventBridgeWrapper: eventBridgeWrapper
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
profile: FileTypeProfile {
id: webviewProfile
storageName: "qmlWebEngine"
}
webChannel.registeredObjects: [eventBridgeWrapper]
profile: FileTypeProfile;
// Create a global EventBridge object for raiseAndLowerKeyboard.
WebEngineScript {
@ -271,6 +254,8 @@ ScrollingWindow {
}
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
desktop.initWebviewProfileHandlers(webview.profile);
}
}

View file

@ -26,15 +26,8 @@ Windows.ScrollingWindow {
// Don't destroy on close... otherwise the JS/C++ will have a dangling pointer
destroyOnCloseButton: false
property alias source: webview.url
property alias eventBridge: eventBridgeWrapper.eventBridge;
property alias scriptUrl: webview.userScriptUrl
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
// This is for JS/QML communication, which is unused in a WebWindow,
// but not having this here results in spurious warnings about a
// missing signal
@ -70,7 +63,6 @@ Windows.ScrollingWindow {
url: "about:blank"
anchors.fill: parent
focus: true
webChannel.registeredObjects: [eventBridgeWrapper]
property string userScriptUrl: ""
@ -107,6 +99,8 @@ Windows.ScrollingWindow {
}
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
eventBridge.webEventReceived.connect(onWebEventReceived);
}
}

View file

@ -30,15 +30,6 @@ Windows.Window {
property bool keyboardRaised: false
property bool punctuationMode: false
// JavaScript event bridge object in case QML content includes Web content.
property alias eventBridge: eventBridgeWrapper.eventBridge;
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
onSourceChanged: {
if (dynamicContent) {
dynamicContent.destroy();

View file

@ -8,7 +8,6 @@ import "controls-uit" as HifiControls
import "styles" as HifiStyles
import "styles-uit"
import "windows"
import HFTabletWebEngineProfile 1.0
Item {
id: root
@ -19,7 +18,6 @@ Item {
property variant permissionsBar: {'securityOrigin':'none','feature':'none'}
property alias url: webview.url
property WebEngineView webView: webview
property alias eventBridge: eventBridgeWrapper.eventBridge
property bool canGoBack: webview.canGoBack
property bool canGoForward: webview.canGoForward
@ -33,12 +31,6 @@ Item {
webview.profile = profile;
}
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
WebEngineView {
id: webview
objectName: "webEngineView"
@ -47,10 +39,7 @@ Item {
width: parent.width
height: keyboardEnabled && keyboardRaised ? parent.height - keyboard.height : parent.height
profile: HFTabletWebEngineProfile {
id: webviewTabletProfile
storageName: "qmlTabletWebEngine"
}
profile: HFTabletWebEngineProfile;
property string userScriptUrl: ""
@ -82,9 +71,10 @@ Item {
property string newUrl: ""
webChannel.registeredObjects: [eventBridgeWrapper]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
// Ensure the JS from the web-engine makes it to our logging
webview.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);

View file

@ -79,15 +79,11 @@ ScrollingWindow {
id: webView
anchors.fill: parent
enabled: false
property alias eventBridgeWrapper: eventBridgeWrapper
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
}
webChannel.registeredObjects: [eventBridgeWrapper]
onEnabledChanged: toolWindow.updateVisiblity()
}
}
@ -251,12 +247,9 @@ ScrollingWindow {
tab.enabled = true;
tab.originalUrl = properties.source;
var eventBridge = properties.eventBridge;
var result = tab.item;
result.enabled = true;
tabView.tabCount++;
result.eventBridgeWrapper.eventBridge = eventBridge;
result.url = properties.source;
return result;
}

View file

@ -10,13 +10,10 @@
import QtQuick 2.5
import QtWebEngine 1.2
import HFWebEngineProfile 1.0
WebEngineView {
id: root
// profile: desktop.browserProfile
Component.onCompleted: {
console.log("Connecting JS messaging to Hifi Logging")
// Ensure the JS from the web-engine makes it to our logging

View file

@ -2,12 +2,10 @@ import QtQuick 2.5
import QtWebEngine 1.1
import QtWebChannel 1.0
import "../controls-uit" as HiFiControls
import HFTabletWebEngineProfile 1.0
Item {
property alias url: root.url
property alias scriptURL: root.userScriptUrl
property alias eventBridge: eventBridgeWrapper.eventBridge
property alias canGoBack: root.canGoBack;
property var goBack: root.goBack;
property alias urlTag: root.urlTag
@ -23,12 +21,6 @@ Item {
}
*/
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
property alias viewProfile: root.profile
WebEngineView {
@ -39,10 +31,7 @@ Item {
width: parent.width
height: keyboardEnabled && keyboardRaised ? parent.height - keyboard.height : parent.height
profile: HFTabletWebEngineProfile {
id: webviewProfile
storageName: "qmlTabletWebEngine"
}
profile: HFTabletWebEngineProfile;
property string userScriptUrl: ""
@ -75,10 +64,11 @@ Item {
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard, userScript ]
property string newUrl: ""
webChannel.registeredObjects: [eventBridgeWrapper]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
// Ensure the JS from the web-engine makes it to our logging
root.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);

View file

@ -2,7 +2,6 @@ import QtQuick 2.5
import QtQuick.Controls 1.4
import QtWebEngine 1.2
import QtWebChannel 1.0
import HFTabletWebEngineProfile 1.0
import "../controls-uit" as HiFiControls
import "../styles" as HifiStyles
import "../styles-uit"
@ -18,7 +17,6 @@ Item {
property int headerHeight: 70
property string url
property string scriptURL
property alias eventBridge: eventBridgeWrapper.eventBridge
property bool keyboardEnabled: false
property bool keyboardRaised: false
property bool punctuationMode: false
@ -136,12 +134,6 @@ Item {
loadUrl(url);
}
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
WebEngineView {
id: webview
objectName: "webEngineView"
@ -150,10 +142,7 @@ Item {
width: parent.width
height: keyboardEnabled && keyboardRaised ? parent.height - keyboard.height - web.headerHeight : parent.height - web.headerHeight
anchors.top: buttons.bottom
profile: HFTabletWebEngineProfile {
id: webviewTabletProfile
storageName: "qmlTabletWebEngine"
}
profile: HFTabletWebEngineProfile;
property string userScriptUrl: ""
@ -186,9 +175,9 @@ Item {
property string newUrl: ""
webChannel.registeredObjects: [eventBridgeWrapper]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
// Ensure the JS from the web-engine makes it to our logging
webview.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);

View file

@ -10,13 +10,9 @@
import QtQuick 2.5
import "."
import FileTypeProfile 1.0
WebView {
viewProfile: FileTypeProfile {
id: webviewProfile
storageName: "qmlWebEngine"
}
viewProfile: FileTypeProfile;
urlTag: "noDownload=true";
}

View file

@ -2,12 +2,10 @@ import QtQuick 2.5
import QtWebEngine 1.1
import QtWebChannel 1.0
import "../controls-uit" as HiFiControls
import HFWebEngineProfile 1.0
Item {
property alias url: root.url
property alias scriptURL: root.userScriptUrl
property alias eventBridge: eventBridgeWrapper.eventBridge
property alias canGoBack: root.canGoBack;
property var goBack: root.goBack;
property alias urlTag: root.urlTag
@ -23,12 +21,6 @@ Item {
}
*/
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
property alias viewProfile: root.profile
WebEngineView {
@ -39,10 +31,7 @@ Item {
width: parent.width
height: keyboardEnabled && keyboardRaised ? parent.height - keyboard.height : parent.height
profile: HFWebEngineProfile {
id: webviewProfile
storageName: "qmlWebEngine"
}
profile: HFWebEngineProfile;
property string userScriptUrl: ""
@ -76,9 +65,9 @@ Item {
property string newUrl: ""
webChannel.registeredObjects: [eventBridgeWrapper]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
// Ensure the JS from the web-engine makes it to our logging
root.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);

View file

@ -20,7 +20,6 @@ TabletModalWindow {
id: loginDialogRoot
objectName: "LoginDialog"
property var eventBridge;
signal sendToScript(var message);
property bool isHMD: false
property bool gotoPreviousApp: false;

View file

@ -24,8 +24,6 @@ Window {
resizable: true
modality: Qt.ApplicationModal
property alias eventBridge: eventBridgeWrapper.eventBridge
Item {
anchors.fill: parent
@ -45,16 +43,6 @@ Window {
bottom: keyboard.top
}
property alias eventBridgeWrapper: eventBridgeWrapper
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
webChannel.registeredObjects: [eventBridgeWrapper]
// Create a global EventBridge object for raiseAndLowerKeyboard.
WebEngineScript {
id: createGlobalEventBridge
@ -73,6 +61,10 @@ Window {
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard ]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
}
}
Keyboard {

View file

@ -116,9 +116,7 @@ Preference {
Component {
id: tabletAvatarBrowserBuilder;
TabletAvatarBrowser {
eventBridge: tabletRoot.eventBridge
}
TabletAvatarBrowser { }
}
}

View file

@ -2,7 +2,6 @@ import QtQuick 2.5
import QtQuick.Controls 1.4
import QtWebEngine 1.1;
import Qt.labs.settings 1.0
import HFWebEngineProfile 1.0
import "../desktop" as OriginalDesktop
import ".."
@ -27,11 +26,6 @@ OriginalDesktop.Desktop {
property alias toolWindow: toolWindow
ToolWindow { id: toolWindow }
property var browserProfile: HFWebEngineProfile {
id: webviewProfile
storageName: "qmlWebEngine"
}
Action {
text: "Open Browser"
shortcut: "Ctrl+B"

View file

@ -44,7 +44,6 @@ Rectangle {
property var activeTab: "nearbyTab";
property bool currentlyEditingDisplayName: false
property bool punctuationMode: false;
property var eventBridge;
HifiConstants { id: hifi; }
@ -388,8 +387,13 @@ Rectangle {
sortIndicatorColumn: settings.nearbySortIndicatorColumn;
sortIndicatorOrder: settings.nearbySortIndicatorOrder;
onSortIndicatorColumnChanged: {
settings.nearbySortIndicatorColumn = sortIndicatorColumn;
sortModel();
if (sortIndicatorColumn > 2) {
// these are not sortable, switch back to last column
sortIndicatorColumn = settings.nearbySortIndicatorColumn;
} else {
settings.nearbySortIndicatorColumn = sortIndicatorColumn;
sortModel();
}
}
onSortIndicatorOrderChanged: {
settings.nearbySortIndicatorOrder = sortIndicatorOrder;
@ -1038,7 +1042,6 @@ Rectangle {
} // Keyboard
HifiControls.TabletWebView {
eventBridge: pal.eventBridge;
id: userInfoViewer;
anchors {
top: parent.top;

View file

@ -24,7 +24,6 @@ Rectangle {
property string title: "Asset Browser"
property bool keyboardRaised: false
property var eventBridge;
signal sendToScript(var message);
property bool isHMD: false
@ -179,7 +178,7 @@ Rectangle {
SHAPE_TYPE_STATIC_MESH
],
checkStateOnDisable: false,
warningOnDisable: "Models with 'Exact' automatic collisions cannot be dynamic"
warningOnDisable: "Models with 'Exact' automatic collisions cannot be dynamic, and should not be used as floors"
}
});

View file

@ -20,7 +20,6 @@ Rectangle {
id: root
objectName: "DCConectionTiming"
property var eventBridge;
signal sendToScript(var message);
property bool isHMD: false

View file

@ -19,7 +19,6 @@ Rectangle {
id: root
objectName: "DebugWindow"
property var eventBridge;
property var title: "Debug Window"
property bool isHMD: false

View file

@ -20,7 +20,6 @@ Rectangle {
id: root
objectName: "EntityStatistics"
property var eventBridge;
signal sendToScript(var message);
property bool isHMD: false

View file

@ -20,7 +20,6 @@ Rectangle {
id: root
objectName: "LODTools"
property var eventBridge;
signal sendToScript(var message);
property bool isHMD: false

View file

@ -23,7 +23,6 @@ Rectangle {
property string title: "Running Scripts"
HifiConstants { id: hifi }
signal sendToScript(var message);
property var eventBridge;
property var scripts: ScriptDiscoveryService;
property var scriptsModel: scripts.scriptsModelFilter
property var runningScriptsModel: ListModel { }

View file

@ -7,14 +7,12 @@ StackView {
objectName: "stack"
initialItem: Qt.resolvedUrl('EditTabView.qml')
property var eventBridge;
signal sendToScript(var message);
HifiConstants { id: hifi }
function pushSource(path) {
editRoot.push(Qt.resolvedUrl(path));
editRoot.currentItem.eventBridge = editRoot.eventBridge;
editRoot.currentItem.sendToScript.connect(editRoot.sendToScript);
}

View file

@ -5,7 +5,6 @@ import QtWebChannel 1.0
import QtQuick.Controls.Styles 1.4
import "../../controls"
import "../toolbars"
import HFWebEngineProfile 1.0
import QtGraphicalEffects 1.0
import "../../controls-uit" as HifiControls
import "../../styles-uit"
@ -182,7 +181,6 @@ TabView {
WebView {
id: entityListToolWebView
url: "../../../../../scripts/system/html/entityList.html"
eventBridge: editRoot.eventBridge
anchors.fill: parent
enabled: true
}
@ -197,7 +195,6 @@ TabView {
WebView {
id: entityPropertiesWebView
url: "../../../../../scripts/system/html/entityProperties.html"
eventBridge: editRoot.eventBridge
anchors.fill: parent
enabled: true
}
@ -212,7 +209,6 @@ TabView {
WebView {
id: gridControlsWebView
url: "../../../../../scripts/system/html/gridControls.html"
eventBridge: editRoot.eventBridge
anchors.fill: parent
enabled: true
}
@ -227,7 +223,6 @@ TabView {
WebView {
id: particleExplorerWebView
url: "../../../../../scripts/system/particle_explorer/particleExplorer.html"
eventBridge: editRoot.eventBridge
anchors.fill: parent
enabled: true
}
@ -290,7 +285,7 @@ TabView {
editTabView.currentIndex = id;
} else {
console.warn('Attempt to switch to invalid tab:', id);
}
}
} else if (typeof id === 'string'){
switch (id.toLowerCase()) {
case 'create':

View file

@ -18,7 +18,6 @@ import "../../dialogs"
Rectangle {
id: inputRecorder
property var eventBridge;
HifiConstants { id: hifi }
signal sendToScript(var message);
color: hifi.colors.baseGray;

View file

@ -20,7 +20,6 @@ Rectangle {
// height: parent.height
HifiConstants { id: hifi }
color: hifi.colors.baseGray;
property var eventBridge;
signal sendToScript(var message);
property bool keyboardEnabled: false
property bool punctuationMode: false
@ -118,7 +117,7 @@ Rectangle {
id: text2
width: 160
color: "#ffffff"
text: qsTr("Models with automatic collisions set to 'Exact' cannot be dynamic")
text: qsTr("Models with automatic collisions set to 'Exact' cannot be dynamic, and should not be used as floors")
wrapMode: Text.WordWrap
font.pixelSize: 12
}

View file

@ -29,7 +29,6 @@ StackView {
initialItem: addressBarDialog
width: parent !== null ? parent.width : undefined
height: parent !== null ? parent.height : undefined
property var eventBridge;
property int cardWidth: 212;
property int cardHeight: 152;
property string metaverseBase: addressBarDialog.metaverseServerUrl + "/api/v1/";
@ -80,7 +79,6 @@ StackView {
var card = tabletWebView.createObject();
card.url = addressBarDialog.metaverseServerUrl + targetString;
card.parentStackItem = root;
card.eventBridge = root.eventBridge;
root.push(card);
return;
}

View file

@ -25,7 +25,6 @@ Item {
property bool keyboardRaised: false
property bool punctuationMode: false
property var eventBridge;
signal sendToScript(var message);
anchors.fill: parent

View file

@ -20,6 +20,7 @@ StackView {
property string title: "Audio Buffers"
property alias gotoPreviousApp: root.gotoPreviousApp;
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -19,7 +19,6 @@ StackView {
objectName: "stack"
property string title: "Avatar Settings"
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -19,7 +19,6 @@ StackView {
objectName: "stack"
property string title: "General Settings"
property alias gotoPreviousApp: root.gotoPreviousApp;
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -19,7 +19,6 @@ StackView {
objectName: "stack"
property string title: "Graphics Settings"
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -19,7 +19,6 @@ StackView {
objectName: "stack"
property string title: "LOD Settings"
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -4,7 +4,6 @@ import QtQuick.Controls 1.4
import QtQml 2.2
import QtWebChannel 1.0
import QtWebEngine 1.1
import HFWebEngineProfile 1.0
import "."
@ -22,7 +21,6 @@ FocusScope {
property var point: Qt.point(50, 50);
TabletMenuStack { id: menuPopperUpper }
property string subMenu: ""
property var eventBridge;
signal sendToScript(var message);
Rectangle {

View file

@ -49,7 +49,6 @@ Item {
function pushSource(path) {
d.push(Qt.resolvedUrl(path));
d.currentItem.eventBridge = tabletMenu.eventBridge
d.currentItem.sendToScript.connect(tabletMenu.sendToScript);
d.currentItem.focus = true;
d.currentItem.forceActiveFocus();

View file

@ -19,7 +19,6 @@ StackView {
objectName: "stack"
property var title: "Networking Settings"
property var eventBridge;
signal sendToScript(var message);
function pushSource(path) {

View file

@ -1,7 +1,6 @@
import QtQuick 2.0
import Hifi 1.0
import QtQuick.Controls 1.4
import HFTabletWebEngineProfile 1.0
import "../../dialogs"
import "../../controls"
@ -9,7 +8,6 @@ Item {
id: tabletRoot
objectName: "tabletRoot"
property string username: "Unknown user"
property var eventBridge;
property var rootMenu;
property var openModal: null;
property var openMessage: null;
@ -112,7 +110,6 @@ Item {
function openBrowserWindow(request, profile) {
var component = Qt.createComponent("../../controls/TabletWebView.qml");
var newWindow = component.createObject(tabletRoot);
newWindow.eventBridge = tabletRoot.eventBridge;
newWindow.remove = true;
newWindow.profile = profile;
request.openIn(newWindow.webView);
@ -166,21 +163,21 @@ Item {
objectName: "loader"
asynchronous: false
width: parent.width
height: parent.height
onLoaded: {
if (loader.item.hasOwnProperty("eventBridge")) {
loader.item.eventBridge = eventBridge;
// Hook up callback for clara.io download from the marketplace.
eventBridge.webEventReceived.connect(function (event) {
if (event.slice(0, 17) === "CLARA.IO DOWNLOAD") {
ApplicationInterface.addAssetToWorldFromURL(event.slice(18));
}
});
// Hook up callback for clara.io download from the marketplace.
Connections {
id: eventBridgeConnection
target: eventBridge
onWebEventReceived: {
if (message.slice(0, 17) === "CLARA.IO DOWNLOAD") {
ApplicationInterface.addAssetToWorldFromURL(message.slice(18));
}
}
}
onLoaded: {
if (loader.item.hasOwnProperty("sendToScript")) {
loader.item.sendToScript.connect(tabletRoot.sendToScript);
}

View file

@ -18,7 +18,6 @@ Windows.ScrollingWindow {
id: tabletRoot
objectName: "tabletRoot"
property string username: "Unknown user"
property var eventBridge;
property var rootMenu;
property string subMenu: ""
@ -86,17 +85,18 @@ Windows.ScrollingWindow {
anchors.left: parent.left
anchors.top: parent.top
onLoaded: {
if (loader.item.hasOwnProperty("eventBridge")) {
loader.item.eventBridge = eventBridge;
// Hook up callback for clara.io download from the marketplace.
eventBridge.webEventReceived.connect(function (event) {
if (event.slice(0, 17) === "CLARA.IO DOWNLOAD") {
ApplicationInterface.addAssetToWorldFromURL(event.slice(18));
}
});
// Hook up callback for clara.io download from the marketplace.
Connections {
id: eventBridgeConnection
target: eventBridge
onWebEventReceived: {
if (message.slice(0, 17) === "CLARA.IO DOWNLOAD") {
ApplicationInterface.addAssetToWorldFromURL(message.slice(18));
}
}
}
onLoaded: {
if (loader.item.hasOwnProperty("sendToScript")) {
loader.item.sendToScript.connect(tabletRoot.sendToScript);
}

View file

@ -136,7 +136,7 @@ Item {
for (var i = 0; i < sections.length; i++) {
totalHeight += sections[i].height + sections[i].getPreferencesHeight();
}
var bottomPadding = 100;
var bottomPadding = 170;
return (totalHeight + bottomPadding);
}
}

View file

@ -27,8 +27,6 @@ Item {
property bool keyboardRaised: false
property bool punctuationMode: false
property alias eventBridge: eventBridgeWrapper.eventBridge
anchors.fill: parent
BaseWebView {
@ -43,14 +41,6 @@ Item {
bottom: footer.top
}
QtObject {
id: eventBridgeWrapper
WebChannel.id: "eventBridgeWrapper"
property var eventBridge;
}
webChannel.registeredObjects: [eventBridgeWrapper]
// Create a global EventBridge object for raiseAndLowerKeyboard.
WebEngineScript {
id: createGlobalEventBridge
@ -68,6 +58,11 @@ Item {
}
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard ]
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
}
}
Rectangle {

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

View file

@ -11,6 +11,9 @@
#include "Application.h"
#include <chrono>
#include <thread>
#include <gl/Config.h>
#include <glm/glm.hpp>
#include <glm/gtx/component_wise.hpp>
@ -69,6 +72,7 @@
#include <ErrorDialog.h>
#include <FileScriptingInterface.h>
#include <Finally.h>
#include <FingerprintUtils.h>
#include <FramebufferCache.h>
#include <gpu/Batch.h>
#include <gpu/Context.h>
@ -110,7 +114,9 @@
#include <render/RenderFetchCullSortTask.h>
#include <RenderDeferredTask.h>
#include <RenderForwardTask.h>
#include <RenderViewTask.h>
#include <ResourceCache.h>
#include <ResourceRequest.h>
#include <SandboxUtils.h>
#include <SceneScriptingInterface.h>
#include <ScriptEngines.h>
@ -144,10 +150,8 @@
#include "InterfaceLogging.h"
#include "LODManager.h"
#include "ModelPackager.h"
#include "networking/HFWebEngineProfile.h"
#include "networking/HFTabletWebEngineProfile.h"
#include "networking/FileTypeProfile.h"
#include "scripting/Audio.h"
#include "networking/CloseEventSender.h"
#include "scripting/TestScriptingInterface.h"
#include "scripting/AccountScriptingInterface.h"
#include "scripting/AssetMappingsScriptingInterface.h"
@ -244,6 +248,8 @@ static const QString DESKTOP_LOCATION = QStandardPaths::writableLocation(QStanda
Setting::Handle<int> maxOctreePacketsPerSecond("maxOctreePPS", DEFAULT_MAX_OCTREE_PPS);
static const QString MARKETPLACE_CDN_HOSTNAME = "mpassets.highfidelity.com";
static const int INTERVAL_TO_CHECK_HMD_WORN_STATUS = 500; // milliseconds
static const QString DESKTOP_DISPLAY_PLUGIN_NAME = "Desktop";
static const QString SYSTEM_TABLET = "com.highfidelity.interface.tablet.system";
@ -438,6 +444,10 @@ bool setupEssentials(int& argc, char** argv, bool runningMarkerExisted) {
const char* portStr = getCmdOption(argc, constArgv, "--listenPort");
const int listenPort = portStr ? atoi(portStr) : INVALID_PORT;
static const auto SUPPRESS_SETTINGS_RESET = "--suppress-settings-reset";
bool suppressPrompt = cmdOptionExists(argc, const_cast<const char**>(argv), SUPPRESS_SETTINGS_RESET);
bool previousSessionCrashed = CrashHandler::checkForResetSettings(runningMarkerExisted, suppressPrompt);
Setting::init();
if (auto steamClient = PluginManager::getInstance()->getSteamClientPlugin()) {
@ -458,10 +468,6 @@ bool setupEssentials(int& argc, char** argv, bool runningMarkerExisted) {
QCoreApplication::addLibraryPath(audioDLLPath);
#endif
static const auto SUPPRESS_SETTINGS_RESET = "--suppress-settings-reset";
bool suppressPrompt = cmdOptionExists(argc, const_cast<const char**>(argv), SUPPRESS_SETTINGS_RESET);
bool previousSessionCrashed = CrashHandler::checkForResetSettings(runningMarkerExisted, suppressPrompt);
DependencyManager::registerInheritance<LimitedNodeList, NodeList>();
DependencyManager::registerInheritance<AvatarHashMap, AvatarManager>();
DependencyManager::registerInheritance<EntityDynamicFactoryInterface, InterfaceDynamicFactory>();
@ -536,6 +542,7 @@ bool setupEssentials(int& argc, char** argv, bool runningMarkerExisted) {
DependencyManager::set<AvatarBookmarks>();
DependencyManager::set<LocationBookmarks>();
DependencyManager::set<Snapshot>();
DependencyManager::set<CloseEventSender>();
return previousSessionCrashed;
}
@ -649,6 +656,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
Model::setAbstractViewStateInterface(this); // The model class will sometimes need to know view state details from us
auto nodeList = DependencyManager::get<NodeList>();
nodeList->startThread();
// Set up a watchdog thread to intentionally crash the application on deadlocks
_deadlockWatchdogThread = new DeadlockWatchdogThread();
@ -674,25 +682,11 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
updateHeartbeat();
// start the nodeThread so its event loop is running
QThread* nodeThread = new QThread(this);
nodeThread->setObjectName("NodeList Thread");
nodeThread->start();
// make sure the node thread is given highest priority
nodeThread->setPriority(QThread::TimeCriticalPriority);
// setup a timer for domain-server check ins
QTimer* domainCheckInTimer = new QTimer(nodeList.data());
connect(domainCheckInTimer, &QTimer::timeout, nodeList.data(), &NodeList::sendDomainServerCheckIn);
domainCheckInTimer->start(DOMAIN_SERVER_CHECK_IN_MSECS);
// put the NodeList and datagram processing on the node thread
nodeList->moveToThread(nodeThread);
// put the audio processing on a separate thread
QThread* audioThread = new QThread();
audioThread->setObjectName("Audio Thread");
auto audioIO = DependencyManager::get<AudioClient>();
audioIO->setPositionGetter([]{
@ -708,7 +702,6 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
return myAvatar ? myAvatar->getOrientationForAudio() : Quaternions::IDENTITY;
});
audioIO->moveToThread(audioThread);
recording::Frame::registerFrameHandler(AudioConstants::getAudioFrameName(), [=](recording::Frame::ConstPointer frame) {
audioIO->handleRecordedAudioInput(frame->data);
});
@ -720,11 +713,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
recorder->recordFrame(AUDIO_FRAME_TYPE, audio);
}
});
connect(audioThread, &QThread::started, audioIO.data(), &AudioClient::start);
connect(audioIO.data(), &AudioClient::destroyed, audioThread, &QThread::quit);
connect(audioThread, &QThread::finished, audioThread, &QThread::deleteLater);
audioThread->start();
audioIO->startThread();
auto audioScriptingInterface = DependencyManager::set<AudioScriptingInterface, scripting::Audio>();
connect(audioIO.data(), &AudioClient::mutedByMixer, audioScriptingInterface.data(), &AudioScriptingInterface::mutedByMixer);
@ -750,12 +739,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
updateHeartbeat();
// Setup MessagesClient
auto messagesClient = DependencyManager::get<MessagesClient>();
QThread* messagesThread = new QThread;
messagesThread->setObjectName("Messages Client Thread");
messagesClient->moveToThread(messagesThread);
connect(messagesThread, &QThread::started, messagesClient.data(), &MessagesClient::init);
messagesThread->start();
DependencyManager::get<MessagesClient>()->startThread();
const DomainHandler& domainHandler = nodeList->getDomainHandler();
@ -950,6 +934,13 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
properties["processor_l3_cache_count"] = procInfo.numProcessorCachesL3;
}
// add firstRun flag from settings to launch event
Setting::Handle<bool> firstRun { Settings::firstRun, true };
properties["first_run"] = firstRun.get();
// add the user's machine ID to the launch event
properties["machine_fingerprint"] = uuidStringWithoutCurlyBraces(FingerprintUtils::getMachineFingerprint());
UserActivityLogger::getInstance().logAction("launch", properties);
// Tell our entity edit sender about our known jurisdictions
@ -1309,6 +1300,8 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
properties["kbps_in"] = bandwidthRecorder->getCachedTotalAverageInputKilobitsPerSecond();
properties["kbps_out"] = bandwidthRecorder->getCachedTotalAverageOutputKilobitsPerSecond();
properties["atp_in_kbps"] = bandwidthRecorder->getAverageInputKilobitsPerSecond(NodeType::AssetServer);
auto nodeList = DependencyManager::get<NodeList>();
SharedNodePointer entityServerNode = nodeList->soloNodeOfType(NodeType::EntityServer);
SharedNodePointer audioMixerNode = nodeList->soloNodeOfType(NodeType::AudioMixer);
@ -1322,8 +1315,61 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
properties["messages_ping"] = messagesMixerNode ? messagesMixerNode->getPingMs() : -1;
auto loadingRequests = ResourceCache::getLoadingRequests();
QJsonArray loadingRequestsStats;
for (const auto& request : loadingRequests) {
QJsonObject requestStats;
requestStats["filename"] = request->getURL().fileName();
requestStats["received"] = request->getBytesReceived();
requestStats["total"] = request->getBytesTotal();
requestStats["attempts"] = (int)request->getDownloadAttempts();
loadingRequestsStats.append(requestStats);
}
properties["active_downloads"] = loadingRequests.size();
properties["pending_downloads"] = ResourceCache::getPendingRequestCount();
properties["active_downloads_details"] = loadingRequestsStats;
auto statTracker = DependencyManager::get<StatTracker>();
properties["processing_resources"] = statTracker->getStat("Processing").toInt();
properties["pending_processing_resources"] = statTracker->getStat("PendingProcessing").toInt();
QJsonObject startedRequests;
startedRequests["atp"] = statTracker->getStat(STAT_ATP_REQUEST_STARTED).toInt();
startedRequests["http"] = statTracker->getStat(STAT_HTTP_REQUEST_STARTED).toInt();
startedRequests["file"] = statTracker->getStat(STAT_FILE_REQUEST_STARTED).toInt();
startedRequests["total"] = startedRequests["atp"].toInt() + startedRequests["http"].toInt()
+ startedRequests["file"].toInt();
properties["started_requests"] = startedRequests;
QJsonObject successfulRequests;
successfulRequests["atp"] = statTracker->getStat(STAT_ATP_REQUEST_SUCCESS).toInt();
successfulRequests["http"] = statTracker->getStat(STAT_HTTP_REQUEST_SUCCESS).toInt();
successfulRequests["file"] = statTracker->getStat(STAT_FILE_REQUEST_SUCCESS).toInt();
successfulRequests["total"] = successfulRequests["atp"].toInt() + successfulRequests["http"].toInt()
+ successfulRequests["file"].toInt();
properties["successful_requests"] = successfulRequests;
QJsonObject failedRequests;
failedRequests["atp"] = statTracker->getStat(STAT_ATP_REQUEST_FAILED).toInt();
failedRequests["http"] = statTracker->getStat(STAT_HTTP_REQUEST_FAILED).toInt();
failedRequests["file"] = statTracker->getStat(STAT_FILE_REQUEST_FAILED).toInt();
failedRequests["total"] = failedRequests["atp"].toInt() + failedRequests["http"].toInt()
+ failedRequests["file"].toInt();
properties["failed_requests"] = failedRequests;
QJsonObject cacheRequests;
cacheRequests["atp"] = statTracker->getStat(STAT_ATP_REQUEST_CACHE).toInt();
cacheRequests["http"] = statTracker->getStat(STAT_HTTP_REQUEST_CACHE).toInt();
cacheRequests["total"] = cacheRequests["atp"].toInt() + cacheRequests["http"].toInt();
properties["cache_requests"] = cacheRequests;
QJsonObject atpMappingRequests;
atpMappingRequests["started"] = statTracker->getStat(STAT_ATP_MAPPING_REQUEST_STARTED).toInt();
atpMappingRequests["failed"] = statTracker->getStat(STAT_ATP_MAPPING_REQUEST_FAILED).toInt();
atpMappingRequests["successful"] = statTracker->getStat(STAT_ATP_MAPPING_REQUEST_SUCCESS).toInt();
properties["atp_mapping_requests"] = atpMappingRequests;
properties["throttled"] = _displayPlugin ? _displayPlugin->isThrottled() : false;
@ -1339,9 +1385,43 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
properties["deleted_entity_cnt"] = entityActivityTracking.deletedEntityCount;
properties["edited_entity_cnt"] = entityActivityTracking.editedEntityCount;
NodeToOctreeSceneStats* octreeServerSceneStats = getOcteeSceneStats();
unsigned long totalServerOctreeElements = 0;
for (NodeToOctreeSceneStatsIterator i = octreeServerSceneStats->begin(); i != octreeServerSceneStats->end(); i++) {
totalServerOctreeElements += i->second.getTotalElements();
}
properties["local_octree_elements"] = (qint64) OctreeElement::getInternalNodeCount();
properties["server_octree_elements"] = (qint64) totalServerOctreeElements;
properties["active_display_plugin"] = getActiveDisplayPlugin()->getName();
properties["using_hmd"] = isHMDMode();
_autoSwitchDisplayModeSupportedHMDPlugin = nullptr;
foreach(DisplayPluginPointer displayPlugin, PluginManager::getInstance()->getDisplayPlugins()) {
if (displayPlugin->isHmd() &&
displayPlugin->getSupportsAutoSwitch()) {
_autoSwitchDisplayModeSupportedHMDPlugin = displayPlugin;
_autoSwitchDisplayModeSupportedHMDPluginName =
_autoSwitchDisplayModeSupportedHMDPlugin->getName();
_previousHMDWornStatus =
_autoSwitchDisplayModeSupportedHMDPlugin->isDisplayVisible();
break;
}
}
if (_autoSwitchDisplayModeSupportedHMDPlugin) {
if (getActiveDisplayPlugin() != _autoSwitchDisplayModeSupportedHMDPlugin &&
!_autoSwitchDisplayModeSupportedHMDPlugin->isSessionActive()) {
startHMDStandBySession();
}
// Poll periodically to check whether the user has worn HMD or not. Switch Display mode accordingly.
// If the user wears HMD then switch to VR mode. If the user removes HMD then switch to Desktop mode.
QTimer* autoSwitchDisplayModeTimer = new QTimer(this);
connect(autoSwitchDisplayModeTimer, SIGNAL(timeout()), this, SLOT(switchDisplayMode()));
autoSwitchDisplayModeTimer->start(INTERVAL_TO_CHECK_HMD_WORN_STATUS);
}
auto glInfo = getGLContextData();
properties["gl_info"] = glInfo;
properties["gpu_used_memory"] = (int)BYTES_TO_MB(gpu::Context::getUsedGPUMemSize());
@ -1364,6 +1444,8 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
lastLeftHandPose = leftHandPose;
lastRightHandPose = rightHandPose;
properties["local_socket_changes"] = DependencyManager::get<StatTracker>()->getStat(LOCAL_SOCKET_CHANGE_STAT).toInt();
UserActivityLogger::getInstance().logAction("stats", properties);
});
sendStatsTimer->start();
@ -1463,6 +1545,8 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
updateSystemTabletMode();
connect(&_myCamera, &Camera::modeUpdated, this, &Application::cameraModeChanged);
qCDebug(interfaceapp) << "Metaverse session ID is" << uuidStringWithoutCurlyBraces(accountManager->getSessionID());
}
void Application::domainConnectionRefused(const QString& reasonMessage, int reasonCodeInt, const QString& extraInfo) {
@ -1571,6 +1655,12 @@ void Application::aboutToQuit() {
}
getActiveDisplayPlugin()->deactivate();
if (_autoSwitchDisplayModeSupportedHMDPlugin
&& _autoSwitchDisplayModeSupportedHMDPlugin->isSessionActive()) {
_autoSwitchDisplayModeSupportedHMDPlugin->endSession();
}
// use the CloseEventSender via a QThread to send an event that says the user asked for the app to close
DependencyManager::get<CloseEventSender>()->startThread();
// Hide Running Scripts dialog so that it gets destroyed in an orderly manner; prevents warnings at shutdown.
DependencyManager::get<OffscreenUi>()->hide("RunningScripts");
@ -1658,6 +1748,8 @@ void Application::cleanupBeforeQuit() {
// stop QML
DependencyManager::destroy<OffscreenUi>();
DependencyManager::destroy<OffscreenQmlSurfaceCache>();
if (_snapshotSoundInjector != nullptr) {
_snapshotSoundInjector->stop();
}
@ -1685,6 +1777,10 @@ Application::~Application() {
_physicsEngine->setCharacterController(nullptr);
// the _shapeManager should have zero references
_shapeManager.collectGarbage();
assert(_shapeManager.getNumShapes() == 0);
// shutdown render engine
_main3DScene = nullptr;
_renderEngine = nullptr;
@ -1713,15 +1809,9 @@ Application::~Application() {
ResourceManager::cleanup();
QThread* nodeThread = DependencyManager::get<NodeList>()->thread();
// remove the NodeList from the DependencyManager
DependencyManager::destroy<NodeList>();
// ask the node thread to quit and wait until it is done
nodeThread->quit();
nodeThread->wait();
Leapmotion::destroy();
if (auto steamClient = PluginManager::getInstance()->getSteamClientPlugin()) {
@ -1736,6 +1826,15 @@ Application::~Application() {
_window->deleteLater();
// make sure that the quit event has finished sending before we take the application down
auto closeEventSender = DependencyManager::get<CloseEventSender>();
while (!closeEventSender->hasFinishedQuitEvent() && !closeEventSender->hasTimedOutQuitEvent()) {
// sleep a little so we're not spinning at 100%
std::this_thread::sleep_for(std::chrono::milliseconds(10));
}
// quit the thread used by the closure event sender
closeEventSender->thread()->quit();
// Can't log to file passed this point, FileLogger about to be deleted
qInstallMessageHandler(LogHandler::verboseMessageHandler);
}
@ -1771,15 +1870,9 @@ void Application::initializeGL() {
// Set up the render engine
render::CullFunctor cullFunctor = LODManager::shouldRender;
_renderEngine->addJob<RenderShadowTask>("RenderShadowTask", cullFunctor);
const auto items = _renderEngine->addJob<RenderFetchCullSortTask>("FetchCullSort", cullFunctor);
assert(items.canCast<RenderFetchCullSortTask::Output>());
static const QString RENDER_FORWARD = "HIFI_RENDER_FORWARD";
if (QProcessEnvironment::systemEnvironment().contains(RENDER_FORWARD)) {
_renderEngine->addJob<RenderForwardTask>("Forward", items);
} else {
_renderEngine->addJob<RenderDeferredTask>("RenderDeferredTask", items);
}
bool isDeferred = !QProcessEnvironment::systemEnvironment().contains(RENDER_FORWARD);
_renderEngine->addJob<RenderViewTask>("RenderMainView", cullFunctor, isDeferred);
_renderEngine->load();
_renderEngine->registerScene(_main3DScene);
@ -1828,14 +1921,10 @@ void Application::initializeUi() {
UpdateDialog::registerType();
qmlRegisterType<Preference>("Hifi", 1, 0, "Preference");
qmlRegisterType<HFWebEngineProfile>("HFWebEngineProfile", 1, 0, "HFWebEngineProfile");
qmlRegisterType<HFTabletWebEngineProfile>("HFTabletWebEngineProfile", 1, 0, "HFTabletWebEngineProfile");
qmlRegisterType<FileTypeProfile>("FileTypeProfile", 1, 0, "FileTypeProfile");
auto offscreenUi = DependencyManager::get<OffscreenUi>();
offscreenUi->create(_glWidget->qglContext());
auto rootContext = offscreenUi->getRootContext();
auto surfaceContext = offscreenUi->getSurfaceContext();
offscreenUi->setProxyWindow(_window->windowHandle());
offscreenUi->setBaseUrl(QUrl::fromLocalFile(PathUtils::resourcesPath() + "/qml/"));
@ -1847,7 +1936,7 @@ void Application::initializeUi() {
// do better detection in the offscreen UI of what has focus
offscreenUi->setNavigationFocused(false);
auto engine = rootContext->engine();
auto engine = surfaceContext->engine();
connect(engine, &QQmlEngine::quit, [] {
qApp->quit();
});
@ -1856,78 +1945,77 @@ void Application::initializeUi() {
// For some reason there is already an "Application" object in the QML context,
// though I can't find it. Hence, "ApplicationInterface"
rootContext->setContextProperty("ApplicationInterface", this);
rootContext->setContextProperty("Audio", DependencyManager::get<AudioScriptingInterface>().data());
rootContext->setContextProperty("AudioStats", DependencyManager::get<AudioClient>()->getStats().data());
rootContext->setContextProperty("AudioScope", DependencyManager::get<AudioScope>().data());
surfaceContext->setContextProperty("Audio", DependencyManager::get<AudioScriptingInterface>().data());
surfaceContext->setContextProperty("AudioStats", DependencyManager::get<AudioClient>()->getStats().data());
surfaceContext->setContextProperty("AudioScope", DependencyManager::get<AudioScope>().data());
rootContext->setContextProperty("Controller", DependencyManager::get<controller::ScriptingInterface>().data());
rootContext->setContextProperty("Entities", DependencyManager::get<EntityScriptingInterface>().data());
surfaceContext->setContextProperty("Controller", DependencyManager::get<controller::ScriptingInterface>().data());
surfaceContext->setContextProperty("Entities", DependencyManager::get<EntityScriptingInterface>().data());
_fileDownload = new FileScriptingInterface(engine);
rootContext->setContextProperty("File", _fileDownload);
surfaceContext->setContextProperty("File", _fileDownload);
connect(_fileDownload, &FileScriptingInterface::unzipResult, this, &Application::handleUnzip);
rootContext->setContextProperty("MyAvatar", getMyAvatar().get());
rootContext->setContextProperty("Messages", DependencyManager::get<MessagesClient>().data());
rootContext->setContextProperty("Recording", DependencyManager::get<RecordingScriptingInterface>().data());
rootContext->setContextProperty("Preferences", DependencyManager::get<Preferences>().data());
rootContext->setContextProperty("AddressManager", DependencyManager::get<AddressManager>().data());
rootContext->setContextProperty("FrameTimings", &_frameTimingsScriptingInterface);
rootContext->setContextProperty("Rates", new RatesScriptingInterface(this));
rootContext->setContextProperty("pathToFonts", "../../");
surfaceContext->setContextProperty("MyAvatar", getMyAvatar().get());
surfaceContext->setContextProperty("Messages", DependencyManager::get<MessagesClient>().data());
surfaceContext->setContextProperty("Recording", DependencyManager::get<RecordingScriptingInterface>().data());
surfaceContext->setContextProperty("Preferences", DependencyManager::get<Preferences>().data());
surfaceContext->setContextProperty("AddressManager", DependencyManager::get<AddressManager>().data());
surfaceContext->setContextProperty("FrameTimings", &_frameTimingsScriptingInterface);
surfaceContext->setContextProperty("Rates", new RatesScriptingInterface(this));
rootContext->setContextProperty("TREE_SCALE", TREE_SCALE);
rootContext->setContextProperty("Quat", new Quat());
rootContext->setContextProperty("Vec3", new Vec3());
rootContext->setContextProperty("Uuid", new ScriptUUID());
rootContext->setContextProperty("Assets", DependencyManager::get<AssetMappingsScriptingInterface>().data());
surfaceContext->setContextProperty("TREE_SCALE", TREE_SCALE);
// FIXME Quat and Vec3 won't work with QJSEngine used by QML
surfaceContext->setContextProperty("Quat", new Quat());
surfaceContext->setContextProperty("Vec3", new Vec3());
surfaceContext->setContextProperty("Uuid", new ScriptUUID());
surfaceContext->setContextProperty("Assets", DependencyManager::get<AssetMappingsScriptingInterface>().data());
rootContext->setContextProperty("AvatarList", DependencyManager::get<AvatarManager>().data());
rootContext->setContextProperty("Users", DependencyManager::get<UsersScriptingInterface>().data());
surfaceContext->setContextProperty("AvatarList", DependencyManager::get<AvatarManager>().data());
surfaceContext->setContextProperty("Users", DependencyManager::get<UsersScriptingInterface>().data());
rootContext->setContextProperty("UserActivityLogger", DependencyManager::get<UserActivityLoggerScriptingInterface>().data());
surfaceContext->setContextProperty("UserActivityLogger", DependencyManager::get<UserActivityLoggerScriptingInterface>().data());
rootContext->setContextProperty("Camera", &_myCamera);
surfaceContext->setContextProperty("Camera", &_myCamera);
#if defined(Q_OS_MAC) || defined(Q_OS_WIN)
rootContext->setContextProperty("SpeechRecognizer", DependencyManager::get<SpeechRecognizer>().data());
surfaceContext->setContextProperty("SpeechRecognizer", DependencyManager::get<SpeechRecognizer>().data());
#endif
rootContext->setContextProperty("Overlays", &_overlays);
rootContext->setContextProperty("Window", DependencyManager::get<WindowScriptingInterface>().data());
rootContext->setContextProperty("MenuInterface", MenuScriptingInterface::getInstance());
rootContext->setContextProperty("Stats", Stats::getInstance());
rootContext->setContextProperty("Settings", SettingsScriptingInterface::getInstance());
rootContext->setContextProperty("ScriptDiscoveryService", DependencyManager::get<ScriptEngines>().data());
rootContext->setContextProperty("AvatarBookmarks", DependencyManager::get<AvatarBookmarks>().data());
rootContext->setContextProperty("LocationBookmarks", DependencyManager::get<LocationBookmarks>().data());
surfaceContext->setContextProperty("Overlays", &_overlays);
surfaceContext->setContextProperty("Window", DependencyManager::get<WindowScriptingInterface>().data());
surfaceContext->setContextProperty("MenuInterface", MenuScriptingInterface::getInstance());
surfaceContext->setContextProperty("Stats", Stats::getInstance());
surfaceContext->setContextProperty("Settings", SettingsScriptingInterface::getInstance());
surfaceContext->setContextProperty("ScriptDiscoveryService", DependencyManager::get<ScriptEngines>().data());
surfaceContext->setContextProperty("AvatarBookmarks", DependencyManager::get<AvatarBookmarks>().data());
surfaceContext->setContextProperty("LocationBookmarks", DependencyManager::get<LocationBookmarks>().data());
// Caches
rootContext->setContextProperty("AnimationCache", DependencyManager::get<AnimationCache>().data());
rootContext->setContextProperty("TextureCache", DependencyManager::get<TextureCache>().data());
rootContext->setContextProperty("ModelCache", DependencyManager::get<ModelCache>().data());
rootContext->setContextProperty("SoundCache", DependencyManager::get<SoundCache>().data());
surfaceContext->setContextProperty("AnimationCache", DependencyManager::get<AnimationCache>().data());
surfaceContext->setContextProperty("TextureCache", DependencyManager::get<TextureCache>().data());
surfaceContext->setContextProperty("ModelCache", DependencyManager::get<ModelCache>().data());
surfaceContext->setContextProperty("SoundCache", DependencyManager::get<SoundCache>().data());
rootContext->setContextProperty("Account", AccountScriptingInterface::getInstance());
rootContext->setContextProperty("Tablet", DependencyManager::get<TabletScriptingInterface>().data());
rootContext->setContextProperty("DialogsManager", _dialogsManagerScriptingInterface);
rootContext->setContextProperty("GlobalServices", GlobalServicesScriptingInterface::getInstance());
rootContext->setContextProperty("FaceTracker", DependencyManager::get<DdeFaceTracker>().data());
rootContext->setContextProperty("AvatarManager", DependencyManager::get<AvatarManager>().data());
rootContext->setContextProperty("UndoStack", &_undoStackScriptingInterface);
rootContext->setContextProperty("LODManager", DependencyManager::get<LODManager>().data());
rootContext->setContextProperty("Paths", DependencyManager::get<PathUtils>().data());
rootContext->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
rootContext->setContextProperty("Scene", DependencyManager::get<SceneScriptingInterface>().data());
rootContext->setContextProperty("Render", _renderEngine->getConfiguration().get());
rootContext->setContextProperty("Reticle", getApplicationCompositor().getReticleInterface());
rootContext->setContextProperty("Snapshot", DependencyManager::get<Snapshot>().data());
surfaceContext->setContextProperty("Account", AccountScriptingInterface::getInstance());
surfaceContext->setContextProperty("Tablet", DependencyManager::get<TabletScriptingInterface>().data());
surfaceContext->setContextProperty("DialogsManager", _dialogsManagerScriptingInterface);
surfaceContext->setContextProperty("GlobalServices", GlobalServicesScriptingInterface::getInstance());
surfaceContext->setContextProperty("FaceTracker", DependencyManager::get<DdeFaceTracker>().data());
surfaceContext->setContextProperty("AvatarManager", DependencyManager::get<AvatarManager>().data());
surfaceContext->setContextProperty("UndoStack", &_undoStackScriptingInterface);
surfaceContext->setContextProperty("LODManager", DependencyManager::get<LODManager>().data());
surfaceContext->setContextProperty("Paths", DependencyManager::get<PathUtils>().data());
surfaceContext->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
surfaceContext->setContextProperty("Scene", DependencyManager::get<SceneScriptingInterface>().data());
surfaceContext->setContextProperty("Render", _renderEngine->getConfiguration().get());
surfaceContext->setContextProperty("Reticle", getApplicationCompositor().getReticleInterface());
surfaceContext->setContextProperty("Snapshot", DependencyManager::get<Snapshot>().data());
rootContext->setContextProperty("ApplicationCompositor", &getApplicationCompositor());
surfaceContext->setContextProperty("ApplicationCompositor", &getApplicationCompositor());
rootContext->setContextProperty("AvatarInputs", AvatarInputs::getInstance());
surfaceContext->setContextProperty("AvatarInputs", AvatarInputs::getInstance());
if (auto steamClient = PluginManager::getInstance()->getSteamClientPlugin()) {
rootContext->setContextProperty("Steam", new SteamScriptingInterface(engine, steamClient.get()));
surfaceContext->setContextProperty("Steam", new SteamScriptingInterface(engine, steamClient.get()));
}
@ -2066,7 +2154,7 @@ void Application::paintGL() {
_myCamera.setOrientation(glm::quat_cast(camMat));
} else {
_myCamera.setPosition(myAvatar->getDefaultEyePosition());
_myCamera.setOrientation(myAvatar->getMyHead()->getCameraOrientation());
_myCamera.setOrientation(myAvatar->getMyHead()->getHeadOrientation());
}
} else if (_myCamera.getMode() == CAMERA_MODE_THIRD_PERSON) {
if (isHMDMode()) {
@ -2186,6 +2274,9 @@ void Application::paintGL() {
});
renderArgs._context->setStereoProjections(eyeProjections);
renderArgs._context->setStereoViews(eyeOffsets);
// Configure the type of display / stereo
renderArgs._displayMode = (isHMDMode() ? RenderArgs::STEREO_HMD : RenderArgs::STEREO_MONITOR);
}
renderArgs._blitFramebuffer = finalFramebuffer;
displaySide(&renderArgs, _myCamera);
@ -2379,15 +2470,16 @@ void Application::handleSandboxStatus(QNetworkReply* reply) {
// Check HMD use (may be technically available without being in use)
bool hasHMD = PluginUtils::isHMDAvailable();
bool isUsingHMD = hasHMD && hasHandControllers && _displayPlugin->isHmd();
bool isUsingHMD = _displayPlugin->isHmd();
bool isUsingHMDAndHandControllers = hasHMD && hasHandControllers && isUsingHMD;
Setting::Handle<bool> tutorialComplete{ "tutorialComplete", false };
Setting::Handle<bool> firstRun{ Settings::firstRun, true };
bool isTutorialComplete = tutorialComplete.get();
bool shouldGoToTutorial = isUsingHMD && hasTutorialContent && !isTutorialComplete;
bool shouldGoToTutorial = isUsingHMDAndHandControllers && hasTutorialContent && !isTutorialComplete;
qCDebug(interfaceapp) << "HMD:" << hasHMD << ", Hand Controllers: " << hasHandControllers << ", Using HMD: " << isUsingHMD;
qCDebug(interfaceapp) << "HMD:" << hasHMD << ", Hand Controllers: " << hasHandControllers << ", Using HMD: " << isUsingHMDAndHandControllers;
qCDebug(interfaceapp) << "Tutorial version:" << contentVersion << ", sufficient:" << hasTutorialContent <<
", complete:" << isTutorialComplete << ", should go:" << shouldGoToTutorial;
@ -2401,10 +2493,18 @@ void Application::handleSandboxStatus(QNetworkReply* reply) {
const QString TUTORIAL_PATH = "/tutorial_begin";
static const QString SENT_TO_TUTORIAL = "tutorial";
static const QString SENT_TO_PREVIOUS_LOCATION = "previous_location";
static const QString SENT_TO_ENTRY = "entry";
static const QString SENT_TO_SANDBOX = "sandbox";
QString sentTo;
if (shouldGoToTutorial) {
if (sandboxIsRunning) {
qCDebug(interfaceapp) << "Home sandbox appears to be running, going to Home.";
DependencyManager::get<AddressManager>()->goToLocalSandbox(TUTORIAL_PATH);
sentTo = SENT_TO_TUTORIAL;
} else {
qCDebug(interfaceapp) << "Home sandbox does not appear to be running, going to Entry.";
if (firstRun.get()) {
@ -2412,8 +2512,10 @@ void Application::handleSandboxStatus(QNetworkReply* reply) {
}
if (addressLookupString.isEmpty()) {
DependencyManager::get<AddressManager>()->goToEntry();
sentTo = SENT_TO_ENTRY;
} else {
DependencyManager::get<AddressManager>()->loadSettings(addressLookupString);
sentTo = SENT_TO_PREVIOUS_LOCATION;
}
}
} else {
@ -2426,23 +2528,40 @@ void Application::handleSandboxStatus(QNetworkReply* reply) {
// If this is a first run we short-circuit the address passed in
if (isFirstRun) {
if (isUsingHMD) {
if (isUsingHMDAndHandControllers) {
if (sandboxIsRunning) {
qCDebug(interfaceapp) << "Home sandbox appears to be running, going to Home.";
DependencyManager::get<AddressManager>()->goToLocalSandbox();
sentTo = SENT_TO_SANDBOX;
} else {
qCDebug(interfaceapp) << "Home sandbox does not appear to be running, going to Entry.";
DependencyManager::get<AddressManager>()->goToEntry();
sentTo = SENT_TO_ENTRY;
}
} else {
DependencyManager::get<AddressManager>()->goToEntry();
sentTo = SENT_TO_ENTRY;
}
} else {
qCDebug(interfaceapp) << "Not first run... going to" << qPrintable(addressLookupString.isEmpty() ? QString("previous location") : addressLookupString);
DependencyManager::get<AddressManager>()->loadSettings(addressLookupString);
sentTo = SENT_TO_PREVIOUS_LOCATION;
}
}
UserActivityLogger::getInstance().logAction("startup_sent_to", {
{ "sent_to", sentTo },
{ "sandbox_is_running", sandboxIsRunning },
{ "has_hmd", hasHMD },
{ "has_hand_controllers", hasHandControllers },
{ "is_using_hmd", isUsingHMD },
{ "is_using_hmd_and_hand_controllers", isUsingHMDAndHandControllers },
{ "content_version", contentVersion },
{ "is_tutorial_complete", isTutorialComplete },
{ "has_tutorial_content", hasTutorialContent },
{ "should_go_to_tutorial", shouldGoToTutorial }
});
_connectionMonitor.init();
// After all of the constructor is completed, then set firstRun to false.
@ -2777,6 +2896,17 @@ void Application::keyPressEvent(QKeyEvent* event) {
if (isShifted && isMeta && !isOption) {
Menu::getInstance()->triggerOption(MenuOption::SuppressShortTimings);
} else if (!isOption && !isShifted && isMeta) {
AudioInjectorOptions options;
options.localOnly = true;
options.stereo = true;
if (_snapshotSoundInjector) {
_snapshotSoundInjector->setOptions(options);
_snapshotSoundInjector->restart();
} else {
QByteArray samples = _snapshotSound->getByteArray();
_snapshotSoundInjector = AudioInjector::playSound(samples, options);
}
takeSnapshot(true);
}
break;
@ -3951,6 +4081,7 @@ void Application::updateMyAvatarLookAtPosition() {
lookAtPosition.x = -lookAtPosition.x;
}
if (isHMD) {
// TODO -- this code is probably wrong, getHeadPose() returns something in sensor frame, not avatar
glm::mat4 headPose = getActiveDisplayPlugin()->getHeadPose();
glm::quat hmdRotation = glm::quat_cast(headPose);
lookAtSpot = _myCamera.getPosition() + myAvatar->getOrientation() * (hmdRotation * lookAtPosition);
@ -3992,9 +4123,10 @@ void Application::updateMyAvatarLookAtPosition() {
}
} else {
// I am not looking at anyone else, so just look forward
if (isHMD) {
glm::mat4 worldHMDMat = myAvatar->getSensorToWorldMatrix() * myAvatar->getHMDSensorMatrix();
lookAtSpot = transformPoint(worldHMDMat, glm::vec3(0.0f, 0.0f, -TREE_SCALE));
auto headPose = myAvatar->getHeadControllerPoseInSensorFrame();
if (headPose.isValid()) {
glm::mat4 worldHeadMat = myAvatar->getSensorToWorldMatrix() * headPose.getMatrix();
lookAtSpot = transformPoint(worldHeadMat, glm::vec3(0.0f, 0.0f, TREE_SCALE));
} else {
lookAtSpot = myAvatar->getHead()->getEyePosition() +
(myAvatar->getHead()->getFinalOrientationInWorldFrame() * glm::vec3(0.0f, 0.0f, -TREE_SCALE));
@ -4492,12 +4624,13 @@ void Application::update(float deltaTime) {
getEntities()->getTree()->withWriteLock([&] {
PerformanceTimer perfTimer("handleOutgoingChanges");
const VectorOfMotionStates& deactivations = _physicsEngine->getDeactivatedMotionStates();
_entitySimulation->handleDeactivatedMotionStates(deactivations);
const VectorOfMotionStates& outgoingChanges = _physicsEngine->getChangedMotionStates();
_entitySimulation->handleChangedMotionStates(outgoingChanges);
avatarManager->handleChangedMotionStates(outgoingChanges);
const VectorOfMotionStates& deactivations = _physicsEngine->getDeactivatedMotionStates();
_entitySimulation->handleDeactivatedMotionStates(deactivations);
});
if (!_aboutToQuit) {
@ -4885,12 +5018,6 @@ QRect Application::getDesirableApplicationGeometry() const {
return applicationGeometry;
}
glm::vec3 Application::getSunDirection() const {
// Sun direction is in fact just the location of the sun relative to the origin
auto skyStage = DependencyManager::get<SceneScriptingInterface>()->getSkyStage();
return skyStage->getSunLight()->getDirection();
}
// FIXME, preprocessor guard this check to occur only in DEBUG builds
static QThread * activeRenderingThread = nullptr;
@ -4947,7 +5074,7 @@ namespace render {
template <> const ItemKey payloadGetKey(const WorldBoxRenderData::Pointer& stuff) { return ItemKey::Builder::opaqueShape(); }
template <> const Item::Bound payloadGetBound(const WorldBoxRenderData::Pointer& stuff) { return Item::Bound(); }
template <> void payloadRender(const WorldBoxRenderData::Pointer& stuff, RenderArgs* args) {
if (args->_renderMode != RenderArgs::MIRROR_RENDER_MODE && Menu::getInstance()->isOptionChecked(MenuOption::WorldAxes)) {
if (Menu::getInstance()->isOptionChecked(MenuOption::WorldAxes)) {
PerformanceTimer perfTimer("worldBox");
auto& batch = *args->_batch;
@ -4957,90 +5084,6 @@ namespace render {
}
}
// Background Render Data & rendering functions
class BackgroundRenderData {
public:
typedef render::Payload<BackgroundRenderData> Payload;
typedef Payload::DataPointer Pointer;
static render::ItemID _item; // unique WorldBoxRenderData
};
render::ItemID BackgroundRenderData::_item = 0;
namespace render {
template <> const ItemKey payloadGetKey(const BackgroundRenderData::Pointer& stuff) {
return ItemKey::Builder::background();
}
template <> const Item::Bound payloadGetBound(const BackgroundRenderData::Pointer& stuff) {
return Item::Bound();
}
template <> void payloadRender(const BackgroundRenderData::Pointer& background, RenderArgs* args) {
Q_ASSERT(args->_batch);
gpu::Batch& batch = *args->_batch;
// Background rendering decision
auto skyStage = DependencyManager::get<SceneScriptingInterface>()->getSkyStage();
auto backgroundMode = skyStage->getBackgroundMode();
switch (backgroundMode) {
case model::SunSkyStage::SKY_DEFAULT: {
auto scene = DependencyManager::get<SceneScriptingInterface>()->getStage();
auto sceneKeyLight = scene->getKeyLight();
scene->setSunModelEnable(false);
sceneKeyLight->setColor(ColorUtils::toVec3(KeyLightPropertyGroup::DEFAULT_KEYLIGHT_COLOR));
sceneKeyLight->setIntensity(KeyLightPropertyGroup::DEFAULT_KEYLIGHT_INTENSITY);
sceneKeyLight->setAmbientIntensity(KeyLightPropertyGroup::DEFAULT_KEYLIGHT_AMBIENT_INTENSITY);
sceneKeyLight->setDirection(KeyLightPropertyGroup::DEFAULT_KEYLIGHT_DIRECTION);
// fall through: render a skybox (if available), or the defaults (if requested)
}
case model::SunSkyStage::SKY_BOX: {
auto skybox = skyStage->getSkybox();
if (!skybox->empty()) {
PerformanceTimer perfTimer("skybox");
skybox->render(batch, args->getViewFrustum());
break;
}
// fall through: render defaults (if requested)
}
case model::SunSkyStage::SKY_DEFAULT_AMBIENT_TEXTURE: {
if (Menu::getInstance()->isOptionChecked(MenuOption::DefaultSkybox)) {
auto scene = DependencyManager::get<SceneScriptingInterface>()->getStage();
auto sceneKeyLight = scene->getKeyLight();
auto defaultSkyboxAmbientTexture = qApp->getDefaultSkyboxAmbientTexture();
if (defaultSkyboxAmbientTexture) {
sceneKeyLight->setAmbientSphere(defaultSkyboxAmbientTexture->getIrradiance());
sceneKeyLight->setAmbientMap(defaultSkyboxAmbientTexture);
} else {
static QString repeatedMessage = LogHandler::getInstance().addRepeatedMessageRegex(
"Failed to get a valid Default Skybox Ambient Texture ? probably because it couldn't be find during initialization step");
}
// fall through: render defaults skybox
} else {
break;
}
}
case model::SunSkyStage::SKY_DEFAULT_TEXTURE:
if (Menu::getInstance()->isOptionChecked(MenuOption::DefaultSkybox)) {
qApp->getDefaultSkybox()->render(batch, args->getViewFrustum());
}
break;
// Any other cases require no extra rendering
case model::SunSkyStage::NO_BACKGROUND:
default:
break;
}
}
}
void Application::displaySide(RenderArgs* renderArgs, Camera& theCamera, bool selfAvatarOnly) {
// FIXME: This preDisplayRender call is temporary until we create a separate render::scene for the mirror rendering.
@ -5064,15 +5107,6 @@ void Application::displaySide(RenderArgs* renderArgs, Camera& theCamera, bool se
// The pending changes collecting the changes here
render::Transaction transaction;
// FIXME: Move this out of here!, Background / skybox should be driven by the enityt content just like the other entities
// Background rendering decision
if (!render::Item::isValidID(BackgroundRenderData::_item)) {
auto backgroundRenderData = make_shared<BackgroundRenderData>();
auto backgroundRenderPayload = make_shared<BackgroundRenderData::Payload>(backgroundRenderData);
BackgroundRenderData::_item = _main3DScene->allocateID();
transaction.resetItem(BackgroundRenderData::_item, backgroundRenderPayload);
}
// Assuming nothing gets rendered through that
if (!selfAvatarOnly) {
if (DependencyManager::get<SceneScriptingInterface>()->shouldRenderEntities()) {
@ -5108,12 +5142,6 @@ void Application::displaySide(RenderArgs* renderArgs, Camera& theCamera, bool se
});
}
// Setup the current Zone Entity lighting
{
auto stage = DependencyManager::get<SceneScriptingInterface>()->getSkyStage();
DependencyManager::get<DeferredLightingEffect>()->setGlobalLight(stage->getSunLight());
}
{
PerformanceTimer perfTimer("SceneProcessTransaction");
_main3DScene->enqueueTransaction(transaction);
@ -6384,21 +6412,6 @@ void Application::loadAddAvatarBookmarkDialog() const {
}
void Application::takeSnapshot(bool notify, bool includeAnimated, float aspectRatio) {
//keep sound thread out of event loop scope
AudioInjectorOptions options;
options.localOnly = true;
options.stereo = true;
if (_snapshotSoundInjector) {
_snapshotSoundInjector->setOptions(options);
_snapshotSoundInjector->restart();
} else {
QByteArray samples = _snapshotSound->getByteArray();
_snapshotSoundInjector = AudioInjector::playSound(samples, options);
}
postLambdaEvent([notify, includeAnimated, aspectRatio, this] {
// Get a screenshot and save it
QString path = Snapshot::saveSnapshot(getActiveDisplayPlugin()->getScreenshot(aspectRatio));
@ -6811,6 +6824,35 @@ void Application::updateDisplayMode() {
Q_ASSERT_X(_displayPlugin, "Application::updateDisplayMode", "could not find an activated display plugin");
}
void Application::switchDisplayMode() {
if (!_autoSwitchDisplayModeSupportedHMDPlugin) {
return;
}
bool currentHMDWornStatus = _autoSwitchDisplayModeSupportedHMDPlugin->isDisplayVisible();
if (currentHMDWornStatus != _previousHMDWornStatus) {
// Switch to respective mode as soon as currentHMDWornStatus changes
if (currentHMDWornStatus) {
qCDebug(interfaceapp) << "Switching from Desktop to HMD mode";
endHMDSession();
setActiveDisplayPlugin(_autoSwitchDisplayModeSupportedHMDPluginName);
} else {
qCDebug(interfaceapp) << "Switching from HMD to desktop mode";
setActiveDisplayPlugin(DESKTOP_DISPLAY_PLUGIN_NAME);
startHMDStandBySession();
}
emit activeDisplayPluginChanged();
}
_previousHMDWornStatus = currentHMDWornStatus;
}
void Application::startHMDStandBySession() {
_autoSwitchDisplayModeSupportedHMDPlugin->startStandBySession();
}
void Application::endHMDSession() {
_autoSwitchDisplayModeSupportedHMDPlugin->endSession();
}
mat4 Application::getEyeProjection(int eye) const {
QMutexLocker viewLocker(&_viewMutex);
if (isHMDMode()) {

View file

@ -439,7 +439,7 @@ private slots:
void addAssetToWorldErrorTimeout();
void handleSandboxStatus(QNetworkReply* reply);
void switchDisplayMode();
private:
static void initDisplay();
void init();
@ -457,8 +457,6 @@ private:
void queryOctree(NodeType_t serverType, PacketType packetType, NodeToJurisdictionMap& jurisdictions, bool forceResend = false);
glm::vec3 getSunDirection() const;
void renderRearViewMirror(RenderArgs* renderArgs, const QRect& region, bool isZoomed);
int sendNackPackets();
@ -679,7 +677,11 @@ private:
FileScriptingInterface* _fileDownload;
AudioInjector* _snapshotSoundInjector { nullptr };
SharedSoundPointer _snapshotSound;
DisplayPluginPointer _autoSwitchDisplayModeSupportedHMDPlugin;
QString _autoSwitchDisplayModeSupportedHMDPluginName;
bool _previousHMDWornStatus;
void startHMDStandBySession();
void endHMDSession();
};
#endif // hifi_Application_h

View file

@ -21,15 +21,35 @@
#include "MainWindow.h"
#include "Menu.h"
#include "AvatarBookmarks.h"
#include "InterfaceLogging.h"
#include <QtQuick/QQuickWindow>
AvatarBookmarks::AvatarBookmarks() {
_bookmarksFilename = QStandardPaths::writableLocation(QStandardPaths::DataLocation) + "/" + AVATARBOOKMARKS_FILENAME;
_bookmarksFilename = PathUtils::getAppDataPath() + "/" + AVATARBOOKMARKS_FILENAME;
readFromFile();
}
void AvatarBookmarks::readFromFile() {
// migrate old avatarbookmarks.json, used to be in 'local' folder on windows
QString oldConfigPath = QStandardPaths::writableLocation(QStandardPaths::DataLocation) + "/" + AVATARBOOKMARKS_FILENAME;
QFile oldConfig(oldConfigPath);
// I imagine that in a year from now, this code for migrating (as well as the two lines above)
// may be removed since all bookmarks should have been migrated by then
// - Robbie Uvanni (6.8.2017)
if (oldConfig.exists()) {
if (QDir().rename(oldConfigPath, _bookmarksFilename)) {
qCDebug(interfaceapp) << "Successfully migrated" << AVATARBOOKMARKS_FILENAME;
} else {
qCDebug(interfaceapp) << "Failed to migrate" << AVATARBOOKMARKS_FILENAME;
}
}
Bookmarks::readFromFile();
}
void AvatarBookmarks::setupMenus(Menu* menubar, MenuWrapper* menu) {
// Add menus/actions
auto bookmarkAction = menubar->addActionToQMenuAndActionHash(menu, MenuOption::BookmarkAvatar);

View file

@ -29,6 +29,7 @@ public slots:
protected:
void addBookmarkToMenu(Menu* menubar, const QString& name, const QString& address) override;
void readFromFile();
private:
const QString AVATARBOOKMARKS_FILENAME = "avatarbookmarks.json";

View file

@ -24,12 +24,15 @@
#include "Application.h"
#include "Menu.h"
#include <SettingHandle.h>
#include <RunningMarker.h>
#include <SettingHandle.h>
#include <SettingHelpers.h>
bool CrashHandler::checkForResetSettings(bool wasLikelyCrash, bool suppressPrompt) {
Settings settings;
QSettings::setDefaultFormat(JSON_FORMAT);
QSettings settings;
settings.beginGroup("Developer");
QVariant displayCrashOptions = settings.value(MenuOption::DisplayCrashOptions);
QVariant askToResetSettingsOption = settings.value(MenuOption::AskToResetSettings);
@ -70,7 +73,7 @@ CrashHandler::Action CrashHandler::promptUserForAction(bool showCrashMessage) {
layout->addWidget(label);
QRadioButton* option1 = new QRadioButton("Reset all my settings");
QRadioButton* option2 = new QRadioButton("Reset my settings but retain avatar info.");
QRadioButton* option2 = new QRadioButton("Reset my settings but keep essential info");
QRadioButton* option3 = new QRadioButton("Continue with my current settings");
option3->setChecked(true);
layout->addWidget(option1);
@ -92,7 +95,7 @@ CrashHandler::Action CrashHandler::promptUserForAction(bool showCrashMessage) {
return CrashHandler::DELETE_INTERFACE_INI;
}
if (option2->isChecked()) {
return CrashHandler::RETAIN_AVATAR_INFO;
return CrashHandler::RETAIN_IMPORTANT_INFO;
}
}
@ -101,24 +104,27 @@ CrashHandler::Action CrashHandler::promptUserForAction(bool showCrashMessage) {
}
void CrashHandler::handleCrash(CrashHandler::Action action) {
if (action != CrashHandler::DELETE_INTERFACE_INI && action != CrashHandler::RETAIN_AVATAR_INFO) {
if (action != CrashHandler::DELETE_INTERFACE_INI && action != CrashHandler::RETAIN_IMPORTANT_INFO) {
// CrashHandler::DO_NOTHING or unexpected value
return;
}
Settings settings;
QSettings settings;
const QString ADDRESS_MANAGER_GROUP = "AddressManager";
const QString ADDRESS_KEY = "address";
const QString AVATAR_GROUP = "Avatar";
const QString DISPLAY_NAME_KEY = "displayName";
const QString FULL_AVATAR_URL_KEY = "fullAvatarURL";
const QString FULL_AVATAR_MODEL_NAME_KEY = "fullAvatarModelName";
const QString TUTORIAL_COMPLETE_FLAG_KEY = "tutorialComplete";
QString displayName;
QUrl fullAvatarURL;
QString fullAvatarModelName;
QUrl address;
bool tutorialComplete = false;
if (action == CrashHandler::RETAIN_AVATAR_INFO) {
if (action == CrashHandler::RETAIN_IMPORTANT_INFO) {
// Read avatar info
// Location and orientation
@ -132,6 +138,9 @@ void CrashHandler::handleCrash(CrashHandler::Action action) {
fullAvatarURL = settings.value(FULL_AVATAR_URL_KEY).toUrl();
fullAvatarModelName = settings.value(FULL_AVATAR_MODEL_NAME_KEY).toString();
settings.endGroup();
// Tutorial complete
tutorialComplete = settings.value(TUTORIAL_COMPLETE_FLAG_KEY).toBool();
}
// Delete Interface.ini
@ -140,7 +149,7 @@ void CrashHandler::handleCrash(CrashHandler::Action action) {
settingsFile.remove();
}
if (action == CrashHandler::RETAIN_AVATAR_INFO) {
if (action == CrashHandler::RETAIN_IMPORTANT_INFO) {
// Write avatar info
// Location and orientation
@ -154,6 +163,9 @@ void CrashHandler::handleCrash(CrashHandler::Action action) {
settings.setValue(FULL_AVATAR_URL_KEY, fullAvatarURL);
settings.setValue(FULL_AVATAR_MODEL_NAME_KEY, fullAvatarModelName);
settings.endGroup();
// Tutorial complete
settings.setValue(TUTORIAL_COMPLETE_FLAG_KEY, tutorialComplete);
}
}

View file

@ -22,7 +22,7 @@ public:
private:
enum Action {
DELETE_INTERFACE_INI,
RETAIN_AVATAR_INFO,
RETAIN_IMPORTANT_INFO,
DO_NOTHING
};

View file

@ -25,7 +25,7 @@
#include <QThread>
const Discoverability::Mode DEFAULT_DISCOVERABILITY_MODE = Discoverability::Friends;
const Discoverability::Mode DEFAULT_DISCOVERABILITY_MODE = Discoverability::Connections;
DiscoverabilityManager::DiscoverabilityManager() :
_mode("discoverabilityMode", DEFAULT_DISCOVERABILITY_MODE)

View file

@ -405,6 +405,12 @@ Menu::Menu() {
#endif
{
auto action = addActionToQMenuAndActionHash(renderOptionsMenu, MenuOption::RenderClearKtxCache);
connect(action, &QAction::triggered, []{
Setting::Handle<int>(KTXCache::SETTING_VERSION_NAME, KTXCache::INVALID_VERSION).set(KTXCache::INVALID_VERSION);
});
}
// Developer > Render > LOD Tools
addActionToQMenuAndActionHash(renderOptionsMenu, MenuOption::LodTools, 0,

View file

@ -142,6 +142,7 @@ namespace MenuOption {
const QString Quit = "Quit";
const QString ReloadAllScripts = "Reload All Scripts";
const QString ReloadContent = "Reload Content (Clears all caches)";
const QString RenderClearKtxCache = "Clear KTX Cache (requires restart)";
const QString RenderMaxTextureMemory = "Maximum Texture Memory";
const QString RenderMaxTextureAutomatic = "Automatic Texture Memory";
const QString RenderMaxTexture4MB = "4 MB";

View file

@ -294,6 +294,7 @@ void MyAvatar::simulateAttachments(float deltaTime) {
QByteArray MyAvatar::toByteArrayStateful(AvatarDataDetail dataDetail) {
CameraMode mode = qApp->getCamera().getMode();
_globalPosition = getPosition();
// This might not be right! Isn't the capsule local offset in avatar space, and don't we need to add the radius to the y as well? -HRS 5/26/17
_globalBoundingBoxDimensions.x = _characterController.getCapsuleRadius();
_globalBoundingBoxDimensions.y = _characterController.getCapsuleHalfHeight();
_globalBoundingBoxDimensions.z = _characterController.getCapsuleRadius();
@ -409,7 +410,7 @@ void MyAvatar::update(float deltaTime) {
// update moving average of HMD facing in xz plane.
const float HMD_FACING_TIMESCALE = 4.0f; // very slow average
float tau = deltaTime / HMD_FACING_TIMESCALE;
_hmdSensorFacingMovingAverage = lerp(_hmdSensorFacingMovingAverage, _hmdSensorFacing, tau);
_headControllerFacingMovingAverage = lerp(_headControllerFacingMovingAverage, _headControllerFacing, tau);
if (_smoothOrientationTimer < SMOOTH_TIME_ORIENTATION) {
_rotationChanged = usecTimestampNow();
@ -417,16 +418,18 @@ void MyAvatar::update(float deltaTime) {
}
#ifdef DEBUG_DRAW_HMD_MOVING_AVERAGE
glm::vec3 p = transformPoint(getSensorToWorldMatrix(), _hmdSensorPosition + glm::vec3(_hmdSensorFacingMovingAverage.x, 0.0f, _hmdSensorFacingMovingAverage.y));
glm::vec3 p = transformPoint(getSensorToWorldMatrix(), getHeadControllerPoseInAvatarFrame() *
glm::vec3(_headControllerFacingMovingAverage.x, 0.0f, _headControllerFacingMovingAverage.y));
DebugDraw::getInstance().addMarker("facing-avg", getOrientation(), p, glm::vec4(1.0f));
p = transformPoint(getSensorToWorldMatrix(), _hmdSensorPosition + glm::vec3(_hmdSensorFacing.x, 0.0f, _hmdSensorFacing.y));
p = transformPoint(getSensorToWorldMatrix(), getHMDSensorPosition() +
glm::vec3(_headControllerFacing.x, 0.0f, _headControllerFacing.y));
DebugDraw::getInstance().addMarker("facing", getOrientation(), p, glm::vec4(1.0f));
#endif
if (_goToPending) {
setPosition(_goToPosition);
setOrientation(_goToOrientation);
_hmdSensorFacingMovingAverage = _hmdSensorFacing; // reset moving average
_headControllerFacingMovingAverage = _headControllerFacing; // reset moving average
_goToPending = false;
// updateFromHMDSensorMatrix (called from paintGL) expects that the sensorToWorldMatrix is updated for any position changes
// that happen between render and Application::update (which calls updateSensorToWorldMatrix to do so).
@ -434,6 +437,13 @@ void MyAvatar::update(float deltaTime) {
// so we update now. It's ok if it updates again in the normal way.
updateSensorToWorldMatrix();
emit positionGoneTo();
// Run safety tests as soon as we can after goToLocation, or clear if we're not colliding.
_physicsSafetyPending = getCollisionsEnabled();
}
if (_physicsSafetyPending && qApp->isPhysicsEnabled() && _characterController.isEnabledAndReady()) {
// When needed and ready, arrange to check and fix.
_physicsSafetyPending = false;
safeLanding(_goToPosition); // no-op if already safe
}
Head* head = getHead();
@ -448,6 +458,7 @@ void MyAvatar::update(float deltaTime) {
setAudioAverageLoudness(audio->getAudioAverageInputLoudness());
glm::vec3 halfBoundingBoxDimensions(_characterController.getCapsuleRadius(), _characterController.getCapsuleHalfHeight(), _characterController.getCapsuleRadius());
// This might not be right! Isn't the capsule local offset in avatar space? -HRS 5/26/17
halfBoundingBoxDimensions += _characterController.getCapsuleLocalOffset();
QMetaObject::invokeMethod(audio.data(), "setAvatarBoundingBoxParameters",
Q_ARG(glm::vec3, (getPosition() - halfBoundingBoxDimensions)),
@ -633,15 +644,21 @@ void MyAvatar::updateFromHMDSensorMatrix(const glm::mat4& hmdSensorMatrix) {
_hmdSensorMatrix = hmdSensorMatrix;
auto newHmdSensorPosition = extractTranslation(hmdSensorMatrix);
if (newHmdSensorPosition != _hmdSensorPosition &&
if (newHmdSensorPosition != getHMDSensorPosition() &&
glm::length(newHmdSensorPosition) > MAX_HMD_ORIGIN_DISTANCE) {
qWarning() << "Invalid HMD sensor position " << newHmdSensorPosition;
// Ignore unreasonable HMD sensor data
return;
}
_hmdSensorPosition = newHmdSensorPosition;
_hmdSensorOrientation = glm::quat_cast(hmdSensorMatrix);
_hmdSensorFacing = getFacingDir2D(_hmdSensorOrientation);
auto headPose = _headControllerPoseInSensorFrameCache.get();
if (headPose.isValid()) {
_headControllerFacing = getFacingDir2D(headPose.rotation);
} else {
_headControllerFacing = glm::vec2(1.0f, 0.0f);
}
}
void MyAvatar::updateJointFromController(controller::Action poseKey, ThreadSafeValueCache<glm::mat4>& matrixCache) {
@ -679,7 +696,7 @@ void MyAvatar::updateSensorToWorldMatrix() {
// Update avatar head rotation with sensor data
void MyAvatar::updateFromTrackers(float deltaTime) {
glm::vec3 estimatedPosition, estimatedRotation;
glm::vec3 estimatedRotation;
bool inHmd = qApp->isHMDMode();
bool playing = DependencyManager::get<recording::Deck>()->isPlaying();
@ -690,11 +707,7 @@ void MyAvatar::updateFromTrackers(float deltaTime) {
FaceTracker* tracker = qApp->getActiveFaceTracker();
bool inFacetracker = tracker && !FaceTracker::isMuted();
if (inHmd) {
estimatedPosition = extractTranslation(getHMDSensorMatrix());
estimatedPosition.x *= -1.0f;
} else if (inFacetracker) {
estimatedPosition = tracker->getHeadTranslation();
if (inFacetracker) {
estimatedRotation = glm::degrees(safeEulerAngles(tracker->getHeadRotation()));
}
@ -781,6 +794,77 @@ controller::Pose MyAvatar::getRightHandTipPose() const {
return pose;
}
glm::vec3 MyAvatar::worldToJointPoint(const glm::vec3& position, const int jointIndex) const {
glm::vec3 jointPos = getPosition();//default value if no or invalid joint specified
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if (jointIndex != -1) {
if (_skeletonModel->getJointPositionInWorldFrame(jointIndex, jointPos)) {
_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot);
} else {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
}
glm::vec3 modelOffset = position - jointPos;
glm::vec3 jointSpacePosition = glm::inverse(jointRot) * modelOffset;
return jointSpacePosition;
}
glm::vec3 MyAvatar::worldToJointDirection(const glm::vec3& worldDir, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
glm::vec3 jointSpaceDir = glm::inverse(jointRot) * worldDir;
return jointSpaceDir;
}
glm::quat MyAvatar::worldToJointRotation(const glm::quat& worldRot, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
glm::quat jointSpaceRot = glm::inverse(jointRot) * worldRot;
return jointSpaceRot;
}
glm::vec3 MyAvatar::jointToWorldPoint(const glm::vec3& jointSpacePos, const int jointIndex) const {
glm::vec3 jointPos = getPosition();//default value if no or invalid joint specified
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if (jointIndex != -1) {
if (_skeletonModel->getJointPositionInWorldFrame(jointIndex, jointPos)) {
_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot);
} else {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
}
glm::vec3 worldOffset = jointRot * jointSpacePos;
glm::vec3 worldPos = jointPos + worldOffset;
return worldPos;
}
glm::vec3 MyAvatar::jointToWorldDirection(const glm::vec3& jointSpaceDir, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
glm::vec3 worldDir = jointRot * jointSpaceDir;
return worldDir;
}
glm::quat MyAvatar::jointToWorldRotation(const glm::quat& jointSpaceRot, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
glm::quat worldRot = jointRot * jointSpaceRot;
return worldRot;
}
// virtual
void MyAvatar::render(RenderArgs* renderArgs) {
// don't render if we've been asked to disable local rendering
@ -1479,12 +1563,12 @@ void MyAvatar::updateMotors() {
if (_motionBehaviors & AVATAR_MOTION_ACTION_MOTOR_ENABLED) {
if (_characterController.getState() == CharacterController::State::Hover ||
_characterController.computeCollisionGroup() == BULLET_COLLISION_GROUP_COLLISIONLESS) {
motorRotation = getMyHead()->getCameraOrientation();
motorRotation = getMyHead()->getHeadOrientation();
} else {
// non-hovering = walking: follow camera twist about vertical but not lift
// so we decompose camera's rotation and store the twist part in motorRotation
glm::quat liftRotation;
swingTwistDecomposition(getMyHead()->getCameraOrientation(), _worldUpDirection, liftRotation, motorRotation);
swingTwistDecomposition(getMyHead()->getHeadOrientation(), _worldUpDirection, liftRotation, motorRotation);
}
const float DEFAULT_MOTOR_TIMESCALE = 0.2f;
const float INVALID_MOTOR_TIMESCALE = 1.0e6f;
@ -1498,7 +1582,7 @@ void MyAvatar::updateMotors() {
}
if (_motionBehaviors & AVATAR_MOTION_SCRIPTED_MOTOR_ENABLED) {
if (_scriptedMotorFrame == SCRIPTED_MOTOR_CAMERA_FRAME) {
motorRotation = getMyHead()->getCameraOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
motorRotation = getMyHead()->getHeadOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
} else if (_scriptedMotorFrame == SCRIPTED_MOTOR_AVATAR_FRAME) {
motorRotation = getOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
} else {
@ -1548,6 +1632,10 @@ void MyAvatar::harvestResultsFromPhysicsSimulation(float deltaTime) {
if (_characterController.isEnabledAndReady()) {
setVelocity(_characterController.getLinearVelocity() + _characterController.getFollowVelocity());
if (_characterController.isStuck()) {
_physicsSafetyPending = true;
_goToPosition = getPosition();
}
} else {
setVelocity(getVelocity() + _characterController.getFollowVelocity());
}
@ -1847,7 +1935,7 @@ void MyAvatar::updateOrientation(float deltaTime) {
if (getCharacterController()->getState() == CharacterController::State::Hover) {
// This is the direction the user desires to fly in.
glm::vec3 desiredFacing = getMyHead()->getCameraOrientation() * Vectors::UNIT_Z;
glm::vec3 desiredFacing = getMyHead()->getHeadOrientation() * Vectors::UNIT_Z;
desiredFacing.y = 0.0f;
// This is our reference frame, it is captured when the user begins to move.
@ -1886,11 +1974,9 @@ void MyAvatar::updateOrientation(float deltaTime) {
getHead()->setBasePitch(getHead()->getBasePitch() + getDriveKey(PITCH) * _pitchSpeed * deltaTime);
if (qApp->isHMDMode()) {
glm::quat orientation = glm::quat_cast(getSensorToWorldMatrix()) * getHMDSensorOrientation();
glm::quat bodyOrientation = getWorldBodyOrientation();
glm::quat localOrientation = glm::inverse(bodyOrientation) * orientation;
auto headPose = getHeadControllerPoseInAvatarFrame();
if (headPose.isValid()) {
glm::quat localOrientation = headPose.rotation * Quaternions::Y_180;
// these angles will be in radians
// ... so they need to be converted to degrees before we do math...
glm::vec3 euler = glm::eulerAngles(localOrientation) * DEGREES_PER_RADIAN;
@ -2004,11 +2090,14 @@ void MyAvatar::updatePosition(float deltaTime) {
}
// capture the head rotation, in sensor space, when the user first indicates they would like to move/fly.
if (!_hoverReferenceCameraFacingIsCaptured && (fabs(getDriveKey(TRANSLATE_Z)) > 0.1f || fabs(getDriveKey(TRANSLATE_X)) > 0.1f)) {
if (!_hoverReferenceCameraFacingIsCaptured &&
(fabs(getDriveKey(TRANSLATE_Z)) > 0.1f || fabs(getDriveKey(TRANSLATE_X)) > 0.1f)) {
_hoverReferenceCameraFacingIsCaptured = true;
// transform the camera facing vector into sensor space.
_hoverReferenceCameraFacing = transformVectorFast(glm::inverse(_sensorToWorldMatrix), getMyHead()->getCameraOrientation() * Vectors::UNIT_Z);
} else if (_hoverReferenceCameraFacingIsCaptured && (fabs(getDriveKey(TRANSLATE_Z)) <= 0.1f && fabs(getDriveKey(TRANSLATE_X)) <= 0.1f)) {
_hoverReferenceCameraFacing = transformVectorFast(glm::inverse(_sensorToWorldMatrix),
getMyHead()->getHeadOrientation() * Vectors::UNIT_Z);
} else if (_hoverReferenceCameraFacingIsCaptured &&
(fabs(getDriveKey(TRANSLATE_Z)) <= 0.1f && fabs(getDriveKey(TRANSLATE_X)) <= 0.1f)) {
_hoverReferenceCameraFacingIsCaptured = false;
}
}
@ -2219,6 +2308,144 @@ void MyAvatar::goToLocation(const glm::vec3& newPosition,
emit transformChanged();
}
void MyAvatar::goToLocationAndEnableCollisions(const glm::vec3& position) { // See use case in safeLanding.
goToLocation(position);
QMetaObject::invokeMethod(this, "setCollisionsEnabled", Qt::QueuedConnection, Q_ARG(bool, true));
}
bool MyAvatar::safeLanding(const glm::vec3& position) {
// Considers all collision hull or non-collisionless primitive intersections on a vertical line through the point.
// There needs to be a "landing" if:
// a) the closest above and the closest below are less than the avatar capsule height apart, or
// b) the above point is the top surface of an entity, indicating that we are inside it.
// If no landing is required, we go to that point directly and return false;
// When a landing is required by a, we find the highest intersection on that closest-agbove entity
// (which may be that same "nearest above intersection"). That highest intersection is the candidate landing point.
// For b, use that top surface point.
// We then place our feet there, recurse with new capsule center point, and return true.
if (QThread::currentThread() != thread()) {
bool result;
QMetaObject::invokeMethod(this, "safeLanding", Qt::BlockingQueuedConnection, Q_RETURN_ARG(bool, result), Q_ARG(const glm::vec3&, position));
return result;
}
glm::vec3 better;
if (!requiresSafeLanding(position, better)) {
return false;
}
if (!getCollisionsEnabled()) {
goToLocation(better); // recurses on next update
} else { // If you try to go while stuck, physics will keep you stuck.
setCollisionsEnabled(false);
// Don't goToLocation just yet. Yield so that physics can act on the above.
QMetaObject::invokeMethod(this, "goToLocationAndEnableCollisions", Qt::QueuedConnection, // The equivalent of javascript nextTick
Q_ARG(glm::vec3, better));
}
return true;
}
// If position is not reliably safe from being stuck by physics, answer true and place a candidate better position in betterPositionOut.
bool MyAvatar::requiresSafeLanding(const glm::vec3& positionIn, glm::vec3& betterPositionOut) {
// We begin with utilities and tests. The Algorithm in four parts is below.
auto halfHeight = _characterController.getCapsuleHalfHeight() + _characterController.getCapsuleRadius();
if (halfHeight == 0) {
return false; // zero height avatar
}
auto entityTree = DependencyManager::get<EntityTreeRenderer>()->getTree();
if (!entityTree) {
return false; // no entity tree
}
// More utilities.
const auto offset = getOrientation() *_characterController.getCapsuleLocalOffset();
const auto capsuleCenter = positionIn + offset;
const auto up = _worldUpDirection, down = -up;
glm::vec3 upperIntersection, upperNormal, lowerIntersection, lowerNormal;
EntityItemID upperId, lowerId;
QVector<EntityItemID> include{}, ignore{};
auto mustMove = [&] { // Place bottom of capsule at the upperIntersection, and check again based on the capsule center.
betterPositionOut = upperIntersection + (up * halfHeight) - offset;
return true;
};
auto findIntersection = [&](const glm::vec3& startPointIn, const glm::vec3& directionIn, glm::vec3& intersectionOut, EntityItemID& entityIdOut, glm::vec3& normalOut) {
OctreeElementPointer element;
EntityItemPointer intersectedEntity = NULL;
float distance;
BoxFace face;
const bool visibleOnly = false;
// This isn't quite what we really want here. findRayIntersection always works on mesh, skipping entirely based on collidable.
// What we really want is to use the collision hull!
// See https://highfidelity.fogbugz.com/f/cases/5003/findRayIntersection-has-option-to-use-collidableOnly-but-doesn-t-actually-use-colliders
const bool collidableOnly = true;
const bool precisionPicking = true;
const auto lockType = Octree::Lock; // Should we refactor to take a lock just once?
bool* accurateResult = NULL;
bool intersects = entityTree->findRayIntersection(startPointIn, directionIn, include, ignore, visibleOnly, collidableOnly, precisionPicking,
element, distance, face, normalOut, (void**)&intersectedEntity, lockType, accurateResult);
if (!intersects || !intersectedEntity) {
return false;
}
intersectionOut = startPointIn + (directionIn * distance);
entityIdOut = intersectedEntity->getEntityItemID();
return true;
};
// The Algorithm, in four parts:
if (!findIntersection(capsuleCenter, up, upperIntersection, upperId, upperNormal)) {
// We currently believe that physics will reliably push us out if our feet are embedded,
// as long as our capsule center is out and there's room above us. Here we have those
// conditions, so no need to check our feet below.
return false; // nothing above
}
if (!findIntersection(capsuleCenter, down, lowerIntersection, lowerId, lowerNormal)) {
// Our head may be embedded, but our center is out and there's room below. See corresponding comment above.
return false; // nothing below
}
// See if we have room between entities above and below, but that we are not contained.
// First check if the surface above us is the bottom of something, and the surface below us it the top of something.
// I.e., we are in a clearing between two objects.
if (isDown(upperNormal) && isUp(lowerNormal)) {
auto spaceBetween = glm::distance(upperIntersection, lowerIntersection);
const float halfHeightFactor = 2.5f; // Until case 5003 is fixed (and maybe after?), we need a fudge factor. Also account for content modelers not being precise.
if (spaceBetween > (halfHeightFactor * halfHeight)) {
// There is room for us to fit in that clearing. If there wasn't, physics would oscilate us between the objects above and below.
// We're now going to iterate upwards through successive upperIntersections, testing to see if we're contained within the top surface of some entity.
// There will be one of two outcomes:
// a) We're not contained, so we have enough room and our position is good.
// b) We are contained, so we'll bail out of this but try again at a position above the containing entity.
const int iterationLimit = 1000;
for (int counter = 0; counter < iterationLimit; counter++) {
ignore.push_back(upperId);
if (!findIntersection(upperIntersection, up, upperIntersection, upperId, upperNormal)) {
// We're not inside an entity, and from the nested tests, we have room between what is above and below. So position is good!
return false; // enough room
}
if (isUp(upperNormal)) {
// This new intersection is the top surface of an entity that we have not yet seen, which means we're contained within it.
// We could break here and recurse from the top of the original ceiling, but since we've already done the work to find the top
// of the enclosing entity, let's put our feet at upperIntersection and start over.
return mustMove();
}
// We found a new bottom surface, which we're not interested in.
// But there could still be a top surface above us for an entity we haven't seen, so keep looking upward.
}
qCDebug(interfaceapp) << "Loop in requiresSafeLanding. Floor/ceiling do not make sense.";
}
}
include.push_back(upperId); // We're now looking for the intersection from above onto this entity.
const float big = (float)TREE_SCALE;
const auto skyHigh = up * big;
auto fromAbove = capsuleCenter + skyHigh;
if (!findIntersection(fromAbove, down, upperIntersection, upperId, upperNormal)) {
return false; // Unable to find a landing
}
// Our arbitrary rule is to always go up. There's no need to look down or sideways for a "closer" safe candidate.
return mustMove();
}
void MyAvatar::updateMotionBehaviorFromMenu() {
if (QThread::currentThread() != thread()) {
@ -2322,36 +2549,27 @@ bool MyAvatar::isDriveKeyDisabled(DriveKeys key) const {
}
}
glm::vec3 MyAvatar::getWorldBodyPosition() const {
return transformPoint(_sensorToWorldMatrix, extractTranslation(_bodySensorMatrix));
}
glm::quat MyAvatar::getWorldBodyOrientation() const {
return glm::quat_cast(_sensorToWorldMatrix * _bodySensorMatrix);
}
// old school meat hook style
glm::mat4 MyAvatar::deriveBodyFromHMDSensor() const {
// HMD is in sensor space.
const glm::vec3 hmdPosition = getHMDSensorPosition();
const glm::quat hmdOrientation = getHMDSensorOrientation();
const glm::quat hmdOrientationYawOnly = cancelOutRollAndPitch(hmdOrientation);
glm::vec3 headPosition;
glm::quat headOrientation;
auto headPose = getHeadControllerPoseInSensorFrame();
if (headPose.isValid()) {
headPosition = getHeadControllerPoseInSensorFrame().translation;
headOrientation = getHeadControllerPoseInSensorFrame().rotation * Quaternions::Y_180;
}
const glm::quat headOrientationYawOnly = cancelOutRollAndPitch(headOrientation);
const Rig& rig = _skeletonModel->getRig();
int rightEyeIndex = rig.indexOfJoint("RightEye");
int leftEyeIndex = rig.indexOfJoint("LeftEye");
int headIndex = rig.indexOfJoint("Head");
int neckIndex = rig.indexOfJoint("Neck");
int hipsIndex = rig.indexOfJoint("Hips");
glm::vec3 rigMiddleEyePos = DEFAULT_AVATAR_MIDDLE_EYE_POS;
if (leftEyeIndex >= 0 && rightEyeIndex >= 0) {
rigMiddleEyePos = (rig.getAbsoluteDefaultPose(leftEyeIndex).trans() + rig.getAbsoluteDefaultPose(rightEyeIndex).trans()) / 2.0f;
}
glm::vec3 rigHeadPos = headIndex != -1 ? rig.getAbsoluteDefaultPose(headIndex).trans() : DEFAULT_AVATAR_HEAD_POS;
glm::vec3 rigNeckPos = neckIndex != -1 ? rig.getAbsoluteDefaultPose(neckIndex).trans() : DEFAULT_AVATAR_NECK_POS;
glm::vec3 rigHipsPos = hipsIndex != -1 ? rig.getAbsoluteDefaultPose(hipsIndex).trans() : DEFAULT_AVATAR_HIPS_POS;
glm::vec3 localEyes = (rigMiddleEyePos - rigHipsPos);
glm::vec3 localHead = (rigHeadPos - rigHipsPos);
glm::vec3 localNeck = (rigNeckPos - rigHipsPos);
// apply simplistic head/neck model
@ -2360,11 +2578,11 @@ glm::mat4 MyAvatar::deriveBodyFromHMDSensor() const {
// eyeToNeck offset is relative full HMD orientation.
// while neckToRoot offset is only relative to HMDs yaw.
// Y_180 is necessary because rig is z forward and hmdOrientation is -z forward
glm::vec3 eyeToNeck = hmdOrientation * Quaternions::Y_180 * (localNeck - localEyes);
glm::vec3 neckToRoot = hmdOrientationYawOnly * Quaternions::Y_180 * -localNeck;
glm::vec3 bodyPos = hmdPosition + eyeToNeck + neckToRoot;
glm::vec3 headToNeck = headOrientation * Quaternions::Y_180 * (localNeck - localHead);
glm::vec3 neckToRoot = headOrientationYawOnly * Quaternions::Y_180 * -localNeck;
glm::vec3 bodyPos = headPosition + headToNeck + neckToRoot;
return createMatFromQuatAndPos(hmdOrientationYawOnly, bodyPos);
return createMatFromQuatAndPos(headOrientationYawOnly, bodyPos);
}
glm::vec3 MyAvatar::getPositionForAudio() {
@ -2480,7 +2698,7 @@ bool MyAvatar::FollowHelper::shouldActivateRotation(const MyAvatar& myAvatar, co
} else {
const float FOLLOW_ROTATION_THRESHOLD = cosf(PI / 6.0f); // 30 degrees
glm::vec2 bodyFacing = getFacingDir2D(currentBodyMatrix);
return glm::dot(myAvatar.getHMDSensorFacingMovingAverage(), bodyFacing) < FOLLOW_ROTATION_THRESHOLD;
return glm::dot(-myAvatar.getHeadControllerFacingMovingAverage(), bodyFacing) < FOLLOW_ROTATION_THRESHOLD;
}
}
@ -2517,8 +2735,8 @@ bool MyAvatar::FollowHelper::shouldActivateVertical(const MyAvatar& myAvatar, co
return (offset.y > CYLINDER_TOP) || (offset.y < CYLINDER_BOTTOM);
}
void MyAvatar::FollowHelper::prePhysicsUpdate(MyAvatar& myAvatar, const glm::mat4& desiredBodyMatrix, const glm::mat4& currentBodyMatrix, bool hasDriveInput) {
_desiredBodyMatrix = desiredBodyMatrix;
void MyAvatar::FollowHelper::prePhysicsUpdate(MyAvatar& myAvatar, const glm::mat4& desiredBodyMatrix,
const glm::mat4& currentBodyMatrix, bool hasDriveInput) {
if (myAvatar.getHMDLeanRecenterEnabled()) {
if (!isActive(Rotation) && shouldActivateRotation(myAvatar, desiredBodyMatrix, currentBodyMatrix)) {
@ -2532,7 +2750,7 @@ void MyAvatar::FollowHelper::prePhysicsUpdate(MyAvatar& myAvatar, const glm::mat
}
}
glm::mat4 desiredWorldMatrix = myAvatar.getSensorToWorldMatrix() * _desiredBodyMatrix;
glm::mat4 desiredWorldMatrix = myAvatar.getSensorToWorldMatrix() * desiredBodyMatrix;
glm::mat4 currentWorldMatrix = myAvatar.getSensorToWorldMatrix() * currentBodyMatrix;
AnimPose followWorldPose(currentWorldMatrix);
@ -2627,9 +2845,10 @@ glm::mat4 MyAvatar::computeCameraRelativeHandControllerMatrix(const glm::mat4& c
cameraWorldMatrix *= createMatFromScaleQuatAndPos(vec3(-1.0f, 1.0f, 1.0f), glm::quat(), glm::vec3());
}
// compute a NEW sensorToWorldMatrix for the camera. The equation is cameraWorldMatrix = cameraSensorToWorldMatrix * _hmdSensorMatrix.
// compute a NEW sensorToWorldMatrix for the camera.
// The equation is cameraWorldMatrix = cameraSensorToWorldMatrix * _hmdSensorMatrix.
// here we solve for the unknown cameraSensorToWorldMatrix.
glm::mat4 cameraSensorToWorldMatrix = cameraWorldMatrix * glm::inverse(_hmdSensorMatrix);
glm::mat4 cameraSensorToWorldMatrix = cameraWorldMatrix * glm::inverse(getHMDSensorMatrix());
// Using the new cameraSensorToWorldMatrix, compute where the controller is in world space.
glm::mat4 controllerWorldMatrix = cameraSensorToWorldMatrix * controllerSensorMatrix;
@ -2770,7 +2989,7 @@ glm::mat4 MyAvatar::getLeftFootCalibrationMat() const {
auto leftFootRot = getAbsoluteDefaultJointRotationInObjectFrame(leftFootIndex);
return createMatFromQuatAndPos(leftFootRot, leftFootPos);
} else {
return createMatFromQuatAndPos(DEFAULT_AVATAR_LEFTFOOT_POS, DEFAULT_AVATAR_LEFTFOOT_POS);
return createMatFromQuatAndPos(DEFAULT_AVATAR_LEFTFOOT_ROT, DEFAULT_AVATAR_LEFTFOOT_POS);
}
}
@ -2782,7 +3001,7 @@ glm::mat4 MyAvatar::getRightFootCalibrationMat() const {
auto rightFootRot = getAbsoluteDefaultJointRotationInObjectFrame(rightFootIndex);
return createMatFromQuatAndPos(rightFootRot, rightFootPos);
} else {
return createMatFromQuatAndPos(DEFAULT_AVATAR_RIGHTFOOT_POS, DEFAULT_AVATAR_RIGHTFOOT_POS);
return createMatFromQuatAndPos(DEFAULT_AVATAR_RIGHTFOOT_ROT, DEFAULT_AVATAR_RIGHTFOOT_POS);
}
}
@ -2805,7 +3024,7 @@ glm::mat4 MyAvatar::getLeftArmCalibrationMat() const {
auto leftArmRot = getAbsoluteDefaultJointRotationInObjectFrame(leftArmIndex);
return createMatFromQuatAndPos(leftArmRot, leftArmPos);
} else {
return createMatFromQuatAndPos(DEFAULT_AVATAR_LEFTARM_ROT, DEFAULT_AVATAR_RIGHTARM_POS);
return createMatFromQuatAndPos(DEFAULT_AVATAR_LEFTARM_ROT, DEFAULT_AVATAR_LEFTARM_POS);
}
}

View file

@ -185,7 +185,6 @@ public:
const glm::mat4& getHMDSensorMatrix() const { return _hmdSensorMatrix; }
const glm::vec3& getHMDSensorPosition() const { return _hmdSensorPosition; }
const glm::quat& getHMDSensorOrientation() const { return _hmdSensorOrientation; }
const glm::vec2& getHMDSensorFacingMovingAverage() const { return _hmdSensorFacingMovingAverage; }
Q_INVOKABLE void setOrientationVar(const QVariant& newOrientationVar);
Q_INVOKABLE QVariant getOrientationVar() const;
@ -379,6 +378,15 @@ public:
Q_INVOKABLE controller::Pose getLeftHandTipPose() const;
Q_INVOKABLE controller::Pose getRightHandTipPose() const;
// world-space to avatar-space rigconversion functions
Q_INVOKABLE glm::vec3 worldToJointPoint(const glm::vec3& position, const int jointIndex = -1) const;
Q_INVOKABLE glm::vec3 worldToJointDirection(const glm::vec3& direction, const int jointIndex = -1) const;
Q_INVOKABLE glm::quat worldToJointRotation(const glm::quat& rotation, const int jointIndex = -1) const;
Q_INVOKABLE glm::vec3 jointToWorldPoint(const glm::vec3& position, const int jointIndex = -1) const;
Q_INVOKABLE glm::vec3 jointToWorldDirection(const glm::vec3& direction, const int jointIndex = -1) const;
Q_INVOKABLE glm::quat jointToWorldRotation(const glm::quat& rotation, const int jointIndex = -1) const;
AvatarWeakPointer getLookAtTargetAvatar() const { return _lookAtTargetAvatar; }
void updateLookAtTargetAvatar();
void clearLookAtTargetAvatar();
@ -470,6 +478,8 @@ public:
controller::Pose getHeadControllerPoseInSensorFrame() const;
controller::Pose getHeadControllerPoseInWorldFrame() const;
controller::Pose getHeadControllerPoseInAvatarFrame() const;
const glm::vec2& getHeadControllerFacingMovingAverage() const { return _headControllerFacingMovingAverage; }
void setArmControllerPosesInSensorFrame(const controller::Pose& left, const controller::Pose& right);
controller::Pose getLeftArmControllerPoseInSensorFrame() const;
@ -509,6 +519,10 @@ public:
// results are in HMD frame
glm::mat4 deriveBodyFromHMDSensor() const;
Q_INVOKABLE bool isUp(const glm::vec3& direction) { return glm::dot(direction, _worldUpDirection) > 0.0f; }; // true iff direction points up wrt avatar's definition of up.
Q_INVOKABLE bool isDown(const glm::vec3& direction) { return glm::dot(direction, _worldUpDirection) < 0.0f; };
public slots:
void increaseSize();
void decreaseSize();
@ -518,6 +532,8 @@ public slots:
bool hasOrientation = false, const glm::quat& newOrientation = glm::quat(),
bool shouldFaceLocation = false);
void goToLocation(const QVariant& properties);
void goToLocationAndEnableCollisions(const glm::vec3& newPosition);
bool safeLanding(const glm::vec3& position);
void restrictScaleFromDomainSettings(const QJsonObject& domainSettingsObject);
void clearScaleRestriction();
@ -563,9 +579,7 @@ signals:
private:
glm::vec3 getWorldBodyPosition() const;
glm::quat getWorldBodyOrientation() const;
bool requiresSafeLanding(const glm::vec3& positionIn, glm::vec3& positionOut);
virtual QByteArray toByteArrayStateful(AvatarDataDetail dataDetail) override;
@ -676,13 +690,13 @@ private:
// working copies -- see AvatarData for thread-safe _sensorToWorldMatrixCache, used for outward facing access
glm::mat4 _sensorToWorldMatrix { glm::mat4() };
// cache of the current HMD sensor position and orientation
// in sensor space.
// cache of the current HMD sensor position and orientation in sensor space.
glm::mat4 _hmdSensorMatrix;
glm::quat _hmdSensorOrientation;
glm::vec3 _hmdSensorPosition;
glm::vec2 _hmdSensorFacing; // facing vector in xz plane
glm::vec2 _hmdSensorFacingMovingAverage { 0, 0 }; // facing vector in xz plane
// cache head controller pose in sensor space
glm::vec2 _headControllerFacing; // facing vector in xz plane
glm::vec2 _headControllerFacingMovingAverage { 0, 0 }; // facing vector in xz plane
// cache of the current body position and orientation of the avatar's body,
// in sensor space.
@ -697,7 +711,6 @@ private:
Vertical,
NumFollowTypes
};
glm::mat4 _desiredBodyMatrix;
float _timeRemaining[NumFollowTypes];
void deactivate();
@ -716,7 +729,8 @@ private:
};
FollowHelper _follow;
bool _goToPending;
bool _goToPending { false };
bool _physicsSafetyPending { false };
glm::vec3 _goToPosition;
glm::quat _goToOrientation;

View file

@ -26,19 +26,20 @@ using namespace std;
MyHead::MyHead(MyAvatar* owningAvatar) : Head(owningAvatar) {
}
glm::quat MyHead::getCameraOrientation() const {
// NOTE: Head::getCameraOrientation() is not used for orienting the camera "view" while in Oculus mode, so
glm::quat MyHead::getHeadOrientation() const {
// NOTE: Head::getHeadOrientation() is not used for orienting the camera "view" while in Oculus mode, so
// you may wonder why this code is here. This method will be called while in Oculus mode to determine how
// to change the driving direction while in Oculus mode. It is used to support driving toward where you're
// head is looking. Note that in oculus mode, your actual camera view and where your head is looking is not
// always the same.
if (qApp->isHMDMode()) {
MyAvatar* myAvatar = static_cast<MyAvatar*>(_owningAvatar);
return glm::quat_cast(myAvatar->getSensorToWorldMatrix()) * myAvatar->getHMDSensorOrientation();
} else {
Avatar* owningAvatar = static_cast<Avatar*>(_owningAvatar);
return owningAvatar->getWorldAlignedOrientation() * glm::quat(glm::radians(glm::vec3(_basePitch, 0.0f, 0.0f)));
MyAvatar* myAvatar = static_cast<MyAvatar*>(_owningAvatar);
auto headPose = myAvatar->getHeadControllerPoseInWorldFrame();
if (headPose.isValid()) {
return headPose.rotation * Quaternions::Y_180;
}
return myAvatar->getWorldAlignedOrientation() * glm::quat(glm::radians(glm::vec3(_basePitch, 0.0f, 0.0f)));
}
void MyHead::simulate(float deltaTime) {

View file

@ -18,7 +18,7 @@ public:
explicit MyHead(MyAvatar* owningAvatar);
/// \return orientationBody * orientationBasePitch
glm::quat getCameraOrientation() const;
glm::quat getHeadOrientation() const;
void simulate(float deltaTime) override;
private:

View file

@ -52,26 +52,18 @@ void MySkeletonModel::updateRig(float deltaTime, glm::mat4 parentTransform) {
// input action is the highest priority source for head orientation.
auto avatarHeadPose = myAvatar->getHeadControllerPoseInAvatarFrame();
if (avatarHeadPose.isValid()) {
glm::mat4 rigHeadMat = Matrices::Y_180 * createMatFromQuatAndPos(avatarHeadPose.getRotation(), avatarHeadPose.getTranslation());
glm::mat4 rigHeadMat = Matrices::Y_180 *
createMatFromQuatAndPos(avatarHeadPose.getRotation(), avatarHeadPose.getTranslation());
headParams.rigHeadPosition = extractTranslation(rigHeadMat);
headParams.rigHeadOrientation = glmExtractRotation(rigHeadMat);
headParams.headEnabled = true;
} else {
if (qApp->isHMDMode()) {
// get HMD position from sensor space into world space, and back into rig space
glm::mat4 worldHMDMat = myAvatar->getSensorToWorldMatrix() * myAvatar->getHMDSensorMatrix();
glm::mat4 rigToWorld = createMatFromQuatAndPos(getRotation(), getTranslation());
glm::mat4 worldToRig = glm::inverse(rigToWorld);
glm::mat4 rigHMDMat = worldToRig * worldHMDMat;
_rig.computeHeadFromHMD(AnimPose(rigHMDMat), headParams.rigHeadPosition, headParams.rigHeadOrientation);
headParams.headEnabled = true;
} else {
// even though full head IK is disabled, the rig still needs the head orientation to rotate the head up and down in desktop mode.
// preMult 180 is necessary to convert from avatar to rig coordinates.
// postMult 180 is necessary to convert head from -z forward to z forward.
headParams.rigHeadOrientation = Quaternions::Y_180 * head->getFinalOrientationInLocalFrame() * Quaternions::Y_180;
headParams.headEnabled = false;
}
// even though full head IK is disabled, the rig still needs the head orientation to rotate the head up and
// down in desktop mode.
// preMult 180 is necessary to convert from avatar to rig coordinates.
// postMult 180 is necessary to convert head from -z forward to z forward.
headParams.rigHeadOrientation = Quaternions::Y_180 * head->getFinalOrientationInLocalFrame() * Quaternions::Y_180;
headParams.headEnabled = false;
}
auto avatarHipsPose = myAvatar->getHipsControllerPoseInAvatarFrame();

View file

@ -24,7 +24,6 @@
#include <SandboxUtils.h>
#include <SharedUtil.h>
#include "AddressManager.h"
#include "Application.h"
#include "InterfaceLogging.h"
@ -191,7 +190,7 @@ int main(int argc, const char* argv[]) {
int exitCode;
{
RunningMarker runningMarker(nullptr, RUNNING_MARKER_FILENAME);
RunningMarker runningMarker(RUNNING_MARKER_FILENAME);
bool runningMarkerExisted = runningMarker.fileExists();
runningMarker.writeRunningMarkerFile();
@ -200,14 +199,11 @@ int main(int argc, const char* argv[]) {
bool serverContentPathOptionIsSet = parser.isSet(serverContentPathOption);
QString serverContentPath = serverContentPathOptionIsSet ? parser.value(serverContentPathOption) : QString();
if (runServer) {
SandboxUtils::runLocalSandbox(serverContentPath, true, RUNNING_MARKER_FILENAME, noUpdater);
SandboxUtils::runLocalSandbox(serverContentPath, true, noUpdater);
}
Application app(argc, const_cast<char**>(argv), startupTime, runningMarkerExisted);
// Now that the main event loop is setup, launch running marker thread
runningMarker.startRunningMarker();
// If we failed the OpenGLVersion check, log it.
if (override) {
auto accountManager = DependencyManager::get<AccountManager>();

View file

@ -0,0 +1,95 @@
//
// CloseEventSender.cpp
// interface/src/networking
//
// Created by Stephen Birarda on 5/31/17.
// Copyright 2017 High Fidelity, Inc.
//
// Distributed under the Apache License, Version 2.0.
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
#include <QtCore/QDateTime>
#include <QtCore/QEventLoop>
#include <QtCore/QJsonDocument>
#include <QtNetwork/QNetworkReply>
#include <ThreadHelpers.h>
#include <AccountManager.h>
#include <NetworkAccessManager.h>
#include <NetworkingConstants.h>
#include <NetworkLogging.h>
#include <UserActivityLogger.h>
#include <UUID.h>
#include "CloseEventSender.h"
QNetworkRequest createNetworkRequest() {
QNetworkRequest request;
QUrl requestURL = NetworkingConstants::METAVERSE_SERVER_URL;
requestURL.setPath(USER_ACTIVITY_URL);
request.setUrl(requestURL);
auto accountManager = DependencyManager::get<AccountManager>();
if (accountManager->hasValidAccessToken()) {
request.setRawHeader(ACCESS_TOKEN_AUTHORIZATION_HEADER,
accountManager->getAccountInfo().getAccessToken().authorizationHeaderValue());
}
request.setRawHeader(METAVERSE_SESSION_ID_HEADER,
uuidStringWithoutCurlyBraces(accountManager->getSessionID()).toLocal8Bit());
request.setHeader(QNetworkRequest::ContentTypeHeader, "application/json");
request.setPriority(QNetworkRequest::HighPriority);
return request;
}
QByteArray postDataForAction(QString action) {
return QString("{\"action_name\": \"" + action + "\"}").toUtf8();
}
QNetworkReply* replyForAction(QString action) {
QNetworkAccessManager& networkAccessManager = NetworkAccessManager::getInstance();
return networkAccessManager.post(createNetworkRequest(), postDataForAction(action));
}
void CloseEventSender::sendQuitEventAsync() {
if (UserActivityLogger::getInstance().isEnabled()) {
QNetworkReply* reply = replyForAction("quit");
connect(reply, &QNetworkReply::finished, this, &CloseEventSender::handleQuitEventFinished);
_quitEventStartTimestamp = QDateTime::currentMSecsSinceEpoch();
} else {
_hasFinishedQuitEvent = true;
}
}
void CloseEventSender::handleQuitEventFinished() {
_hasFinishedQuitEvent = true;
auto reply = qobject_cast<QNetworkReply*>(sender());
if (reply->error() == QNetworkReply::NoError) {
qCDebug(networking) << "Quit event sent successfully";
} else {
qCDebug(networking) << "Failed to send quit event -" << reply->errorString();
}
reply->deleteLater();
}
bool CloseEventSender::hasTimedOutQuitEvent() {
const int CLOSURE_EVENT_TIMEOUT_MS = 5000;
return _quitEventStartTimestamp != 0
&& QDateTime::currentMSecsSinceEpoch() - _quitEventStartTimestamp > CLOSURE_EVENT_TIMEOUT_MS;
}
void CloseEventSender::startThread() {
moveToNewNamedThread(this, "CloseEvent Logger Thread", [this] {
sendQuitEventAsync();
});
}

View file

@ -0,0 +1,42 @@
//
// CloseEventSender.h
// interface/src/networking
//
// Created by Stephen Birarda on 5/31/17.
// Copyright 2017 High Fidelity, Inc.
//
// Distributed under the Apache License, Version 2.0.
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
#ifndef hifi_CloseEventSender_h
#define hifi_CloseEventSender_h
#include <atomic>
#include <QtCore/QString>
#include <QtCore/QUuid>
#include <DependencyManager.h>
class CloseEventSender : public QObject, public Dependency {
Q_OBJECT
SINGLETON_DEPENDENCY
public:
void startThread();
bool hasTimedOutQuitEvent();
bool hasFinishedQuitEvent() { return _hasFinishedQuitEvent; }
public slots:
void sendQuitEventAsync();
private slots:
void handleQuitEventFinished();
private:
std::atomic<bool> _hasFinishedQuitEvent { false };
std::atomic<int64_t> _quitEventStartTimestamp;
};
#endif // hifi_CloseEventSender_h

View file

@ -54,6 +54,10 @@ bool HMDScriptingInterface::isHMDAvailable(const QString& name) {
return PluginUtils::isHMDAvailable(name);
}
bool HMDScriptingInterface::isHeadControllerAvailable(const QString& name) {
return PluginUtils::isHeadControllerAvailable(name);
}
bool HMDScriptingInterface::isHandControllerAvailable(const QString& name) {
return PluginUtils::isHandControllerAvailable(name);
}

View file

@ -43,6 +43,7 @@ public:
Q_INVOKABLE QString preferredAudioOutput() const;
Q_INVOKABLE bool isHMDAvailable(const QString& name = "");
Q_INVOKABLE bool isHeadControllerAvailable(const QString& name = "");
Q_INVOKABLE bool isHandControllerAvailable(const QString& name = "");
Q_INVOKABLE bool isSubdeviceContainingNameAvailable(const QString& name);

View file

@ -9,9 +9,12 @@
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
#include <QtConcurrent/QtConcurrentRun>
#include <ThreadHelpers.h>
#include <src/InterfaceLogging.h>
#include <src/ui/AvatarInputs.h>
#include <QtConcurrent/QtConcurrentRun>
#include "LimitlessVoiceRecognitionScriptingInterface.h"
const float LimitlessVoiceRecognitionScriptingInterface::_audioLevelThreshold = 0.33f;
@ -24,9 +27,7 @@ LimitlessVoiceRecognitionScriptingInterface::LimitlessVoiceRecognitionScriptingI
connect(&_voiceTimer, &QTimer::timeout, this, &LimitlessVoiceRecognitionScriptingInterface::voiceTimeout);
connect(&_connection, &LimitlessConnection::onReceivedTranscription, this, [this](QString transcription){emit onReceivedTranscription(transcription);});
connect(&_connection, &LimitlessConnection::onFinishedSpeaking, this, [this](QString transcription){emit onFinishedSpeaking(transcription);});
_connection.moveToThread(&_connectionThread);
_connectionThread.setObjectName("Limitless Connection");
_connectionThread.start();
moveToNewNamedThread(&_connection, "Limitless Connection");
}
void LimitlessVoiceRecognitionScriptingInterface::update() {

View file

@ -41,7 +41,6 @@ private:
static const int _voiceTimeoutDuration;
QTimer _voiceTimer;
QThread _connectionThread;
LimitlessConnection _connection;
void voiceTimeout();

View file

@ -23,6 +23,7 @@
#include <QUrlQuery>
#include <QXmlStreamReader>
#include <ThreadHelpers.h>
#include <NetworkAccessManager.h>
#include <SharedUtil.h>
@ -84,12 +85,7 @@ ModelsBrowser::ModelsBrowser(FSTReader::ModelType modelsType, QWidget* parent) :
_handler->connect(this, SIGNAL(destroyed()), SLOT(exit()));
// Setup and launch update thread
QThread* thread = new QThread();
thread->setObjectName("Models Browser");
thread->connect(_handler, SIGNAL(destroyed()), SLOT(quit()));
thread->connect(thread, SIGNAL(finished()), SLOT(deleteLater()));
_handler->moveToThread(thread);
thread->start();
moveToNewNamedThread(_handler, "Models Browser");
emit startDownloading();
// Initialize the view

View file

@ -51,6 +51,16 @@ Text3DOverlay::~Text3DOverlay() {
}
}
const QString Text3DOverlay::getText() const {
QMutexLocker lock(&_mutex);
return _text;
}
void Text3DOverlay::setText(const QString& text) {
QMutexLocker lock(&_mutex);
_text = text;
}
xColor Text3DOverlay::getBackgroundColor() {
if (_colorPulse == 0.0f) {
return _backgroundColor;
@ -125,7 +135,7 @@ void Text3DOverlay::render(RenderArgs* args) {
// FIXME: Factor out textRenderer so that Text3DOverlay overlay parts can be grouped by pipeline
// for a gpu performance increase. Currently,
// Text renderer sets its own pipeline,
_textRenderer->draw(batch, 0, 0, _text, textColor, glm::vec2(-1.0f), getDrawInFront());
_textRenderer->draw(batch, 0, 0, getText(), textColor, glm::vec2(-1.0f), getDrawInFront());
// so before we continue, we must reset the pipeline
batch.setPipeline(args->_pipeline->pipeline);
args->_pipeline->prepare(batch);
@ -188,7 +198,7 @@ void Text3DOverlay::setProperties(const QVariantMap& properties) {
QVariant Text3DOverlay::getProperty(const QString& property) {
if (property == "text") {
return _text;
return getText();
}
if (property == "textAlpha") {
return _textAlpha;
@ -231,7 +241,7 @@ QSizeF Text3DOverlay::textSize(const QString& text) const {
return QSizeF(extents.x, extents.y) * pointToWorldScale;
}
bool Text3DOverlay::findRayIntersection(const glm::vec3 &origin, const glm::vec3 &direction, float &distance,
bool Text3DOverlay::findRayIntersection(const glm::vec3 &origin, const glm::vec3 &direction, float &distance,
BoxFace &face, glm::vec3& surfaceNormal) {
Transform transform = getTransform();
applyTransformTo(transform, true);

View file

@ -12,14 +12,14 @@
#define hifi_Text3DOverlay_h
#include <QString>
#include <QtCore/QMutex>
#include "Billboard3DOverlay.h"
class TextRenderer3D;
class Text3DOverlay : public Billboard3DOverlay {
Q_OBJECT
public:
static QString const TYPE;
virtual QString getType() const override { return TYPE; }
@ -34,7 +34,7 @@ public:
virtual const render::ShapeKey getShapeKey() override;
// getters
const QString& getText() const { return _text; }
const QString getText() const;
float getLineHeight() const { return _lineHeight; }
float getLeftMargin() const { return _leftMargin; }
float getTopMargin() const { return _topMargin; }
@ -45,7 +45,7 @@ public:
float getBackgroundAlpha() { return getAlpha(); }
// setters
void setText(const QString& text) { _text = text; }
void setText(const QString& text);
void setTextAlpha(float alpha) { _textAlpha = alpha; }
void setLineHeight(float value) { _lineHeight = value; }
void setLeftMargin(float margin) { _leftMargin = margin; }
@ -58,15 +58,16 @@ public:
QSizeF textSize(const QString& test) const; // Meters
virtual bool findRayIntersection(const glm::vec3& origin, const glm::vec3& direction, float& distance,
virtual bool findRayIntersection(const glm::vec3& origin, const glm::vec3& direction, float& distance,
BoxFace& face, glm::vec3& surfaceNormal) override;
virtual Text3DOverlay* createClone() const override;
private:
TextRenderer3D* _textRenderer = nullptr;
QString _text;
mutable QMutex _mutex; // used to make get/setText threadsafe, mutable so can be used in const functions
xColor _backgroundColor = xColor { 0, 0, 0 };
float _textAlpha { 1.0f };
float _lineHeight { 1.0f };

View file

@ -134,7 +134,7 @@ Web3DOverlay::~Web3DOverlay() {
void Web3DOverlay::update(float deltatime) {
if (_webSurface) {
// update globalPosition
_webSurface->getRootContext()->setContextProperty("globalPosition", vec3toVariant(getPosition()));
_webSurface->getSurfaceContext()->setContextProperty("globalPosition", vec3toVariant(getPosition()));
}
}
@ -162,57 +162,56 @@ void Web3DOverlay::loadSourceURL() {
_webSurface->resume();
_webSurface->getRootItem()->setProperty("url", _url);
_webSurface->getRootItem()->setProperty("scriptURL", _scriptURL);
_webSurface->getRootContext()->setContextProperty("ApplicationInterface", qApp);
} else {
_webSurface->setBaseUrl(QUrl::fromLocalFile(PathUtils::resourcesPath()));
_webSurface->load(_url, [&](QQmlContext* context, QObject* obj) {});
_webSurface->resume();
_webSurface->getRootContext()->setContextProperty("Users", DependencyManager::get<UsersScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("UserActivityLogger", DependencyManager::get<UserActivityLoggerScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("Preferences", DependencyManager::get<Preferences>().data());
_webSurface->getRootContext()->setContextProperty("Vec3", new Vec3());
_webSurface->getRootContext()->setContextProperty("Quat", new Quat());
_webSurface->getRootContext()->setContextProperty("MyAvatar", DependencyManager::get<AvatarManager>()->getMyAvatar().get());
_webSurface->getRootContext()->setContextProperty("Entities", DependencyManager::get<EntityScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("Snapshot", DependencyManager::get<Snapshot>().data());
_webSurface->getSurfaceContext()->setContextProperty("Users", DependencyManager::get<UsersScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("UserActivityLogger", DependencyManager::get<UserActivityLoggerScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("Preferences", DependencyManager::get<Preferences>().data());
_webSurface->getSurfaceContext()->setContextProperty("Vec3", new Vec3());
_webSurface->getSurfaceContext()->setContextProperty("Quat", new Quat());
_webSurface->getSurfaceContext()->setContextProperty("MyAvatar", DependencyManager::get<AvatarManager>()->getMyAvatar().get());
_webSurface->getSurfaceContext()->setContextProperty("Entities", DependencyManager::get<EntityScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("Snapshot", DependencyManager::get<Snapshot>().data());
if (_webSurface->getRootItem() && _webSurface->getRootItem()->objectName() == "tabletRoot") {
auto tabletScriptingInterface = DependencyManager::get<TabletScriptingInterface>();
auto flags = tabletScriptingInterface->getFlags();
_webSurface->getRootContext()->setContextProperty("offscreenFlags", flags);
_webSurface->getRootContext()->setContextProperty("AddressManager", DependencyManager::get<AddressManager>().data());
_webSurface->getRootContext()->setContextProperty("Account", AccountScriptingInterface::getInstance());
_webSurface->getRootContext()->setContextProperty("Audio", DependencyManager::get<AudioScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("AudioStats", DependencyManager::get<AudioClient>()->getStats().data());
_webSurface->getRootContext()->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("fileDialogHelper", new FileDialogHelper());
_webSurface->getRootContext()->setContextProperty("MyAvatar", DependencyManager::get<AvatarManager>()->getMyAvatar().get());
_webSurface->getRootContext()->setContextProperty("ScriptDiscoveryService", DependencyManager::get<ScriptEngines>().data());
_webSurface->getRootContext()->setContextProperty("Tablet", DependencyManager::get<TabletScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("Assets", DependencyManager::get<AssetMappingsScriptingInterface>().data());
_webSurface->getRootContext()->setContextProperty("LODManager", DependencyManager::get<LODManager>().data());
_webSurface->getRootContext()->setContextProperty("OctreeStats", DependencyManager::get<OctreeStatsProvider>().data());
_webSurface->getRootContext()->setContextProperty("DCModel", DependencyManager::get<DomainConnectionModel>().data());
_webSurface->getRootContext()->setContextProperty("AvatarInputs", AvatarInputs::getInstance());
_webSurface->getRootContext()->setContextProperty("GlobalServices", GlobalServicesScriptingInterface::getInstance());
_webSurface->getRootContext()->setContextProperty("AvatarList", DependencyManager::get<AvatarManager>().data());
_webSurface->getRootContext()->setContextProperty("DialogsManager", DialogsManagerScriptingInterface::getInstance());
_webSurface->getSurfaceContext()->setContextProperty("offscreenFlags", flags);
_webSurface->getSurfaceContext()->setContextProperty("AddressManager", DependencyManager::get<AddressManager>().data());
_webSurface->getSurfaceContext()->setContextProperty("Account", AccountScriptingInterface::getInstance());
_webSurface->getSurfaceContext()->setContextProperty("Audio", DependencyManager::get<AudioScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("AudioStats", DependencyManager::get<AudioClient>()->getStats().data());
_webSurface->getSurfaceContext()->setContextProperty("HMD", DependencyManager::get<HMDScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("fileDialogHelper", new FileDialogHelper());
_webSurface->getSurfaceContext()->setContextProperty("MyAvatar", DependencyManager::get<AvatarManager>()->getMyAvatar().get());
_webSurface->getSurfaceContext()->setContextProperty("ScriptDiscoveryService", DependencyManager::get<ScriptEngines>().data());
_webSurface->getSurfaceContext()->setContextProperty("Tablet", DependencyManager::get<TabletScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("Assets", DependencyManager::get<AssetMappingsScriptingInterface>().data());
_webSurface->getSurfaceContext()->setContextProperty("LODManager", DependencyManager::get<LODManager>().data());
_webSurface->getSurfaceContext()->setContextProperty("OctreeStats", DependencyManager::get<OctreeStatsProvider>().data());
_webSurface->getSurfaceContext()->setContextProperty("DCModel", DependencyManager::get<DomainConnectionModel>().data());
_webSurface->getSurfaceContext()->setContextProperty("AvatarInputs", AvatarInputs::getInstance());
_webSurface->getSurfaceContext()->setContextProperty("GlobalServices", GlobalServicesScriptingInterface::getInstance());
_webSurface->getSurfaceContext()->setContextProperty("AvatarList", DependencyManager::get<AvatarManager>().data());
_webSurface->getSurfaceContext()->setContextProperty("DialogsManager", DialogsManagerScriptingInterface::getInstance());
_webSurface->getRootContext()->setContextProperty("pathToFonts", "../../");
_webSurface->getSurfaceContext()->setContextProperty("pathToFonts", "../../");
tabletScriptingInterface->setQmlTabletRoot("com.highfidelity.interface.tablet.system", _webSurface->getRootItem(), _webSurface.data());
// mark the TabletProxy object as cpp ownership.
QObject* tablet = tabletScriptingInterface->getTablet("com.highfidelity.interface.tablet.system");
_webSurface->getRootContext()->engine()->setObjectOwnership(tablet, QQmlEngine::CppOwnership);
_webSurface->getSurfaceContext()->engine()->setObjectOwnership(tablet, QQmlEngine::CppOwnership);
// Override min fps for tablet UI, for silky smooth scrolling
setMaxFPS(90);
}
}
_webSurface->getRootContext()->setContextProperty("globalPosition", vec3toVariant(getPosition()));
_webSurface->getSurfaceContext()->setContextProperty("globalPosition", vec3toVariant(getPosition()));
}
void Web3DOverlay::setMaxFPS(uint8_t maxFPS) {
@ -443,6 +442,9 @@ void Web3DOverlay::handlePointerEventAsTouch(const PointerEvent& event) {
QCoreApplication::postEvent(_webSurface->getWindow(), touchEvent);
if (this->_pressed && event.getType() == PointerEvent::Move) {
return;
}
// Send mouse events to the Web surface so that HTML dialog elements work with mouse press and hover.
// FIXME: Scroll bar dragging is a bit unstable in the tablet (content can jump up and down at times).
// This may be improved in Qt 5.8. Release notes: "Cleaned up touch and mouse event delivery".

View file

@ -20,6 +20,8 @@
#include "ElbowConstraint.h"
#include "SwingTwistConstraint.h"
#include "AnimationLogging.h"
#include "CubicHermiteSpline.h"
#include "AnimUtil.h"
AnimInverseKinematics::IKTargetVar::IKTargetVar(const QString& jointNameIn, const QString& positionVarIn, const QString& rotationVarIn,
const QString& typeVarIn, const QString& weightVarIn, float weightIn, const std::vector<float>& flexCoefficientsIn) :
@ -59,7 +61,8 @@ AnimInverseKinematics::AnimInverseKinematics(const QString& id) : AnimNode(AnimN
AnimInverseKinematics::~AnimInverseKinematics() {
clearConstraints();
_accumulators.clear();
_rotationAccumulators.clear();
_translationAccumulators.clear();
_targetVarVec.clear();
}
@ -72,10 +75,12 @@ void AnimInverseKinematics::loadPoses(const AnimPoseVec& poses) {
assert(_skeleton && ((poses.size() == 0) || (_skeleton->getNumJoints() == (int)poses.size())));
if (_skeleton->getNumJoints() == (int)poses.size()) {
_relativePoses = poses;
_accumulators.resize(_relativePoses.size());
_rotationAccumulators.resize(_relativePoses.size());
_translationAccumulators.resize(_relativePoses.size());
} else {
_relativePoses.clear();
_accumulators.clear();
_rotationAccumulators.clear();
_translationAccumulators.clear();
}
}
@ -175,14 +180,17 @@ void AnimInverseKinematics::computeTargets(const AnimVariantMap& animVars, std::
}
}
void AnimInverseKinematics::solveWithCyclicCoordinateDescent(const AnimContext& context, const std::vector<IKTarget>& targets) {
void AnimInverseKinematics::solve(const AnimContext& context, const std::vector<IKTarget>& targets) {
// compute absolute poses that correspond to relative target poses
AnimPoseVec absolutePoses;
absolutePoses.resize(_relativePoses.size());
computeAbsolutePoses(absolutePoses);
// clear the accumulators before we start the IK solver
for (auto& accumulator: _accumulators) {
for (auto& accumulator : _rotationAccumulators) {
accumulator.clearAndClean();
}
for (auto& accumulator : _translationAccumulators) {
accumulator.clearAndClean();
}
@ -197,14 +205,22 @@ void AnimInverseKinematics::solveWithCyclicCoordinateDescent(const AnimContext&
// solve all targets
for (auto& target: targets) {
solveTargetWithCCD(context, target, absolutePoses, debug);
if (target.getType() == IKTarget::Type::Spline) {
solveTargetWithSpline(context, target, absolutePoses, debug);
} else {
solveTargetWithCCD(context, target, absolutePoses, debug);
}
}
// harvest accumulated rotations and apply the average
for (int i = 0; i < (int)_relativePoses.size(); ++i) {
if (_accumulators[i].size() > 0) {
_relativePoses[i].rot() = _accumulators[i].getAverage();
_accumulators[i].clear();
if (_rotationAccumulators[i].size() > 0) {
_relativePoses[i].rot() = _rotationAccumulators[i].getAverage();
_rotationAccumulators[i].clear();
}
if (_translationAccumulators[i].size() > 0) {
_relativePoses[i].trans() = _translationAccumulators[i].getAverage();
_translationAccumulators[i].clear();
}
}
@ -236,7 +252,7 @@ void AnimInverseKinematics::solveWithCyclicCoordinateDescent(const AnimContext&
int parentIndex = _skeleton->getParentIndex(tipIndex);
// update rotationOnly targets that don't lie on the ik chain of other ik targets.
if (parentIndex != -1 && !_accumulators[tipIndex].isDirty() && target.getType() == IKTarget::Type::RotationOnly) {
if (parentIndex != -1 && !_rotationAccumulators[tipIndex].isDirty() && target.getType() == IKTarget::Type::RotationOnly) {
const glm::quat& targetRotation = target.getRotation();
// compute tip's new parent-relative rotation
// Q = Qp * q --> q' = Qp^ * Q
@ -311,10 +327,13 @@ void AnimInverseKinematics::solveTargetWithCCD(const AnimContext& context, const
}
// store the relative rotation change in the accumulator
_accumulators[tipIndex].add(tipRelativeRotation, target.getWeight());
_rotationAccumulators[tipIndex].add(tipRelativeRotation, target.getWeight());
glm::vec3 tipRelativeTranslation = _relativePoses[target.getIndex()].trans();
_translationAccumulators[tipIndex].add(tipRelativeTranslation);
if (debug) {
debugJointMap[tipIndex] = DebugJoint(tipRelativeRotation, constrained);
debugJointMap[tipIndex] = DebugJoint(tipRelativeRotation, tipRelativeTranslation, constrained);
}
}
@ -422,10 +441,13 @@ void AnimInverseKinematics::solveTargetWithCCD(const AnimContext& context, const
}
// store the relative rotation change in the accumulator
_accumulators[pivotIndex].add(newRot, target.getWeight());
_rotationAccumulators[pivotIndex].add(newRot, target.getWeight());
glm::vec3 newTrans = _relativePoses[pivotIndex].trans();
_translationAccumulators[pivotIndex].add(newTrans);
if (debug) {
debugJointMap[pivotIndex] = DebugJoint(newRot, constrained);
debugJointMap[pivotIndex] = DebugJoint(newRot, newTrans, constrained);
}
// keep track of tip's new transform as we descend towards root
@ -444,6 +466,187 @@ void AnimInverseKinematics::solveTargetWithCCD(const AnimContext& context, const
}
}
static CubicHermiteSplineFunctorWithArcLength computeSplineFromTipAndBase(const AnimPose& tipPose, const AnimPose& basePose, float baseGain = 1.0f, float tipGain = 1.0f) {
float linearDistance = glm::length(basePose.trans() - tipPose.trans());
glm::vec3 p0 = basePose.trans();
glm::vec3 m0 = baseGain * linearDistance * (basePose.rot() * Vectors::UNIT_Y);
glm::vec3 p1 = tipPose.trans();
glm::vec3 m1 = tipGain * linearDistance * (tipPose.rot() * Vectors::UNIT_Y);
return CubicHermiteSplineFunctorWithArcLength(p0, m0, p1, m1);
}
// pre-compute information about each joint influeced by this spline IK target.
void AnimInverseKinematics::computeSplineJointInfosForIKTarget(const AnimContext& context, const IKTarget& target) {
std::vector<SplineJointInfo> splineJointInfoVec;
// build spline between the default poses.
AnimPose tipPose = _skeleton->getAbsoluteDefaultPose(target.getIndex());
AnimPose basePose = _skeleton->getAbsoluteDefaultPose(_hipsIndex);
CubicHermiteSplineFunctorWithArcLength spline;
if (target.getIndex() == _headIndex) {
// set gain factors so that more curvature occurs near the tip of the spline.
const float HIPS_GAIN = 0.5f;
const float HEAD_GAIN = 1.0f;
spline = computeSplineFromTipAndBase(tipPose, basePose, HIPS_GAIN, HEAD_GAIN);
} else {
spline = computeSplineFromTipAndBase(tipPose, basePose);
}
// measure the total arc length along the spline
float totalArcLength = spline.arcLength(1.0f);
glm::vec3 baseToTip = tipPose.trans() - basePose.trans();
float baseToTipLength = glm::length(baseToTip);
glm::vec3 baseToTipNormal = baseToTip / baseToTipLength;
int index = target.getIndex();
int endIndex = _skeleton->getParentIndex(_hipsIndex);
while (index != endIndex) {
AnimPose defaultPose = _skeleton->getAbsoluteDefaultPose(index);
float ratio = glm::dot(defaultPose.trans() - basePose.trans(), baseToTipNormal) / baseToTipLength;
// compute offset from spline to the default pose.
float t = spline.arcLengthInverse(ratio * totalArcLength);
// compute the rotation by using the derivative of the spline as the y-axis, and the defaultPose x-axis
glm::vec3 y = glm::normalize(spline.d(t));
glm::vec3 x = defaultPose.rot() * Vectors::UNIT_X;
glm::vec3 u, v, w;
generateBasisVectors(y, x, v, u, w);
glm::mat3 m(u, v, glm::cross(u, v));
glm::quat rot = glm::normalize(glm::quat_cast(m));
AnimPose pose(glm::vec3(1.0f), rot, spline(t));
AnimPose offsetPose = pose.inverse() * defaultPose;
SplineJointInfo splineJointInfo = { index, ratio, offsetPose };
splineJointInfoVec.push_back(splineJointInfo);
index = _skeleton->getParentIndex(index);
}
_splineJointInfoMap[target.getIndex()] = splineJointInfoVec;
}
const std::vector<AnimInverseKinematics::SplineJointInfo>* AnimInverseKinematics::findOrCreateSplineJointInfo(const AnimContext& context, const IKTarget& target) {
// find or create splineJointInfo for this target
auto iter = _splineJointInfoMap.find(target.getIndex());
if (iter != _splineJointInfoMap.end()) {
return &(iter->second);
} else {
computeSplineJointInfosForIKTarget(context, target);
auto iter = _splineJointInfoMap.find(target.getIndex());
if (iter != _splineJointInfoMap.end()) {
return &(iter->second);
}
}
return nullptr;
}
void AnimInverseKinematics::solveTargetWithSpline(const AnimContext& context, const IKTarget& target, const AnimPoseVec& absolutePoses, bool debug) {
std::map<int, DebugJoint> debugJointMap;
const int baseIndex = _hipsIndex;
// build spline from tip to base
AnimPose tipPose = AnimPose(glm::vec3(1.0f), target.getRotation(), target.getTranslation());
AnimPose basePose = absolutePoses[baseIndex];
CubicHermiteSplineFunctorWithArcLength spline;
if (target.getIndex() == _headIndex) {
// set gain factors so that more curvature occurs near the tip of the spline.
const float HIPS_GAIN = 0.5f;
const float HEAD_GAIN = 1.0f;
spline = computeSplineFromTipAndBase(tipPose, basePose, HIPS_GAIN, HEAD_GAIN);
} else {
spline = computeSplineFromTipAndBase(tipPose, basePose);
}
float totalArcLength = spline.arcLength(1.0f);
// This prevents the rotation interpolation from rotating the wrong physical way (but correct mathematical way)
// when the head is arched backwards very far.
glm::quat halfRot = glm::normalize(glm::lerp(basePose.rot(), tipPose.rot(), 0.5f));
if (glm::dot(halfRot * Vectors::UNIT_Z, basePose.rot() * Vectors::UNIT_Z) < 0.0f) {
tipPose.rot() = -tipPose.rot();
}
// find or create splineJointInfo for this target
const std::vector<SplineJointInfo>* splineJointInfoVec = findOrCreateSplineJointInfo(context, target);
if (splineJointInfoVec && splineJointInfoVec->size() > 0) {
const int baseParentIndex = _skeleton->getParentIndex(baseIndex);
AnimPose parentAbsPose = (baseParentIndex >= 0) ? absolutePoses[baseParentIndex] : AnimPose();
// go thru splineJointInfoVec backwards (base to tip)
for (int i = (int)splineJointInfoVec->size() - 1; i >= 0; i--) {
const SplineJointInfo& splineJointInfo = (*splineJointInfoVec)[i];
float t = spline.arcLengthInverse(splineJointInfo.ratio * totalArcLength);
glm::vec3 trans = spline(t);
// for head splines, preform most twist toward the tip by using ease in function. t^2
float rotT = t;
if (target.getIndex() == _headIndex) {
rotT = t * t;
}
glm::quat twistRot = glm::normalize(glm::lerp(basePose.rot(), tipPose.rot(), rotT));
// compute the rotation by using the derivative of the spline as the y-axis, and the twistRot x-axis
glm::vec3 y = glm::normalize(spline.d(t));
glm::vec3 x = twistRot * Vectors::UNIT_X;
glm::vec3 u, v, w;
generateBasisVectors(y, x, v, u, w);
glm::mat3 m(u, v, glm::cross(u, v));
glm::quat rot = glm::normalize(glm::quat_cast(m));
AnimPose desiredAbsPose = AnimPose(glm::vec3(1.0f), rot, trans) * splineJointInfo.offsetPose;
// apply flex coefficent
AnimPose flexedAbsPose;
::blend(1, &absolutePoses[splineJointInfo.jointIndex], &desiredAbsPose, target.getFlexCoefficient(i), &flexedAbsPose);
AnimPose relPose = parentAbsPose.inverse() * flexedAbsPose;
_rotationAccumulators[splineJointInfo.jointIndex].add(relPose.rot(), target.getWeight());
bool constrained = false;
if (splineJointInfo.jointIndex != _hipsIndex) {
// constrain the amount the spine can stretch or compress
float length = glm::length(relPose.trans());
const float EPSILON = 0.0001f;
if (length > EPSILON) {
float defaultLength = glm::length(_skeleton->getRelativeDefaultPose(splineJointInfo.jointIndex).trans());
const float STRETCH_COMPRESS_PERCENTAGE = 0.15f;
const float MAX_LENGTH = defaultLength * (1.0f + STRETCH_COMPRESS_PERCENTAGE);
const float MIN_LENGTH = defaultLength * (1.0f - STRETCH_COMPRESS_PERCENTAGE);
if (length > MAX_LENGTH) {
relPose.trans() = (relPose.trans() / length) * MAX_LENGTH;
constrained = true;
} else if (length < MIN_LENGTH) {
relPose.trans() = (relPose.trans() / length) * MIN_LENGTH;
constrained = true;
}
} else {
relPose.trans() = glm::vec3(0.0f);
}
}
_translationAccumulators[splineJointInfo.jointIndex].add(relPose.trans(), target.getWeight());
if (debug) {
debugJointMap[splineJointInfo.jointIndex] = DebugJoint(relPose.rot(), relPose.trans(), constrained);
}
parentAbsPose = flexedAbsPose;
}
}
if (debug) {
debugDrawIKChain(debugJointMap, context);
}
}
//virtual
const AnimPoseVec& AnimInverseKinematics::evaluate(const AnimVariantMap& animVars, const AnimContext& context, float dt, AnimNode::Triggers& triggersOut) {
// don't call this function, call overlay() instead
@ -453,14 +656,9 @@ const AnimPoseVec& AnimInverseKinematics::evaluate(const AnimVariantMap& animVar
//virtual
const AnimPoseVec& AnimInverseKinematics::overlay(const AnimVariantMap& animVars, const AnimContext& context, float dt, Triggers& triggersOut, const AnimPoseVec& underPoses) {
// allows solutionSource to be overridden by an animVar
auto solutionSource = animVars.lookup(_solutionSourceVar, (int)_solutionSource);
if (context.getEnableDebugDrawIKConstraints()) {
debugDrawConstraints(context);
}
const float MAX_OVERLAY_DT = 1.0f / 30.0f; // what to clamp delta-time to in AnimInverseKinematics::overlay
if (dt > MAX_OVERLAY_DT) {
dt = MAX_OVERLAY_DT;
@ -569,7 +767,7 @@ const AnimPoseVec& AnimInverseKinematics::overlay(const AnimVariantMap& animVars
{
PROFILE_RANGE_EX(simulation_animation, "ik/ccd", 0xffff00ff, 0);
solveWithCyclicCoordinateDescent(context, targets);
solve(context, targets);
}
if (_hipsTargetIndex < 0) {
@ -579,6 +777,20 @@ const AnimPoseVec& AnimInverseKinematics::overlay(const AnimVariantMap& animVars
_hipsOffset = Vectors::ZERO;
}
}
if (context.getEnableDebugDrawIKConstraints()) {
debugDrawConstraints(context);
}
}
if (_leftHandIndex > -1) {
_uncontrolledLeftHandPose = _skeleton->getAbsolutePose(_leftHandIndex, underPoses);
}
if (_rightHandIndex > -1) {
_uncontrolledRightHandPose = _skeleton->getAbsolutePose(_rightHandIndex, underPoses);
}
if (_hipsIndex > -1) {
_uncontrolledHipsPose = _skeleton->getAbsolutePose(_hipsIndex, underPoses);
}
return _relativePoses;
@ -722,8 +934,10 @@ void AnimInverseKinematics::initConstraints() {
loadDefaultPoses(_skeleton->getRelativeBindPoses());
// compute corresponding absolute poses
int numJoints = (int)_defaultRelativePoses.size();
/* KEEP THIS CODE for future experimentation
// compute corresponding absolute poses
AnimPoseVec absolutePoses;
absolutePoses.resize(numJoints);
for (int i = 0; i < numJoints; ++i) {
@ -734,6 +948,7 @@ void AnimInverseKinematics::initConstraints() {
absolutePoses[i] = absolutePoses[parentIndex] * _defaultRelativePoses[i];
}
}
*/
clearConstraints();
for (int i = 0; i < numJoints; ++i) {
@ -1045,7 +1260,10 @@ void AnimInverseKinematics::setSkeletonInternal(AnimSkeleton::ConstPointer skele
_maxTargetIndex = -1;
for (auto& accumulator: _accumulators) {
for (auto& accumulator: _rotationAccumulators) {
accumulator.clearAndClean();
}
for (auto& accumulator: _translationAccumulators) {
accumulator.clearAndClean();
}
@ -1061,12 +1279,21 @@ void AnimInverseKinematics::setSkeletonInternal(AnimSkeleton::ConstPointer skele
} else {
_hipsParentIndex = -1;
}
_leftHandIndex = _skeleton->nameToJointIndex("LeftHand");
_rightHandIndex = _skeleton->nameToJointIndex("RightHand");
} else {
clearConstraints();
_headIndex = -1;
_hipsIndex = -1;
_hipsParentIndex = -1;
_leftHandIndex = -1;
_rightHandIndex = -1;
}
_uncontrolledLeftHandPose = AnimPose();
_uncontrolledRightHandPose = AnimPose();
_uncontrolledHipsPose = AnimPose();
}
static glm::vec3 sphericalToCartesian(float phi, float theta) {
@ -1117,6 +1344,7 @@ void AnimInverseKinematics::debugDrawIKChain(std::map<int, DebugJoint>& debugJoi
// copy debug joint rotations into the relative poses
for (auto& debugJoint : debugJointMap) {
poses[debugJoint.first].rot() = debugJoint.second.relRot;
poses[debugJoint.first].trans() = debugJoint.second.relTrans;
}
// convert relative poses to absolute
@ -1282,7 +1510,7 @@ void AnimInverseKinematics::blendToPoses(const AnimPoseVec& targetPoses, const A
int numJoints = (int)_relativePoses.size();
for (int i = 0; i < numJoints; ++i) {
float dotSign = copysignf(1.0f, glm::dot(_relativePoses[i].rot(), targetPoses[i].rot()));
if (_accumulators[i].isDirty()) {
if (_rotationAccumulators[i].isDirty()) {
// this joint is affected by IK --> blend toward the targetPoses rotation
_relativePoses[i].rot() = glm::normalize(glm::lerp(_relativePoses[i].rot(), dotSign * targetPoses[i].rot(), blendFactor));
} else {
@ -1316,3 +1544,46 @@ void AnimInverseKinematics::initRelativePosesFromSolutionSource(SolutionSource s
break;
}
}
void AnimInverseKinematics::debugDrawSpineSplines(const AnimContext& context, const std::vector<IKTarget>& targets) const {
for (auto& target : targets) {
if (target.getType() != IKTarget::Type::Spline) {
continue;
}
const int baseIndex = _hipsIndex;
// build spline
AnimPose tipPose = AnimPose(glm::vec3(1.0f), target.getRotation(), target.getTranslation());
AnimPose basePose = _skeleton->getAbsolutePose(baseIndex, _relativePoses);
CubicHermiteSplineFunctorWithArcLength spline;
if (target.getIndex() == _headIndex) {
// set gain factors so that more curvature occurs near the tip of the spline.
const float HIPS_GAIN = 0.5f;
const float HEAD_GAIN = 1.0f;
spline = computeSplineFromTipAndBase(tipPose, basePose, HIPS_GAIN, HEAD_GAIN);
} else {
spline = computeSplineFromTipAndBase(tipPose, basePose);
}
float totalArcLength = spline.arcLength(1.0f);
const glm::vec4 RED(1.0f, 0.0f, 0.0f, 1.0f);
const glm::vec4 WHITE(1.0f, 1.0f, 1.0f, 1.0f);
// draw red and white stripped spline, parameterized by arc length.
// i.e. each stripe should be the same length.
AnimPose geomToWorldPose = AnimPose(context.getRigToWorldMatrix() * context.getGeometryToRigMatrix());
const int NUM_SEGMENTS = 20;
const float dArcLength = totalArcLength / NUM_SEGMENTS;
float arcLength = 0.0f;
for (int i = 0; i < NUM_SEGMENTS; i++) {
float prevT = spline.arcLengthInverse(arcLength);
float nextT = spline.arcLengthInverse(arcLength + dArcLength);
DebugDraw::getInstance().drawRay(geomToWorldPose.xformPoint(spline(prevT)), geomToWorldPose.xformPoint(spline(nextT)), (i % 2) == 0 ? RED : WHITE);
arcLength += dArcLength;
}
}
}

View file

@ -19,6 +19,7 @@
#include "IKTarget.h"
#include "RotationAccumulator.h"
#include "TranslationAccumulator.h"
class RotationConstraint;
@ -56,23 +57,39 @@ public:
void setSolutionSource(SolutionSource solutionSource) { _solutionSource = solutionSource; }
void setSolutionSourceVar(const QString& solutionSourceVar) { _solutionSourceVar = solutionSourceVar; }
const AnimPose& getUncontrolledLeftHandPose() { return _uncontrolledLeftHandPose; }
const AnimPose& getUncontrolledRightHandPose() { return _uncontrolledRightHandPose; }
const AnimPose& getUncontrolledHipPose() { return _uncontrolledHipsPose; }
protected:
void computeTargets(const AnimVariantMap& animVars, std::vector<IKTarget>& targets, const AnimPoseVec& underPoses);
void solveWithCyclicCoordinateDescent(const AnimContext& context, const std::vector<IKTarget>& targets);
void solve(const AnimContext& context, const std::vector<IKTarget>& targets);
void solveTargetWithCCD(const AnimContext& context, const IKTarget& target, const AnimPoseVec& absolutePoses, bool debug);
void solveTargetWithSpline(const AnimContext& context, const IKTarget& target, const AnimPoseVec& absolutePoses, bool debug);
virtual void setSkeletonInternal(AnimSkeleton::ConstPointer skeleton) override;
struct DebugJoint {
DebugJoint() : relRot(), constrained(false) {}
DebugJoint(const glm::quat& relRotIn, bool constrainedIn) : relRot(relRotIn), constrained(constrainedIn) {}
DebugJoint(const glm::quat& relRotIn, const glm::vec3& relTransIn, bool constrainedIn) : relRot(relRotIn), relTrans(relTransIn), constrained(constrainedIn) {}
glm::quat relRot;
glm::vec3 relTrans;
bool constrained;
};
void debugDrawIKChain(std::map<int, DebugJoint>& debugJointMap, const AnimContext& context) const;
void debugDrawRelativePoses(const AnimContext& context) const;
void debugDrawConstraints(const AnimContext& context) const;
void debugDrawSpineSplines(const AnimContext& context, const std::vector<IKTarget>& targets) const;
void initRelativePosesFromSolutionSource(SolutionSource solutionSource, const AnimPoseVec& underPose);
void blendToPoses(const AnimPoseVec& targetPoses, const AnimPoseVec& underPose, float blendFactor);
// used to pre-compute information about each joint influeced by a spline IK target.
struct SplineJointInfo {
int jointIndex; // joint in the skeleton that this information pertains to.
float ratio; // percentage (0..1) along the spline for this joint.
AnimPose offsetPose; // local offset from the spline to the joint.
};
void computeSplineJointInfosForIKTarget(const AnimContext& context, const IKTarget& target);
const std::vector<SplineJointInfo>* findOrCreateSplineJointInfo(const AnimContext& context, const IKTarget& target);
// for AnimDebugDraw rendering
virtual const AnimPoseVec& getPosesInternal() const override { return _relativePoses; }
@ -105,12 +122,15 @@ protected:
};
std::map<int, RotationConstraint*> _constraints;
std::vector<RotationAccumulator> _accumulators;
std::vector<RotationAccumulator> _rotationAccumulators;
std::vector<TranslationAccumulator> _translationAccumulators;
std::vector<IKTargetVar> _targetVarVec;
AnimPoseVec _defaultRelativePoses; // poses of the relaxed state
AnimPoseVec _relativePoses; // current relative poses
AnimPoseVec _limitCenterPoses; // relative
std::map<int, std::vector<SplineJointInfo>> _splineJointInfoMap;
// experimental data for moving hips during IK
glm::vec3 _hipsOffset { Vectors::ZERO };
float _maxHipsOffsetLength{ FLT_MAX };
@ -118,6 +138,8 @@ protected:
int _hipsIndex { -1 };
int _hipsParentIndex { -1 };
int _hipsTargetIndex { -1 };
int _leftHandIndex { -1 };
int _rightHandIndex { -1 };
// _maxTargetIndex is tracked to help optimize the recalculation of absolute poses
// during the the cyclic coordinate descent algorithm
@ -127,6 +149,10 @@ protected:
bool _previousEnableDebugIKTargets { false };
SolutionSource _solutionSource { SolutionSource::RelaxToUnderPoses };
QString _solutionSourceVar;
AnimPose _uncontrolledLeftHandPose { AnimPose() };
AnimPose _uncontrolledRightHandPose { AnimPose() };
AnimPose _uncontrolledHipsPose { AnimPose() };
};
#endif // hifi_AnimInverseKinematics_h

View file

@ -44,6 +44,9 @@ void IKTarget::setType(int type) {
case (int)Type::HipsRelativeRotationAndPosition:
_type = Type::HipsRelativeRotationAndPosition;
break;
case (int)Type::Spline:
_type = Type::Spline;
break;
default:
_type = Type::Unknown;
}

View file

@ -21,6 +21,7 @@ public:
RotationOnly,
HmdHead,
HipsRelativeRotationAndPosition,
Spline,
Unknown
};

View file

@ -28,6 +28,7 @@
#include "AnimClip.h"
#include "AnimInverseKinematics.h"
#include "AnimSkeleton.h"
#include "AnimUtil.h"
#include "IKTarget.h"
static bool isEqual(const glm::vec3& u, const glm::vec3& v) {
@ -401,16 +402,6 @@ void Rig::setJointRotation(int index, bool valid, const glm::quat& rotation, flo
}
}
void Rig::restoreJointRotation(int index, float fraction, float priority) {
// AJT: DEAD CODE?
ASSERT(false);
}
void Rig::restoreJointTranslation(int index, float fraction, float priority) {
// AJT: DEAD CODE?
ASSERT(false);
}
bool Rig::getJointPositionInWorldFrame(int jointIndex, glm::vec3& position, glm::vec3 translation, glm::quat rotation) const {
if (isIndexValid(jointIndex)) {
position = (rotation * _internalPoseSet._absolutePoses[jointIndex].trans()) + translation;
@ -1040,8 +1031,8 @@ void Rig::updateFromHeadParameters(const HeadParameters& params, float dt) {
_animVars.set("hipsType", (int)IKTarget::Type::Unknown);
}
if (params.spine2Enabled) {
_animVars.set("spine2Type", (int)IKTarget::Type::RotationAndPosition);
if (params.hipsEnabled && params.spine2Enabled) {
_animVars.set("spine2Type", (int)IKTarget::Type::Spline);
_animVars.set("spine2Position", extractTranslation(params.spine2Matrix));
_animVars.set("spine2Rotation", glmExtractRotation(params.spine2Matrix));
} else {
@ -1051,7 +1042,7 @@ void Rig::updateFromHeadParameters(const HeadParameters& params, float dt) {
if (params.leftArmEnabled) {
_animVars.set("leftArmType", (int)IKTarget::Type::RotationAndPosition);
_animVars.set("leftArmPosition", params.leftArmPosition);
_animVars.set("leftArmRotation", params.leftArmRotation);
_animVars.set("leftArmRotation", params.leftArmRotation);
} else {
_animVars.set("leftArmType", (int)IKTarget::Type::Unknown);
}
@ -1101,9 +1092,9 @@ void Rig::updateHeadAnimVars(const HeadParameters& params) {
_animVars.set("headPosition", params.rigHeadPosition);
_animVars.set("headRotation", params.rigHeadOrientation);
if (params.hipsEnabled) {
// Since there is an explicit hips ik target, switch the head to use the more generic RotationAndPosition IK chain type.
// this will allow the spine to bend more, ensuring that it can reach the head target position.
_animVars.set("headType", (int)IKTarget::Type::RotationAndPosition);
// Since there is an explicit hips ik target, switch the head to use the more flexible Spline IK chain type.
// this will allow the spine to compress/expand and bend more natrually, ensuring that it can reach the head target position.
_animVars.set("headType", (int)IKTarget::Type::Spline);
_animVars.unset("headWeight"); // use the default weight for this target.
} else {
// When there is no hips IK target, use the HmdHead IK chain type. This will make the spine very stiff,
@ -1135,9 +1126,9 @@ void Rig::updateEyeJoint(int index, const glm::vec3& modelTranslation, const glm
}
glm::vec3 headUp = headQuat * Vectors::UNIT_Y;
glm::vec3 z, y, x;
generateBasisVectors(lookAtVector, headUp, z, y, x);
glm::mat3 m(glm::cross(y, z), y, z);
glm::vec3 z, y, zCrossY;
generateBasisVectors(lookAtVector, headUp, z, y, zCrossY);
glm::mat3 m(-zCrossY, y, z);
glm::quat desiredQuat = glm::normalize(glm::quat_cast(m));
glm::quat deltaQuat = desiredQuat * glm::inverse(headQuat);
@ -1172,10 +1163,10 @@ void Rig::updateFromHandAndFeetParameters(const HandAndFeetParameters& params, f
// TODO: add isHipsEnabled
bool bodySensorTrackingEnabled = params.isLeftFootEnabled || params.isRightFootEnabled;
const float RELAX_DURATION = 0.6f;
if (params.isLeftEnabled) {
glm::vec3 handPosition = params.leftPosition;
if (!bodySensorTrackingEnabled) {
// prevent the hand IK targets from intersecting the body capsule
glm::vec3 displacement;
@ -1187,16 +1178,38 @@ void Rig::updateFromHandAndFeetParameters(const HandAndFeetParameters& params, f
_animVars.set("leftHandPosition", handPosition);
_animVars.set("leftHandRotation", params.leftOrientation);
_animVars.set("leftHandType", (int)IKTarget::Type::RotationAndPosition);
_isLeftHandControlled = true;
_lastLeftHandControlledPose = AnimPose(glm::vec3(1.0f), params.leftOrientation, handPosition);
} else {
_animVars.unset("leftHandPosition");
_animVars.unset("leftHandRotation");
_animVars.set("leftHandType", (int)IKTarget::Type::HipsRelativeRotationAndPosition);
if (_isLeftHandControlled) {
_leftHandRelaxDuration = RELAX_DURATION;
_isLeftHandControlled = false;
}
if (_leftHandRelaxDuration > 0) {
// Move hand from controlled position to non-controlled position.
_leftHandRelaxDuration = std::max(_leftHandRelaxDuration - dt, 0.0f);
auto ikNode = getAnimInverseKinematicsNode();
if (ikNode) {
float alpha = 1.0f - _leftHandRelaxDuration / RELAX_DURATION;
const AnimPose geometryToRigTransform(_geometryToRigTransform);
AnimPose uncontrolledHandPose = geometryToRigTransform * ikNode->getUncontrolledLeftHandPose();
AnimPose handPose;
::blend(1, &_lastLeftHandControlledPose, &uncontrolledHandPose, alpha, &handPose);
_animVars.set("leftHandPosition", handPose.trans());
_animVars.set("leftHandRotation", handPose.rot());
_animVars.set("leftHandType", (int)IKTarget::Type::RotationAndPosition);
}
} else {
_animVars.unset("leftHandPosition");
_animVars.unset("leftHandRotation");
_animVars.set("leftHandType", (int)IKTarget::Type::HipsRelativeRotationAndPosition);
}
}
if (params.isRightEnabled) {
glm::vec3 handPosition = params.rightPosition;
if (!bodySensorTrackingEnabled) {
// prevent the hand IK targets from intersecting the body capsule
glm::vec3 displacement;
@ -1208,10 +1221,34 @@ void Rig::updateFromHandAndFeetParameters(const HandAndFeetParameters& params, f
_animVars.set("rightHandPosition", handPosition);
_animVars.set("rightHandRotation", params.rightOrientation);
_animVars.set("rightHandType", (int)IKTarget::Type::RotationAndPosition);
_isRightHandControlled = true;
_lastRightHandControlledPose = AnimPose(glm::vec3(1.0f), params.rightOrientation, handPosition);
} else {
_animVars.unset("rightHandPosition");
_animVars.unset("rightHandRotation");
_animVars.set("rightHandType", (int)IKTarget::Type::HipsRelativeRotationAndPosition);
if (_isRightHandControlled) {
_rightHandRelaxDuration = RELAX_DURATION;
_isRightHandControlled = false;
}
if (_rightHandRelaxDuration > 0) {
// Move hand from controlled position to non-controlled position.
_rightHandRelaxDuration = std::max(_rightHandRelaxDuration - dt, 0.0f);
auto ikNode = getAnimInverseKinematicsNode();
if (ikNode) {
float alpha = 1.0f - _rightHandRelaxDuration / RELAX_DURATION;
const AnimPose geometryToRigTransform(_geometryToRigTransform);
AnimPose uncontrolledHandPose = geometryToRigTransform * ikNode->getUncontrolledRightHandPose();
AnimPose handPose;
::blend(1, &_lastRightHandControlledPose, &uncontrolledHandPose, alpha, &handPose);
_animVars.set("rightHandPosition", handPose.trans());
_animVars.set("rightHandRotation", handPose.rot());
_animVars.set("rightHandType", (int)IKTarget::Type::RotationAndPosition);
}
} else {
_animVars.unset("rightHandPosition");
_animVars.unset("rightHandRotation");
_animVars.set("rightHandType", (int)IKTarget::Type::HipsRelativeRotationAndPosition);
}
}
if (params.isLeftFootEnabled) {
@ -1233,7 +1270,6 @@ void Rig::updateFromHandAndFeetParameters(const HandAndFeetParameters& params, f
_animVars.unset("rightFootRotation");
_animVars.set("rightFootType", (int)IKTarget::Type::RotationAndPosition);
}
}
}
@ -1509,5 +1545,3 @@ void Rig::computeAvatarBoundingCapsule(
glm::vec3 rigCenter = (geometryToRig * (0.5f * (totalExtents.maximum + totalExtents.minimum)));
localOffsetOut = rigCenter - (geometryToRig * rootPosition);
}

View file

@ -134,10 +134,6 @@ public:
void setJointTranslation(int index, bool valid, const glm::vec3& translation, float priority);
void setJointRotation(int index, bool valid, const glm::quat& rotation, float priority);
// legacy
void restoreJointRotation(int index, float fraction, float priority);
void restoreJointTranslation(int index, float fraction, float priority);
// if translation and rotation is identity, position will be in rig space
bool getJointPositionInWorldFrame(int jointIndex, glm::vec3& position,
glm::vec3 translation, glm::quat rotation) const;
@ -355,6 +351,13 @@ private:
QMap<int, StateHandler> _stateHandlers;
int _nextStateHandlerId { 0 };
QMutex _stateMutex;
bool _isLeftHandControlled { false };
bool _isRightHandControlled { false };
float _leftHandRelaxDuration { 0.0f };
float _rightHandRelaxDuration { 0.0f };
AnimPose _lastLeftHandControlledPose;
AnimPose _lastRightHandControlledPose;
};
#endif /* defined(__hifi__Rig__) */

View file

@ -1,5 +1,5 @@
//
// RotationAccumulator.h
// RotationAccumulator.cpp
//
// Copyright 2015 High Fidelity, Inc.
//
@ -27,7 +27,7 @@ void RotationAccumulator::clear() {
_numRotations = 0;
}
void RotationAccumulator::clearAndClean() {
void RotationAccumulator::clearAndClean() {
clear();
_isDirty = false;
}

Some files were not shown because too many files have changed in this diff Show more