Merge branch 'master' of https://github.com/highfidelity/hifi into commerce_QmlWhitelist

This commit is contained in:
Zach Fox 2017-12-07 11:21:24 -08:00
commit b2bafed2bb
325 changed files with 8336 additions and 6614 deletions

4
.gitignore vendored
View file

@ -15,7 +15,9 @@ Makefile
# Android Studio
*.iml
local.properties
android/libraries
android/gradle*
android/.gradle
android/app/src/main/jniLibs
# VSCode
# List taken from Github Global Ignores master@435c4d92

View file

@ -1,25 +1,23 @@
Please read the [general build guide](BUILD.md) for information on dependencies required for all platforms. Only Android specific instructions are found in this file.
Please read the [general build guide](BUILD.md) for information on building other platform. Only Android specific instructions are found in this file.
# Android Dependencies
# Dependencies
*Currently Android building is only supported on 64 bit Linux host environments*
You will need the following tools to build our Android targets.
* [Qt](http://www.qt.io/download-open-source/#) ~> 5.9.1
* [Gradle](https://gradle.org/install/)
* [Android Studio](https://developer.android.com/studio/index.html)
* [Google VR SDK](https://github.com/googlevr/gvr-android-sdk/releases)
* [Gradle](https://gradle.org/releases/)
### Qt
### Gradle
Download the Qt online installer. Run the installer and select the android_armv7 binaries. Installing to the default path is recommended
Install gradle version 4.1 or higher. Following the instructions to install via [SDKMAN!](http://sdkman.io/install.html) is recommended.
### Android Studio
Download the Android Studio installer and run it. Once installed, at the welcome screen, click configure in the lower right corner and select SDK manager
From the SDK Platforms tab, select API level 26.
* Install the ARM EABI v7a System Image if you want to run an emulator.
From the SDK Platforms tab, select API levels 24 and 26.
From the SDK Tools tab select the following
@ -29,123 +27,41 @@ From the SDK Tools tab select the following
* LLDB
* Android SDK Platform-Tools
* Android SDK Tools
* Android SDK Tools
* NDK (even if you have the NDK installed separately)
### Google VR SDK
# Environment
Download the 1.8 Google VR SDK [release](https://github.com/googlevr/gvr-android-sdk/archive/v1.80.0.zip). Unzip the archive to a location on your drive.
### Gradle
Download [Gradle 4.1](https://services.gradle.org/distributions/gradle-4.1-all.zip) and unzip it on your local drive. You may wish to add the location of the bin directory within the archive to your path
Setting up the environment for android builds requires some additional steps
#### Set up machine specific Gradle properties
Create a `gradle.properties` file in ~/.gradle. Edit the file to contain the following
Create a `gradle.properties` file in $HOME/.gradle. Edit the file to contain the following
QT5_ROOT=C\:\\Qt\\5.9.1\\android_armv7
GVR_ROOT=C\:\\Android\\gvr-android-sdk
HIFI_ANDROID_PRECOMPILED=<your_home_directory>/Android/hifi_externals
Replace the paths with your local installations of Qt5 and the Google VR SDK
Note, do not use `$HOME` for the path. It must be a fully qualified path name.
### Setup the repository
Clone the repository
`git clone https://github.com/highfidelity/hifi.git`
Enter the repository `android` directory
`cd hifi/android`
Execute a gradle pre-build setup. This step should only need to be done once
`gradle setupDepedencies`
# TODO fix the rest
# Building & Running
You will also need to cross-compile the dependencies required for all platforms for Android, and help CMake find these compiled libraries on your machine.
* Open Android Studio
* Choose _Open Existing Android Studio Project_
* Navigate to the `hifi` repository and choose the `android` folder and select _OK_
* If Android Studio asks you if you want to use the Gradle wrapper, select cancel and tell it where your local gradle installation is. If you used SDKMAN to install gradle it will be located in `$HOME/.sdkman/candidates/gradle/current/`
* From the _Build_ menu select _Make Project_
* Once the build completes, from the _Run_ menu select _Run App_
#### Scribe
High Fidelity has a shader pre-processing tool called `scribe` that various libraries will call on during the build process. You must compile scribe using your native toolchain (following the build instructions for your platform) and then pass a CMake variable or set an ENV variable `SCRIBE_PATH` that is a path to the scribe executable.
CMake will fatally error if it does not find the scribe executable while using the android toolchain.
#### Optional Components
* [Oculus Mobile SDK](https://developer.oculus.com/downloads/#sdk=mobile) ~> 0.4.2
#### ANDROID_LIB_DIR
Since you won't be installing Android dependencies to system paths on your development machine, CMake will need a little help tracking down your Android dependencies.
This is most easily accomplished by installing all Android dependencies in the same folder. You can place this folder wherever you like on your machine. In this build guide and across our CMakeLists files this folder is referred to as `ANDROID_LIB_DIR`. You can set `ANDROID_LIB_DIR` in your environment or by passing when you run CMake.
#### Qt
Install Qt 5.5.1 for Android for your host environment from the [Qt downloads page](http://www.qt.io/download/). Install Qt to ``$ANDROID_LIB_DIR/Qt``. This is required so that our root CMakeLists file can help CMake find your Android Qt installation.
The component required for the Android build is the `Android armv7` component.
If you would like to install Qt to a different location, or attempt to build with a different Qt version, you can pass `ANDROID_QT_CMAKE_PREFIX_PATH` to CMake. Point to the `cmake` folder inside `$VERSION_NUMBER/android_armv7/lib`. Otherwise, our root CMakeLists will set it to `$ANDROID_LIB_DIR/Qt/5.5/android_armv7/lib/cmake`.
#### OpenSSL
Cross-compilation of OpenSSL has been tested from an OS X machine running 10.10 compiling OpenSSL 1.0.2. It is likely that the steps below will work for other OpenSSL versions than 1.0.2.
The original instructions to compile OpenSSL for Android from your host environment can be found [here](http://wiki.openssl.org/index.php/Android). We required some tweaks to get OpenSSL to successfully compile, those tweaks are explained below.
Download the [OpenSSL source](https://www.openssl.org/source/) and extract the tarball inside your `ANDROID_LIB_DIR`. Rename the extracted folder to `openssl`.
You will need the [setenv-android.sh script](http://wiki.openssl.org/index.php/File:Setenv-android.sh) from the OpenSSL wiki.
You must change three values at the top of the `setenv-android.sh` script - `_ANDROID_NDK`, `_ANDROID_EABI` and `_ANDROID_API`.
`_ANDROID_NDK` should be `android-ndk-r10`, `_ANDROID_EABI` should be `arm-linux-androidebi-4.9` and `_ANDROID_API` should be `19`.
First, make sure `ANDROID_NDK_ROOT` is set in your env. This should be the path to the root of your Android NDK install. `setenv-android.sh` needs `ANDROID_NDK_ROOT` to set the environment variables required for building OpenSSL.
Source the `setenv-android.sh` script so it can set environment variables that OpenSSL will use while compiling. If you use zsh as your shell you may need to modify the `setenv-android.sh` for it to set the correct variables in your env.
```
export ANDROID_NDK_ROOT=YOUR_NDK_ROOT
source setenv-android.sh
```
Then, from the OpenSSL directory, run the following commands.
```
perl -pi -e 's/install: all install_docs install_sw/install: install_docs install_sw/g' Makefile.org
./config shared -no-ssl2 -no-ssl3 -no-comp -no-hw -no-engine --openssldir=/usr/local/ssl/$ANDROID_API
make depend
make all
```
This should generate libcrypto and libssl in the root of the OpenSSL directory. YOU MUST remove the `libssl.so` and `libcrypto.so` files that are generated. They are symlinks to `libssl.so.VER` and `libcrypto.so.VER` which Android does not know how to handle. By removing `libssl.so` and `libcrypto.so` the FindOpenSSL module will find the static libs and use those instead.
If you have been building other components it is possible that the OpenSSL compile will fail based on the values other cross-compilations (tbb, bullet) have set. Ensure that you are in a new terminal window to avoid compilation errors from previously set environment variables.
#### Oculus Mobile SDK
The Oculus Mobile SDK is optional, for Gear VR support. It is not required to compile gvr-interface.
Download the [Oculus Mobile SDK](https://developer.oculus.com/downloads/#sdk=mobile) and extract the archive inside your `ANDROID_LIB_DIR` folder. Rename the extracted folder to `libovr`.
From the VRLib directory, use ndk-build to build VrLib.
```
cd VRLib
ndk-build
```
This will create the liboculus.a archive that our FindLibOVR module will look for when cmake is run.
##### Hybrid testing
Currently the 'vr_dual' mode that would allow us to run a hybrid app has limited support in the Oculus Mobile SDK. The best way to have an application we can launch without having to connect to the GearVR is to put the Gear VR Service into developer mode. This stops Oculus Home from taking over the device when it is plugged into the Gear VR headset, and allows the application to be launched from the Applications page.
To put the Gear VR Service into developer mode you need an application with an Oculus Signature File on your device. Generate an Oculus Signature File for your device on the [Oculus osig tool page](https://developer.oculus.com/tools/osig/). Place this file in the gvr-interface/assets directory. Cmake will automatically copy it into your apk in the right place when you execute `make gvr-interface-apk`.
Once the application is on your device, go to `Settings->Application Manager->Gear VR Service->Manage Storage`. Tap on `VR Service Version` six times. It will scan your device to verify that you have an osig file in an application on your device, and then it will let you enable Developer mode.
### CMake
We use CMake to generate the makefiles that compile and deploy the Android APKs to your device. In order to create Makefiles for the Android targets, CMake requires that some environment variables are set, and that other variables are passed to it when it is run.
The following must be set in your environment:
* ANDROID_NDK - the root of your Android NDK install
* ANDROID_HOME - the root of your Android SDK install
* ANDROID_LIB_DIR - the directory containing cross-compiled versions of dependencies
The following must be passed to CMake when it is run:
* USE_ANDROID_TOOLCHAIN - set to true to build for Android

View file

@ -1,8 +1,10 @@
set(TARGET_NAME native-lib)
setup_hifi_library()
link_hifi_libraries(shared networking gl gpu gpu-gles render-utils)
autoscribe_shader_lib(gpu model render render-utils)
target_opengl()
link_hifi_libraries(shared networking gl gpu gpu-gles image fbx render-utils physics)
target_link_libraries(native-lib android log m)
target_include_directories(native-lib PRIVATE "${GVR_ROOT}/libraries/headers")
target_link_libraries(native-lib "C:/Users/bdavis/Git/hifi/android/libraries/jni/armeabi-v7a/libgvr.so")
target_opengl()
target_googlevr()

View file

@ -1,27 +1,32 @@
apply plugin: 'com.android.application'
ext.RELEASE_NUMBER = project.hasProperty('RELEASE_NUMBER') ? project.getProperty('RELEASE_NUMBER') : '0'
ext.RELEASE_TYPE = project.hasProperty('RELEASE_TYPE') ? project.getProperty('RELEASE_TYPE') : 'DEV'
ext.BUILD_BRANCH = project.hasProperty('BUILD_BRANCH') ? project.getProperty('BUILD_BRANCH') : ''
android {
compileSdkVersion 26
buildToolsVersion "26.0.1"
defaultConfig {
applicationId "org.saintandreas.testapp"
minSdkVersion 24
targetSdkVersion 26
versionCode 1
versionName "1.0"
ndk { abiFilters 'armeabi-v7a' }
ndk { abiFilters 'arm64-v8a' }
externalNativeBuild {
cmake {
arguments '-DHIFI_ANDROID=1',
'-DANDROID_PLATFORM=android-24',
'-DANDROID_TOOLCHAIN=clang',
'-DANDROID_STL=gnustl_shared',
'-DGVR_ROOT=' + GVR_ROOT,
'-DNATIVE_SCRIBE=c:/bin/scribe.exe',
"-DHIFI_ANDROID_PRECOMPILED=${project.rootDir}/libraries/jni/armeabi-v7a"
'-DANDROID_STL=c++_shared',
'-DQT_CMAKE_PREFIX_PATH=' + HIFI_ANDROID_PRECOMPILED + '/qt/lib/cmake',
'-DNATIVE_SCRIBE=' + HIFI_ANDROID_PRECOMPILED + '/scribe',
'-DHIFI_ANDROID_PRECOMPILED=' + HIFI_ANDROID_PRECOMPILED,
'-DRELEASE_NUMBER=' + RELEASE_NUMBER,
'-DRELEASE_TYPE=' + RELEASE_TYPE,
'-DBUILD_BRANCH=' + BUILD_BRANCH
}
}
jackOptions { enabled true }
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
@ -29,17 +34,20 @@ android {
}
buildTypes {
applicationVariants.all { variant ->
variant.outputs.all {
if (RELEASE_NUMBER != '0') {
outputFileName = "app_" + RELEASE_NUMBER + "_" + RELEASE_TYPE + ".apk"
}
}
}
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
sourceSets {
main {
jniLibs.srcDirs += '../libraries/jni';
}
}
externalNativeBuild {
cmake {
path '../../CMakeLists.txt'
@ -53,5 +61,3 @@ dependencies {
compile 'com.google.vr:sdk-audio:1.80.0'
compile 'com.google.vr:sdk-base:1.80.0'
}
build.dependsOn(':extractQt5')

View file

@ -7,12 +7,10 @@
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-feature android:name="android.hardware.sensor.accelerometer" android:required="true"/>
<uses-feature android:name="android.hardware.sensor.gyroscope" android:required="true"/>
<uses-feature android:name="android.software.vr.mode" android:required="false"/>
<uses-feature android:name="android.hardware.vr.high_performance" android:required="false"/>
<application
android:allowBackup="true"
android:theme="@style/VrActivityTheme"
android:theme="@style/NoSystemUI"
android:icon="@mipmap/ic_launcher"
android:roundIcon="@mipmap/ic_launcher_round">
<activity
@ -20,17 +18,10 @@
android:label="@string/app_name"
android:screenOrientation="landscape"
android:configChanges="orientation|keyboardHidden|screenSize"
android:enableVrMode="@string/gvr_vr_mode_component"
android:resizeableActivity="false">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
<category android:name="com.google.intent.category.DAYDREAM"/>
</intent-filter>
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
<category android:name="com.google.intent.category.CARDBOARD" />
</intent-filter>
</activity>
</application>

View file

@ -37,12 +37,7 @@ extern "C" {
JNI_METHOD(jlong, nativeCreateRenderer)
(JNIEnv *env, jclass clazz, jobject class_loader, jobject android_context, jlong native_gvr_api) {
qInstallMessageHandler(messageHandler);
#if defined(GVR)
auto gvrContext = reinterpret_cast<gvr_context *>(native_gvr_api);
return toJni(new NativeRenderer(gvrContext));
#else
return toJni(new NativeRenderer(nullptr));
#endif
return toJni(new NativeRenderer());
}
JNI_METHOD(void, nativeDestroyRenderer)

View file

@ -1,138 +1,14 @@
#include "renderer.h"
#include <mutex>
#include <glm/gtc/matrix_transform.hpp>
#include <QtCore/QDebug>
#include <gl/Config.h>
#include "GoogleVRHelpers.h"
#include <gl/GLShaders.h>
#include <shared/Bilateral.h>
#include <gpu/DrawTransformUnitQuad_vert.h>
#include <gpu/DrawTexcoordRectTransformUnitQuad_vert.h>
#include <gpu/DrawViewportQuadTransformTexcoord_vert.h>
#include <gpu/DrawTexture_frag.h>
#include <gpu/DrawTextureOpaque_frag.h>
#include <gpu/DrawColoredTexture_frag.h>
#include <render-utils/simple_vert.h>
#include <render-utils/simple_frag.h>
#include <render-utils/simple_textured_frag.h>
#include <render-utils/simple_textured_unlit_frag.h>
#include <render-utils/deferred_light_vert.h>
#include <render-utils/deferred_light_point_vert.h>
#include <render-utils/deferred_light_spot_vert.h>
#include <render-utils/directional_ambient_light_frag.h>
#include <render-utils/directional_skybox_light_frag.h>
#include <render-utils/standardTransformPNTC_vert.h>
#include <render-utils/standardDrawTexture_frag.h>
#include <render-utils/model_vert.h>
#include <render-utils/model_shadow_vert.h>
#include <render-utils/model_normal_map_vert.h>
#include <render-utils/model_lightmap_vert.h>
#include <render-utils/model_lightmap_normal_map_vert.h>
#include <render-utils/skin_model_vert.h>
#include <render-utils/skin_model_shadow_vert.h>
#include <render-utils/skin_model_normal_map_vert.h>
#include <render-utils/model_frag.h>
#include <render-utils/model_shadow_frag.h>
#include <render-utils/model_normal_map_frag.h>
#include <render-utils/model_normal_specular_map_frag.h>
#include <render-utils/model_specular_map_frag.h>
#include <render-utils/model_lightmap_frag.h>
#include <render-utils/model_lightmap_normal_map_frag.h>
#include <render-utils/model_lightmap_normal_specular_map_frag.h>
#include <render-utils/model_lightmap_specular_map_frag.h>
#include <render-utils/model_translucent_frag.h>
#include <render-utils/overlay3D_vert.h>
#include <render-utils/overlay3D_frag.h>
#include <render-utils/sdf_text3D_vert.h>
#include <render-utils/sdf_text3D_frag.h>
#if 0
#include <model/skybox_vert.h>
#include <model/skybox_frag.h>
#include <entities-renderer/textured_particle_frag.h>
#include <entities-renderer/textured_particle_vert.h>
#include <entities-renderer/paintStroke_vert.h>
#include <entities-renderer/paintStroke_frag.h>
#include <entities-renderer/polyvox_vert.h>
#include <entities-renderer/polyvox_frag.h>
#endif
template <typename F>
void withFrameBuffer(gvr::Frame& frame, int32_t index, F f) {
frame.BindBuffer(index);
f();
frame.Unbind();
}
static const uint64_t kPredictionTimeWithoutVsyncNanos = 50000000;
// Each shader has two variants: a single-eye ES 2.0 variant, and a multiview
// ES 3.0 variant. The multiview vertex shaders use transforms defined by
// arrays of mat4 uniforms, using gl_ViewID_OVR to determine the array index.
#define UNIFORM_LIGHT_POS 20
#define UNIFORM_M 16
#define UNIFORM_MV 8
#define UNIFORM_MVP 0
#if 0
uniform Transform { // API uses “Transform[2]” to refer to instance 2
mat4 u_MVP[2];
mat4 u_MVMatrix[2];
mat4 u_Model;
vec3 u_LightPos[2];
};
static const char *kDiffuseLightingVertexShader = R"glsl(
#version 300 es
#extension GL_OVR_multiview2 : enable
layout(num_views=2) in;
layout(location = 0) uniform mat4 u_MVP[2];
layout(location = 8) uniform mat4 u_MVMatrix[2];
layout(location = 16) uniform mat4 u_Model;
layout(location = 20) uniform vec3 u_LightPos[2];
layout(location = 0) in vec4 a_Position;
layout(location = 1) in vec4 a_Color;
layout(location = 2) in vec3 a_Normal;
out vec4 v_Color;
out vec3 v_Grid;
void main() {
mat4 mvp = u_MVP[gl_ViewID_OVR];
mat4 modelview = u_MVMatrix[gl_ViewID_OVR];
vec3 lightpos = u_LightPos[gl_ViewID_OVR];
v_Grid = vec3(u_Model * a_Position);
vec3 modelViewVertex = vec3(modelview * a_Position);
vec3 modelViewNormal = vec3(modelview * vec4(a_Normal, 0.0));
float distance = length(lightpos - modelViewVertex);
vec3 lightVector = normalize(lightpos - modelViewVertex);
float diffuse = max(dot(modelViewNormal, lightVector), 0.5);
diffuse = diffuse * (1.0 / (1.0 + (0.00001 * distance * distance)));
v_Color = vec4(a_Color.rgb * diffuse, a_Color.a);
gl_Position = mvp * a_Position;
}
)glsl";
#endif
static const char *kSimepleVertexShader = R"glsl(
#version 300 es
static const char *kSimepleVertexShader = R"glsl(#version 300 es
#extension GL_OVR_multiview2 : enable
layout(num_views=2) in;
@ -147,9 +23,7 @@ void main() {
}
)glsl";
static const char *kPassthroughFragmentShader = R"glsl(
#version 300 es
static const char *kPassthroughFragmentShader = R"glsl(#version 300 es
precision mediump float;
in vec4 v_Color;
out vec4 FragColor;
@ -157,6 +31,17 @@ out vec4 FragColor;
void main() { FragColor = v_Color; }
)glsl";
int LoadGLShader(int type, const char *shadercode) {
GLuint result = 0;
std::string shaderError;
static const std::string SHADER_DEFINES;
if (!gl::compileShader(type, shadercode, SHADER_DEFINES, result, shaderError)) {
qWarning() << "QQQ" << __FUNCTION__ << "Shader compile failure" << shaderError.c_str();
}
return result;
}
static void CheckGLError(const char* label) {
int gl_error = glGetError();
if (gl_error != GL_NO_ERROR) {
@ -167,158 +52,6 @@ static void CheckGLError(const char* label) {
}
// Contains vertex, normal and other data.
namespace cube {
const std::array<float, 108> CUBE_COORDS{{
// Front face
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
// Right face
1.0f, 1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
1.0f, -1.0f, -1.0f,
1.0f, 1.0f, -1.0f,
// Back face
1.0f, 1.0f, -1.0f,
1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, -1.0f,
1.0f, -1.0f, -1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, -1.0f,
// Left face
-1.0f, 1.0f, -1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, -1.0f, 1.0f,
-1.0f, 1.0f, 1.0f,
// Top face
-1.0f, 1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
// Bottom face
1.0f, -1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, -1.0f
}};
const std::array<float, 108> CUBE_COLORS{{
// front, green
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
// right, blue
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
// back, also green
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
// left, also blue
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
// top, red
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
// bottom, also red
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f
}};
const std::array<float, 108> CUBE_NORMALS{{
// Front face
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
// Right face
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
// Back face
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
// Left face
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
// Top face
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
// Bottom face
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f
}};
}
namespace triangle {
static std::array<float, 9> TRIANGLE_VERTS {{
-0.5f, -0.5f, 0.0f,
@ -327,229 +60,36 @@ namespace triangle {
}};
}
std::array<gvr::BufferViewport, 2> buildViewports(const std::unique_ptr<gvr::GvrApi> &gvrapi) {
return { {gvrapi->CreateBufferViewport(), gvrapi->CreateBufferViewport()} };
};
const std::string VERTEX_SHADER_DEFINES{ R"GLSL(
#version 300 es
#extension GL_EXT_clip_cull_distance : enable
#define GPU_VERTEX_SHADER
#define GPU_SSBO_TRANSFORM_OBJECT 1
#define GPU_TRANSFORM_IS_STEREO
#define GPU_TRANSFORM_STEREO_CAMERA
#define GPU_TRANSFORM_STEREO_CAMERA_INSTANCED
#define GPU_TRANSFORM_STEREO_SPLIT_SCREEN
)GLSL" };
const std::string PIXEL_SHADER_DEFINES{ R"GLSL(
#version 300 es
precision mediump float;
#define GPU_PIXEL_SHADER
#define GPU_TRANSFORM_IS_STEREO
#define GPU_TRANSFORM_STEREO_CAMERA
#define GPU_TRANSFORM_STEREO_CAMERA_INSTANCED
#define GPU_TRANSFORM_STEREO_SPLIT_SCREEN
)GLSL" };
#if defined(GVR)
NativeRenderer::NativeRenderer(gvr_context *vrContext) :
_gvrapi(new gvr::GvrApi(vrContext, false)),
_viewports(buildViewports(_gvrapi)),
_gvr_viewer_type(_gvrapi->GetViewerType())
#else
NativeRenderer::NativeRenderer(void *vrContext)
#endif
{
start = std::chrono::system_clock::now();
qDebug() << "QQQ" << __FUNCTION__;
}
/**
* Converts a raw text file, saved as a resource, into an OpenGL ES shader.
*
* @param type The type of shader we will be creating.
* @param resId The resource ID of the raw text file.
* @return The shader object handler.
*/
int LoadGLShader(int type, const char *shadercode) {
GLuint result = 0;
std::string shaderError;
static const std::string SHADER_DEFINES;
if (!gl::compileShader(type, shadercode, SHADER_DEFINES, result, shaderError)) {
qWarning() << "QQQ" << __FUNCTION__ << "Shader compile failure" << shaderError.c_str();
}
return result;
}
// Computes a texture size that has approximately half as many pixels. This is
// equivalent to scaling each dimension by approximately sqrt(2)/2.
static gvr::Sizei HalfPixelCount(const gvr::Sizei &in) {
// Scale each dimension by sqrt(2)/2 ~= 7/10ths.
gvr::Sizei out;
out.width = (7 * in.width) / 10;
out.height = (7 * in.height) / 10;
return out;
}
#if defined(GVR)
void NativeRenderer::InitializeVR() {
_gvrapi->InitializeGl();
bool multiviewEnabled = _gvrapi->IsFeatureSupported(GVR_FEATURE_MULTIVIEW);
qWarning() << "QQQ" << __FUNCTION__ << "Multiview enabled " << multiviewEnabled;
// Because we are using 2X MSAA, we can render to half as many pixels and
// achieve similar quality.
_renderSize = HalfPixelCount(_gvrapi->GetMaximumEffectiveRenderTargetSize());
std::vector<gvr::BufferSpec> specs;
specs.push_back(_gvrapi->CreateBufferSpec());
specs[0].SetColorFormat(GVR_COLOR_FORMAT_RGBA_8888);
specs[0].SetDepthStencilFormat(GVR_DEPTH_STENCIL_FORMAT_DEPTH_16);
specs[0].SetSamples(2);
gvr::Sizei half_size = {_renderSize.width / 2, _renderSize.height};
specs[0].SetMultiviewLayers(2);
specs[0].SetSize(half_size);
_swapchain.reset(new gvr::SwapChain(_gvrapi->CreateSwapChain(specs)));
_viewportlist.reset(new gvr::BufferViewportList(_gvrapi->CreateEmptyBufferViewportList()));
}
void NativeRenderer::PrepareFramebuffer() {
const gvr::Sizei recommended_size = HalfPixelCount(
_gvrapi->GetMaximumEffectiveRenderTargetSize());
if (_renderSize.width != recommended_size.width ||
_renderSize.height != recommended_size.height) {
// We need to resize the framebuffer. Note that multiview uses two texture
// layers, each with half the render width.
gvr::Sizei framebuffer_size = recommended_size;
framebuffer_size.width /= 2;
_swapchain->ResizeBuffer(0, framebuffer_size);
_renderSize = recommended_size;
}
}
#endif
void testShaderBuild(const char* vs_src, const char * fs_src) {
std::string error;
GLuint vs, fs;
if (!gl::compileShader(GL_VERTEX_SHADER, vs_src, VERTEX_SHADER_DEFINES, vs, error) ||
!gl::compileShader(GL_FRAGMENT_SHADER, fs_src, PIXEL_SHADER_DEFINES, fs, error)) {
throw std::runtime_error("Failed to compile shader");
}
auto pr = gl::compileProgram({ vs, fs }, error);
if (!pr) {
throw std::runtime_error("Failed to link shader");
}
}
void NativeRenderer::InitializeGl() {
qDebug() << "QQQ" << __FUNCTION__;
//gl::initModuleGl();
#if defined(GVR)
InitializeVR();
#endif
glDisable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glDisable(GL_SCISSOR_TEST);
glDisable(GL_BLEND);
const uint32_t vertShader = LoadGLShader(GL_VERTEX_SHADER, kSimepleVertexShader);
//const uint32_t vertShader = LoadGLShader(GL_VERTEX_SHADER, kDiffuseLightingVertexShader);
const uint32_t fragShader = LoadGLShader(GL_FRAGMENT_SHADER, kPassthroughFragmentShader);
std::string error;
_cubeProgram = gl::compileProgram({ vertShader, fragShader }, error);
_program = gl::compileProgram({ vertShader, fragShader }, error);
CheckGLError("build program");
glGenBuffers(1, &_cubeBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _cubeBuffer);
glGenBuffers(1, &_geometryBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _geometryBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 9, triangle::TRIANGLE_VERTS.data(), GL_STATIC_DRAW);
/*
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 3, NULL, GL_STATIC_DRAW);
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 0, sizeof(float) * 108, cube::CUBE_COORDS.data());
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 1, sizeof(float) * 108, cube::CUBE_COLORS.data());
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 2, sizeof(float) * 108, cube::CUBE_NORMALS.data());
*/
glBindBuffer(GL_ARRAY_BUFFER, 0);
CheckGLError("upload vertices");
glGenVertexArrays(1, &_cubeVao);
glBindBuffer(GL_ARRAY_BUFFER, _cubeBuffer);
glBindVertexArray(_cubeVao);
glGenVertexArrays(1, &_vao);
glBindBuffer(GL_ARRAY_BUFFER, _geometryBuffer);
glBindVertexArray(_vao);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);
/*
glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
glEnableVertexAttribArray(0);
glVertexAttribPointer(1, 3, GL_FLOAT, false, 0, (const void*)(sizeof(float) * 108 * 1) );
glEnableVertexAttribArray(1);
glVertexAttribPointer(2, 3, GL_FLOAT, false, 0, (const void*)(sizeof(float) * 108 * 2));
glEnableVertexAttribArray(2);
*/
glBindVertexArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
CheckGLError("build vao ");
static std::once_flag once;
std::call_once(once, [&]{
testShaderBuild(sdf_text3D_vert, sdf_text3D_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawTexture_frag);
testShaderBuild(DrawTexcoordRectTransformUnitQuad_vert, DrawTexture_frag);
testShaderBuild(DrawViewportQuadTransformTexcoord_vert, DrawTexture_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawTextureOpaque_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawColoredTexture_frag);
testShaderBuild(simple_vert, simple_frag);
testShaderBuild(simple_vert, simple_textured_frag);
testShaderBuild(simple_vert, simple_textured_unlit_frag);
testShaderBuild(deferred_light_vert, directional_ambient_light_frag);
testShaderBuild(deferred_light_vert, directional_skybox_light_frag);
testShaderBuild(standardTransformPNTC_vert, standardDrawTexture_frag);
testShaderBuild(standardTransformPNTC_vert, DrawTextureOpaque_frag);
testShaderBuild(model_vert, model_frag);
testShaderBuild(model_normal_map_vert, model_normal_map_frag);
testShaderBuild(model_vert, model_specular_map_frag);
testShaderBuild(model_normal_map_vert, model_normal_specular_map_frag);
testShaderBuild(model_vert, model_translucent_frag);
testShaderBuild(model_normal_map_vert, model_translucent_frag);
testShaderBuild(model_lightmap_vert, model_lightmap_frag);
testShaderBuild(model_lightmap_normal_map_vert, model_lightmap_normal_map_frag);
testShaderBuild(model_lightmap_vert, model_lightmap_specular_map_frag);
testShaderBuild(model_lightmap_normal_map_vert, model_lightmap_normal_specular_map_frag);
testShaderBuild(skin_model_vert, model_frag);
testShaderBuild(skin_model_normal_map_vert, model_normal_map_frag);
testShaderBuild(skin_model_vert, model_specular_map_frag);
testShaderBuild(skin_model_normal_map_vert, model_normal_specular_map_frag);
testShaderBuild(skin_model_vert, model_translucent_frag);
testShaderBuild(skin_model_normal_map_vert, model_translucent_frag);
testShaderBuild(model_shadow_vert, model_shadow_frag);
testShaderBuild(overlay3D_vert, overlay3D_frag);
#if 0
testShaderBuild(textured_particle_vert, textured_particle_frag);
testShaderBuild(skybox_vert, skybox_frag);
testShaderBuild(paintStroke_vert,paintStroke_frag);
testShaderBuild(polyvox_vert, polyvox_frag);
#endif
});
qDebug() << "done";
}
static const float kZNear = 1.0f;
static const float kZFar = 100.0f;
static const gvr_rectf fullscreen = {0, 1, 0, 1};
void NativeRenderer::DrawFrame() {
auto now = std::chrono::duration_cast<std::chrono::milliseconds>(
@ -559,65 +99,12 @@ void NativeRenderer::DrawFrame() {
v.g = 1.0f - v.r;
v.b = 1.0f;
PrepareFramebuffer();
// A client app does its rendering here.
gvr::ClockTimePoint target_time = gvr::GvrApi::GetTimePointNow();
target_time.monotonic_system_time_nanos += kPredictionTimeWithoutVsyncNanos;
using namespace googlevr;
using namespace bilateral;
const auto gvrHeadPose = _gvrapi->GetHeadSpaceFromStartSpaceRotation(target_time);
_head_view = toGlm(gvrHeadPose);
_viewportlist->SetToRecommendedBufferViewports();
glm::mat4 eye_views[2];
for_each_side([&](bilateral::Side side) {
int eye = index(side);
const gvr::Eye gvr_eye = eye == 0 ? GVR_LEFT_EYE : GVR_RIGHT_EYE;
const auto& eyeView = eye_views[eye] = toGlm(_gvrapi->GetEyeFromHeadMatrix(gvr_eye)) * _head_view;
auto& viewport = _viewports[eye];
_viewportlist->GetBufferViewport(eye, &viewport);
viewport.SetSourceUv(fullscreen);
viewport.SetSourceLayer(eye);
_viewportlist->SetBufferViewport(eye, viewport);
const auto &mvc = _modelview_cube[eye] = eyeView * _model_cube;
const auto &mvf = _modelview_floor[eye] = eyeView * _model_floor;
const gvr_rectf fov = viewport.GetSourceFov();
const glm::mat4 perspective = perspectiveMatrixFromView(fov, kZNear, kZFar);
_modelview_projection_cube[eye] = perspective * mvc;
_modelview_projection_floor[eye] = perspective * mvf;
_light_pos_eye_space[eye] = glm::vec3(eyeView * _light_pos_world_space);
});
gvr::Frame frame = _swapchain->AcquireFrame();
withFrameBuffer(frame, 0, [&]{
glClearColor(v.r, v.g, v.b, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, _renderSize.width / 2, _renderSize.height);
glUseProgram(_cubeProgram);
glBindVertexArray(_cubeVao);
glDrawArrays(GL_TRIANGLES, 0, 3);
/*
float* fp;
fp = (float*)&_light_pos_eye_space[0];
glUniform3fv(UNIFORM_LIGHT_POS, 2, fp);
fp = (float*)&_modelview_cube[0];
glUniformMatrix4fv(UNIFORM_MV, 2, GL_FALSE, fp);
fp = (float*)&_modelview_projection_cube[0];
glUniformMatrix4fv(UNIFORM_MVP, 2, GL_FALSE, fp);
fp = (float*)&_model_cube;
glUniformMatrix4fv(UNIFORM_M, 1, GL_FALSE, fp);
glDrawArrays(GL_TRIANGLES, 0, 36);
*/
glBindVertexArray(0);
});
frame.Submit(*_viewportlist, gvrHeadPose);
CheckGLError("onDrawFrame");
glClearColor(v.r, v.g, v.b, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(_program);
glBindVertexArray(_vao);
glDrawArrays(GL_TRIANGLES, 0, 3);
glBindVertexArray(0);
}
void NativeRenderer::OnTriggerEvent() {
@ -626,11 +113,8 @@ void NativeRenderer::OnTriggerEvent() {
void NativeRenderer::OnPause() {
qDebug() << "QQQ" << __FUNCTION__;
_gvrapi->PauseTracking();
}
void NativeRenderer::OnResume() {
qDebug() << "QQQ" << __FUNCTION__;
_gvrapi->ResumeTracking();
_gvrapi->RefreshViewerProfile();
}

View file

@ -4,21 +4,8 @@
#include <array>
#include <glm/glm.hpp>
#define GVR
#if defined(GVR)
#include <vr/gvr/capi/include/gvr.h>
#endif
class NativeRenderer {
public:
#if defined(GVR)
NativeRenderer(gvr_context* vrContext);
#else
NativeRenderer(void* vrContext);
#endif
void InitializeGl();
void DrawFrame();
void OnTriggerEvent();
@ -26,35 +13,9 @@ public:
void OnResume();
private:
std::chrono::time_point<std::chrono::system_clock> start { std::chrono::system_clock::now() };
std::chrono::time_point<std::chrono::system_clock> start;
#if defined(GVR)
void InitializeVR();
void PrepareFramebuffer();
std::unique_ptr<gvr::GvrApi> _gvrapi;
gvr::ViewerType _gvr_viewer_type;
std::unique_ptr<gvr::BufferViewportList> _viewportlist;
std::unique_ptr<gvr::SwapChain> _swapchain;
std::array<gvr::BufferViewport, 2> _viewports;
gvr::Sizei _renderSize;
#endif
uint32_t _cubeBuffer { 0 };
uint32_t _cubeVao { 0 };
uint32_t _cubeProgram { 0 };
glm::mat4 _head_view;
glm::mat4 _model_cube;
glm::mat4 _camera;
glm::mat4 _view;
glm::mat4 _model_floor;
std::array<glm::mat4, 2> _modelview_cube;
std::array<glm::mat4, 2> _modelview_floor;
std::array<glm::mat4, 2> _modelview_projection_cube;
std::array<glm::mat4, 2> _modelview_projection_floor;
std::array<glm::vec3, 2> _light_pos_eye_space;
const glm::vec4 _light_pos_world_space{ 0, 2, 0, 1};
uint32_t _geometryBuffer { 0 };
uint32_t _vao { 0 };
uint32_t _program { 0 };
};

View file

@ -26,10 +26,9 @@ public class MainActivity extends Activity {
}
private long nativeRenderer;
private GvrLayout gvrLayout;
private GLSurfaceView surfaceView;
private native long nativeCreateRenderer(ClassLoader appClassLoader, Context context, long nativeGvrContext);
private native long nativeCreateRenderer(ClassLoader appClassLoader, Context context);
private native void nativeDestroyRenderer(long renderer);
private native void nativeInitializeGl(long renderer);
private native void nativeDrawFrame(long renderer);
@ -55,30 +54,21 @@ public class MainActivity extends Activity {
if ((visibility & View.SYSTEM_UI_FLAG_FULLSCREEN) == 0) { setImmersiveSticky(); }
});
gvrLayout = new GvrLayout(this);
nativeRenderer = nativeCreateRenderer(
getClass().getClassLoader(),
getApplicationContext(),
gvrLayout.getGvrApi().getNativeGvrContext());
getApplicationContext());
surfaceView = new GLSurfaceView(this);
surfaceView.setEGLContextClientVersion(3);
surfaceView.setEGLConfigChooser(8, 8, 8, 0, 0, 0);
surfaceView.setPreserveEGLContextOnPause(true);
surfaceView.setRenderer(new NativeRenderer());
gvrLayout.setPresentationView(surfaceView);
setContentView(gvrLayout);
if (gvrLayout.setAsyncReprojectionEnabled(true)) {
AndroidCompat.setSustainedPerformanceMode(this, true);
}
AndroidCompat.setVrModeEnabled(this, true);
setContentView(surfaceView);
}
@Override
protected void onDestroy() {
super.onDestroy();
gvrLayout.shutdown();
nativeDestroyRenderer(nativeRenderer);
nativeRenderer = 0;
}
@ -87,14 +77,12 @@ public class MainActivity extends Activity {
protected void onPause() {
surfaceView.queueEvent(()->nativeOnPause(nativeRenderer));
surfaceView.onPause();
gvrLayout.onPause();
super.onPause();
}
@Override
protected void onResume() {
super.onResume();
gvrLayout.onResume();
surfaceView.onResume();
surfaceView.queueEvent(()->nativeOnResume(nativeRenderer));
}

View file

@ -1,91 +1,216 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
jcenter()
google()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.3.3'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
classpath 'com.android.tools.build:gradle:3.0.1'
}
}
plugins {
id 'de.undercouch.download' version '3.3.0'
}
allprojects {
repositories {
jcenter()
google()
}
}
def baseFolder = new File(HIFI_ANDROID_PRECOMPILED)
def jniFolder = new File('app/src/main/jniLibs/arm64-v8a')
import org.apache.tools.ant.taskdefs.condition.Os
def baseUrl = 'https://hifi-public.s3.amazonaws.com/austin/android/'
def qtFile='qt-5.9.3_linux_armv8-libcpp.tgz'
def qtChecksum='547da3547d5690144e23d6504c6d6e91'
if (Os.isFamily(Os.FAMILY_MAC)) {
qtFile = 'qt-5.9.3_osx_armv8-libcpp.tgz'
qtChecksum='6fa3e068cfdee863fc909b294a3a0cc6'
} else if (Os.isFamily(Os.FAMILY_WINDOWS)) {
qtFile = 'qt-5.9.3_win_armv8-libcpp.tgz'
qtChecksum='3a757378a7e9dbbfc662177e0eb46408'
}
def packages = [
qt: [
file: qtFile,
checksum: qtChecksum,
sharedLibFolder: '',
includeLibs: ['lib/*.so', 'plugins/*/*.so']
],
bullet: [
file: 'bullet-2.83_armv8-libcpp.tgz',
checksum: '2c558d604fce337f5eba3eb7ec1252fd'
],
draco: [
file: 'draco_armv8-libcpp.tgz',
checksum: '617a80d213a5ec69fbfa21a1f2f738cd'
],
gvr: [
file: 'gvrsdk_v1.101.0.tgz',
checksum: '57fd02baa069176ba18597a29b6b4fc7'
],
openssl: [
file: 'openssl-1.1.0g_armv8.tgz',
checksum: 'cabb681fbccd79594f65fcc266e02f32'
],
polyvox: [
file: 'polyvox_armv8-libcpp.tgz',
checksum: '5c918288741ee754c16aeb12bb46b9e1',
sharedLibFolder: 'lib',
includeLibs: ['Release/libPolyVoxCore.so', 'libPolyVoxUtil.so']
],
tbb: [
file: 'tbb-2018_U1_armv8_libcpp.tgz',
checksum: '20768f298f53b195e71b414b0ae240c4',
sharedLibFolder: 'lib/release',
includeLibs: ['libtbb.so', 'libtbbmalloc.so']
]
]
task downloadDependencies {
doLast {
packages.each { entry ->
def filename = entry.value['file'];
def url = baseUrl + filename;
download {
src url
dest new File(baseFolder, filename)
onlyIfNewer true
}
}
}
}
import de.undercouch.gradle.tasks.download.Verify
task verifyQt(type: Verify) { def p = packages['qt']; src new File(baseFolder, p['file']); checksum p['checksum']; }
task verifyBullet(type: Verify) { def p = packages['bullet']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyDraco(type: Verify) { def p = packages['draco']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyGvr(type: Verify) { def p = packages['gvr']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyOpenSSL(type: Verify) { def p = packages['openssl']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyPolyvox(type: Verify) { def p = packages['polyvox']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyTBB(type: Verify) { def p = packages['tbb']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyDependencyDownloads(dependsOn: downloadDependencies) { }
verifyDependencyDownloads.dependsOn verifyQt
verifyDependencyDownloads.dependsOn verifyBullet
verifyDependencyDownloads.dependsOn verifyDraco
verifyDependencyDownloads.dependsOn verifyGvr
verifyDependencyDownloads.dependsOn verifyOpenSSL
verifyDependencyDownloads.dependsOn verifyPolyvox
verifyDependencyDownloads.dependsOn verifyTBB
task extractDependencies(dependsOn: verifyDependencyDownloads) {
doLast {
packages.each { entry ->
def folder = entry.key;
def filename = entry.value['file'];
def localFile = new File(HIFI_ANDROID_PRECOMPILED, filename)
def localFolder = new File(HIFI_ANDROID_PRECOMPILED, folder)
copy {
from tarTree(resources.gzip(localFile))
into localFolder
}
}
}
}
task copyDependencies(dependsOn: extractDependencies) {
doLast {
packages.each { entry ->
def packageName = entry.key
def currentPackage = entry.value;
if (currentPackage.containsKey('sharedLibFolder')) {
def localFolder = new File(baseFolder, packageName + '/' + currentPackage['sharedLibFolder'])
def tree = fileTree(localFolder);
if (currentPackage.containsKey('includeLibs')) {
currentPackage['includeLibs'].each { includeSpec -> tree.include includeSpec }
}
tree.visit { element ->
if (!element.file.isDirectory()) {
copy { from element.file; into jniFolder }
}
}
}
}
}
}
def scribeFile='scribe_linux_x86_64'
def scribeLocalFile='scribe'
def scribeChecksum='c98678d9726bd8bbf1bab792acf3ff6c'
if (Os.isFamily(Os.FAMILY_MAC)) {
scribeFile = 'scribe_osx_x86_64'
scribeChecksum='a137ad62c1bf7cca739da219544a9a16'
} else if (Os.isFamily(Os.FAMILY_WINDOWS)) {
scribeFile = 'scribe_win32_x86_64.exe'
scribeLocalFile = 'scribe.exe'
scribeChecksum='75c2ce9ed45d17de375e3988bfaba816'
}
import de.undercouch.gradle.tasks.download.Download
task downloadScribe(type: Download) {
src baseUrl + scribeFile
dest new File(baseFolder, scribeLocalFile)
onlyIfNewer true
}
task verifyScribe (type: Verify, dependsOn: downloadScribe) {
src new File(baseFolder, scribeLocalFile);
checksum scribeChecksum
}
task fixScribePermissions(type: Exec, dependsOn: verifyScribe) {
commandLine 'chmod', 'a+x', HIFI_ANDROID_PRECOMPILED + '/' + scribeLocalFile
}
task setupScribe(dependsOn: verifyScribe) { }
// On Windows, we don't need to set the executable bit, but on OSX and Unix we do
if (!Os.isFamily(Os.FAMILY_WINDOWS)) {
setupScribe.dependsOn fixScribePermissions
}
task extractGvrBinaries(dependsOn: extractDependencies) {
doLast {
def gvrLibFolder = new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries');
zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-audio-1.101.0.aar')).visit { element ->
def fileName = element.file.toString();
if (fileName.endsWith('libgvr_audio.so') && fileName.contains('arm64-v8a')) {
copy { from element.file; into gvrLibFolder }
}
}
zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-base-1.101.0.aar')).visit { element ->
def fileName = element.file.toString();
if (fileName.endsWith('libgvr.so') && fileName.contains('arm64-v8a')) {
copy { from element.file; into gvrLibFolder }
}
}
fileTree(gvrLibFolder).visit { element ->
if (element.file.toString().endsWith('.so')) {
copy { from element.file; into jniFolder }
}
}
}
}
task setupDependencies(dependsOn: [setupScribe, copyDependencies, extractGvrBinaries]) {
}
task cleanDependencies(type: Delete) {
delete HIFI_ANDROID_PRECOMPILED
delete 'app/src/main/jniLibs/arm64-v8a'
}
task clean(type: Delete) {
delete rootProject.buildDir
}
task extractQt5jars(type: Copy) {
from fileTree(QT5_ROOT + "/jar")
into("${project.rootDir}/libraries/jar")
include("*.jar")
}
task extractQt5so(type: Copy) {
from fileTree(QT5_ROOT + "/lib")
into("${project.rootDir}/libraries/jni/armeabi-v7a/")
include("libQt5AndroidExtras.so")
include("libQt5Concurrent.so")
include("libQt5Core.so")
include("libQt5Gamepad.so")
include("libQt5Gui.so")
include("libQt5Location.so")
include("libQt5Multimedia.so")
include("libQt5MultimediaQuick_p.so")
include("libQt5Network.so")
include("libQt5NetworkAuth.so")
include("libQt5OpenGL.so")
include("libQt5Positioning.so")
include("libQt5Qml.so")
include("libQt5Quick.so")
include("libQt5QuickControls2.so")
include("libQt5QuickParticles.so")
include("libQt5QuickTemplates2.so")
include("libQt5QuickWidgets.so")
include("libQt5Script.so")
include("libQt5ScriptTools.so")
include("libQt5Sensors.so")
include("libQt5Svg.so")
include("libQt5WebChannel.so")
include("libQt5WebSockets.so")
include("libQt5WebView.so")
include("libQt5Widgets.so")
include("libQt5Xml.so")
include("libQt5XmlPatterns.so")
}
task extractAudioSo(type: Copy) {
from zipTree(GVR_ROOT + "/libraries/sdk-audio-1.80.0.aar")
into "${project.rootDir}/libraries/"
include "jni/armeabi-v7a/libgvr_audio.so"
}
task extractGvrSo(type: Copy) {
from zipTree(GVR_ROOT + "/libraries/sdk-base-1.80.0.aar")
into "${project.rootDir}/libraries/"
include "jni/armeabi-v7a/libgvr.so"
}
task extractNdk { }
extractNdk.dependsOn extractAudioSo
extractNdk.dependsOn extractGvrSo
task extractQt5 { }
extractQt5.dependsOn extractQt5so
extractQt5.dependsOn extractQt5jars
task extractBinaries { }
extractBinaries.dependsOn extractQt5
extractBinaries.dependsOn extractNdk
task deleteBinaries(type: Delete) {
delete "${project.rootDir}/libraries/jni"
}
//clean.dependsOn(deleteBinaries)

41
android/setupGVR.gradle Normal file
View file

@ -0,0 +1,41 @@
buildscript {
repositories {
jcenter()
google()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
classpath 'de.undercouch:gradle-download-task:3.3.0'
}
}
def file='gvrsdk_v1.101.0.tgz'
def url='https://hifi-public.s3.amazonaws.com/austin/android/' + file
def destFile = new File(HIFI_ANDROID_PRECOMPILED, file)
// FIXME find a way to only download if the file doesn't exist
task downloadGVR(type: de.undercouch.gradle.tasks.download.Download) {
src url
dest destFile
}
task extractGVR(dependsOn: downloadGVR, type: Copy) {
from tarTree(resources.gzip(destFile))
into new File(HIFI_ANDROID_PRECOMPILED, 'gvr')
}
task copyGVRAudioLibs(dependsOn: extractGVR, type: Copy) {
from zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-audio-1.101.0.aar'))
include 'jni/arm64-v8a/libgvr_audio.so'
into HIFI_ANDROID_PRECOMPILED
}
task copyGVRLibs(dependsOn: extractGVR, type: Copy) {
from zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-base-1.101.0.aar'))
include 'jni/arm64-v8a/libgvr.so'
into HIFI_ANDROID_PRECOMPILED
}
task setupGVR(dependsOn: [copyGVRLibs, copyGVRAudioLibs]) {
}

View file

@ -441,7 +441,7 @@ void Agent::executeScript() {
Transform audioTransform;
auto headOrientation = scriptedAvatar->getHeadOrientation();
audioTransform.setTranslation(scriptedAvatar->getPosition());
audioTransform.setTranslation(scriptedAvatar->getWorldPosition());
audioTransform.setRotation(headOrientation);
QByteArray encodedBuffer;
@ -452,7 +452,7 @@ void Agent::executeScript() {
}
AbstractAudioInterface::emitAudioPacket(encodedBuffer.data(), encodedBuffer.size(), audioSequenceNumber,
audioTransform, scriptedAvatar->getPosition(), glm::vec3(0),
audioTransform, scriptedAvatar->getWorldPosition(), glm::vec3(0),
packetType, _selectedCodecName);
});
@ -742,10 +742,10 @@ void Agent::processAgentAvatarAudio() {
audioPacket->writePrimitive(numAvailableSamples);
// use the orientation and position of this avatar for the source of this audio
audioPacket->writePrimitive(scriptedAvatar->getPosition());
audioPacket->writePrimitive(scriptedAvatar->getWorldPosition());
glm::quat headOrientation = scriptedAvatar->getHeadOrientation();
audioPacket->writePrimitive(headOrientation);
audioPacket->writePrimitive(scriptedAvatar->getPosition());
audioPacket->writePrimitive(scriptedAvatar->getWorldPosition());
audioPacket->writePrimitive(glm::vec3(0));
// no matter what, the loudness should be set to 0
@ -759,10 +759,10 @@ void Agent::processAgentAvatarAudio() {
audioPacket->writePrimitive((quint8)0);
// use the orientation and position of this avatar for the source of this audio
audioPacket->writePrimitive(scriptedAvatar->getPosition());
audioPacket->writePrimitive(scriptedAvatar->getWorldPosition());
glm::quat headOrientation = scriptedAvatar->getHeadOrientation();
audioPacket->writePrimitive(headOrientation);
audioPacket->writePrimitive(scriptedAvatar->getPosition());
audioPacket->writePrimitive(scriptedAvatar->getWorldPosition()); // HUH? why do we write this twice??
audioPacket->writePrimitive(glm::vec3(0));
QByteArray encodedBuffer;

View file

@ -29,6 +29,7 @@
#include <UUID.h>
#include <CPUDetect.h>
#include "AudioLogging.h"
#include "AudioHelpers.h"
#include "AudioRingBuffer.h"
#include "AudioMixerClientData.h"
@ -130,7 +131,7 @@ void AudioMixer::queueReplicatedAudioPacket(QSharedPointer<ReceivedMessage> mess
PacketType rewrittenType = PacketTypeEnum::getReplicatedPacketMapping().key(message->getType());
if (rewrittenType == PacketType::Unknown) {
qDebug() << "Cannot unwrap replicated packet type not present in REPLICATED_PACKET_WRAPPING";
qCDebug(audio) << "Cannot unwrap replicated packet type not present in REPLICATED_PACKET_WRAPPING";
}
auto replicatedMessage = QSharedPointer<ReceivedMessage>::create(audioData, rewrittenType,
@ -345,7 +346,7 @@ void AudioMixer::sendStatsPacket() {
void AudioMixer::run() {
qDebug() << "Waiting for connection to domain to request settings from domain-server.";
qCDebug(audio) << "Waiting for connection to domain to request settings from domain-server.";
// wait until we have the domain-server settings, otherwise we bail
DomainHandler& domainHandler = DependencyManager::get<NodeList>()->getDomainHandler();
@ -502,14 +503,14 @@ void AudioMixer::throttle(std::chrono::microseconds duration, int frame) {
int proportionalTerm = 1 + (_trailingMixRatio - TARGET) / 0.1f;
_throttlingRatio += THROTTLE_RATE * proportionalTerm;
_throttlingRatio = std::min(_throttlingRatio, 1.0f);
qDebug("audio-mixer is struggling (%f mix/sleep) - throttling %f of streams",
(double)_trailingMixRatio, (double)_throttlingRatio);
qCDebug(audio) << "audio-mixer is struggling (" << _trailingMixRatio << "mix/sleep) - throttling"
<< _throttlingRatio << "of streams";
} else if (_throttlingRatio > 0.0f && _trailingMixRatio <= BACKOFF_TARGET) {
int proportionalTerm = 1 + (TARGET - _trailingMixRatio) / 0.2f;
_throttlingRatio -= BACKOFF_RATE * proportionalTerm;
_throttlingRatio = std::max(_throttlingRatio, 0.0f);
qDebug("audio-mixer is recovering (%f mix/sleep) - throttling %f of streams",
(double)_trailingMixRatio, (double)_throttlingRatio);
qCDebug(audio) << "audio-mixer is recovering (" << _trailingMixRatio << "mix/sleep) - throttling"
<< _throttlingRatio << "of streams";
}
}
}
@ -534,7 +535,7 @@ void AudioMixer::clearDomainSettings() {
}
void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
qDebug() << "AVX2 Support:" << (cpuSupportsAVX2() ? "enabled" : "disabled");
qCDebug(audio) << "AVX2 Support:" << (cpuSupportsAVX2() ? "enabled" : "disabled");
if (settingsObject.contains(AUDIO_THREADING_GROUP_KEY)) {
QJsonObject audioThreadingGroupObject = settingsObject[AUDIO_THREADING_GROUP_KEY].toObject();
@ -557,7 +558,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
const QString DYNAMIC_JITTER_BUFFER_JSON_KEY = "dynamic_jitter_buffer";
bool enableDynamicJitterBuffer = audioBufferGroupObject[DYNAMIC_JITTER_BUFFER_JSON_KEY].toBool();
if (enableDynamicJitterBuffer) {
qDebug() << "Enabling dynamic jitter buffers.";
qCDebug(audio) << "Enabling dynamic jitter buffers.";
bool ok;
const QString DESIRED_JITTER_BUFFER_FRAMES_KEY = "static_desired_jitter_buffer_frames";
@ -565,9 +566,9 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
if (!ok) {
_numStaticJitterFrames = InboundAudioStream::DEFAULT_STATIC_JITTER_FRAMES;
}
qDebug() << "Static desired jitter buffer frames:" << _numStaticJitterFrames;
qCDebug(audio) << "Static desired jitter buffer frames:" << _numStaticJitterFrames;
} else {
qDebug() << "Disabling dynamic jitter buffers.";
qCDebug(audio) << "Disabling dynamic jitter buffers.";
_numStaticJitterFrames = DISABLE_STATIC_JITTER_FRAMES;
}
@ -621,7 +622,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
if (audioEnvGroupObject[CODEC_PREFERENCE_ORDER].isString()) {
QString codecPreferenceOrder = audioEnvGroupObject[CODEC_PREFERENCE_ORDER].toString();
_codecPreferenceOrder = codecPreferenceOrder.split(",");
qDebug() << "Codec preference order changed to" << _codecPreferenceOrder;
qCDebug(audio) << "Codec preference order changed to" << _codecPreferenceOrder;
}
const QString ATTENATION_PER_DOULING_IN_DISTANCE = "attenuation_per_doubling_in_distance";
@ -630,7 +631,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
float attenuation = audioEnvGroupObject[ATTENATION_PER_DOULING_IN_DISTANCE].toString().toFloat(&ok);
if (ok) {
_attenuationPerDoublingInDistance = attenuation;
qDebug() << "Attenuation per doubling in distance changed to" << _attenuationPerDoublingInDistance;
qCDebug(audio) << "Attenuation per doubling in distance changed to" << _attenuationPerDoublingInDistance;
}
}
@ -640,7 +641,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
float noiseMutingThreshold = audioEnvGroupObject[NOISE_MUTING_THRESHOLD].toString().toFloat(&ok);
if (ok) {
_noiseMutingThreshold = noiseMutingThreshold;
qDebug() << "Noise muting threshold changed to" << _noiseMutingThreshold;
qCDebug(audio) << "Noise muting threshold changed to" << _noiseMutingThreshold;
}
}
@ -680,8 +681,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
glm::vec3 dimensions(xMax - xMin, yMax - yMin, zMax - zMin);
AABox zoneAABox(corner, dimensions);
_audioZones.insert(zone, zoneAABox);
qDebug() << "Added zone:" << zone << "(corner:" << corner
<< ", dimensions:" << dimensions << ")";
qCDebug(audio) << "Added zone:" << zone << "(corner:" << corner << ", dimensions:" << dimensions << ")";
}
}
}
@ -712,7 +712,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
_audioZones.contains(settings.source) && _audioZones.contains(settings.listener)) {
_zoneSettings.push_back(settings);
qDebug() << "Added Coefficient:" << settings.source << settings.listener << settings.coefficient;
qCDebug(audio) << "Added Coefficient:" << settings.source << settings.listener << settings.coefficient;
}
}
}
@ -745,7 +745,7 @@ void AudioMixer::parseSettingsObject(const QJsonObject& settingsObject) {
_zoneReverbSettings.push_back(settings);
qDebug() << "Added Reverb:" << zone << reverbTime << wetLevel;
qCDebug(audio) << "Added Reverb:" << zone << reverbTime << wetLevel;
}
}
}

View file

@ -19,6 +19,7 @@
#include "InjectedAudioStream.h"
#include "AudioLogging.h"
#include "AudioHelpers.h"
#include "AudioMixer.h"
#include "AudioMixerClientData.h"
@ -132,7 +133,7 @@ void AudioMixerClientData::optionallyReplicatePacket(ReceivedMessage& message, c
if (PacketTypeEnum::getReplicatedPacketMapping().key(message.getType()) != PacketType::Unknown) {
mirroredType = message.getType();
} else {
qDebug() << "Packet passed to optionallyReplicatePacket was not a replicatable type - returning";
qCDebug(audio) << "Packet passed to optionallyReplicatePacket was not a replicatable type - returning";
return;
}
}
@ -189,8 +190,16 @@ void AudioMixerClientData::parsePerAvatarGainSet(ReceivedMessage& message, const
uint8_t packedGain;
message.readPrimitive(&packedGain);
float gain = unpackFloatGainFromByte(packedGain);
hrtfForStream(avatarUuid, QUuid()).setGainAdjustment(gain);
qDebug() << "Setting gain adjustment for hrtf[" << uuid << "][" << avatarUuid << "] to " << gain;
if (avatarUuid.isNull()) {
// set the MASTER avatar gain
setMasterAvatarGain(gain);
qCDebug(audio) << "Setting MASTER avatar gain for " << uuid << " to " << gain;
} else {
// set the per-source avatar gain
hrtfForStream(avatarUuid, QUuid()).setGainAdjustment(gain);
qCDebug(audio) << "Setting avatar gain adjustment for hrtf[" << uuid << "][" << avatarUuid << "] to " << gain;
}
}
void AudioMixerClientData::parseNodeIgnoreRequest(QSharedPointer<ReceivedMessage> message, const SharedNodePointer& node) {
@ -276,7 +285,7 @@ int AudioMixerClientData::parseData(ReceivedMessage& message) {
auto avatarAudioStream = new AvatarAudioStream(isStereo, AudioMixer::getStaticJitterFrames());
avatarAudioStream->setupCodec(_codec, _selectedCodecName, AudioConstants::MONO);
qDebug() << "creating new AvatarAudioStream... codec:" << _selectedCodecName;
qCDebug(audio) << "creating new AvatarAudioStream... codec:" << _selectedCodecName;
connect(avatarAudioStream, &InboundAudioStream::mismatchedAudioCodec,
this, &AudioMixerClientData::handleMismatchAudioFormat);
@ -315,7 +324,7 @@ int AudioMixerClientData::parseData(ReceivedMessage& message) {
#if INJECTORS_SUPPORT_CODECS
injectorStream->setupCodec(_codec, _selectedCodecName, isStereo ? AudioConstants::STEREO : AudioConstants::MONO);
qDebug() << "creating new injectorStream... codec:" << _selectedCodecName;
qCDebug(audio) << "creating new injectorStream... codec:" << _selectedCodecName;
#endif
auto emplaced = _audioStreams.emplace(
@ -339,8 +348,8 @@ int AudioMixerClientData::parseData(ReceivedMessage& message) {
auto parseResult = matchingStream->parseData(message);
if (matchingStream->getOverflowCount() > overflowBefore) {
qDebug() << "Just overflowed on stream from" << message.getSourceID() << "at" << message.getSenderSockAddr();
qDebug() << "This stream is for" << (isMicStream ? "microphone audio" : "injected audio");
qCDebug(audio) << "Just overflowed on stream from" << message.getSourceID() << "at" << message.getSenderSockAddr();
qCDebug(audio) << "This stream is for" << (isMicStream ? "microphone audio" : "injected audio");
}
return parseResult;
@ -689,7 +698,7 @@ void AudioMixerClientData::setupCodecForReplicatedAgent(QSharedPointer<ReceivedM
auto codecString = message->readString();
if (codecString != _selectedCodecName) {
qDebug() << "Manually setting codec for replicated agent" << uuidStringWithoutCurlyBraces(getNodeID())
qCDebug(audio) << "Manually setting codec for replicated agent" << uuidStringWithoutCurlyBraces(getNodeID())
<< "-" << codecString;
const std::pair<QString, CodecPluginPointer> codec = AudioMixer::negotiateCodec({ codecString });

View file

@ -83,6 +83,9 @@ public:
// uses randomization to have the AudioMixer send a stats packet to this node around every second
bool shouldSendStats(int frameNumber);
float getMasterAvatarGain() const { return _masterAvatarGain; }
void setMasterAvatarGain(float gain) { _masterAvatarGain = gain; }
AudioLimiter audioLimiter;
void setupCodec(CodecPluginPointer codec, const QString& codecName);
@ -175,6 +178,8 @@ private:
int _frameToSendStats { 0 };
float _masterAvatarGain { 1.0f }; // per-listener mixing gain, applied only to avatars
CodecPluginPointer _codec;
QString _selectedCodecName;
Encoder* _encoder{ nullptr }; // for outbound mixed stream

View file

@ -48,8 +48,8 @@ void sendEnvironmentPacket(const SharedNodePointer& node, AudioMixerClientData&
// mix helpers
inline float approximateGain(const AvatarAudioStream& listeningNodeStream, const PositionalAudioStream& streamToAdd,
const glm::vec3& relativePosition);
inline float computeGain(const AvatarAudioStream& listeningNodeStream, const PositionalAudioStream& streamToAdd,
const glm::vec3& relativePosition, bool isEcho);
inline float computeGain(const AudioMixerClientData& listenerNodeData, const AvatarAudioStream& listeningNodeStream,
const PositionalAudioStream& streamToAdd, const glm::vec3& relativePosition, bool isEcho);
inline float computeAzimuth(const AvatarAudioStream& listeningNodeStream, const PositionalAudioStream& streamToAdd,
const glm::vec3& relativePosition);
@ -266,7 +266,7 @@ void AudioMixerSlave::addStream(AudioMixerClientData& listenerNodeData, const QU
glm::vec3 relativePosition = streamToAdd.getPosition() - listeningNodeStream.getPosition();
float distance = glm::max(glm::length(relativePosition), EPSILON);
float gain = computeGain(listeningNodeStream, streamToAdd, relativePosition, isEcho);
float gain = computeGain(listenerNodeData, listeningNodeStream, streamToAdd, relativePosition, isEcho);
float azimuth = isEcho ? 0.0f : computeAzimuth(listeningNodeStream, listeningNodeStream, relativePosition);
const int HRTF_DATASET_INDEX = 1;
@ -484,10 +484,12 @@ float approximateGain(const AvatarAudioStream& listeningNodeStream, const Positi
// when throttling, as close streams are expected to be heard by a user
float distance = glm::length(relativePosition);
return gain / distance;
// avatar: skip master gain - it is constant for all streams
}
float computeGain(const AvatarAudioStream& listeningNodeStream, const PositionalAudioStream& streamToAdd,
const glm::vec3& relativePosition, bool isEcho) {
float computeGain(const AudioMixerClientData& listenerNodeData, const AvatarAudioStream& listeningNodeStream,
const PositionalAudioStream& streamToAdd, const glm::vec3& relativePosition, bool isEcho) {
float gain = 1.0f;
// injector: apply attenuation
@ -507,6 +509,9 @@ float computeGain(const AvatarAudioStream& listeningNodeStream, const Positional
float offAxisCoefficient = MAX_OFF_AXIS_ATTENUATION + (angleOfDelivery * (OFF_AXIS_ATTENUATION_STEP / PI_OVER_TWO));
gain *= offAxisCoefficient;
// apply master gain, only to avatars
gain *= listenerNodeData.getMasterAvatarGain();
}
auto& audioZones = AudioMixer::getAudioZones();

View file

@ -25,6 +25,23 @@ AvatarMixerClientData::AvatarMixerClientData(const QUuid& nodeID) :
_avatar->setID(nodeID);
}
uint64_t AvatarMixerClientData::getLastOtherAvatarEncodeTime(QUuid otherAvatar) const {
std::unordered_map<QUuid, uint64_t>::const_iterator itr = _lastOtherAvatarEncodeTime.find(otherAvatar);
if (itr != _lastOtherAvatarEncodeTime.end()) {
return itr->second;
}
return 0;
}
void AvatarMixerClientData::setLastOtherAvatarEncodeTime(const QUuid& otherAvatar, const uint64_t& time) {
std::unordered_map<QUuid, uint64_t>::iterator itr = _lastOtherAvatarEncodeTime.find(otherAvatar);
if (itr != _lastOtherAvatarEncodeTime.end()) {
itr->second = time;
} else {
_lastOtherAvatarEncodeTime.emplace(std::pair<QUuid, uint64_t>(otherAvatar, time));
}
}
void AvatarMixerClientData::queuePacket(QSharedPointer<ReceivedMessage> message, SharedNodePointer node) {
if (!_packetQueue.node) {
_packetQueue.node = node;

View file

@ -91,7 +91,7 @@ public:
void loadJSONStats(QJsonObject& jsonObject) const;
glm::vec3 getPosition() const { return _avatar ? _avatar->getPosition() : glm::vec3(0); }
glm::vec3 getPosition() const { return _avatar ? _avatar->getWorldPosition() : glm::vec3(0); }
glm::vec3 getGlobalBoundingBoxCorner() const { return _avatar ? _avatar->getGlobalBoundingBoxCorner() : glm::vec3(0); }
bool isRadiusIgnoring(const QUuid& other) const { return _radiusIgnoredOthers.find(other) != _radiusIgnoredOthers.end(); }
void addToRadiusIgnoringSet(const QUuid& other) { _radiusIgnoredOthers.insert(other); }
@ -110,16 +110,10 @@ public:
bool getRequestsDomainListData() { return _requestsDomainListData; }
void setRequestsDomainListData(bool requesting) { _requestsDomainListData = requesting; }
ViewFrustum getViewFrustom() const { return _currentViewFrustum; }
ViewFrustum getViewFrustum() const { return _currentViewFrustum; }
quint64 getLastOtherAvatarEncodeTime(QUuid otherAvatar) {
quint64 result = 0;
if (_lastOtherAvatarEncodeTime.find(otherAvatar) != _lastOtherAvatarEncodeTime.end()) {
result = _lastOtherAvatarEncodeTime[otherAvatar];
}
_lastOtherAvatarEncodeTime[otherAvatar] = usecTimestampNow();
return result;
}
uint64_t getLastOtherAvatarEncodeTime(QUuid otherAvatar) const;
void setLastOtherAvatarEncodeTime(const QUuid& otherAvatar, const uint64_t& time);
QVector<JointData>& getLastOtherAvatarSentJoints(QUuid otherAvatar) {
_lastOtherAvatarSentJoints[otherAvatar].resize(_avatar->getJointCount());
@ -143,7 +137,7 @@ private:
// this is a map of the last time we encoded an "other" avatar for
// sending to "this" node
std::unordered_map<QUuid, quint64> _lastOtherAvatarEncodeTime;
std::unordered_map<QUuid, uint64_t> _lastOtherAvatarEncodeTime;
std::unordered_map<QUuid, QVector<JointData>> _lastOtherAvatarSentJoints;
uint64_t _identityChangeTimestamp;

View file

@ -22,6 +22,7 @@
#include <NodeList.h>
#include <Node.h>
#include <OctreeConstants.h>
#include <PrioritySortUtil.h>
#include <udt/PacketHeaders.h>
#include <SharedUtil.h>
#include <StDev.h>
@ -32,6 +33,10 @@
#include "AvatarMixerClientData.h"
#include "AvatarMixerSlave.h"
namespace PrioritySortUtil {
// we declare this callback here but override it later
std::function<uint64_t(const AvatarSharedPointer&)> getAvatarAgeCallback = [] (const AvatarSharedPointer& avatar) { return 0; };
}
void AvatarMixerSlave::configure(ConstIter begin, ConstIter end) {
_begin = begin;
@ -184,10 +189,8 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
// setup list of AvatarData as well as maps to map betweeen the AvatarData and the original nodes
// for calling the AvatarData::sortAvatars() function and getting our sorted list of client nodes
QList<AvatarSharedPointer> avatarList;
std::vector<AvatarSharedPointer> avatarsToSort;
std::unordered_map<AvatarSharedPointer, SharedNodePointer> avatarDataToNodes;
std::for_each(_begin, _end, [&](const SharedNodePointer& otherNode) {
// make sure this is an agent that we have avatar data for before considering it for inclusion
if (otherNode->getType() == NodeType::Agent
@ -195,36 +198,61 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
const AvatarMixerClientData* otherNodeData = reinterpret_cast<const AvatarMixerClientData*>(otherNode->getLinkedData());
AvatarSharedPointer otherAvatar = otherNodeData->getAvatarSharedPointer();
avatarList << otherAvatar;
avatarsToSort.push_back(otherAvatar);
avatarDataToNodes[otherAvatar] = otherNode;
}
});
AvatarSharedPointer thisAvatar = nodeData->getAvatarSharedPointer();
ViewFrustum cameraView = nodeData->getViewFrustom();
std::priority_queue<AvatarPriority> sortedAvatars;
AvatarData::sortAvatars(avatarList, cameraView, sortedAvatars,
[&](AvatarSharedPointer avatar)->uint64_t {
auto avatarNode = avatarDataToNodes[avatar];
assert(avatarNode); // we can't have gotten here without the avatarData being a valid key in the map
return nodeData->getLastBroadcastTime(avatarNode->getUUID());
}, [&](AvatarSharedPointer avatar)->float{
glm::vec3 nodeBoxHalfScale = (avatar->getPosition() - avatar->getGlobalBoundingBoxCorner() * avatar->getSensorToWorldScale());
return glm::max(nodeBoxHalfScale.x, glm::max(nodeBoxHalfScale.y, nodeBoxHalfScale.z));
}, [&](AvatarSharedPointer avatar)->bool {
// now that we've assembled the avatarDataToNodes map we can replace PrioritySortUtil::getAvatarAgeCallback
// with the true implementation
PrioritySortUtil::getAvatarAgeCallback = [&] (const AvatarSharedPointer& avatar) {
auto avatarNode = avatarDataToNodes[avatar];
assert(avatarNode); // we can't have gotten here without the avatarData being a valid key in the map
return nodeData->getLastOtherAvatarEncodeTime(avatarNode->getUUID());
};
class SortableAvatar: public PrioritySortUtil::Sortable {
public:
SortableAvatar() = delete;
SortableAvatar(const AvatarSharedPointer& avatar) : _avatar(avatar) {}
glm::vec3 getPosition() const override { return _avatar->getWorldPosition(); }
float getRadius() const override {
glm::vec3 nodeBoxHalfScale = (_avatar->getWorldPosition() - _avatar->getGlobalBoundingBoxCorner() * _avatar->getSensorToWorldScale());
return glm::max(nodeBoxHalfScale.x, glm::max(nodeBoxHalfScale.y, nodeBoxHalfScale.z));
}
uint64_t getTimestamp() const override {
// use the callback implemented above
return PrioritySortUtil::getAvatarAgeCallback(_avatar);
}
const AvatarSharedPointer& getAvatar() const { return _avatar; }
private:
AvatarSharedPointer _avatar;
};
// prepare to sort
ViewFrustum cameraView = nodeData->getViewFrustum();
PrioritySortUtil::PriorityQueue<SortableAvatar> sortedAvatars(cameraView,
AvatarData::_avatarSortCoefficientSize,
AvatarData::_avatarSortCoefficientCenter,
AvatarData::_avatarSortCoefficientAge);
// ignore or sort
const AvatarSharedPointer& thisAvatar = nodeData->getAvatarSharedPointer();
for (const auto& avatar : avatarsToSort) {
if (avatar == thisAvatar) {
return true; // ignore ourselves...
// don't echo updates to self
continue;
}
bool shouldIgnore = false;
// We will also ignore other nodes for a couple of different reasons:
// We ignore other nodes for a couple of reasons:
// 1) ignore bubbles and ignore specific node
// 2) the node hasn't really updated it's frame data recently, this can
// happen if for example the avatar is connected on a desktop and sending
// updates at ~30hz. So every 3 frames we skip a frame.
auto avatarNode = avatarDataToNodes[avatar];
auto avatarNode = avatarDataToNodes[avatar];
assert(avatarNode); // we can't have gotten here without the avatarData being a valid key in the map
const AvatarMixerClientData* avatarNodeData = reinterpret_cast<const AvatarMixerClientData*>(avatarNode->getLinkedData());
@ -240,7 +268,6 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
|| (avatarNode->isIgnoringNodeWithID(node->getUUID()) && !getsAnyIgnored)) {
shouldIgnore = true;
} else {
// Check to see if the space bubble is enabled
// Don't bother with these checks if the other avatar has their bubble enabled and we're gettingAnyIgnored
if (node->isIgnoreRadiusEnabled() || (avatarNode->isIgnoreRadiusEnabled() && !getsAnyIgnored)) {
@ -267,8 +294,6 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
nodeData->removeFromRadiusIgnoringSet(node, avatarNode->getUUID());
}
}
quint64 endIgnoreCalculation = usecTimestampNow();
_stats.ignoreCalculationElapsedTime += (endIgnoreCalculation - startIgnoreCalculation);
if (!shouldIgnore) {
AvatarDataSequenceNumber lastSeqToReceiver = nodeData->getLastBroadcastSequenceNumber(avatarNode->getUUID());
@ -292,20 +317,21 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
++numAvatarsWithSkippedFrames;
}
}
return shouldIgnore;
});
quint64 endIgnoreCalculation = usecTimestampNow();
_stats.ignoreCalculationElapsedTime += (endIgnoreCalculation - startIgnoreCalculation);
if (!shouldIgnore) {
// sort this one for later
sortedAvatars.push(SortableAvatar(avatar));
}
}
// loop through our sorted avatars and allocate our bandwidth to them accordingly
int avatarRank = 0;
// this is overly conservative, because it includes some avatars we might not consider
int remainingAvatars = (int)sortedAvatars.size();
while (!sortedAvatars.empty()) {
AvatarPriority sortData = sortedAvatars.top();
const auto& avatarData = sortedAvatars.top().getAvatar();
sortedAvatars.pop();
const auto& avatarData = sortData.avatar;
avatarRank++;
remainingAvatars--;
auto otherNode = avatarDataToNodes[avatarData];
@ -332,10 +358,8 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
nodeData->setLastBroadcastTime(otherNode->getUUID(), usecTimestampNow());
}
// determine if avatar is in view which determines how much data to send
glm::vec3 otherPosition = otherAvatar->getClientGlobalPosition();
// determine if avatar is in view, to determine how much data to include...
glm::vec3 otherNodeBoxScale = (otherPosition - otherNodeData->getGlobalBoundingBoxCorner()) * 2.0f * otherAvatar->getSensorToWorldScale();
AABox otherNodeBox(otherNodeData->getGlobalBoundingBoxCorner(), otherNodeBoxScale);
bool isInView = nodeData->otherAvatarInView(otherNodeBox);
@ -405,14 +429,18 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
// set the last sent sequence number for this sender on the receiver
nodeData->setLastBroadcastSequenceNumber(otherNode->getUUID(),
otherNodeData->getLastReceivedSequenceNumber());
nodeData->setLastOtherAvatarEncodeTime(otherNode->getUUID(), usecTimestampNow());
}
} else {
// TODO? this avatar is not included now, and will probably not be included next frame.
// It would be nice if we could tweak its future sort priority to put it at the back of the list.
}
avatarPacketList->endSegment();
quint64 endAvatarDataPacking = usecTimestampNow();
_stats.avatarDataPackingElapsedTime += (endAvatarDataPacking - startAvatarDataPacking);
};
}
quint64 startPacketSending = usecTimestampNow();

View file

@ -20,7 +20,7 @@
QByteArray ScriptableAvatar::toByteArrayStateful(AvatarDataDetail dataDetail, bool dropFaceTracking) {
_globalPosition = getPosition();
_globalPosition = getWorldPosition();
return AvatarData::toByteArrayStateful(dataDetail);
}

View file

@ -29,10 +29,6 @@ macro(GENERATE_INSTALLERS)
if (WIN32)
# Do not install the Visual Studio C runtime libraries. The installer will do this automatically
set(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS_SKIP TRUE)
include(InstallRequiredSystemLibraries)
set(CPACK_NSIS_MUI_ICON "${HF_CMAKE_DIR}/installer/installer.ico")
# install and reference the Add/Remove icon
@ -49,6 +45,10 @@ macro(GENERATE_INSTALLERS)
set(_UNINSTALLER_HEADER_BAD_PATH "${HF_CMAKE_DIR}/installer/uninstaller-header.bmp")
set(UNINSTALLER_HEADER_IMAGE "")
fix_path_for_nsis(${_UNINSTALLER_HEADER_BAD_PATH} UNINSTALLER_HEADER_IMAGE)
# grab the latest VC redist (2017) and add it to the installer, our NSIS template
# will call it during the install
install(CODE "file(DOWNLOAD https://go.microsoft.com/fwlink/?LinkId=746572 \"\${CMAKE_INSTALL_PREFIX}/vcredist_x64.exe\")")
elseif (APPLE)
# produce a drag and drop DMG on OS X
set(CPACK_GENERATOR "DragNDrop")
@ -84,4 +84,3 @@ macro(GENERATE_INSTALLERS)
include(CPack)
endmacro()

View file

@ -0,0 +1,17 @@
#
# Created by Bradley Austin Davis on 2017/11/27
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
function(set_from_env _RESULT_NAME _ENV_VAR_NAME _DEFAULT_VALUE)
if (NOT DEFINED ${_RESULT_NAME})
if ("$ENV{${_ENV_VAR_NAME}}" STREQUAL "")
set (${_RESULT_NAME} ${_DEFAULT_VALUE} PARENT_SCOPE)
else()
set (${_RESULT_NAME} $ENV{${_ENV_VAR_NAME}} PARENT_SCOPE)
endif()
endif()
endfunction()

View file

@ -15,13 +15,14 @@ macro(SET_PACKAGING_PARAMETERS)
set(PR_BUILD 0)
set(PRODUCTION_BUILD 0)
set(DEV_BUILD 0)
set(RELEASE_TYPE $ENV{RELEASE_TYPE})
set(RELEASE_NUMBER $ENV{RELEASE_NUMBER})
string(TOLOWER "$ENV{BRANCH}" BUILD_BRANCH)
set(BUILD_GLOBAL_SERVICES "DEVELOPMENT")
set(USE_STABLE_GLOBAL_SERVICES 0)
set_from_env(RELEASE_TYPE RELEASE_TYPE "DEV")
set_from_env(RELEASE_NUMBER RELEASE_NUMBER "")
set_from_env(BUILD_BRANCH BRANCH "")
string(TOLOWER "${BUILD_BRANCH}" BUILD_BRANCH)
message(STATUS "The BUILD_BRANCH variable is: ${BUILD_BRANCH}")
message(STATUS "The BRANCH environment variable is: $ENV{BRANCH}")
message(STATUS "The RELEASE_TYPE variable is: ${RELEASE_TYPE}")

View file

@ -1,21 +1,11 @@
#
# Copyright 2015 High Fidelity, Inc.
# Created by Bradley Austin Davis on 2015/10/10
# Created by Bradley Austin Davis on 2017/09/02
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
function(set_from_env _RESULT_NAME _ENV_VAR_NAME _DEFAULT_VALUE)
if (NOT DEFINED ${_RESULT_NAME})
if ("$ENV{${_ENV_VAR_NAME}}" STREQUAL "")
set (${_RESULT_NAME} ${_DEFAULT_VALUE} PARENT_SCOPE)
else()
set (${_RESULT_NAME} $ENV{${_ENV_VAR_NAME}} PARENT_SCOPE)
endif()
endif()
endfunction()
# Construct a default QT location from a root path, a version and an architecture
function(calculate_default_qt_dir _RESULT_NAME)
if (ANDROID)

View file

@ -6,8 +6,19 @@
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_BULLET)
add_dependency_external_projects(bullet)
find_package(Bullet REQUIRED)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/bullet)
set(BULLET_INCLUDE_DIRS "${INSTALL_DIR}/include/bullet" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletDynamics.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletCollision.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libLinearMath.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletSoftBody.a)
else()
add_dependency_external_projects(bullet)
find_package(Bullet REQUIRED)
endif()
# perform the system include hack for OS X to ignore warnings
if (APPLE)
SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -isystem ${BULLET_INCLUDE_DIRS}")
@ -16,3 +27,5 @@ macro(TARGET_BULLET)
endif()
target_link_libraries(${TARGET_NAME} ${BULLET_LIBRARIES})
endmacro()

18
cmake/macros/TargetDraco.cmake Executable file
View file

@ -0,0 +1,18 @@
macro(TARGET_DRACO)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/draco)
set(DRACO_INCLUDE_DIRS "${INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdraco.a)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdracodec.a)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdracoenc.a)
else()
add_dependency_external_projects(draco)
find_package(Draco REQUIRED)
list(APPEND DRACO_LIBRARIES ${DRACO_LIBRARY})
list(APPEND DRACO_LIBRARIES ${DRACO_ENCODER_LIBRARY})
endif()
target_include_directories(${TARGET_NAME} SYSTEM PRIVATE ${DRACO_INCLUDE_DIRS})
target_link_libraries(${TARGET_NAME} ${DRACO_LIBRARIES})
endmacro()

View file

@ -0,0 +1,14 @@
#
# Created by Bradley Austin Davis on 2017/11/28
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_GOOGLEVR)
if (ANDROID)
set(GVR_ROOT "${HIFI_ANDROID_PRECOMPILED}/gvr/gvr-android-sdk-1.101.0/")
target_include_directories(native-lib PRIVATE "${GVR_ROOT}/libraries/headers")
target_link_libraries(native-lib "${GVR_ROOT}/libraries/libgvr.so")
endif()
endmacro()

View file

@ -6,14 +6,10 @@
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_OPENSSL)
if (ANDROID)
# FIXME use a distributable binary
set(OPENSSL_INSTALL_DIR C:/Android/openssl)
set(OPENSSL_INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/openssl)
set(OPENSSL_INCLUDE_DIR "${OPENSSL_INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(OPENSSL_LIBRARIES "${OPENSSL_INSTALL_DIR}/lib/libcrypto.a;${OPENSSL_INSTALL_DIR}/lib/libssl.a" CACHE TYPE INTERNAL)
else()
find_package(OpenSSL REQUIRED)
@ -28,5 +24,4 @@ macro(TARGET_OPENSSL)
include_directories(SYSTEM "${OPENSSL_INCLUDE_DIR}")
target_link_libraries(${TARGET_NAME} ${OPENSSL_LIBRARIES})
endmacro()

View file

@ -0,0 +1,24 @@
#
# Created by Bradley Austin Davis on 2017/11/28
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_POLYVOX)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/polyvox)
set(POLYVOX_INCLUDE_DIRS "${INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND POLYVOX_LIBRARIES ${LIB_DIR}/libPolyVoxUtil.so)
list(APPEND POLYVOX_LIBRARIES ${LIB_DIR}/Release/libPolyVoxCore.so)
else()
add_dependency_external_projects(polyvox)
find_package(PolyVox REQUIRED)
endif()
target_include_directories(${TARGET_NAME} SYSTEM PUBLIC ${POLYVOX_INCLUDE_DIRS})
target_link_libraries(${TARGET_NAME} ${POLYVOX_LIBRARIES})
endmacro()

View file

@ -8,10 +8,10 @@
macro(TARGET_TBB)
if (ANDROID)
set(TBB_INSTALL_DIR C:/tbb-2018/built)
set(TBB_LIBRARY ${HIFI_ANDROID_PRECOMPILED}/libtbb.so CACHE FILEPATH "TBB library location")
set(TBB_MALLOC_LIBRARY ${HIFI_ANDROID_PRECOMPILED}/libtbbmalloc.so CACHE FILEPATH "TBB malloc library location")
set(TBB_INCLUDE_DIRS ${TBB_INSTALL_DIR}/include CACHE TYPE "List of tbb include directories" CACHE FILEPATH "TBB includes location")
set(TBB_INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/tbb)
set(TBB_INCLUDE_DIRS ${TBB_INSTALL_DIR}/include CACHE FILEPATH "TBB includes location")
set(TBB_LIBRARY ${TBB_INSTALL_DIR}/lib/release/libtbb.so CACHE FILEPATH "TBB library location")
set(TBB_MALLOC_LIBRARY ${TBB_INSTALL_DIR}/lib/release/libtbbmalloc.so CACHE FILEPATH "TBB malloc library location")
set(TBB_LIBRARIES ${TBB_LIBRARY} ${TBB_MALLOC_LIBRARY})
else()
add_dependency_external_projects(tbb)

View file

@ -45,5 +45,4 @@ else()
endif()
file(GLOB EXTRA_PLUGINS "${BUNDLE_PLUGIN_DIR}/*.${PLUGIN_EXTENSION}")
fixup_bundle("${BUNDLE_EXECUTABLE}" "${EXTRA_PLUGINS}" "@FIXUP_LIBS@")
fixup_bundle("${BUNDLE_EXECUTABLE}" "${EXTRA_PLUGINS}" "@FIXUP_LIBS@" IGNORE_ITEM "vcredist_x86.exe;vcredist_x64.exe")

View file

@ -916,6 +916,14 @@
"default": false
}
]
},
{
"name": "multi_kick_logged_in",
"type": "checkbox",
"label": "Multi-Kick for Logged In Users",
"help": "Kick logged in users by machine fingerprint (in addition to the default kick by username)",
"default": false,
"advanced": true
}
]
},

View file

@ -1,3 +1,17 @@
/* cairo-regular - latin */
@font-face {
font-family: 'Cairo';
font-style: normal;
font-weight: 400;
src: url('/fonts/cairo-v2-latin-regular.eot'); /* IE9 Compat Modes */
src: local('Cairo'), local('Cairo-Regular'),
url('/fonts/cairo-v2-latin-regular.eot?#iefix') format('embedded-opentype'), /* IE6-IE8 */
url('/fonts/cairo-v2-latin-regular.woff2') format('woff2'), /* Super Modern Browsers */
url('/fonts/cairo-v2-latin-regular.woff') format('woff'), /* Modern Browsers */
url('/fonts/cairo-v2-latin-regular.ttf') format('truetype'), /* Safari, Android, iOS */
url('/fonts/cairo-v2-latin-regular.svg#Cairo') format('svg'); /* Legacy iOS */
}
body {
position: relative;
padding-bottom: 30px;
@ -80,11 +94,23 @@ span.port {
display: none;
}
#setup-sidebar.affix {
/* This overrides a case where going to the bottom of the page,
* then scrolling up, causes `position: relative` to be added to the style
*/
position: fixed !important;
@media (min-width: 768px) {
#setup-sidebar.affix {
/* This overrides a case where going to the bottom of the page,
* then scrolling up, causes `position: relative` to be added to the style
*/
position: fixed !important;
}
}
@media (max-width: 767px) {
#setup-sidebar.affix {
position: static !important;
}
#setup-sidebar {
margin-bottom: 20px;
}
}
#setup-sidebar button {
@ -302,6 +328,7 @@ table .headers + .headers td {
}
.account-connected-header {
vertical-align: middle;
color: #6FCF97;
font-size: 30px;
margin-right: 20px;
@ -311,7 +338,7 @@ table .headers + .headers td {
font-size: 14px;
text-decoration-line: underline;
font-weight: normal;
color: #2F80ED;
color: #00B3F8;
}
#manage-cloud-domains-link {

View file

@ -0,0 +1 @@
<!DOCTYPE html><html lang=en><meta charset=utf-8><meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width"><title>Error 500 (Server Error)!!1</title><style>*{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{color:#222;text-align:unset;margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px;}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}pre{white-space:pre-wrap;}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}</style><div id="af-error-container"><a href=//www.google.com><span id=logo aria-label=Google></span></a><p><b>500.</b> <ins>Thats an error.</ins><p>There was an error. Please try again later. <ins>Thats all we know.</ins></div>

View file

@ -1,25 +0,0 @@
<svg width="676" height="676" viewBox="0 0 676 676" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>CongratulationImage</title>
<desc>Created using Figma</desc>
<g id="Canvas" transform="matrix(4 0 0 4 -21208 -17980)">
<g id="CongratulationImage">
<g id="Ellipse">
<use xlink:href="#path0_fill" transform="translate(5302 4495)" fill="#FFFFFF"/>
<mask id="mask0_outline_ins">
<use xlink:href="#path0_fill" fill="white" transform="translate(5302 4495)"/>
</mask>
<g mask="url(#mask0_outline_ins)">
<use xlink:href="#path1_stroke_2x" transform="translate(5302 4495)" fill="#219653"/>
</g>
</g>
<g id="Vector 2">
<use xlink:href="#path2_stroke" transform="translate(5355 4559)" fill="#219653"/>
</g>
</g>
</g>
<defs>
<path id="path0_fill" d="M 169 84.5C 169 131.168 131.168 169 84.5 169C 37.8319 169 0 131.168 0 84.5C 0 37.8319 37.8319 0 84.5 0C 131.168 0 169 37.8319 169 84.5Z"/>
<path id="path1_stroke_2x" d="M 154 84.5C 154 122.884 122.884 154 84.5 154L 84.5 184C 139.452 184 184 139.452 184 84.5L 154 84.5ZM 84.5 154C 46.1162 154 15 122.884 15 84.5L -15 84.5C -15 139.452 29.5477 184 84.5 184L 84.5 154ZM 15 84.5C 15 46.1162 46.1162 15 84.5 15L 84.5 -15C 29.5477 -15 -15 29.5477 -15 84.5L 15 84.5ZM 84.5 15C 122.884 15 154 46.1162 154 84.5L 184 84.5C 184 29.5477 139.452 -15 84.5 -15L 84.5 15Z"/>
<path id="path2_stroke" d="M 5.18747 19.8031C 2.19593 16.9382 -2.5517 17.0408 -5.41666 20.0323C -8.28162 23.0238 -8.17901 27.7715 -5.18747 30.6364L 5.18747 19.8031ZM 20.6541 45L 15.4667 50.4167C 18.3816 53.2083 22.9831 53.1924 25.8787 50.3809L 20.6541 45ZM 72.2246 5.38085C 75.1964 2.49539 75.2663 -2.25283 72.3809 -5.2246C 69.4954 -8.19636 64.7472 -8.26632 61.7754 -5.38085L 72.2246 5.38085ZM -5.18747 30.6364L 15.4667 50.4167L 25.8416 39.5833L 5.18747 19.8031L -5.18747 30.6364ZM 25.8787 50.3809L 72.2246 5.38085L 61.7754 -5.38085L 15.4295 39.6191L 25.8787 50.3809Z"/>
</defs>
</svg>

Before

Width:  |  Height:  |  Size: 1.9 KiB

View file

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 22.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 576 576" style="enable-background:new 0 0 576 576;" xml:space="preserve">
<path d="M329.7,410.7H127.8c-42.1,0-76.4-34.3-76.4-76.4V132.4c0-42.1,34.3-76.4,76.4-76.4h201.9c42.1,0,76.4,34.3,76.4,76.4v201.9
C406.1,376.4,371.8,410.7,329.7,410.7z M127.8,98.9c-18.5,0-33.5,15-33.5,33.5v201.9c0,18.5,15,33.5,33.5,33.5h201.9
c18.5,0,33.5-15,33.5-33.5V132.4c0-18.5-15-33.5-33.5-33.5H127.8z"/>
<path d="M449.4,519H407c-11.9,0-21.5-9.6-21.5-21.5s9.6-21.5,21.5-21.5h42.4c11.9,0,21.5,9.6,21.5,21.5S461.3,519,449.4,519z
M305.1,519H263c-11.9,0-21.5-9.6-21.5-21.5s9.6-21.5,21.5-21.5h42.2c11.9,0,21.5,9.6,21.5,21.5S317,519,305.1,519z M192.4,464.1
c-11.9,0-21.5-9.6-21.5-21.4v-42.2c0-11.9,9.6-21.5,21.5-21.5c11.9,0,21.5,9.6,21.5,21.5v42.1C213.8,454.5,204.2,464.1,192.4,464.1z
M504.1,448.2c-11.9,0-21.5-9.6-21.5-21.5v-42.2c0-11.9,9.6-21.5,21.5-21.5c11.9,0,21.5,9.6,21.5,21.5v42.2
C525.6,438.6,516,448.2,504.1,448.2z M192.4,320.1c-11.9,0-21.5-9.6-21.5-21.5v-42.2c0-11.9,9.6-21.5,21.5-21.5
c11.9,0,21.5,9.6,21.5,21.5v42.2C213.8,310.5,204.2,320.1,192.4,320.1z M504.1,304.1c-11.9,0-21.5-9.6-21.5-21.5v-42
c0-11.9,9.6-21.6,21.5-21.6c11.9,0,21.5,9.5,21.5,21.4v42.2C525.6,294.5,516,304.1,504.1,304.1z M433.4,207.2h-42.2
c-11.9,0-21.5-9.6-21.5-21.5s9.6-21.5,21.5-21.5h42.2c11.9,0,21.5,9.6,21.5,21.5S445.3,207.2,433.4,207.2z M289.3,207.2h-42.1
c-11.9,0-21.5-9.6-21.5-21.5s9.6-21.5,21.4-21.5h42.2c11.9,0,21.5,9.6,21.5,21.5S301.2,207.2,289.3,207.2z"/>
</svg>

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 137 KiB

View file

@ -62,26 +62,25 @@ var Strings = {
// dialog with new path still set, allowing them to retry immediately, and without
// having to type the new path in again.
EDIT_PLACE_TITLE: "Modify Viewpoint or Path",
EDIT_PLACE_ERROR: "Failed to update place path. Please try again.",
EDIT_PLACE_ERROR: "Failed to update Viewpoint or Path for this Place Name. Please try again.",
EDIT_PLACE_CONFIRM_BUTTON: "Save",
EDIT_PLACE_CONFIRM_BUTTON_PENDING: "Saving...",
EDIT_PLACE_CANCEL_BUTTON: "Cancel",
REMOVE_PLACE_TITLE: "Are you sure you want to remove <strong>{{place}}</strong>?",
REMOVE_PLACE_ERROR: "Failed to remove place. Please try again.",
REMOVE_PLACE_DELETE_BUTTON: "Delete",
REMOVE_PLACE_TITLE: "Are you sure you want to remove <strong>{{place}}</strong> and its path information?",
REMOVE_PLACE_ERROR: "Failed to remove Place Name and its Path information.",
REMOVE_PLACE_DELETE_BUTTON: "This action removes your Place Name",
REMOVE_PLACE_DELETE_BUTTON_PENDING: "Deleting...",
REMOVE_PLACE_CANCEL_BUTTON: "Cancel",
ADD_PLACE_TITLE: "Choose a place",
ADD_PLACE_MESSAGE: "Choose the High Fidelity place to point at this domain server.",
ADD_PLACE_CONFIRM_BUTTON: "Choose place",
ADD_PLACE_MESSAGE: "Choose a Place Name that you own or register a new Place Name.",
ADD_PLACE_CONFIRM_BUTTON: "Save",
ADD_PLACE_CONFIRM_BUTTON_PENDING: "Saving...",
ADD_PLACE_CANCEL_BUTTON: "Cancel",
ADD_PLACE_UNKNOWN_ERROR: "There was an error adding this place name.",
ADD_PLACE_UNKNOWN_ERROR: "There was an error adding this Place Name. Try saving again",
ADD_PLACE_NO_PLACES_MESSAGE: "<p>You do not have any places in your High Fidelity account."
+ "<br/><br/>Go to your <a href='https://metaverse.highfidelity.com/user/places/new'>places page</a> to create a new one. Once your place is created re-open this dialog to select it.</p>",
ADD_PLACE_NO_PLACES_MESSAGE: "You don't have any Place Names registered. Once you have a Place Name, reopen this window to select it.",
ADD_PLACE_NO_PLACES_BUTTON: "Create new place",
ADD_PLACE_UNABLE_TO_LOAD_ERROR: "We were unable to load your place names. Please try again later.",
ADD_PLACE_LOADING_DIALOG: "Loading your places...",
@ -236,7 +235,7 @@ function chooseFromHighFidelityPlaces(accessToken, forcePathTo, onSuccessfullyAd
if (forcePathTo === undefined || forcePathTo === null) {
var path = "<div class='form-group'>";
path += "<label for='place-path-input' class='control-label'>Path</label>";
path += "<label for='place-path-input' class='control-label'>Path or Viewpoint</label>";
path += "<input type='text' id='place-path-input' class='form-control' value='/'>";
path += "</div>";
modal_body.append($(path));
@ -339,7 +338,6 @@ function chooseFromHighFidelityPlaces(accessToken, forcePathTo, onSuccessfullyAd
$('.add-place-confirm-button').html(Strings.ADD_PLACE_CONFIRM_BUTTON);
$('.add-place-cancel-button').removeAttr('disabled');
bootbox.alert(Strings.ADD_PLACE_UNKNOWN_ERROR);
bootbox.alert("FAIL");
});
}
}
@ -363,7 +361,8 @@ function chooseFromHighFidelityPlaces(accessToken, forcePathTo, onSuccessfullyAd
title: Strings.ADD_PLACE_TITLE,
message: modal_body,
closeButton: false,
buttons: modal_buttons
buttons: modal_buttons,
onEscape: true
});
} else {
bootbox.alert(Strings.ADD_PLACE_UNABLE_TO_LOAD_ERROR);

View file

@ -36,11 +36,6 @@
</div>
<div class="col-md-9 col-sm-9 col-xs-12">
<div id="xs-advanced-container" class="col-xs-12 hidden-sm hidden-md hidden-lg">
<button id="advanced-toggle-button-xs" class="btn btn-info advanced-toggle">Show advanced</button>
</div>
<div class="col-xs-12">
<div id="cloud-domains-alert" class="alert alert-info alert-dismissible" role="alert" style="display: none;">

View file

@ -503,7 +503,7 @@ function showDomainCreationAlert(justConnected) {
swal({
title: 'Create new domain ID',
type: 'input',
text: 'Enter a short description for this machine.</br></br>This will help you identify which domain ID belongs to which machine.</br></br>',
text: 'Enter a label this machine.</br></br>This will help you identify which domain ID belongs to which machine.</br></br>',
showCancelButton: true,
confirmButtonText: "Create",
closeOnConfirm: false,
@ -527,13 +527,12 @@ function showDomainCreationAlert(justConnected) {
function createNewDomainID(label, justConnected) {
// get the JSON object ready that we'll use to create a new domain
var domainJSON = {
"label": label
//"access_token": $(Settings.ACCESS_TOKEN_SELECTOR).val()
"label": label
}
$.post("/api/domains", domainJSON, function(data){
// we successfully created a domain ID, set it on that field
var domainID = data.domain_id;
var domainID = data.domain.id;
console.log("Setting domain id to ", data, domainID);
$(Settings.DOMAIN_ID_SELECTOR).val(domainID).change();
@ -620,18 +619,14 @@ function parseJSONResponse(xhr) {
function showOrHideLabel() {
var type = getCurrentDomainIDType();
if (!accessTokenIsSet() || (type !== DOMAIN_ID_TYPE_FULL && type !== DOMAIN_ID_TYPE_UNKNOWN)) {
$(".panel#label").hide();
return false;
}
$(".panel#label").show();
return true;
var shouldShow = accessTokenIsSet() && (type === DOMAIN_ID_TYPE_FULL || type === DOMAIN_ID_TYPE_UNKNOWN);
$(".panel#label").toggle(shouldShow);
$("li a[href='#label']").parent().toggle(shouldShow);
return shouldShow;
}
function setupDomainLabelSetting() {
if (!showOrHideLabel()) {
return;
}
showOrHideLabel();
var html = "<div>"
html += "<label class='control-label'>Specify a label for your domain</label> <a class='domain-loading-hide' href='#'>Edit</a>";
@ -654,6 +649,7 @@ function setupDomainLabelSetting() {
title: 'Edit Label',
message: modal_body,
closeButton: false,
onEscape: true,
buttons: [
{
label: 'Cancel',
@ -742,7 +738,7 @@ function setupDomainNetworkingSettings() {
var includeAddress = autoNetworkingSetting === 'disabled';
if (includeAddress) {
var label = "Network Address and Port";
var label = "Network Address:Port";
} else {
var label = "Network Port";
}
@ -777,6 +773,7 @@ function setupDomainNetworkingSettings() {
title: 'Edit Network',
message: modal_body,
closeButton: false,
onEscape: true,
buttons: [
{
label: 'Cancel',
@ -924,6 +921,7 @@ function placeTableRow(name, path, isTemporary, placeID) {
var dialog = bootbox.dialog({
message: confirmString,
closeButton: false,
onEscape: true,
buttons: [
{
label: Strings.REMOVE_PLACE_CANCEL_BUTTON,
@ -1025,7 +1023,9 @@ function reloadDomainInfo() {
}
}
appendAddButtonToPlacesTable();
if (accessTokenIsSet()) {
appendAddButtonToPlacesTable();
}
} else {
$('.domain-loading-error').show();
@ -1098,6 +1098,7 @@ function editHighFidelityPlace(placeID, name, path) {
dialog = bootbox.dialog({
title: Strings.EDIT_PLACE_TITLE,
closeButton: false,
onEscape: true,
message: modal_body,
buttons: modal_buttons
})
@ -1180,6 +1181,7 @@ function chooseFromHighFidelityDomains(clickedButton) {
bootbox.dialog({
title: "Choose matching domain",
onEscape: true,
message: modal_body,
buttons: modal_buttons
})

View file

@ -83,21 +83,80 @@ label {
margin-bottom: 33px;
}
#checkmark-image {
margin-top: 66px;
margin-bottom: 59px;
width: 169px;
height: 169px;
.btn.btn-square {
border-radius: 0;
}
#congratulation-text {
margin-bottom: 59px;
.header {
position: absolute;
top: -20px;
left: 50%;
margin-left: -720px; /* Half of the width */
min-width: 1440px;
z-index: -1;
}
#visit-domain-checkbox {
margin-bottom: 23px;
.title {
font-family: 'Cairo';
color: white;
margin-top: 260px;
}
#visit-domain-checkbox label {
margin: 0 0;
#main-description {
margin-top: 32px;
margin-bottom: 35px;
}
#share-box {
background-color: rgb(240, 240, 240);
padding: 20px 20px;
margin-top: 15px;
margin-bottom: 70px;
white-space: nowrap;
}
#share-box > span {
vertical-align: middle;
display: table-cell;
height: 100%;
}
#share-box button {
border-color: rgb(225, 225, 225);
font-size: 18px;
}
#share-box img {
padding-right: 10px;
}
#share-text {
font-size: 22px;
padding-right: 15px;
}
#share-field {
background-color: rgb(255, 255, 255);
padding: 0 10px;
width: 100%;
}
#share-link {
font-size: 28px;
}
#congrats-list > div > div {
margin-bottom: 30px;
}
#congrats-list ul {
padding-left: 16px;
}
#congrats-list {
margin-bottom: 60px;
}
#open-settings {
border-color: black;
}

View file

@ -191,29 +191,70 @@
</dl>
</div>
<div class="wizard-step cloud-only col-xs-12 col-sm-12 col-md-9 col-lg-7 col-centered" style="display: none;">
<div id="congratulation-step" class="wizard-step cloud-only col-xs-12 col-centered" style="display: none;">
<div class="row">
<div class="col-xs-12" align="center">
<img id="checkmark-image" src="../images/checkmark.svg">
<div class="col-xs-10 col-xs-offset-1">
<img class="header" src="/images/wizard-congratulation-header.jpg" alt="Header" width="1440" height="339">
<h1 class="title">Congratulations!</h1>
</div>
</div>
<div id="congratulation-text" class="row">
<div class="col-xs-12">
<p class="step-info">Congratulations! You have successfully setup and configured your cloud hosted domain.</p>
<div class="row">
<div class="col-xs-10 col-xs-offset-1">
<p id="main-description" class="step-info">You have successfully setup and configured your cloud hosted domain.</p>
</div>
</div>
<dl class="row">
<div class="col-xs-12">
<div class="pull-right">
<div id="visit-domain-checkbox">
<label><input id="go-to-domain" class="form-check-input" type="checkbox"> Visit domain in VR now</label>
</div>
<button id="explore-settings" type="button" class="btn btn-md btn-primary">Explore all domain server settings</button>
<div class="row">
<div class="col-xs-10 col-xs-offset-1">
<div class="step-info">
<b>Invite people in!</b>
</div>
<div id="share-box">
<span id="share-text" class="step-info">Share your domain:</span>
<span id="share-field">
<a id="share-link" class="blue-link" target="_blank"></a>
</span>
<span>
<button type="button" class="btn btn-md btn-default btn-square" onclick="copyToClipboard('#share-link')">
<img src="/images/copy-icon.svg" alt="cpy icon" height="30" width="30">copy
</button>
</span>
</div>
</div>
</dl>
</div>
<div id="congrats-list" class="row">
<div class="col-xs-4 col-xs-offset-1">
<div class="step-info">
<b>Go to your Domain:</b>
</div>
<ul>
<li class="step-info">Browse environments in the Marketplace to select the perfect content set for your VR world.</li>
<li class="step-info">Invite people to your domain right now.</li>
<li class="step-info">Meet new people and explore other domains.</li>
</ul>
</div>
<div class="col-xs-3 col-xs-offset-1">
<div class="step-info">
<b>Continue to Domain settings:</b>
</div>
<ul>
<li class="step-info">Set additional permissions for who can visit and make changes.</li>
<li class="step-info">Adjust audio settings.</li>
<li class="step-info">Back up your domain's content.</li>
</ul>
</div>
</div>
<div class="row">
<div class="col-xs-4 col-xs-offset-1">
<button id="visit-domain" type="button" class="btn btn-md btn-primary btn-square">Visit Your VR World</button>
</div>
<div class="col-xs-3 col-xs-offset-1">
<button id="open-settings" type="button" class="btn btn-md btn-default btn-square next-button">Open Domain Settings</button>
</div>
</div>
</div>
<!--#include virtual="footer.html"-->

View file

@ -11,7 +11,7 @@ $(document).ready(function(){
$('#connect-account-btn').attr('href', URLs.METAVERSE_URL + "/user/tokens/new?for_domain_server=true");
$('[data-toggle="tooltip"]').tooltip();
$('.perms-link').on('click', function() {
var modal_body = '<div>';
modal_body += '<b>None</b> - No one will have permissions. Only you and the users your have given administrator privileges to will have permissions.</br></br>';
@ -70,8 +70,8 @@ $(document).ready(function(){
});
});
$('body').on('click', '#explore-settings', function() {
exploreSettings();
$('body').on('click', '#visit-domain', function() {
$('#share-link')[0].click();
});
$('input[type=radio][name=connect-radio]').change(function() {
@ -120,6 +120,14 @@ $(document).ready(function(){
});
});
function copyToClipboard(element) {
var $temp = $("<input>");
$("body").append($temp);
$temp.val($(element).text()).select();
document.execCommand("copy");
$temp.remove();
}
function setupWizardSteps() {
currentStepNumber = Settings.data.values.wizard.steps_completed;
var steps = null;
@ -155,7 +163,9 @@ function setupWizardSteps() {
function updatePlaceNameLink(address) {
if (address) {
$('#place-name-link').html('Your domain is reachable at: <a target="_blank" href="' + URLs.PLACE_URL + '/' + address + '">' + address + '</a>');
var url = URLs.PLACE_URL + '/' + address;
$('#place-name-link').html('Your domain is reachable at: <a target="_blank" href="' + url + '">' + address + '</a>');
$('#share-field a').attr('href', url).text(url);
}
}
@ -507,14 +517,3 @@ function saveUsernamePassword() {
location.reload();
});
}
function exploreSettings() {
if ($('#go-to-domain').is(":checked")) {
var link = $('#place-name-link a:first');
if (link.length > 0) {
window.open(link.attr("href"));
}
}
goToNextStep();
}

View file

@ -183,6 +183,11 @@ NodePermissions DomainGatekeeper::setPermissionsForUser(bool isLocalUser, QStrin
#ifdef WANT_DEBUG
qDebug() << "| user-permissions: specific MAC matches, so:" << userPerms;
#endif
} else if (_server->_settingsManager.hasPermissionsForMachineFingerprint(machineFingerprint)) {
userPerms = _server->_settingsManager.getPermissionsForMachineFingerprint(machineFingerprint);
#ifdef WANT_DEBUG
qDebug(() << "| user-permissions: specific Machine Fingerprint matches, so: " << userPerms;
#endif
} else if (_server->_settingsManager.hasPermissionsForIP(senderAddress)) {
// this user comes from an IP we have in our permissions table, apply those permissions

View file

@ -550,6 +550,7 @@ void DomainServerSettingsManager::unpackPermissions() {
} else {
// anonymous, logged in, and friend users get connect permissions by default
perms->set(NodePermissions::Permission::canConnectToDomain);
perms->set(NodePermissions::Permission::canRezTemporaryCertifiedEntities);
}
// add the permissions to the standard map
@ -671,7 +672,7 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
bool newPermissions = false;
if (!verifiedUsername.isEmpty()) {
// if we have a verified user name for this user, we apply the kick to the username
// if we have a verified user name for this user, we first apply the kick to the username
// check if there were already permissions
bool hadPermissions = havePermissionsForName(verifiedUsername);
@ -683,7 +684,14 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
// ensure that the connect permission is clear
userPermissions->clear(NodePermissions::Permission::canConnectToDomain);
} else {
}
// if we didn't have a username, or this domain-server uses the "multi-kick" setting to
// kick logged in users via username AND machine fingerprint (or IP as fallback)
// then we remove connect permissions for the machine fingerprint (or IP as fallback)
const QString MULTI_KICK_SETTINGS_KEYPATH = "security.multi_kick_logged_in";
if (verifiedUsername.isEmpty() || valueOrDefaultValueForKeyPath(MULTI_KICK_SETTINGS_KEYPATH).toBool()) {
// remove connect permissions for the machine fingerprint
DomainServerNodeData* nodeData = static_cast<DomainServerNodeData*>(matchingNode->getLinkedData());
if (nodeData) {
@ -718,8 +726,8 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
// TODO: soon we will have feedback (in the form of a message to the client) after we kick. When we
// do, we will have a success flag, and perhaps a reason for failure. For now, just don't do it.
if (kickAddress == limitedNodeList->getPublicSockAddr().getAddress() ||
kickAddress == limitedNodeList->getLocalSockAddr().getAddress() ||
kickAddress.isLoopback() ) {
kickAddress == limitedNodeList->getLocalSockAddr().getAddress() ||
kickAddress.isLoopback() ) {
qWarning() << "attempt to kick node running on same machine as domain server, ignoring KickRequest";
return;
}

View file

@ -33,6 +33,56 @@ var EventBridge;
// replace the TempEventBridge with the real one.
var tempEventBridge = EventBridge;
EventBridge = channel.objects.eventBridge;
EventBridge.audioOutputDeviceChanged.connect(function(deviceName) {
navigator.mediaDevices.getUserMedia({ audio: true, video: true }).then(function(mediaStream) {
navigator.mediaDevices.enumerateDevices().then(function(devices) {
devices.forEach(function(device) {
if (device.kind == "audiooutput") {
if (device.label == deviceName){
console.log("Changing HTML audio output to device " + device.label);
var deviceId = device.deviceId;
var videos = document.getElementsByTagName("video");
for (var i = 0; i < videos.length; i++){
videos[i].setSinkId(deviceId);
}
var audios = document.getElementsByTagName("audio");
for (var i = 0; i < audios.length; i++){
audios[i].setSinkId(deviceId);
}
}
}
});
}).catch(function(err) {
console.log("Error getting media devices"+ err.name + ": " + err.message);
});
}).catch(function(err) {
console.log("Error getting user media"+ err.name + ": " + err.message);
});
});
// To be able to update the state of the output device selection for every element added to the DOM
// we need to listen to events that might precede the addition of this elements.
// A more robust hack will be to add a setInterval that look for DOM changes every 100-300 ms (low performance?)
window.addEventListener("load",function(event) {
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
}, false);
document.addEventListener("click",function(){
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
}, false);
document.addEventListener("change",function(){
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
}, false);
tempEventBridge._callbacks.forEach(function (callback) {
EventBridge.scriptEventReceived.connect(callback);
});

View file

@ -1,127 +0,0 @@
<!-- Copyright 2016 High Fidelity, Inc. -->
<html>
<head>
<meta charset="utf-8"/>
<input type="hidden" id="version" value="1"/>
<title>Welcome to Interface</title>
<style>
body {
background: black;
width: 100%;
overflow-x: hidden;
margin: 0;
padding: 0;
}
#kbm_button {
position: absolute;
left: 70;
top: 118;
width: 297;
height: 80;
}
#hand_controllers_button {
position: absolute;
left: 367;
top: 118;
width: 267;
height: 80;
}
#game_controller_button {
position: absolute;
left: 634;
top: 118;
width: 297;
height: 80;
}
#image_area {
width: 1024;
height: 720;
margin: auto;
position: absolute;
top: 0; left: 0; bottom: 0; right: 0;
}
</style>
<script>
var handControllerImageURL = null;
function showKbm() {
document.getElementById("main_image").setAttribute("src", "img/controls-help-keyboard.png");
}
function showHandControllers() {
document.getElementById("main_image").setAttribute("src", handControllerImageURL);
}
function showGamepad() {
document.getElementById("main_image").setAttribute("src", "img/controls-help-gamepad.png");
}
// This is not meant to be a complete or hardened query string parser - it only
// needs to handle the values we send in and have control over.
//
// queryString is a string of the form "key1=value1&key2=value2&key3&key4=value4"
function parseQueryString(queryString) {
var params = {};
var paramsParts = queryString.split("&");
for (var i = 0; i < paramsParts.length; ++i) {
var paramKeyValue = paramsParts[i].split("=");
if (paramKeyValue.length == 1) {
params[paramKeyValue[0]] = undefined;
} else if (paramKeyValue.length == 2) {
params[paramKeyValue[0]] = paramKeyValue[1];
} else {
console.error("Error parsing param keyvalue: ", paramParts);
}
}
return params;
}
function load() {
var parts = window.location.href.split("?");
var params = {};
if (parts.length > 0) {
params = parseQueryString(parts[1]);
}
switch (params.handControllerName) {
case "oculus":
handControllerImageURL = "img/controls-help-oculus.png";
break;
case "vive":
default:
handControllerImageURL = "img/controls-help-vive.png";
}
switch (params.defaultTab) {
case "gamepad":
showGamepad();
break;
case "handControllers":
showHandControllers();
break;
case "kbm":
default:
showKbm();
}
}
</script>
</head>
<body onload="load()">
<div id="image_area">
<img id="main_image" src="img/controls-help-keyboard.png" width="1024px" height="720px"></img>
<a href="#" id="kbm_button" onmousedown="showKbm()"></a>
<a href="#" id="hand_controllers_button" onmousedown="showHandControllers()"></a>
<a href="#" id="game_controller_button" onmousedown="showGamepad()"></a>
</div>
</body>
</html>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 67 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 604 KiB

After

Width:  |  Height:  |  Size: 298 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 503 KiB

After

Width:  |  Height:  |  Size: 215 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 585 KiB

After

Width:  |  Height:  |  Size: 289 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 547 KiB

After

Width:  |  Height:  |  Size: 254 KiB

View file

@ -12,9 +12,9 @@
var MAX_WARNINGS = 3;
var numWarnings = 0;
var isWindowFocused = true;
var isKeyboardRaised = false;
var isNumericKeyboard = false;
var isPasswordField = false;
window.isKeyboardRaised = false;
window.isNumericKeyboard = false;
window.isPasswordField = false;
function shouldSetPasswordField() {
var nodeType = document.activeElement.type;
@ -62,7 +62,7 @@
var passwordField = shouldSetPasswordField();
if (isWindowFocused &&
(keyboardRaised !== isKeyboardRaised || numericKeyboard !== isNumericKeyboard || passwordField !== isPasswordField)) {
(keyboardRaised !== window.isKeyboardRaised || numericKeyboard !== window.isNumericKeyboard || passwordField !== window.isPasswordField)) {
if (typeof EventBridge !== "undefined" && EventBridge !== null) {
EventBridge.emitWebEvent(
@ -75,20 +75,20 @@
}
}
if (!isKeyboardRaised) {
if (!window.isKeyboardRaised) {
scheduleBringToView(250); // Allow time for keyboard to be raised in QML.
// 2DO: should it be rather done from 'client area height changed' event?
}
isKeyboardRaised = keyboardRaised;
isNumericKeyboard = numericKeyboard;
isPasswordField = passwordField;
window.isKeyboardRaised = keyboardRaised;
window.isNumericKeyboard = numericKeyboard;
window.isPasswordField = passwordField;
}
}, POLL_FREQUENCY);
window.addEventListener("click", function () {
var keyboardRaised = shouldRaiseKeyboard();
if(keyboardRaised && isKeyboardRaised) {
if (keyboardRaised && window.isKeyboardRaised) {
scheduleBringToView(150);
}
});
@ -99,7 +99,7 @@
window.addEventListener("blur", function () {
isWindowFocused = false;
isKeyboardRaised = false;
isNumericKeyboard = false;
window.isKeyboardRaised = false;
window.isNumericKeyboard = false;
});
})();

View file

@ -1,7 +1,7 @@
import QtQuick 2.5
import QtQuick.Controls 1.2
import QtWebChannel 1.0
import QtWebEngine 1.2
import QtWebEngine 1.5
import "controls-uit"
import "styles" as HifiStyles
@ -212,7 +212,7 @@ ScrollingWindow {
WebEngineScript {
id: createGlobalEventBridge
sourceCode: eventBridgeJavaScriptToInject
injectionPoint: WebEngineScript.DocumentCreation
injectionPoint: WebEngineScript.Deferred
worldId: WebEngineScript.MainWorld
}
@ -233,9 +233,13 @@ ScrollingWindow {
anchors.right: parent.right
onFeaturePermissionRequested: {
permissionsBar.securityOrigin = securityOrigin;
permissionsBar.feature = feature;
root.showPermissionsBar();
if (feature == 2) { // QWebEnginePage::MediaAudioCapture
grantFeaturePermission(securityOrigin, feature, true);
} else {
permissionsBar.securityOrigin = securityOrigin;
permissionsBar.feature = feature;
root.showPermissionsBar();
}
}
onLoadingChanged: {

View file

@ -8,7 +8,7 @@
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
import QtQuick 2.5
import QtQuick 2.7
import QtWebEngine 1.5
WebEngineView {

View file

@ -70,11 +70,12 @@ Item {
keyItem.state = "mouseOver";
var globalPosition = keyItem.mapToGlobal(mouseArea1.mouseX, mouseArea1.mouseY);
var deviceId = Web3DOverlay.deviceIdByTouchPoint(globalPosition.x, globalPosition.y);
var hand = deviceId - 1; // based on touchEventUtils.js, deviceId is 'hand + 1', so 'hand' is 'deviceId' - 1
var pointerID = Web3DOverlay.deviceIdByTouchPoint(globalPosition.x, globalPosition.y);
if (hand == leftHand || hand == rightHand) {
Controller.triggerHapticPulse(_HAPTIC_STRENGTH, _HAPTIC_DURATION, hand);
if (Pointers.isLeftHand(pointerID)) {
Controller.triggerHapticPulse(_HAPTIC_STRENGTH, _HAPTIC_DURATION, leftHand);
} else if (Pointers.isRightHand(pointerID)) {
Controller.triggerHapticPulse(_HAPTIC_STRENGTH, _HAPTIC_DURATION, rightHand);
}
}

View file

@ -11,6 +11,7 @@
import QtQuick 2.5
import QtQuick.Controls 1.4
import QtQuick.Controls.Styles 1.4
import QtQuick.Controls 2.2 as QQC2
import "../styles-uit"
@ -24,6 +25,45 @@ TableView {
model: ListModel { }
Component.onCompleted: {
if (flickableItem !== null && flickableItem !== undefined) {
tableView.flickableItem.QQC2.ScrollBar.vertical = scrollbar
}
}
QQC2.ScrollBar {
id: scrollbar
parent: tableView.flickableItem
policy: QQC2.ScrollBar.AsNeeded
orientation: Qt.Vertical
visible: size < 1.0
topPadding: tableView.headerVisible ? hifi.dimensions.tableHeaderHeight + 1 : 1
anchors.top: tableView.top
anchors.left: tableView.right
anchors.bottom: tableView.bottom
background: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent;
topMargin: tableView.headerVisible ? hifi.dimensions.tableHeaderHeight : 0
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight
: hifi.colors.tableScrollBackgroundDark
}
}
contentItem: Item {
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors.fill: parent
radius: (width - 4)/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
}
headerVisible: false
headerDelegate: Rectangle {
height: hifi.dimensions.tableHeaderHeight
@ -98,74 +138,13 @@ TableView {
backgroundVisible: true
horizontalScrollBarPolicy: Qt.ScrollBarAlwaysOff
verticalScrollBarPolicy: Qt.ScrollBarAsNeeded
verticalScrollBarPolicy: Qt.ScrollBarAlwaysOff
style: TableViewStyle {
// Needed in order for rows to keep displaying rows after end of table entries.
backgroundColor: tableView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
alternateBackgroundColor: tableView.isLightColorScheme ? hifi.colors.tableRowLightOdd : hifi.colors.tableRowDarkOdd
padding.top: headerVisible ? hifi.dimensions.tableHeaderHeight: 0
handle: Item {
id: scrollbarHandle
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors {
fill: parent
topMargin: 3
bottomMargin: 3 // ""
leftMargin: 1 // Move it right
rightMargin: -1 // ""
}
radius: hifi.dimensions.scrollbarHandleWidth/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
scrollBarBackground: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent
margins: -1 // Expand
topMargin: -1
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight : hifi.colors.tableScrollBackgroundDark
// Extend header color above scrollbar background
Rectangle {
anchors {
top: parent.top
topMargin: -hifi.dimensions.tableHeaderHeight
left: parent.left
right: parent.right
}
height: hifi.dimensions.tableHeaderHeight
color: tableView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
visible: headerVisible
}
Rectangle {
// Extend header bottom border
anchors {
top: parent.top
left: parent.left
right: parent.right
}
height: 1
color: isLightColorScheme ? hifi.colors.lightGrayText : hifi.colors.baseGrayHighlight
visible: headerVisible
}
}
}
incrementControl: Item {
visible: false
}
decrementControl: Item {
visible: false
}
}
rowDelegate: Rectangle {

View file

@ -9,9 +9,11 @@
//
import QtQml.Models 2.2
import QtQuick 2.5
import QtQuick 2.7
import QtQuick.Controls 1.4
import QtQuick.Controls.Styles 1.4
import QtQuick.Controls 2.2 as QQC2
import "../styles-uit"
@ -35,6 +37,45 @@ TreeView {
headerVisible: false
Component.onCompleted: {
if (flickableItem !== null && flickableItem !== undefined) {
treeView.flickableItem.QQC2.ScrollBar.vertical = scrollbar
}
}
QQC2.ScrollBar {
id: scrollbar
parent: treeView.flickableItem
policy: QQC2.ScrollBar.AsNeeded
orientation: Qt.Vertical
visible: size < 1.0
topPadding: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight + 1 : 1
anchors.top: treeView.top
anchors.left: treeView.right
anchors.bottom: treeView.bottom
background: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent;
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight: 0
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight
: hifi.colors.tableScrollBackgroundDark
}
}
contentItem: Item {
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors.fill: parent
radius: (width - 4)/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
}
// Use rectangle to draw border with rounded corners.
frameVisible: false
Rectangle {
@ -50,7 +91,7 @@ TreeView {
backgroundVisible: true
horizontalScrollBarPolicy: Qt.ScrollBarAlwaysOff
verticalScrollBarPolicy: Qt.ScrollBarAsNeeded
verticalScrollBarPolicy: Qt.ScrollBarAlwaysOff
style: TreeViewStyle {
// Needed in order for rows to keep displaying rows after end of table entries.
@ -126,66 +167,6 @@ TreeView {
leftMargin: hifi.dimensions.tablePadding / 2
}
}
handle: Item {
id: scrollbarHandle
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors {
fill: parent
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight + 3 : 3
bottomMargin: 3 // ""
leftMargin: 1 // Move it right
rightMargin: -1 // ""
}
radius: hifi.dimensions.scrollbarHandleWidth / 2
color: treeView.isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
scrollBarBackground: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight - 1 : -1
margins: -1 // Expand
}
color: treeView.isLightColorScheme ? hifi.colors.tableScrollBackgroundLight : hifi.colors.tableScrollBackgroundDark
// Extend header color above scrollbar background
Rectangle {
anchors {
top: parent.top
topMargin: -hifi.dimensions.tableHeaderHeight
left: parent.left
right: parent.right
}
height: hifi.dimensions.tableHeaderHeight
color: treeView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
visible: treeView.headerVisible
}
Rectangle {
// Extend header bottom border
anchors {
top: parent.top
left: parent.left
right: parent.right
}
height: 1
color: treeView.isLightColorScheme ? hifi.colors.lightGrayText : hifi.colors.baseGrayHighlight
visible: treeView.headerVisible
}
}
}
incrementControl: Item {
visible: false
}
decrementControl: Item {
visible: false
}
}
rowDelegate: Rectangle {
@ -193,8 +174,8 @@ TreeView {
color: styleData.selected
? hifi.colors.primaryHighlight
: treeView.isLightColorScheme
? (styleData.alternate ? hifi.colors.tableRowLightEven : hifi.colors.tableRowLightOdd)
: (styleData.alternate ? hifi.colors.tableRowDarkEven : hifi.colors.tableRowDarkOdd)
? (styleData.alternate ? hifi.colors.tableRowLightEven : hifi.colors.tableRowLightOdd)
: (styleData.alternate ? hifi.colors.tableRowDarkEven : hifi.colors.tableRowDarkOdd)
}
itemDelegate: FiraSansSemiBold {
@ -209,9 +190,9 @@ TreeView {
text: styleData.value
size: hifi.fontSizes.tableText
color: colorScheme == hifi.colorSchemes.light
? (styleData.selected ? hifi.colors.black : hifi.colors.baseGrayHighlight)
: (styleData.selected ? hifi.colors.black : hifi.colors.lightGrayText)
? (styleData.selected ? hifi.colors.black : hifi.colors.baseGrayHighlight)
: (styleData.selected ? hifi.colors.black : hifi.colors.lightGrayText)
elide: Text.ElideRight
}

View file

@ -135,4 +135,10 @@ Item {
playing: visible
z: 10000
}
Keys.onPressed: {
if ((event.modifiers & Qt.ShiftModifier) && (event.modifiers & Qt.ControlModifier)) {
webViewCore.focus = false;
}
}
}

View file

@ -51,19 +51,23 @@ FocusScope {
// The VR version of the primary menu
property var rootMenu: Menu {
id: rootMenuId
objectName: "rootMenu"
// for some reasons it is not possible to use just '({})' here as it gets empty when passed to TableRoot/DesktopRoot
property var exclusionGroupsByMenuItem : ListModel {}
property var exclusionGroups: ({});
property Component exclusiveGroupMaker: Component {
ExclusiveGroup {
}
}
function addExclusionGroup(menuItem, exclusionGroup)
{
exclusionGroupsByMenuItem.append(
{
'menuItem' : menuItem.toString(),
'exclusionGroup' : exclusionGroup.toString()
}
);
function addExclusionGroup(qmlAction, exclusionGroup) {
var exclusionGroupId = exclusionGroup.toString();
if(!exclusionGroups[exclusionGroupId]) {
exclusionGroups[exclusionGroupId] = exclusiveGroupMaker.createObject(rootMenuId);
}
qmlAction.exclusiveGroup = exclusionGroups[exclusionGroupId]
}
}

View file

@ -1,6 +1,6 @@
import QtQuick 2.5
import QtQuick.Controls 1.4
import QtWebEngine 1.1;
import QtWebEngine 1.5;
import Qt.labs.settings 1.0
import "../desktop" as OriginalDesktop

View file

@ -442,7 +442,7 @@ Item {
Rectangle {
id: nameCardVUMeter
// Size
width: isMyCard ? myDisplayName.width - 20 : ((gainSlider.value - gainSlider.minimumValue)/(gainSlider.maximumValue - gainSlider.minimumValue)) * (gainSlider.width);
width: ((gainSlider.value - gainSlider.minimumValue)/(gainSlider.maximumValue - gainSlider.minimumValue)) * (gainSlider.width);
height: 8
// Anchors
anchors.bottom: isMyCard ? avatarImage.bottom : parent.bottom;
@ -526,16 +526,14 @@ Item {
anchors.verticalCenter: nameCardVUMeter.verticalCenter;
anchors.left: nameCardVUMeter.left;
// Properties
visible: !isMyCard && selected && pal.activeTab == "nearbyTab" && isPresent;
visible: (isMyCard || (selected && pal.activeTab == "nearbyTab")) && isPresent;
value: Users.getAvatarGain(uuid)
minimumValue: -60.0
maximumValue: 20.0
stepSize: 5
updateValueWhileDragging: true
onValueChanged: {
if (uuid !== "") {
updateGainFromQML(uuid, value, false);
}
updateGainFromQML(uuid, value, false);
}
onPressedChanged: {
if (!pressed) {
@ -575,7 +573,19 @@ Item {
implicitHeight: 16
}
}
}
RalewayRegular {
// The slider for my card is special, it controls the master gain
id: gainSliderText;
visible: isMyCard;
text: "master volume";
size: hifi.fontSizes.tabularData;
anchors.left: parent.right;
anchors.leftMargin: 8;
color: hifi.colors.baseGrayHighlight;
horizontalAlignment: Text.AlignLeft;
verticalAlignment: Text.AlignTop;
}
}
function updateGainFromQML(avatarUuid, sliderValue, isReleased) {
Users.setAvatarGain(avatarUuid, sliderValue);

View file

@ -1,3 +1,4 @@
//
// WebBrowser.qml
//
@ -9,12 +10,12 @@
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
import QtQuick 2.5
import QtQuick.Controls 1.5 as QQControls
import QtQuick 2.7
import QtQuick.Controls 2.2 as QQControls
import QtQuick.Layouts 1.3
import QtQuick.Controls.Styles 1.4
import QtGraphicalEffects 1.0
import QtWebEngine 1.2
import QtWebEngine 1.5
import QtWebChannel 1.0
import "../styles-uit"
@ -22,6 +23,8 @@ import "../controls-uit" as HifiControls
import "../windows"
import "../controls"
import HifiWeb 1.0
Rectangle {
id: root;
@ -32,214 +35,401 @@ Rectangle {
property bool keyboardEnabled: true // FIXME - Keyboard HMD only: Default to false
property bool keyboardRaised: false
property bool punctuationMode: false
property var suggestionsList: []
readonly property string searchUrlTemplate: "https://www.google.com/search?client=hifibrowser&q=";
WebBrowserSuggestionsEngine {
id: searchEngine
onSuggestions: {
if (suggestions.length > 0) {
suggestionsList = []
suggestionsList.push(addressBarInput.text); //do not overwrite edit text
for(var i = 0; i < suggestions.length; i++) {
suggestionsList.push(suggestions[i]);
}
addressBar.model = suggestionsList
if (!addressBar.popup.visible) {
addressBar.popup.open();
}
}
}
}
Timer {
id: suggestionRequestTimer
interval: 200
repeat: false
onTriggered: {
if (addressBar.editText !== "") {
searchEngine.querySuggestions(addressBarInput.text);
}
}
}
color: hifi.colors.baseGray;
// only show the title if loaded through a "loader"
function goTo(url) {
//must be valid attempt to open an site with dot
var urlNew = url
if (url.indexOf(".") > 0) {
if (url.indexOf("http") < 0) {
urlNew = "http://" + url;
}
} else {
urlNew = searchUrlTemplate + url
}
addressBar.model = []
//need to rebind if binfing was broken by selecting from suggestions
addressBar.editText = Qt.binding( function() { return webStack.currentItem.webEngineView.url; });
webStack.currentItem.webEngineView.url = urlNew
suggestionRequestTimer.stop();
addressBar.popup.close();
}
Column {
spacing: 2
width: parent.width;
RowLayout {
id: addressBarRow
width: parent.width;
height: 48
HifiControls.WebGlyphButton {
enabled: webEngineView.canGoBack
enabled: webStack.currentItem.webEngineView.canGoBack || webStack.depth > 1
glyph: hifi.glyphs.backward;
anchors.verticalCenter: parent.verticalCenter;
size: 38;
onClicked: {
webEngineView.goBack()
if (webStack.currentItem.webEngineView.canGoBack) {
webStack.currentItem.webEngineView.goBack();
} else if (webStack.depth > 1) {
webStack.pop();
}
}
}
HifiControls.WebGlyphButton {
enabled: webEngineView.canGoForward
enabled: webStack.currentItem.webEngineView.canGoForward
glyph: hifi.glyphs.forward;
anchors.verticalCenter: parent.verticalCenter;
size: 38;
onClicked: {
webEngineView.goForward()
webStack.currentItem.webEngineView.goForward();
}
}
QQControls.TextField {
QQControls.ComboBox {
id: addressBar
Image {
anchors.verticalCenter: addressBar.verticalCenter;
x: 5
z: 2
id: faviconImage
width: 16; height: 16
sourceSize: Qt.size(width, height)
source: webEngineView.icon
//selectByMouse: true
focus: true
editable: true
//flat: true
indicator: Item {}
background: Item {}
onActivated: {
goTo(textAt(index));
}
HifiControls.WebGlyphButton {
glyph: webEngineView.loading ? hifi.glyphs.closeSmall : hifi.glyphs.reloadSmall;
anchors.verticalCenter: parent.verticalCenter;
width: hifi.dimensions.controlLineHeight
z: 2
x: addressBar.width - 28
onClicked: {
if (webEngineView.loading) {
webEngineView.stop()
} else {
reloadTimer.start()
onHighlightedIndexChanged: {
if (highlightedIndex >= 0) {
addressBar.editText = textAt(highlightedIndex)
}
}
popup.height: webStack.height
onFocusChanged: {
if (focus) {
addressBarInput.selectAll();
}
}
contentItem: QQControls.TextField {
id: addressBarInput
leftPadding: 26
rightPadding: hifi.dimensions.controlLineHeight + 5
text: addressBar.editText
placeholderText: qsTr("Enter URL")
font: addressBar.font
selectByMouse: true
horizontalAlignment: Text.AlignLeft
verticalAlignment: Text.AlignVCenter
onFocusChanged: {
if (focus) {
selectAll();
}
}
Keys.onDeletePressed: {
addressBarInput.text = ""
}
Keys.onPressed: {
if (event.key === Qt.Key_Return) {
goTo(addressBarInput.text);
event.accepted = true;
}
}
Image {
anchors.verticalCenter: parent.verticalCenter;
x: 5
z: 2
id: faviconImage
width: 16; height: 16
sourceSize: Qt.size(width, height)
source: webStack.currentItem.webEngineView.icon
}
HifiControls.WebGlyphButton {
glyph: webStack.currentItem.webEngineView.loading ? hifi.glyphs.closeSmall : hifi.glyphs.reloadSmall;
anchors.verticalCenter: parent.verticalCenter;
width: hifi.dimensions.controlLineHeight
z: 2
x: addressBarInput.width - implicitWidth
onClicked: {
if (webStack.currentItem.webEngineView.loading) {
webStack.currentItem.webEngineView.stop();
} else {
webStack.currentItem.reloadTimer.start();
}
}
}
}
style: TextFieldStyle {
padding {
left: 26;
right: 26
Component.onCompleted: ScriptDiscoveryService.scriptsModelFilter.filterRegExp = new RegExp("^.*$", "i");
Keys.onPressed: {
if (event.key === Qt.Key_Return) {
goTo(addressBarInput.text);
event.accepted = true;
}
}
focus: true
onEditTextChanged: {
if (addressBar.editText !== "" && addressBar.editText !== webStack.currentItem.webEngineView.url.toString()) {
suggestionRequestTimer.restart();
} else {
addressBar.model = []
addressBar.popup.close();
}
}
Layout.fillWidth: true
text: webEngineView.url
onAccepted: webEngineView.url = text
editText: webStack.currentItem.webEngineView.url
onAccepted: goTo(addressBarInput.text);
}
HifiControls.WebGlyphButton {
checkable: true
//only QtWebEngine 1.3
//checked: webEngineView.audioMuted
checked: webStack.currentItem.webEngineView.audioMuted
glyph: checked ? hifi.glyphs.unmuted : hifi.glyphs.muted
anchors.verticalCenter: parent.verticalCenter;
width: hifi.dimensions.controlLineHeight
onClicked: {
webEngineView.triggerWebAction(WebEngineView.ToggleMediaMute)
webStack.currentItem.webEngineView.audioMuted = !webStack.currentItem.webEngineView.audioMuted
}
}
}
QQControls.ProgressBar {
id: loadProgressBar
style: ProgressBarStyle {
background: Rectangle {
color: "#6A6A6A"
}
progress: Rectangle{
background: Rectangle {
implicitHeight: 2
color: "#6A6A6A"
}
contentItem: Item {
implicitHeight: 2
Rectangle {
width: loadProgressBar.visualPosition * parent.width
height: parent.height
color: "#00B4EF"
}
}
width: parent.width;
minimumValue: 0
maximumValue: 100
value: webEngineView.loadProgress
from: 0
to: 100
value: webStack.currentItem.webEngineView.loadProgress
height: 2
}
HifiControls.BaseWebView {
id: webEngineView
focus: true
objectName: "tabletWebEngineView"
Component {
id: webViewComponent
Rectangle {
property alias webEngineView: webEngineView
property alias reloadTimer: reloadTimer
url: "http://www.highfidelity.com"
property real webViewHeight: root.height - loadProgressBar.height - 48 - 4
property WebEngineNewViewRequest request: null
width: parent.width;
height: keyboardEnabled && keyboardRaised ? webViewHeight - keyboard.height : webViewHeight
property bool isDialog: QQControls.StackView.index > 0
property real margins: isDialog ? 10 : 0
profile: HFWebEngineProfile;
color: "#d1d1d1"
property string userScriptUrl: ""
// creates a global EventBridge object.
WebEngineScript {
id: createGlobalEventBridge
sourceCode: eventBridgeJavaScriptToInject
injectionPoint: WebEngineScript.DocumentCreation
worldId: WebEngineScript.MainWorld
}
// detects when to raise and lower virtual keyboard
WebEngineScript {
id: raiseAndLowerKeyboard
injectionPoint: WebEngineScript.Deferred
sourceUrl: resourceDirectoryUrl + "/html/raiseAndLowerKeyboard.js"
worldId: WebEngineScript.MainWorld
}
// User script.
WebEngineScript {
id: userScript
sourceUrl: webEngineView.userScriptUrl
injectionPoint: WebEngineScript.DocumentReady // DOM ready but page load may not be finished.
worldId: WebEngineScript.MainWorld
}
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard, userScript ]
settings.autoLoadImages: true
settings.javascriptEnabled: true
settings.errorPageEnabled: true
settings.pluginsEnabled: true
settings.fullScreenSupportEnabled: false
//from WebEngine 1.3
// settings.autoLoadIconsForPage: false
// settings.touchIconsEnabled: false
onCertificateError: {
error.defer();
}
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
webEngineView.profile.httpUserAgent = "Mozilla/5.0 Chrome (HighFidelityInterface)";
}
onFeaturePermissionRequested: {
grantFeaturePermission(securityOrigin, feature, true);
}
onNewViewRequested: {
if (!request.userInitiated) {
print("Warning: Blocked a popup window.");
}
}
onRenderProcessTerminated: {
var status = "";
switch (terminationStatus) {
case WebEngineView.NormalTerminationStatus:
status = "(normal exit)";
break;
case WebEngineView.AbnormalTerminationStatus:
status = "(abnormal exit)";
break;
case WebEngineView.CrashedTerminationStatus:
status = "(crashed)";
break;
case WebEngineView.KilledTerminationStatus:
status = "(killed)";
break;
QQControls.StackView.onActivated: {
addressBar.editText = Qt.binding( function() { return webStack.currentItem.webEngineView.url; });
}
print("Render process exited with code " + exitCode + " " + status);
reloadTimer.running = true;
}
onRequestChanged: {
if (isDialog && request !== null && request !== undefined) {//is Dialog ?
request.openIn(webEngineView);
}
}
onWindowCloseRequested: {
}
HifiControls.BaseWebView {
id: webEngineView
anchors.fill: parent
anchors.margins: parent.margins
Timer {
id: reloadTimer
interval: 0
running: false
repeat: false
onTriggered: webEngineView.reload()
layer.enabled: parent.isDialog
layer.effect: DropShadow {
verticalOffset: 8
horizontalOffset: 8
color: "#330066ff"
samples: 10
spread: 0.5
}
focus: true
objectName: "tabletWebEngineView"
//profile: HFWebEngineProfile;
profile.httpUserAgent: "Mozilla/5.0 (Android; Mobile; rv:13.0) Gecko/13.0 Firefox/13.0"
property string userScriptUrl: ""
onLoadingChanged: {
if (!loading) {
addressBarInput.cursorPosition = 0 //set input field cursot to beginning
suggestionRequestTimer.stop();
addressBar.popup.close();
}
}
onLinkHovered: {
//TODO: change cursor shape?
}
// creates a global EventBridge object.
WebEngineScript {
id: createGlobalEventBridge
sourceCode: eventBridgeJavaScriptToInject
injectionPoint: WebEngineScript.Deferred
worldId: WebEngineScript.MainWorld
}
// detects when to raise and lower virtual keyboard
WebEngineScript {
id: raiseAndLowerKeyboard
injectionPoint: WebEngineScript.Deferred
sourceUrl: resourceDirectoryUrl + "/html/raiseAndLowerKeyboard.js"
worldId: WebEngineScript.MainWorld
}
// User script.
WebEngineScript {
id: userScript
sourceUrl: webEngineView.userScriptUrl
injectionPoint: WebEngineScript.DocumentReady // DOM ready but page load may not be finished.
worldId: WebEngineScript.MainWorld
}
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard, userScript ]
settings.autoLoadImages: true
settings.javascriptEnabled: true
settings.errorPageEnabled: true
settings.pluginsEnabled: true
settings.fullScreenSupportEnabled: true
settings.autoLoadIconsForPage: true
settings.touchIconsEnabled: true
onCertificateError: {
error.defer();
}
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
}
onFeaturePermissionRequested: {
grantFeaturePermission(securityOrigin, feature, true);
}
onNewViewRequested: {
if (request.destination == WebEngineView.NewViewInDialog) {
webStack.push(webViewComponent, {"request": request});
} else {
request.openIn(webEngineView);
}
}
onRenderProcessTerminated: {
var status = "";
switch (terminationStatus) {
case WebEngineView.NormalTerminationStatus:
status = "(normal exit)";
break;
case WebEngineView.AbnormalTerminationStatus:
status = "(abnormal exit)";
break;
case WebEngineView.CrashedTerminationStatus:
status = "(crashed)";
break;
case WebEngineView.KilledTerminationStatus:
status = "(killed)";
break;
}
console.error("Render process exited with code " + exitCode + " " + status);
reloadTimer.running = true;
}
onFullScreenRequested: {
if (request.toggleOn) {
webEngineView.state = "FullScreen";
} else {
webEngineView.state = "";
}
request.accept();
}
onWindowCloseRequested: {
webStack.pop();
}
}
Timer {
id: reloadTimer
interval: 0
running: false
repeat: false
onTriggered: webEngineView.reload()
}
}
}
QQControls.StackView {
id: webStack
width: parent.width;
property real webViewHeight: root.height - loadProgressBar.height - 48 - 4
height: keyboardEnabled && keyboardRaised ? webViewHeight - keyboard.height : webViewHeight
Component.onCompleted: webStack.push(webViewComponent, {"webEngineView.url": "https://www.highfidelity.com"});
}
}
HifiControls.Keyboard {
id: keyboard
raised: parent.keyboardEnabled && parent.keyboardRaised

View file

@ -79,10 +79,12 @@ Rectangle {
if (result.status !== 'success') {
failureErrorText.text = result.message;
root.activeView = "checkoutFailure";
UserActivityLogger.commercePurchaseFailure(root.itemId, root.itemPrice, !root.alreadyOwned, result.message);
} else {
root.itemHref = result.data.download_url;
root.isWearable = result.data.categories.indexOf("Wearables") > -1;
root.activeView = "checkoutSuccess";
UserActivityLogger.commercePurchaseSuccess(root.itemId, root.itemPrice, !root.alreadyOwned);
}
}
@ -599,6 +601,7 @@ Rectangle {
sendToScript({method: 'checkout_rezClicked', itemHref: root.itemHref, isWearable: root.isWearable});
rezzedNotifContainer.visible = true;
rezzedNotifContainerTimer.start();
UserActivityLogger.commerceEntityRezzed(root.itemId, "checkout", root.isWearable ? "rez" : "wear");
}
}
RalewaySemiBold {

View file

@ -430,7 +430,7 @@ Rectangle {
var a = new Date(timestamp);
var year = a.getFullYear();
var month = addLeadingZero(a.getMonth());
var month = addLeadingZero(a.getMonth() + 1);
var day = addLeadingZero(a.getDate());
var hour = a.getHours();
var drawnHour = hour;

View file

@ -349,6 +349,7 @@ Item {
sendToPurchases({method: 'purchases_rezClicked', itemHref: root.itemHref, isWearable: root.isWearable});
rezzedNotifContainer.visible = true;
rezzedNotifContainerTimer.start();
UserActivityLogger.commerceEntityRezzed(root.itemId, "purchases", root.isWearable ? "rez" : "wear");
}
style: ButtonStyle {

View file

@ -343,6 +343,9 @@ Rectangle {
ListModel {
id: previousPurchasesModel;
}
HifiCommerceCommon.SortableListModel {
id: tempPurchasesModel;
}
HifiCommerceCommon.SortableListModel {
id: filteredPurchasesModel;
}
@ -635,20 +638,40 @@ Rectangle {
}
function buildFilteredPurchasesModel() {
filteredPurchasesModel.clear();
var sameItemCount = 0;
tempPurchasesModel.clear();
for (var i = 0; i < purchasesModel.count; i++) {
if (purchasesModel.get(i).title.toLowerCase().indexOf(filterBar.text.toLowerCase()) !== -1) {
if (purchasesModel.get(i).status !== "confirmed" && !root.isShowingMyItems) {
filteredPurchasesModel.insert(0, purchasesModel.get(i));
tempPurchasesModel.insert(0, purchasesModel.get(i));
} else if ((root.isShowingMyItems && purchasesModel.get(i).edition_number === "0") ||
(!root.isShowingMyItems && purchasesModel.get(i).edition_number !== "0")) {
filteredPurchasesModel.append(purchasesModel.get(i));
tempPurchasesModel.append(purchasesModel.get(i));
}
}
}
for (var i = 0; i < tempPurchasesModel.count; i++) {
if (!filteredPurchasesModel.get(i)) {
sameItemCount = -1;
break;
} else if (tempPurchasesModel.get(i).itemId === filteredPurchasesModel.get(i).itemId &&
tempPurchasesModel.get(i).edition_number === filteredPurchasesModel.get(i).edition_number &&
tempPurchasesModel.get(i).status === filteredPurchasesModel.get(i).status) {
sameItemCount++;
}
}
populateDisplayedItemCounts();
sortByDate();
if (sameItemCount !== tempPurchasesModel.count) {
filteredPurchasesModel.clear();
for (var i = 0; i < tempPurchasesModel.count; i++) {
filteredPurchasesModel.append(tempPurchasesModel.get(i));
}
populateDisplayedItemCounts();
sortByDate();
}
}
function checkIfAnyItemStatusChanged() {

View file

@ -90,6 +90,11 @@ Item {
ListModel {
id: helpModel;
ListElement {
isExpanded: false;
question: "How can I get HFC?"
answer: qsTr("High Fidelity commerce is in closed beta right now.<br><br>To request entry and get free HFC, <b>please contact info@highfidelity.com with your High Fidelity account username and the email address registered to that account.</b>");
}
ListElement {
isExpanded: false;
question: "What are private keys?"
@ -118,12 +123,7 @@ Item {
ListElement {
isExpanded: false;
question: "My HFC balance isn't what I expect it to be. Why?"
answer: qsTr('High Fidelity Coin (HFC) transactions are backed by a <b>blockchain</b>, which takes time to update. The status of a transaction usually updates within 90 seconds.<br><br><b><font color="#0093C5"><a href="#blockchain">Tap here to learn more about the blockchain.</a></font></b>');
}
ListElement {
isExpanded: false;
question: "My friend purchased my item from the Marketplace, but I still haven't received the money from the sale. Why not?"
answer: qsTr('High Fidelity Coin (HFC) transactions are backed by a <b>blockchain</b>, which takes time to update. The status of a transaction usually updates within 90 seconds, at which point you will receive your money.<br><br><b><font color="#0093C5"><a href="#blockchain">Tap here to learn more about the blockchain.</a></font></b>');
answer: qsTr('High Fidelity Coin (HFC) transactions are backed by a <b>blockchain</b>, which takes time to update. The status of a transaction usually updates within a few seconds.<br><br><b><font color="#0093C5"><a href="#blockchain">Tap here to learn more about the blockchain.</a></font></b>');
}
ListElement {
isExpanded: false;
@ -243,7 +243,7 @@ Item {
if (link === "#privateKeyPath") {
Qt.openUrlExternally("file:///" + root.keyFilePath.substring(0, root.keyFilePath.lastIndexOf('/')));
} else if (link === "#blockchain") {
Qt.openUrlExternally("https://www.highfidelity.com/");
Qt.openUrlExternally("https://docs.highfidelity.com/high-fidelity-commerce");
}
}
}

View file

@ -206,16 +206,6 @@ Item {
root.isPasswordField = (focus && passphraseField.echoMode === TextInput.Password);
}
//MouseArea {
// anchors.fill: parent;
// onClicked: {
// root.keyboardRaised = true;
// root.isPasswordField = (passphraseField.echoMode === TextInput.Password);
// mouse.accepted = false;
// }
//}
onAccepted: {
submitPassphraseInputButton.enabled = false;
Commerce.setPassphrase(passphraseField.text);
@ -362,25 +352,6 @@ Item {
right: parent.right;
}
Image {
id: lowerKeyboardButton;
z: 999;
source: "images/lowerKeyboard.png";
anchors.right: keyboard.right;
anchors.top: keyboard.showMirrorText ? keyboard.top : undefined;
anchors.bottom: keyboard.showMirrorText ? undefined : keyboard.bottom;
height: 50;
width: 60;
MouseArea {
anchors.fill: parent;
onClicked: {
root.keyboardRaised = false;
}
}
}
HifiControlsUit.Keyboard {
id: keyboard;
raised: HMD.mounted && root.keyboardRaised;

View file

@ -82,20 +82,9 @@ Item {
if (focus) {
var hidePassword = (currentPassphraseField.echoMode === TextInput.Password);
sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseFieldAgain.focus) {
sendSignalToWallet({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}
//MouseArea {
// anchors.fill: parent;
// onPressed: {
// var hidePassword = (currentPassphraseField.echoMode === TextInput.Password);
// sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
// mouse.accepted = false;
// }
//}
onAccepted: {
passphraseField.focus = true;
}
@ -115,21 +104,10 @@ Item {
activeFocusOnPress: true;
activeFocusOnTab: true;
//MouseArea {
// anchors.fill: parent;
// onPressed: {
// var hidePassword = (passphraseField.echoMode === TextInput.Password);
// sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
// mouse.accepted = false;
// }
//}
onFocusChanged: {
if (focus) {
var hidePassword = (passphraseField.echoMode === TextInput.Password);
sendMessageToLightbox({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseFieldAgain.focus) {
sendMessageToLightbox({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}
@ -151,21 +129,10 @@ Item {
activeFocusOnPress: true;
activeFocusOnTab: true;
//MouseArea {
// anchors.fill: parent;
// onPressed: {
// var hidePassword = (passphraseFieldAgain.echoMode === TextInput.Password);
// sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
// mouse.accepted = false;
// }
//}
onFocusChanged: {
if (focus) {
var hidePassword = (passphraseFieldAgain.echoMode === TextInput.Password);
sendMessageToLightbox({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseField.focus) {
sendMessageToLightbox({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}

View file

@ -49,6 +49,12 @@ Rectangle {
} else if (walletStatus === 1) {
if (root.activeView !== "walletSetup") {
root.activeView = "walletSetup";
commerce.resetLocalWalletOnly();
var timestamp = new Date();
walletSetup.startingTimestamp = timestamp;
walletSetup.setupAttemptID = generateUUID();
UserActivityLogger.commerceWalletSetupStarted(timestamp, setupAttemptID, walletSetup.setupFlowVersion, walletSetup.referrer ? walletSetup.referrer : "wallet app",
(AddressManager.placename || AddressManager.hostname || '') + (AddressManager.pathname ? AddressManager.pathname.match(/\/[^\/]+/)[0] : ''));
}
} else if (walletStatus === 2) {
if (root.activeView !== "passphraseModal") {
@ -174,7 +180,7 @@ Rectangle {
Connections {
onSendSignalToWallet: {
if (msg.method === 'walletSetup_finished') {
if (msg.referrer === '') {
if (msg.referrer === '' || msg.referrer === 'marketplace cta') {
root.activeView = "initialize";
Commerce.getWalletStatus();
} else if (msg.referrer === 'purchases') {
@ -668,25 +674,6 @@ Rectangle {
right: parent.right;
}
Image {
id: lowerKeyboardButton;
z: 999;
source: "images/lowerKeyboard.png";
anchors.right: keyboard.right;
anchors.top: keyboard.showMirrorText ? keyboard.top : undefined;
anchors.bottom: keyboard.showMirrorText ? undefined : keyboard.bottom;
height: 50;
width: 60;
MouseArea {
anchors.fill: parent;
onClicked: {
root.keyboardRaised = false;
}
}
}
HifiControlsUit.Keyboard {
id: keyboard;
raised: HMD.mounted && root.keyboardRaised;
@ -721,12 +708,28 @@ Rectangle {
case 'updateWalletReferrer':
walletSetup.referrer = message.referrer;
break;
case 'inspectionCertificate_resetCert':
// NOP
break;
default:
console.log('Unrecognized message from wallet.js:', JSON.stringify(message));
}
}
signal sendToScript(var message);
// generateUUID() taken from:
// https://stackoverflow.com/a/8809472
function generateUUID() { // Public Domain/MIT
var d = new Date().getTime();
if (typeof performance !== 'undefined' && typeof performance.now === 'function'){
d += performance.now(); //use high-precision timer if available
}
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function (c) {
var r = (d + Math.random() * 16) % 16 | 0;
d = Math.floor(d / 16);
return (c === 'x' ? r : (r & 0x3 | 0x8)).toString(16);
});
}
//
// FUNCTION DEFINITIONS END
//

View file

@ -38,10 +38,28 @@ Item {
onHistoryResult : {
historyReceived = true;
if (result.status === 'success') {
transactionHistoryModel.clear();
transactionHistoryModel.append(result.data.history);
var sameItemCount = 0;
tempTransactionHistoryModel.clear();
tempTransactionHistoryModel.append(result.data.history);
for (var i = 0; i < tempTransactionHistoryModel.count; i++) {
if (!transactionHistoryModel.get(i)) {
sameItemCount = -1;
break;
} else if (tempTransactionHistoryModel.get(i).transaction_type === transactionHistoryModel.get(i).transaction_type &&
tempTransactionHistoryModel.get(i).text === transactionHistoryModel.get(i).text) {
sameItemCount++;
}
}
calculatePendingAndInvalidated();
if (sameItemCount !== tempTransactionHistoryModel.count) {
transactionHistoryModel.clear();
for (var i = 0; i < tempTransactionHistoryModel.count; i++) {
transactionHistoryModel.append(tempTransactionHistoryModel.get(i));
}
calculatePendingAndInvalidated();
}
}
refreshTimer.start();
}
@ -50,6 +68,7 @@ Item {
Connections {
target: GlobalServices
onMyUsernameChanged: {
transactionHistoryModel.clear();
usernameText.text = Account.username;
}
}
@ -143,10 +162,9 @@ Item {
Timer {
id: refreshTimer;
interval: 4000; // Remove this after demo?
interval: 4000;
onTriggered: {
console.log("Refreshing Wallet Home...");
historyReceived = false;
Commerce.balance();
Commerce.history();
}
@ -187,6 +205,9 @@ Item {
// Style
color: hifi.colors.baseGrayHighlight;
}
ListModel {
id: tempTransactionHistoryModel;
}
ListModel {
id: transactionHistoryModel;
}
@ -339,7 +360,7 @@ Item {
var a = new Date(timestamp);
var year = a.getFullYear();
var month = addLeadingZero(a.getMonth());
var month = addLeadingZero(a.getMonth() + 1);
var day = addLeadingZero(a.getDate());
var hour = a.getHours();
var drawnHour = hour;

View file

@ -31,6 +31,10 @@ Item {
property bool hasShownSecurityImageTip: false;
property string referrer;
property string keyFilePath;
property date startingTimestamp;
property string setupAttemptID;
readonly property int setupFlowVersion: 1;
readonly property var setupStepNames: [ "Setup Prompt", "Security Image Selection", "Passphrase Selection", "Private Keys Ready" ];
Image {
anchors.fill: parent;
@ -67,6 +71,13 @@ Item {
anchors.fill: parent;
}
onActiveViewChanged: {
var timestamp = new Date();
var currentStepNumber = root.activeView.substring(5);
UserActivityLogger.commerceWalletSetupProgress(timestamp, root.setupAttemptID,
Math.round((timestamp - root.startingTimestamp)/1000), currentStepNumber, root.setupStepNames[currentStepNumber - 1]);
}
//
// TITLE BAR START
//
@ -371,7 +382,7 @@ Item {
Item {
id: securityImageTip;
visible: false;
visible: !root.hasShownSecurityImageTip && root.activeView === "step_3";
z: 999;
anchors.fill: root;
@ -421,7 +432,6 @@ Item {
text: "Got It";
onClicked: {
root.hasShownSecurityImageTip = true;
securityImageTip.visible = false;
passphraseSelection.focusFirstTextField();
}
}
@ -439,9 +449,6 @@ Item {
onVisibleChanged: {
if (visible) {
Commerce.getWalletAuthenticatedStatus();
if (!root.hasShownSecurityImageTip) {
securityImageTip.visible = true;
}
}
}
@ -732,7 +739,11 @@ Item {
text: "Finish";
onClicked: {
root.visible = false;
root.hasShownSecurityImageTip = false;
sendSignalToWallet({method: 'walletSetup_finished', referrer: root.referrer ? root.referrer : ""});
var timestamp = new Date();
UserActivityLogger.commerceWalletSetupFinished(timestamp, setupAttemptID, Math.round((timestamp - root.startingTimestamp)/1000));
}
}
}

View file

@ -9,33 +9,30 @@ Overlay {
Image {
id: image
property bool scaleFix: true;
property real xOffset: 0
property real yOffset: 0
property bool scaleFix: true
property real xStart: 0
property real yStart: 0
property real xSize: 0
property real ySize: 0
property real imageScale: 1.0
property var resizer: Timer {
interval: 50
repeat: false
running: false
onTriggered: {
var targetAspect = root.width / root.height;
var sourceAspect = image.sourceSize.width / image.sourceSize.height;
if (sourceAspect <= targetAspect) {
if (root.width === image.sourceSize.width) {
return;
}
image.imageScale = root.width / image.sourceSize.width;
} else if (sourceAspect > targetAspect){
if (root.height === image.sourceSize.height) {
return;
}
image.imageScale = root.height / image.sourceSize.height;
if (image.xSize === 0) {
image.xSize = image.sourceSize.width - image.xStart;
}
image.sourceSize = Qt.size(image.sourceSize.width * image.imageScale, image.sourceSize.height * image.imageScale);
if (image.ySize === 0) {
image.ySize = image.sourceSize.height - image.yStart;
}
image.anchors.leftMargin = -image.xStart * root.width / image.xSize;
image.anchors.topMargin = -image.yStart * root.height / image.ySize;
image.anchors.rightMargin = (image.xStart + image.xSize - image.sourceSize.width) * root.width / image.xSize;
image.anchors.bottomMargin = (image.yStart + image.ySize - image.sourceSize.height) * root.height / image.ySize;
}
}
x: -1 * xOffset * imageScale
y: -1 * yOffset * imageScale
onSourceSizeChanged: {
if (sourceSize.width !== 0 && sourceSize.height !== 0 && progress === 1.0 && scaleFix) {
@ -43,6 +40,8 @@ Overlay {
resizer.start();
}
}
anchors.fill: parent
}
ColorOverlay {
@ -57,8 +56,10 @@ Overlay {
var key = keys[i];
var value = subImage[key];
switch (key) {
case "x": image.xOffset = value; break;
case "y": image.yOffset = value; break;
case "x": image.xStart = value; break;
case "y": image.yStart = value; break;
case "width": image.xSize = value; break;
case "height": image.ySize = value; break;
}
}
}
@ -78,6 +79,7 @@ Overlay {
case "imageURL": image.source = value; break;
case "subImage": updateSubImage(value); break;
case "color": color.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, root.opacity); break;
case "bounds": break; // The bounds property is handled in C++.
default: console.log("OVERLAY Unhandled image property " + key);
}
}

View file

@ -29,6 +29,7 @@ Overlay {
case "borderColor": rectangle.border.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, rectangle.border.color.a); break;
case "borderWidth": rectangle.border.width = value; break;
case "radius": rectangle.radius = value; break;
case "bounds": break; // The bounds property is handled in C++.
default: console.warn("OVERLAY Unhandled rectangle property " + key);
}
}

View file

@ -46,6 +46,7 @@ Overlay {
case "backgroundColor": background.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, background.color.a); break;
case "font": textField.font.pixelSize = value.size; break;
case "lineHeight": textField.lineHeight = value; break;
case "bounds": break; // The bounds property is handled in C++.
default: console.warn("OVERLAY text unhandled property " + key);
}
}

View file

@ -36,7 +36,9 @@ Rectangle {
readonly property bool hmdHead: headBox.checked
readonly property bool headPuck: headPuckBox.checked
readonly property bool handController: handBox.checked
readonly property bool handPuck: handPuckBox.checked
readonly property bool hmdDesktop: hmdInDesktop.checked
property int state: buttonState.disabled
property var lastConfiguration: null
@ -53,10 +55,6 @@ Rectangle {
}
MouseArea {
id: mouseArea
@ -99,6 +97,7 @@ Rectangle {
onClicked: {
if (checked) {
headPuckBox.checked = false;
hmdInDesktop.checked = false;
} else {
checked = true;
}
@ -121,6 +120,7 @@ Rectangle {
onClicked: {
if (checked) {
headBox.checked = false;
hmdInDesktop.checked = false;
} else {
checked = true;
}
@ -133,6 +133,36 @@ Rectangle {
text: "Tracker"
color: hifi.colors.lightGrayText
}
HifiControls.CheckBox {
id: hmdInDesktop
width: 15
height: 15
boxRadius: 7
visible: viveInDesktop.checked
anchors.top: viveInDesktop.bottom
anchors.topMargin: 5
anchors.left: openVrConfiguration.left
anchors.leftMargin: leftMargin + 10
onClicked: {
if (checked) {
headBox.checked = false;
headPuckBox.checked = false;
} else {
checked = true;
}
sendConfigurationSettings();
}
}
RalewayBold {
size: 12
visible: viveInDesktop.checked
text: "None"
color: hifi.colors.lightGrayText
}
}
Row {
@ -773,6 +803,11 @@ Rectangle {
anchors.leftMargin: leftMargin + 10
onClicked: {
if (!checked & hmdInDesktop.checked) {
headBox.checked = true;
headPuckBox.checked = false;
hmdInDesktop.checked = false;
}
sendConfigurationSettings();
}
}
@ -789,6 +824,7 @@ Rectangle {
verticalCenter: viveInDesktop.verticalCenter
}
}
NumberAnimation {
id: numberAnimation
@ -797,6 +833,7 @@ Rectangle {
to: 0
}
function logAction(action, status) {
console.log("calibrated from ui");
var data = {
@ -877,6 +914,7 @@ Rectangle {
var HmdHead = settings["HMDHead"];
var viveController = settings["handController"];
var desktopMode = settings["desktopMode"];
var hmdDesktopPosition = settings["hmdDesktopTracking"];
armCircumference.value = settings.armCircumference;
shoulderWidth.value = settings.shoulderWidth;
@ -898,6 +936,7 @@ Rectangle {
}
viveInDesktop.checked = desktopMode;
hmdInDesktop.checked = hmdDesktopPosition;
initializeButtonState();
updateCalibrationText();
@ -1058,7 +1097,8 @@ Rectangle {
"handConfiguration": handObject,
"armCircumference": armCircumference.value,
"shoulderWidth": shoulderWidth.value,
"desktopMode": viveInDesktop.checked
"desktopMode": viveInDesktop.checked,
"hmdDesktopTracking": hmdInDesktop.checked
}
return settingsObject;

View file

@ -5,7 +5,9 @@ import TabletScriptingInterface 1.0
Item {
id: tabletButton
property string captionColorOverride: ""
property color defaultCaptionColor: "#ffffff"
property color captionColor: defaultCaptionColor
property var uuid;
property string icon: "icons/tablet-icons/edit-i.svg"
property string hoverIcon: tabletButton.icon
@ -105,7 +107,7 @@ Item {
Text {
id: text
color: captionColorOverride !== "" ? captionColorOverride: "#ffffff"
color: captionColor
text: tabletButton.text
font.bold: true
font.pixelSize: 18
@ -170,7 +172,7 @@ Item {
PropertyChanges {
target: text
color: captionColorOverride !== "" ? captionColorOverride: "#ffffff"
color: captionColor
text: tabletButton.hoverText
}
@ -196,7 +198,7 @@ Item {
PropertyChanges {
target: text
color: captionColorOverride !== "" ? captionColorOverride: "#333333"
color: captionColor !== defaultCaptionColor ? captionColor : "#333333"
text: tabletButton.activeText
}
@ -227,7 +229,7 @@ Item {
PropertyChanges {
target: text
color: captionColorOverride !== "" ? captionColorOverride: "#333333"
color: captionColor !== defaultCaptionColor ? captionColor : "#333333"
text: tabletButton.activeHoverText
}

View file

@ -40,37 +40,29 @@ Item {
CheckBox {
id: checkbox
// FIXME: Should use radio buttons if source.exclusiveGroup.
width: 20
visible: source !== null ?
source.visible && source.type === 1 && source.checkable && !source.exclusiveGroup :
false
checked: setChecked()
function setChecked() {
if (!source || source.type !== 1 || !source.checkable) {
return false;
}
// FIXME this works for native QML menus but I don't think it will
// for proxied QML menus
return source.checked;
Binding on checked {
value: source.checked;
when: source && source.type === 1 && source.checkable && !source.exclusiveGroup;
}
}
RadioButton {
id: radiobutton
// FIXME: Should use radio buttons if source.exclusiveGroup.
width: 20
visible: source !== null ?
source.visible && source.type === 1 && source.checkable && source.exclusiveGroup :
false
checked: setChecked()
function setChecked() {
if (!source || source.type !== 1 || !source.checkable) {
return false;
}
// FIXME this works for native QML menus but I don't think it will
// for proxied QML menus
return source.checked;
Binding on checked {
value: source.checked;
when: source && source.type === 1 && source.checkable && source.exclusiveGroup;
}
}
}

View file

@ -66,7 +66,6 @@ Item {
function toModel(items, newMenu) {
var result = modelMaker.createObject(tabletMenu);
var exclusionGroups = {};
for (var i = 0; i < items.length; ++i) {
var item = items[i];
@ -78,28 +77,6 @@ Item {
if (item.text !== "Users Online") {
result.append({"name": item.text, "item": item})
}
for(var j = 0; j < tabletMenu.rootMenu.exclusionGroupsByMenuItem.count; ++j)
{
var entry = tabletMenu.rootMenu.exclusionGroupsByMenuItem.get(j);
if(entry.menuItem == item.toString())
{
var exclusionGroupId = entry.exclusionGroup;
console.debug('item exclusionGroupId: ', exclusionGroupId)
if(!exclusionGroups[exclusionGroupId])
{
exclusionGroups[exclusionGroupId] = exclusiveGroupMaker.createObject(newMenu);
console.debug('new exclusion group created: ', exclusionGroups[exclusionGroupId])
}
var exclusionGroup = exclusionGroups[exclusionGroupId];
item.exclusiveGroup = exclusionGroup
console.debug('item.exclusiveGroup: ', item.exclusiveGroup)
}
}
break;
case MenuItemType.Separator:
result.append({"name": "", "item": item})

View file

@ -4,7 +4,9 @@ import QtQuick.Controls 1.4
StateImage {
id: button
property string captionColorOverride: ""
property color defaultCaptionColor: "#ffffff"
property color captionColor: defaultCaptionColor
property bool buttonEnabled: true
property bool isActive: false
property bool isEntered: false
@ -99,7 +101,7 @@ StateImage {
Text {
id: caption
color: captionColorOverride !== "" ? captionColorOverride: (button.isActive ? "#000000" : "#ffffff")
color: button.isActive ? "#000000" : captionColor
text: button.isActive ? (button.isEntered ? button.activeHoverText : button.activeText) : (button.isEntered ? button.hoverText : button.text)
font.bold: false
font.pixelSize: 9

View file

@ -162,7 +162,7 @@ Item {
readonly property real controlLineHeight: 28 // Height of spinbox control on 1920 x 1080 monitor
readonly property real controlInterlineHeight: 21 // 75% of controlLineHeight
readonly property vector2d menuPadding: Qt.vector2d(14, 102)
readonly property real scrollbarBackgroundWidth: 18
readonly property real scrollbarBackgroundWidth: 20
readonly property real scrollbarHandleWidth: scrollbarBackgroundWidth - 2
readonly property real tabletMenuHeader: 90
}

View file

@ -194,8 +194,13 @@
#include <EntityScriptClient.h>
#include <ModelScriptingInterface.h>
#include <PickManager.h>
#include <PointerManager.h>
#include <raypick/RayPickScriptingInterface.h>
#include <raypick/LaserPointerScriptingInterface.h>
#include <raypick/PickScriptingInterface.h>
#include <raypick/PointerScriptingInterface.h>
#include <raypick/MouseRayPick.h>
#include <FadeEffect.h>
@ -203,6 +208,8 @@
#include "commerce/Wallet.h"
#include "commerce/QmlCommerce.h"
#include "webbrowser/WebBrowserSuggestionsEngine.h"
// On Windows PC, NVidia Optimus laptop, we want to enable NVIDIA GPU
// FIXME seems to be broken.
#if defined(Q_OS_WIN)
@ -618,6 +625,12 @@ bool setupEssentials(int& argc, char** argv, bool runningMarkerExisted) {
DependencyManager::registerInheritance<SpatialParentFinder, InterfaceParentFinder>();
// Set dependencies
DependencyManager::set<PickManager>();
DependencyManager::set<PointerManager>();
DependencyManager::set<LaserPointerScriptingInterface>();
DependencyManager::set<RayPickScriptingInterface>();
DependencyManager::set<PointerScriptingInterface>();
DependencyManager::set<PickScriptingInterface>();
DependencyManager::set<Cursor::Manager>();
DependencyManager::set<AccountManager>(std::bind(&Application::getUserAgent, qApp));
DependencyManager::set<StatTracker>();
@ -698,9 +711,6 @@ bool setupEssentials(int& argc, char** argv, bool runningMarkerExisted) {
DependencyManager::set<FadeEffect>();
DependencyManager::set<LaserPointerScriptingInterface>();
DependencyManager::set<RayPickScriptingInterface>();
return previousSessionCrashed;
}
@ -893,7 +903,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
connect(audioIO.data(), &AudioClient::muteEnvironmentRequested, [](glm::vec3 position, float radius) {
auto audioClient = DependencyManager::get<AudioClient>();
auto audioScriptingInterface = DependencyManager::get<AudioScriptingInterface>();
auto myAvatarPosition = DependencyManager::get<AvatarManager>()->getMyAvatar()->getPosition();
auto myAvatarPosition = DependencyManager::get<AvatarManager>()->getMyAvatar()->getWorldPosition();
float distance = glm::distance(myAvatarPosition, position);
bool shouldMute = !audioClient->isMuted() && (distance < radius);
@ -966,8 +976,8 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
auto addressManager = DependencyManager::get<AddressManager>();
// use our MyAvatar position and quat for address manager path
addressManager->setPositionGetter([this]{ return getMyAvatar()->getPosition(); });
addressManager->setOrientationGetter([this]{ return getMyAvatar()->getOrientation(); });
addressManager->setPositionGetter([this]{ return getMyAvatar()->getWorldPosition(); });
addressManager->setOrientationGetter([this]{ return getMyAvatar()->getWorldOrientation(); });
connect(addressManager.data(), &AddressManager::hostChanged, this, &Application::updateWindowTitle);
connect(this, &QCoreApplication::aboutToQuit, addressManager.data(), &AddressManager::storeCurrentAddress);
@ -1183,6 +1193,10 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
_entityEditSender.setServerJurisdictions(&_entityServerJurisdictions);
_entityEditSender.setMyAvatar(myAvatar.get());
// The entity octree will have to know about MyAvatar for the parentJointName import
getEntities()->getTree()->setMyAvatar(myAvatar);
_entityClipboard->setMyAvatar(myAvatar);
// For now we're going to set the PPS for outbound packets to be super high, this is
// probably not the right long term solution. But for now, we're going to do this to
// allow you to move an entity around in your hand
@ -1464,13 +1478,15 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
// If the user clicks an an entity, we will check that it's an unlocked web entity, and if so, set the focus to it
auto entityScriptingInterface = DependencyManager::get<EntityScriptingInterface>();
connect(entityScriptingInterface.data(), &EntityScriptingInterface::clickDownOnEntity,
connect(entityScriptingInterface.data(), &EntityScriptingInterface::mousePressOnEntity,
[this](const EntityItemID& entityItemID, const PointerEvent& event) {
if (getEntities()->wantsKeyboardFocus(entityItemID)) {
setKeyboardFocusOverlay(UNKNOWN_OVERLAY_ID);
setKeyboardFocusEntity(entityItemID);
} else {
setKeyboardFocusEntity(UNKNOWN_ENTITY_ID);
if (event.shouldFocus()) {
if (getEntities()->wantsKeyboardFocus(entityItemID)) {
setKeyboardFocusOverlay(UNKNOWN_OVERLAY_ID);
setKeyboardFocusEntity(entityItemID);
} else {
setKeyboardFocusEntity(UNKNOWN_ENTITY_ID);
}
}
});
@ -1501,7 +1517,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
static const QString FAST_STATS_ARG = "--fast-heartbeat";
static int SEND_STATS_INTERVAL_MS = arguments().indexOf(FAST_STATS_ARG) != -1 ? 1000 : 10000;
static glm::vec3 lastAvatarPosition = myAvatar->getPosition();
static glm::vec3 lastAvatarPosition = myAvatar->getWorldPosition();
static glm::mat4 lastHMDHeadPose = getHMDSensorPose();
static controller::Pose lastLeftHandPose = myAvatar->getLeftHandPose();
static controller::Pose lastRightHandPose = myAvatar->getRightHandPose();
@ -1629,7 +1645,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
properties["bytes_downloaded"] = bytesDownloaded;
auto myAvatar = getMyAvatar();
glm::vec3 avatarPosition = myAvatar->getPosition();
glm::vec3 avatarPosition = myAvatar->getWorldPosition();
properties["avatar_has_moved"] = lastAvatarPosition != avatarPosition;
lastAvatarPosition = avatarPosition;
@ -1709,7 +1725,7 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
checkNearbyAvatarsTimer->setInterval(CHECK_NEARBY_AVATARS_INTERVAL_MS); // 10 seconds, Qt::CoarseTimer ok
connect(checkNearbyAvatarsTimer, &QTimer::timeout, this, [this]() {
auto avatarManager = DependencyManager::get<AvatarManager>();
int nearbyAvatars = avatarManager->numberOfAvatarsInRange(avatarManager->getMyAvatar()->getPosition(),
int nearbyAvatars = avatarManager->numberOfAvatarsInRange(avatarManager->getMyAvatar()->getWorldPosition(),
NEARBY_AVATAR_RADIUS_METERS) - 1;
if (nearbyAvatars != lastCountOfNearbyAvatars) {
lastCountOfNearbyAvatars = nearbyAvatars;
@ -1798,25 +1814,35 @@ Application::Application(int& argc, char** argv, QElapsedTimer& startupTimer, bo
connect(&_myCamera, &Camera::modeUpdated, this, &Application::cameraModeChanged);
DependencyManager::get<PickManager>()->setShouldPickHUDOperator([&]() { return DependencyManager::get<HMDScriptingInterface>()->isHMDMode(); });
DependencyManager::get<PickManager>()->setCalculatePos2DFromHUDOperator([&](const glm::vec3& intersection) {
const glm::vec2 MARGIN(25.0f);
glm::vec2 maxPos = _controllerScriptingInterface->getViewportDimensions() - MARGIN;
glm::vec2 pos2D = DependencyManager::get<HMDScriptingInterface>()->overlayFromWorldPoint(intersection);
return glm::max(MARGIN, glm::min(pos2D, maxPos));
});
// Setup the mouse ray pick and related operators
DependencyManager::get<EntityTreeRenderer>()->setMouseRayPickID(_rayPickManager.createRayPick(
RayPickFilter(DependencyManager::get<RayPickScriptingInterface>()->PICK_ENTITIES() | DependencyManager::get<RayPickScriptingInterface>()->PICK_INCLUDE_NONCOLLIDABLE()),
0.0f, true));
DependencyManager::get<EntityTreeRenderer>()->setMouseRayPickResultOperator([&](QUuid rayPickID) {
DependencyManager::get<EntityTreeRenderer>()->setMouseRayPickID(DependencyManager::get<PickManager>()->addPick(PickQuery::Ray, std::make_shared<MouseRayPick>(
PickFilter(PickScriptingInterface::PICK_ENTITIES() | PickScriptingInterface::PICK_INCLUDE_NONCOLLIDABLE()), 0.0f, true)));
DependencyManager::get<EntityTreeRenderer>()->setMouseRayPickResultOperator([&](unsigned int rayPickID) {
RayToEntityIntersectionResult entityResult;
RayPickResult result = _rayPickManager.getPrevRayPickResult(rayPickID);
entityResult.intersects = result.type != DependencyManager::get<RayPickScriptingInterface>()->INTERSECTED_NONE();
if (entityResult.intersects) {
entityResult.intersection = result.intersection;
entityResult.distance = result.distance;
entityResult.surfaceNormal = result.surfaceNormal;
entityResult.entityID = result.objectID;
entityResult.entity = DependencyManager::get<EntityTreeRenderer>()->getTree()->findEntityByID(entityResult.entityID);
entityResult.intersects = false;
auto pickResult = DependencyManager::get<PickManager>()->getPrevPickResultTyped<RayPickResult>(rayPickID);
if (pickResult) {
entityResult.intersects = pickResult->type != IntersectionType::NONE;
if (entityResult.intersects) {
entityResult.intersection = pickResult->intersection;
entityResult.distance = pickResult->distance;
entityResult.surfaceNormal = pickResult->surfaceNormal;
entityResult.entityID = pickResult->objectID;
entityResult.entity = DependencyManager::get<EntityTreeRenderer>()->getTree()->findEntityByID(entityResult.entityID);
}
}
return entityResult;
});
DependencyManager::get<EntityTreeRenderer>()->setSetPrecisionPickingOperator([&](QUuid rayPickID, bool value) {
_rayPickManager.setPrecisionPicking(rayPickID, value);
DependencyManager::get<EntityTreeRenderer>()->setSetPrecisionPickingOperator([&](unsigned int rayPickID, bool value) {
DependencyManager::get<PickManager>()->setPrecisionPicking(rayPickID, value);
});
// Preload Tablet sounds
@ -2258,6 +2284,7 @@ void Application::initializeUi() {
}, callback);
qmlRegisterType<ResourceImageItem>("Hifi", 1, 0, "ResourceImageItem");
qmlRegisterType<Preference>("Hifi", 1, 0, "Preference");
qmlRegisterType<WebBrowserSuggestionsEngine>("HifiWeb", 1, 0, "WebBrowserSuggestionsEngine");
{
auto tabletScriptingInterface = DependencyManager::get<TabletScriptingInterface>();
@ -2436,7 +2463,7 @@ void Application::updateCamera(RenderArgs& renderArgs) {
auto hmdWorldMat = myAvatar->getSensorToWorldMatrix() * myAvatar->getHMDSensorMatrix();
_myCamera.setOrientation(glm::normalize(glmExtractRotation(hmdWorldMat)));
_myCamera.setPosition(extractTranslation(hmdWorldMat) +
myAvatar->getOrientation() * boomOffset);
myAvatar->getWorldOrientation() * boomOffset);
}
else {
_myCamera.setOrientation(myAvatar->getHead()->getOrientation());
@ -2446,13 +2473,13 @@ void Application::updateCamera(RenderArgs& renderArgs) {
}
else {
_myCamera.setPosition(myAvatar->getDefaultEyePosition()
+ myAvatar->getOrientation() * boomOffset);
+ myAvatar->getWorldOrientation() * boomOffset);
}
}
}
else if (_myCamera.getMode() == CAMERA_MODE_MIRROR) {
if (isHMDMode()) {
auto mirrorBodyOrientation = myAvatar->getOrientation() * glm::quat(glm::vec3(0.0f, PI + _rotateMirror, 0.0f));
auto mirrorBodyOrientation = myAvatar->getWorldOrientation() * glm::quat(glm::vec3(0.0f, PI + _rotateMirror, 0.0f));
glm::quat hmdRotation = extractRotation(myAvatar->getHMDSensorMatrix());
// Mirror HMD yaw and roll
@ -2475,11 +2502,11 @@ void Application::updateCamera(RenderArgs& renderArgs) {
+ mirrorBodyOrientation * hmdOffset);
}
else {
_myCamera.setOrientation(myAvatar->getOrientation()
_myCamera.setOrientation(myAvatar->getWorldOrientation()
* glm::quat(glm::vec3(0.0f, PI + _rotateMirror, 0.0f)));
_myCamera.setPosition(myAvatar->getDefaultEyePosition()
+ glm::vec3(0, _raiseMirror * myAvatar->getModelScale(), 0)
+ (myAvatar->getOrientation() * glm::quat(glm::vec3(0.0f, _rotateMirror, 0.0f))) *
+ (myAvatar->getWorldOrientation() * glm::quat(glm::vec3(0.0f, _rotateMirror, 0.0f))) *
glm::vec3(0.0f, 0.0f, -1.0f) * MIRROR_FULLSCREEN_DISTANCE * _scaleMirror);
}
renderArgs._renderMode = RenderArgs::MIRROR_RENDER_MODE;
@ -2489,13 +2516,13 @@ void Application::updateCamera(RenderArgs& renderArgs) {
if (cameraEntity != nullptr) {
if (isHMDMode()) {
glm::quat hmdRotation = extractRotation(myAvatar->getHMDSensorMatrix());
_myCamera.setOrientation(cameraEntity->getRotation() * hmdRotation);
_myCamera.setOrientation(cameraEntity->getWorldOrientation() * hmdRotation);
glm::vec3 hmdOffset = extractTranslation(myAvatar->getHMDSensorMatrix());
_myCamera.setPosition(cameraEntity->getPosition() + (hmdRotation * hmdOffset));
_myCamera.setPosition(cameraEntity->getWorldPosition() + (hmdRotation * hmdOffset));
}
else {
_myCamera.setOrientation(cameraEntity->getRotation());
_myCamera.setPosition(cameraEntity->getPosition());
_myCamera.setOrientation(cameraEntity->getWorldOrientation());
_myCamera.setPosition(cameraEntity->getWorldPosition());
}
}
}
@ -3311,7 +3338,7 @@ void Application::mouseMoveEvent(QMouseEvent* event) {
auto offscreenUi = DependencyManager::get<OffscreenUi>();
auto eventPosition = compositor.getMouseEventPosition(event);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition, _glWidget);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition);
auto button = event->button();
auto buttons = event->buttons();
// Determine if the ReticleClick Action is 1 and if so, fake include the LeftMouseButton
@ -3357,7 +3384,7 @@ void Application::mousePressEvent(QMouseEvent* event) {
offscreenUi->unfocusWindows();
auto eventPosition = getApplicationCompositor().getMouseEventPosition(event);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition, _glWidget);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition);
QMouseEvent mappedEvent(event->type(),
transformedPos,
event->screenPos(), event->button(),
@ -3387,7 +3414,7 @@ void Application::mousePressEvent(QMouseEvent* event) {
void Application::mouseDoublePressEvent(QMouseEvent* event) {
auto offscreenUi = DependencyManager::get<OffscreenUi>();
auto eventPosition = getApplicationCompositor().getMouseEventPosition(event);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition, _glWidget);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition);
QMouseEvent mappedEvent(event->type(),
transformedPos,
event->screenPos(), event->button(),
@ -3413,7 +3440,7 @@ void Application::mouseReleaseEvent(QMouseEvent* event) {
auto offscreenUi = DependencyManager::get<OffscreenUi>();
auto eventPosition = getApplicationCompositor().getMouseEventPosition(event);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition, _glWidget);
QPointF transformedPos = offscreenUi->mapToVirtualScreen(eventPosition);
QMouseEvent mappedEvent(event->type(),
transformedPos,
event->screenPos(), event->button(),
@ -3906,16 +3933,16 @@ void Application::idle() {
if (!_keyboardFocusedEntity.get().isInvalidID()) {
auto entity = getEntities()->getTree()->findEntityByID(_keyboardFocusedEntity.get());
if (entity && _keyboardFocusHighlight) {
_keyboardFocusHighlight->setRotation(entity->getRotation());
_keyboardFocusHighlight->setPosition(entity->getPosition());
_keyboardFocusHighlight->setWorldOrientation(entity->getWorldOrientation());
_keyboardFocusHighlight->setWorldPosition(entity->getWorldPosition());
}
} else {
// Only Web overlays can have focus.
auto overlay =
std::dynamic_pointer_cast<Web3DOverlay>(getOverlays().getOverlay(_keyboardFocusedOverlay.get()));
if (overlay && _keyboardFocusHighlight) {
_keyboardFocusHighlight->setRotation(overlay->getRotation());
_keyboardFocusHighlight->setPosition(overlay->getPosition());
_keyboardFocusHighlight->setWorldOrientation(overlay->getWorldOrientation());
_keyboardFocusHighlight->setWorldPosition(overlay->getWorldPosition());
}
}
}
@ -4031,6 +4058,7 @@ bool Application::exportEntities(const QString& filename,
auto entityTree = getEntities()->getTree();
auto exportTree = std::make_shared<EntityTree>();
exportTree->setMyAvatar(getMyAvatar());
exportTree->createRootElement();
glm::vec3 root(TREE_SCALE, TREE_SCALE, TREE_SCALE);
bool success = true;
@ -4049,7 +4077,7 @@ bool Application::exportEntities(const QString& filename,
!entityIDs.contains(parentID) ||
!entityTree->findEntityByEntityItemID(parentID))) {
// If parent wasn't selected, we want absolute position, which isn't in properties.
auto position = entityItem->getPosition();
auto position = entityItem->getWorldPosition();
root.x = glm::min(root.x, position.x);
root.y = glm::min(root.y, position.y);
root.z = glm::min(root.z, position.z);
@ -4252,7 +4280,7 @@ void Application::init() {
return 0.0f;
}
auto distance = glm::distance(getMyAvatar()->getPosition(), item.getPosition());
auto distance = glm::distance(getMyAvatar()->getWorldPosition(), item.getWorldPosition());
return atan2(maxSize, distance);
});
@ -4329,7 +4357,7 @@ void Application::updateMyAvatarLookAtPosition() {
// TODO -- this code is probably wrong, getHeadPose() returns something in sensor frame, not avatar
glm::mat4 headPose = getActiveDisplayPlugin()->getHeadPose();
glm::quat hmdRotation = glm::quat_cast(headPose);
lookAtSpot = _myCamera.getPosition() + myAvatar->getOrientation() * (hmdRotation * lookAtPosition);
lookAtSpot = _myCamera.getPosition() + myAvatar->getWorldOrientation() * (hmdRotation * lookAtPosition);
} else {
lookAtSpot = myAvatar->getHead()->getEyePosition()
+ (myAvatar->getHead()->getFinalOrientationInWorldFrame() * lookAtPosition);
@ -4558,8 +4586,8 @@ void Application::setKeyboardFocusHighlight(const glm::vec3& position, const glm
}
// Position focus
_keyboardFocusHighlight->setRotation(rotation);
_keyboardFocusHighlight->setPosition(position);
_keyboardFocusHighlight->setWorldOrientation(rotation);
_keyboardFocusHighlight->setWorldPosition(position);
_keyboardFocusHighlight->setDimensions(dimensions);
_keyboardFocusHighlight->setVisible(true);
}
@ -4596,7 +4624,7 @@ void Application::setKeyboardFocusEntity(const EntityItemID& entityItemID) {
}
_lastAcceptedKeyPress = usecTimestampNow();
setKeyboardFocusHighlight(entity->getPosition(), entity->getRotation(),
setKeyboardFocusHighlight(entity->getWorldPosition(), entity->getWorldOrientation(),
entity->getDimensions() * FOCUS_HIGHLIGHT_EXPANSION_FACTOR);
}
}
@ -4633,7 +4661,7 @@ void Application::setKeyboardFocusOverlay(const OverlayID& overlayID) {
if (overlay->getProperty("showKeyboardFocusHighlight").toBool()) {
auto size = overlay->getSize() * FOCUS_HIGHLIGHT_EXPANSION_FACTOR;
const float OVERLAY_DEPTH = 0.0105f;
setKeyboardFocusHighlight(overlay->getPosition(), overlay->getRotation(), glm::vec3(size.x, size.y, OVERLAY_DEPTH));
setKeyboardFocusHighlight(overlay->getWorldPosition(), overlay->getWorldOrientation(), glm::vec3(size.x, size.y, OVERLAY_DEPTH));
} else if (_keyboardFocusHighlight) {
_keyboardFocusHighlight->setVisible(false);
}
@ -4725,7 +4753,7 @@ void Application::update(float deltaTime) {
controller::InputCalibrationData calibrationData = {
myAvatar->getSensorToWorldMatrix(),
createMatFromQuatAndPos(myAvatar->getOrientation(), myAvatar->getPosition()),
createMatFromQuatAndPos(myAvatar->getWorldOrientation(), myAvatar->getWorldPosition()),
myAvatar->getHMDSensorMatrix(),
myAvatar->getCenterEyeCalibrationMat(),
myAvatar->getHeadCalibrationMat(),
@ -4835,7 +4863,7 @@ void Application::update(float deltaTime) {
};
// copy controller poses from userInputMapper to myAvatar.
glm::mat4 myAvatarMatrix = createMatFromQuatAndPos(myAvatar->getOrientation(), myAvatar->getPosition());
glm::mat4 myAvatarMatrix = createMatFromQuatAndPos(myAvatar->getWorldOrientation(), myAvatar->getWorldPosition());
glm::mat4 worldToSensorMatrix = glm::inverse(myAvatar->getSensorToWorldMatrix());
glm::mat4 avatarToSensorMatrix = worldToSensorMatrix * myAvatarMatrix;
for (auto& action : avatarControllerActions) {
@ -4975,13 +5003,13 @@ void Application::update(float deltaTime) {
// TODO: break these out into distinct perfTimers when they prove interesting
{
PROFILE_RANGE(app, "RayPickManager");
_rayPickManager.update();
PROFILE_RANGE(app, "PickManager");
DependencyManager::get<PickManager>()->update();
}
{
PROFILE_RANGE(app, "LaserPointerManager");
_laserPointerManager.update();
PROFILE_RANGE(app, "PointerManager");
DependencyManager::get<PointerManager>()->update();
}
{
@ -5451,7 +5479,7 @@ std::shared_ptr<MyAvatar> Application::getMyAvatar() const {
}
glm::vec3 Application::getAvatarPosition() const {
return getMyAvatar()->getPosition();
return getMyAvatar()->getWorldPosition();
}
void Application::copyViewFrustum(ViewFrustum& viewOut) const {
@ -5535,6 +5563,8 @@ void Application::clearDomainOctreeDetails() {
DependencyManager::get<ModelCache>()->clearUnusedResources();
DependencyManager::get<SoundCache>()->clearUnusedResources();
DependencyManager::get<TextureCache>()->clearUnusedResources();
getMyAvatar()->setAvatarEntityDataChanged(true);
}
void Application::clearDomainAvatars() {
@ -5664,7 +5694,7 @@ bool Application::nearbyEntitiesAreReadyForPhysics() {
// whose bounding boxes cannot be computed (it is too loose for our purposes here). Instead we manufacture
// custom filters and use the general-purpose EntityTree::findEntities(filter, ...)
QVector<EntityItemPointer> entities;
AABox avatarBox(getMyAvatar()->getPosition() - glm::vec3(PHYSICS_READY_RANGE), glm::vec3(2 * PHYSICS_READY_RANGE));
AABox avatarBox(getMyAvatar()->getWorldPosition() - glm::vec3(PHYSICS_READY_RANGE), glm::vec3(2 * PHYSICS_READY_RANGE));
// create two functions that use avatarBox (entityScan and elementScan), the second calls the first
std::function<bool (EntityItemPointer&)> entityScan = [=](EntityItemPointer& entity) {
if (entity->shouldBePhysical()) {
@ -5866,6 +5896,8 @@ void Application::registerScriptEngineWithApplicationServices(ScriptEnginePointe
scriptEngine->registerGlobalObject("RayPick", DependencyManager::get<RayPickScriptingInterface>().data());
scriptEngine->registerGlobalObject("LaserPointers", DependencyManager::get<LaserPointerScriptingInterface>().data());
scriptEngine->registerGlobalObject("Picks", DependencyManager::get<PickScriptingInterface>().data());
scriptEngine->registerGlobalObject("Pointers", DependencyManager::get<PointerScriptingInterface>().data());
// Caches
scriptEngine->registerGlobalObject("AnimationCache", DependencyManager::get<AnimationCache>().data());
@ -5923,6 +5955,8 @@ void Application::registerScriptEngineWithApplicationServices(ScriptEnginePointe
qScriptRegisterMetaType(scriptEngine.data(), OverlayIDtoScriptValue, OverlayIDfromScriptValue);
DependencyManager::get<PickScriptingInterface>()->registerMetaTypes(scriptEngine.data());
// connect this script engines printedMessage signal to the global ScriptEngines these various messages
connect(scriptEngine.data(), &ScriptEngine::printedMessage,
DependencyManager::get<ScriptEngines>().data(), &ScriptEngines::onPrintedMessage);
@ -6164,46 +6198,60 @@ bool Application::askToReplaceDomainContent(const QString& url) {
"and restoring content, visit the documentation page at: ", MAX_CHARACTERS_PER_LINE) +
"\nhttps://docs.highfidelity.com/create-and-explore/start-working-in-your-sandbox/restoring-sandbox-content";
bool agreeToReplaceContent = false; // assume false
agreeToReplaceContent = QMessageBox::Yes == OffscreenUi::question("Are you sure you want to replace this domain's content set?",
infoText, QMessageBox::Yes | QMessageBox::No, QMessageBox::No);
ModalDialogListener* dig = OffscreenUi::asyncQuestion("Are you sure you want to replace this domain's content set?",
infoText, QMessageBox::Yes | QMessageBox::No, QMessageBox::No);
if (agreeToReplaceContent) {
// Given confirmation, send request to domain server to replace content
qCDebug(interfaceapp) << "Attempting to replace domain content: " << url;
QByteArray urlData(url.toUtf8());
auto limitedNodeList = DependencyManager::get<LimitedNodeList>();
limitedNodeList->eachMatchingNode([](const SharedNodePointer& node) {
return node->getType() == NodeType::EntityServer && node->getActiveSocket();
}, [&urlData, limitedNodeList](const SharedNodePointer& octreeNode) {
auto octreeFilePacket = NLPacket::create(PacketType::OctreeFileReplacementFromUrl, urlData.size(), true);
octreeFilePacket->write(urlData);
limitedNodeList->sendPacket(std::move(octreeFilePacket), *octreeNode);
});
auto addressManager = DependencyManager::get<AddressManager>();
addressManager->handleLookupString(DOMAIN_SPAWNING_POINT);
QString newHomeAddress = addressManager->getHost() + DOMAIN_SPAWNING_POINT;
qCDebug(interfaceapp) << "Setting new home bookmark to: " << newHomeAddress;
DependencyManager::get<LocationBookmarks>()->setHomeLocationToAddress(newHomeAddress);
methodDetails = "SuccessfulRequestToReplaceContent";
} else {
methodDetails = "UserDeclinedToReplaceContent";
}
QObject::connect(dig, &ModalDialogListener::response, this, [=] (QVariant answer) {
QString details;
if (static_cast<QMessageBox::StandardButton>(answer.toInt()) == QMessageBox::Yes) {
// Given confirmation, send request to domain server to replace content
qCDebug(interfaceapp) << "Attempting to replace domain content: " << url;
QByteArray urlData(url.toUtf8());
auto limitedNodeList = DependencyManager::get<LimitedNodeList>();
limitedNodeList->eachMatchingNode([](const SharedNodePointer& node) {
return node->getType() == NodeType::EntityServer && node->getActiveSocket();
}, [&urlData, limitedNodeList](const SharedNodePointer& octreeNode) {
auto octreeFilePacket = NLPacket::create(PacketType::OctreeFileReplacementFromUrl, urlData.size(), true);
octreeFilePacket->write(urlData);
limitedNodeList->sendPacket(std::move(octreeFilePacket), *octreeNode);
});
auto addressManager = DependencyManager::get<AddressManager>();
addressManager->handleLookupString(DOMAIN_SPAWNING_POINT);
QString newHomeAddress = addressManager->getHost() + DOMAIN_SPAWNING_POINT;
qCDebug(interfaceapp) << "Setting new home bookmark to: " << newHomeAddress;
DependencyManager::get<LocationBookmarks>()->setHomeLocationToAddress(newHomeAddress);
details = "SuccessfulRequestToReplaceContent";
} else {
details = "UserDeclinedToReplaceContent";
}
QJsonObject messageProperties = {
{ "status", details },
{ "content_set_url", url }
};
UserActivityLogger::getInstance().logAction("replace_domain_content", messageProperties);
QObject::disconnect(dig, &ModalDialogListener::response, this, nullptr);
});
} else {
methodDetails = "ContentSetDidNotOriginateFromMarketplace";
QJsonObject messageProperties = {
{ "status", methodDetails },
{ "content_set_url", url }
};
UserActivityLogger::getInstance().logAction("replace_domain_content", messageProperties);
}
} else {
methodDetails = "UserDoesNotHavePermissionToReplaceContent";
static const QString warningMessage = simpleWordWrap("The domain owner must enable 'Replace Content' "
"permissions for you in this domain's server settings before you can continue.", MAX_CHARACTERS_PER_LINE);
OffscreenUi::warning("You do not have permissions to replace domain content", warningMessage,
OffscreenUi::asyncWarning("You do not have permissions to replace domain content", warningMessage,
QMessageBox::Ok, QMessageBox::Ok);
QJsonObject messageProperties = {
{ "status", methodDetails },
{ "content_set_url", url }
};
UserActivityLogger::getInstance().logAction("replace_domain_content", messageProperties);
}
QJsonObject messageProperties = {
{ "status", methodDetails },
{ "content_set_url", url }
};
UserActivityLogger::getInstance().logAction("replace_domain_content", messageProperties);
return true;
}
@ -6363,20 +6411,14 @@ void Application::addAssetToWorldUnzipFailure(QString filePath) {
addAssetToWorldError(filename, "Couldn't unzip file " + filename + ".");
}
void Application::addAssetToWorld(QString filePath, QString zipFile, bool isZip, bool isBlocks) {
void Application::addAssetToWorld(QString path, QString zipFile, bool isZip, bool isBlocks) {
// Automatically upload and add asset to world as an alternative manual process initiated by showAssetServerWidget().
QString mapping;
QString path = filePath;
QString filename = filenameFromPath(path);
if (isZip) {
QString assetFolder = zipFile.section("/", -1);
assetFolder.remove(".zip");
mapping = "/" + assetFolder + "/" + filename;
} else if (isBlocks) {
qCDebug(interfaceapp) << "Path to asset folder: " << zipFile;
QString assetFolder = zipFile.section('/', -1);
assetFolder.remove(".zip?noDownload=false");
mapping = "/" + assetFolder + "/" + filename;
if (isZip || isBlocks) {
QString assetName = zipFile.section("/", -1).remove(QRegExp("[.]zip(.*)$"));
QString assetFolder = path.section("model_repo/", -1);
mapping = "/" + assetName + "/" + assetFolder;
} else {
mapping = "/" + filename;
}
@ -6482,9 +6524,9 @@ void Application::addAssetToWorldAddEntity(QString filePath, QString mapping) {
properties.setShapeType(SHAPE_TYPE_SIMPLE_COMPOUND);
properties.setCollisionless(true); // Temporarily set so that doesn't collide with avatar.
properties.setVisible(false); // Temporarily set so that don't see at large unresized dimensions.
glm::vec3 positionOffset = getMyAvatar()->getOrientation() * (getMyAvatar()->getSensorToWorldScale() * glm::vec3(0.0f, 0.0f, -2.0f));
properties.setPosition(getMyAvatar()->getPosition() + positionOffset);
properties.setRotation(getMyAvatar()->getOrientation());
glm::vec3 positionOffset = getMyAvatar()->getWorldOrientation() * (getMyAvatar()->getSensorToWorldScale() * glm::vec3(0.0f, 0.0f, -2.0f));
properties.setPosition(getMyAvatar()->getWorldPosition() + positionOffset);
properties.setRotation(getMyAvatar()->getWorldOrientation());
properties.setGravity(glm::vec3(0.0f, 0.0f, 0.0f));
auto entityID = DependencyManager::get<EntityScriptingInterface>()->addEntity(properties);
@ -6762,8 +6804,10 @@ void Application::handleUnzip(QString zipFile, QStringList unzipFile, bool autoA
if (autoAdd) {
if (!unzipFile.isEmpty()) {
for (int i = 0; i < unzipFile.length(); i++) {
qCDebug(interfaceapp) << "Preparing file for asset server: " << unzipFile.at(i);
addAssetToWorld(unzipFile.at(i), zipFile, isZip, isBlocks);
if (QFileInfo(unzipFile.at(i)).isFile()) {
qCDebug(interfaceapp) << "Preparing file for asset server: " << unzipFile.at(i);
addAssetToWorld(unzipFile.at(i), zipFile, isZip, isBlocks);
}
}
} else {
addAssetToWorldUnzipFailure(zipFile);
@ -7066,11 +7110,11 @@ QRect Application::getRecommendedHUDRect() const {
return result;
}
QSize Application::getDeviceSize() const {
glm::vec2 Application::getDeviceSize() const {
static const int MIN_SIZE = 1;
QSize result(MIN_SIZE, MIN_SIZE);
glm::vec2 result(MIN_SIZE);
if (_displayPlugin) {
result = fromGlm(getActiveDisplayPlugin()->getRecommendedRenderSize());
result = getActiveDisplayPlugin()->getRecommendedRenderSize();
}
return result;
}
@ -7089,10 +7133,6 @@ bool Application::hasFocus() const {
return (QApplication::activeWindow() != nullptr);
}
glm::vec2 Application::getViewportDimensions() const {
return toGlm(getDeviceSize());
}
void Application::setMaxOctreePacketsPerSecond(int maxOctreePPS) {
if (maxOctreePPS != _maxOctreePPS) {
_maxOctreePPS = maxOctreePPS;
@ -7121,6 +7161,7 @@ DisplayPluginPointer Application::getActiveDisplayPlugin() const {
return _displayPlugin;
}
static const char* EXCLUSION_GROUP_KEY = "exclusionGroup";
static void addDisplayPluginToMenu(DisplayPluginPointer displayPlugin, bool active = false) {
auto menu = Menu::getInstance();
@ -7156,6 +7197,8 @@ static void addDisplayPluginToMenu(DisplayPluginPointer displayPlugin, bool acti
action->setCheckable(true);
action->setChecked(active);
displayPluginGroup->addAction(action);
action->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(displayPluginGroup));
Q_ASSERT(menu->menuItemExists(MenuOption::OutputMenu, name));
}

View file

@ -70,9 +70,6 @@
#include "ui/overlays/Overlays.h"
#include "UndoStackScriptingInterface.h"
#include "raypick/RayPickManager.h"
#include "raypick/LaserPointerManager.h"
#include <procedural/ProceduralSkybox.h>
#include <model/Skybox.h>
#include <ModelScriptingInterface.h>
@ -161,7 +158,7 @@ public:
glm::uvec2 getUiSize() const;
QRect getRecommendedHUDRect() const;
QSize getDeviceSize() const;
glm::vec2 getDeviceSize() const;
bool hasFocus() const;
void showCursor(const Cursor::Icon& cursor);
@ -231,8 +228,6 @@ public:
FileLogger* getLogger() const { return _logger; }
glm::vec2 getViewportDimensions() const;
NodeToJurisdictionMap& getEntityServerJurisdictions() { return _entityServerJurisdictions; }
float getRenderResolutionScale() const;
@ -286,9 +281,6 @@ public:
QUrl getAvatarOverrideUrl() { return _avatarOverrideUrl; }
bool getSaveAvatarOverrideUrl() { return _saveAvatarOverrideUrl; }
LaserPointerManager& getLaserPointerManager() { return _laserPointerManager; }
RayPickManager& getRayPickManager() { return _rayPickManager; }
signals:
void svoImportRequested(const QString& url);
@ -703,9 +695,6 @@ private:
bool _saveAvatarOverrideUrl { false };
QObject* _renderEventHandler{ nullptr };
RayPickManager _rayPickManager;
LaserPointerManager _laserPointerManager;
friend class RenderEventHandler;
std::atomic<bool> _pendingIdleEvent { true };

View file

@ -104,8 +104,7 @@ void Application::paintGL() {
PerformanceTimer perfTimer("renderOverlay");
// NOTE: There is no batch associated with this renderArgs
// the ApplicationOverlay class assumes it's viewport is setup to be the device size
QSize size = getDeviceSize();
renderArgs._viewport = glm::ivec4(0, 0, size.width(), size.height());
renderArgs._viewport = glm::ivec4(0, 0, getDeviceSize());
_applicationOverlay.renderOverlay(&renderArgs);
}

View file

@ -56,7 +56,7 @@ Menu* Menu::getInstance() {
return dynamic_cast<Menu*>(qApp->getWindow()->menuBar());
}
const char* exclusionGroupKey = "exclusionGroup";
const char* EXCLUSION_GROUP_KEY = "exclusionGroup";
Menu::Menu() {
auto dialogsManager = DependencyManager::get<DialogsManager>();
@ -228,21 +228,21 @@ Menu::Menu() {
viewMenu, MenuOption::FirstPerson, Qt::CTRL | Qt::Key_F,
true, qApp, SLOT(cameraMenuChanged())));
firstPersonAction->setProperty(exclusionGroupKey, QVariant::fromValue(cameraModeGroup));
firstPersonAction->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(cameraModeGroup));
// View > Third Person
auto thirdPersonAction = cameraModeGroup->addAction(addCheckableActionToQMenuAndActionHash(
viewMenu, MenuOption::ThirdPerson, Qt::CTRL | Qt::Key_G,
false, qApp, SLOT(cameraMenuChanged())));
thirdPersonAction->setProperty(exclusionGroupKey, QVariant::fromValue(cameraModeGroup));
thirdPersonAction->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(cameraModeGroup));
// View > Mirror
auto viewMirrorAction = cameraModeGroup->addAction(addCheckableActionToQMenuAndActionHash(
viewMenu, MenuOption::FullscreenMirror, Qt::CTRL | Qt::Key_H,
false, qApp, SLOT(cameraMenuChanged())));
viewMirrorAction->setProperty(exclusionGroupKey, QVariant::fromValue(cameraModeGroup));
viewMirrorAction->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(cameraModeGroup));
// View > Independent [advanced]
auto viewIndependentAction = cameraModeGroup->addAction(addCheckableActionToQMenuAndActionHash(viewMenu,
@ -250,7 +250,7 @@ Menu::Menu() {
false, qApp, SLOT(cameraMenuChanged()),
UNSPECIFIED_POSITION, "Advanced"));
viewIndependentAction->setProperty(exclusionGroupKey, QVariant::fromValue(cameraModeGroup));
viewIndependentAction->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(cameraModeGroup));
// View > Entity Camera [advanced]
auto viewEntityCameraAction = cameraModeGroup->addAction(addCheckableActionToQMenuAndActionHash(viewMenu,
@ -258,7 +258,7 @@ Menu::Menu() {
false, qApp, SLOT(cameraMenuChanged()),
UNSPECIFIED_POSITION, "Advanced"));
viewEntityCameraAction->setProperty(exclusionGroupKey, QVariant::fromValue(cameraModeGroup));
viewEntityCameraAction->setProperty(EXCLUSION_GROUP_KEY, QVariant::fromValue(cameraModeGroup));
viewMenu->addSeparator();

View file

@ -14,6 +14,7 @@
#include <TextureCache.h>
#include <gpu/Context.h>
#include <EntityScriptingInterface.h>
#include <glm/gtx/transform.hpp>
using RenderArgsPointer = std::shared_ptr<RenderArgs>;
@ -48,6 +49,47 @@ public:
_farClipPlaneDistance = config.farClipPlaneDistance;
_textureWidth = config.textureWidth;
_textureHeight = config.textureHeight;
_mirrorProjection = config.mirrorProjection;
}
void setMirrorProjection(ViewFrustum& srcViewFrustum) {
if (_attachedEntityId.isNull()) {
qWarning() << "ERROR: Cannot set mirror projection for SecondaryCamera without an attachedEntityId set.";
return;
}
EntityItemProperties entityProperties = _entityScriptingInterface->getEntityProperties(_attachedEntityId,
_attachedEntityPropertyFlags);
glm::vec3 mirrorPropertiesPosition = entityProperties.getPosition();
glm::quat mirrorPropertiesRotation = entityProperties.getRotation();
glm::vec3 mirrorPropertiesDimensions = entityProperties.getDimensions();
glm::vec3 halfMirrorPropertiesDimensions = 0.5f * mirrorPropertiesDimensions;
// setup mirror from world as inverse of world from mirror transformation using inverted x and z for mirrored image
// TODO: we are assuming here that UP is world y-axis
glm::mat4 worldFromMirrorRotation = glm::mat4_cast(mirrorPropertiesRotation) * glm::scale(vec3(-1.0f, 1.0f, -1.0f));
glm::mat4 worldFromMirrorTranslation = glm::translate(mirrorPropertiesPosition);
glm::mat4 worldFromMirror = worldFromMirrorTranslation * worldFromMirrorRotation;
glm::mat4 mirrorFromWorld = glm::inverse(worldFromMirror);
// get mirror camera position by reflecting main camera position's z coordinate in mirror space
glm::vec3 mainCameraPositionWorld = qApp->getCamera().getPosition();
glm::vec3 mainCameraPositionMirror = vec3(mirrorFromWorld * vec4(mainCameraPositionWorld, 1.0f));
glm::vec3 mirrorCameraPositionMirror = vec3(mainCameraPositionMirror.x, mainCameraPositionMirror.y,
-mainCameraPositionMirror.z);
glm::vec3 mirrorCameraPositionWorld = vec3(worldFromMirror * vec4(mirrorCameraPositionMirror, 1.0f));
// set frustum position to be mirrored camera and set orientation to mirror's adjusted rotation
glm::quat mirrorCameraOrientation = glm::quat_cast(worldFromMirrorRotation);
srcViewFrustum.setPosition(mirrorCameraPositionWorld);
srcViewFrustum.setOrientation(mirrorCameraOrientation);
// build frustum using mirror space translation of mirrored camera
float nearClip = mirrorCameraPositionMirror.z + mirrorPropertiesDimensions.z * 2.0f;
glm::vec3 upperRight = halfMirrorPropertiesDimensions - mirrorCameraPositionMirror;
glm::vec3 bottomLeft = -halfMirrorPropertiesDimensions - mirrorCameraPositionMirror;
glm::mat4 frustum = glm::frustum(bottomLeft.x, upperRight.x, bottomLeft.y, upperRight.y, nearClip, _farClipPlaneDistance);
srcViewFrustum.setProjection(frustum);
}
void run(const render::RenderContextPointer& renderContext, RenderArgsPointer& cachedArgs) {
@ -71,15 +113,22 @@ public:
});
auto srcViewFrustum = args->getViewFrustum();
if (!_attachedEntityId.isNull()) {
EntityItemProperties entityProperties = _entityScriptingInterface->getEntityProperties(_attachedEntityId, _attachedEntityPropertyFlags);
srcViewFrustum.setPosition(entityProperties.getPosition());
srcViewFrustum.setOrientation(entityProperties.getRotation());
if (_mirrorProjection) {
setMirrorProjection(srcViewFrustum);
} else {
srcViewFrustum.setPosition(_position);
srcViewFrustum.setOrientation(_orientation);
if (!_attachedEntityId.isNull()) {
EntityItemProperties entityProperties = _entityScriptingInterface->getEntityProperties(_attachedEntityId,
_attachedEntityPropertyFlags);
srcViewFrustum.setPosition(entityProperties.getPosition());
srcViewFrustum.setOrientation(entityProperties.getRotation());
} else {
srcViewFrustum.setPosition(_position);
srcViewFrustum.setOrientation(_orientation);
}
srcViewFrustum.setProjection(glm::perspective(glm::radians(_vFoV),
((float)args->_viewport.z / (float)args->_viewport.w),
_nearClipPlaneDistance, _farClipPlaneDistance));
}
srcViewFrustum.setProjection(glm::perspective(glm::radians(_vFoV), ((float)args->_viewport.z / (float)args->_viewport.w), _nearClipPlaneDistance, _farClipPlaneDistance));
// Without calculating the bound planes, the secondary camera will use the same culling frustum as the main camera,
// which is not what we want here.
srcViewFrustum.calculate();
@ -101,6 +150,7 @@ private:
float _farClipPlaneDistance;
int _textureWidth;
int _textureHeight;
bool _mirrorProjection;
EntityPropertyFlags _attachedEntityPropertyFlags;
QSharedPointer<EntityScriptingInterface> _entityScriptingInterface;
};

View file

@ -35,6 +35,7 @@ class SecondaryCameraJobConfig : public render::Task::Config { // Exposes second
Q_PROPERTY(float vFoV MEMBER vFoV NOTIFY dirty) // Secondary camera's vertical field of view. In degrees.
Q_PROPERTY(float nearClipPlaneDistance MEMBER nearClipPlaneDistance NOTIFY dirty) // Secondary camera's near clip plane distance. In meters.
Q_PROPERTY(float farClipPlaneDistance MEMBER farClipPlaneDistance NOTIFY dirty) // Secondary camera's far clip plane distance. In meters.
Q_PROPERTY(bool mirrorProjection MEMBER mirrorProjection NOTIFY dirty) // Flag to use attached mirror entity to build frustum for the mirror and set mirrored camera position/orientation.
public:
QUuid attachedEntityId;
glm::vec3 position;
@ -44,6 +45,7 @@ public:
float farClipPlaneDistance { DEFAULT_FAR_CLIP };
int textureWidth { TextureCache::DEFAULT_SPECTATOR_CAM_WIDTH };
int textureHeight { TextureCache::DEFAULT_SPECTATOR_CAM_HEIGHT };
bool mirrorProjection { false };
SecondaryCameraJobConfig() : render::Task::Config(false) {}
signals:

View file

@ -529,7 +529,7 @@ void AvatarActionHold::lateAvatarUpdate(const AnimPose& prePhysicsRoomPose, cons
rigidBody->setWorldTransform(worldTrans);
bool positionSuccess;
ownerEntity->setPosition(bulletToGLM(worldTrans.getOrigin()) + ObjectMotionState::getWorldOffset(), positionSuccess, false);
ownerEntity->setWorldPosition(bulletToGLM(worldTrans.getOrigin()) + ObjectMotionState::getWorldOffset(), positionSuccess, false);
bool orientationSuccess;
ownerEntity->setOrientation(bulletToGLM(worldTrans.getRotation()), orientationSuccess, false);
ownerEntity->setWorldOrientation(bulletToGLM(worldTrans.getRotation()), orientationSuccess, false);
}

View file

@ -28,6 +28,7 @@
#include <shared/QtHelpers.h>
#include <AvatarData.h>
#include <PerfStat.h>
#include <PrioritySortUtil.h>
#include <RegisteredMetaTypes.h>
#include <Rig.h>
#include <SettingHandle.h>
@ -142,32 +143,39 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
PerformanceTimer perfTimer("otherAvatars");
auto avatarMap = getHashCopy();
QList<AvatarSharedPointer> avatarList = avatarMap.values();
class SortableAvatar: public PrioritySortUtil::Sortable {
public:
SortableAvatar() = delete;
SortableAvatar(const AvatarSharedPointer& avatar) : _avatar(avatar) {}
glm::vec3 getPosition() const override { return _avatar->getWorldPosition(); }
float getRadius() const override { return std::static_pointer_cast<Avatar>(_avatar)->getBoundingRadius(); }
uint64_t getTimestamp() const override { return std::static_pointer_cast<Avatar>(_avatar)->getLastRenderUpdateTime(); }
const AvatarSharedPointer& getAvatar() const { return _avatar; }
private:
AvatarSharedPointer _avatar;
};
ViewFrustum cameraView;
qApp->copyDisplayViewFrustum(cameraView);
PrioritySortUtil::PriorityQueue<SortableAvatar> sortedAvatars(cameraView,
AvatarData::_avatarSortCoefficientSize,
AvatarData::_avatarSortCoefficientCenter,
AvatarData::_avatarSortCoefficientAge);
std::priority_queue<AvatarPriority> sortedAvatars;
AvatarData::sortAvatars(avatarList, cameraView, sortedAvatars,
[](AvatarSharedPointer avatar)->uint64_t{
return std::static_pointer_cast<Avatar>(avatar)->getLastRenderUpdateTime();
},
[](AvatarSharedPointer avatar)->float{
return std::static_pointer_cast<Avatar>(avatar)->getBoundingRadius();
},
[this](AvatarSharedPointer avatar)->bool{
const auto& castedAvatar = std::static_pointer_cast<Avatar>(avatar);
if (castedAvatar == _myAvatar || !castedAvatar->isInitialized()) {
// DO NOT update _myAvatar! Its update has already been done earlier in the main loop.
// DO NOT update or fade out uninitialized Avatars
return true; // ignore it
}
return false;
});
// sort
auto avatarMap = getHashCopy();
AvatarHash::iterator itr = avatarMap.begin();
while (itr != avatarMap.end()) {
const auto& avatar = std::static_pointer_cast<Avatar>(*itr);
// DO NOT update _myAvatar! Its update has already been done earlier in the main loop.
// DO NOT update or fade out uninitialized Avatars
if (avatar != _myAvatar && avatar->isInitialized()) {
sortedAvatars.push(SortableAvatar(avatar));
}
++itr;
}
// process in sorted order
uint64_t startTime = usecTimestampNow();
const uint64_t UPDATE_BUDGET = 2000; // usec
uint64_t updateExpiry = startTime + UPDATE_BUDGET;
@ -176,8 +184,8 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
render::Transaction transaction;
while (!sortedAvatars.empty()) {
const AvatarPriority& sortData = sortedAvatars.top();
const auto& avatar = std::static_pointer_cast<Avatar>(sortData.avatar);
const SortableAvatar& sortData = sortedAvatars.top();
const auto& avatar = std::static_pointer_cast<Avatar>(sortData.getAvatar());
bool ignoring = DependencyManager::get<NodeList>()->isPersonalMutingNode(avatar->getID());
if (ignoring) {
@ -207,7 +215,7 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
uint64_t now = usecTimestampNow();
if (now < updateExpiry) {
// we're within budget
bool inView = sortData.priority > OUT_OF_VIEW_THRESHOLD;
bool inView = sortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (inView && avatar->hasNewJointData()) {
numAvatarsUpdated++;
}
@ -221,7 +229,7 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
// --> some avatar velocity measurements may be a little off
// no time simulate, but we take the time to count how many were tragically missed
bool inView = sortData.priority > OUT_OF_VIEW_THRESHOLD;
bool inView = sortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (!inView) {
break;
}
@ -230,9 +238,9 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
}
sortedAvatars.pop();
while (inView && !sortedAvatars.empty()) {
const AvatarPriority& newSortData = sortedAvatars.top();
const auto& newAvatar = std::static_pointer_cast<Avatar>(newSortData.avatar);
inView = newSortData.priority > OUT_OF_VIEW_THRESHOLD;
const SortableAvatar& newSortData = sortedAvatars.top();
const auto& newAvatar = std::static_pointer_cast<Avatar>(newSortData.getAvatar());
inView = newSortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (inView && newAvatar->hasNewJointData()) {
numAVatarsNotUpdated++;
}
@ -441,7 +449,7 @@ void AvatarManager::handleCollisionEvents(const CollisionEvents& collisionEvents
static const int MAX_INJECTOR_COUNT = 3;
if (_collisionInjectors.size() < MAX_INJECTOR_COUNT) {
auto injector = AudioInjector::playSound(collisionSound, energyFactorOfFull, AVATAR_STRETCH_FACTOR,
myAvatar->getPosition());
myAvatar->getWorldPosition());
_collisionInjectors.emplace_back(injector);
}
myAvatar->collisionWithEntity(collision);

View file

@ -41,7 +41,7 @@ public:
void init();
std::shared_ptr<MyAvatar> getMyAvatar() { return _myAvatar; }
glm::vec3 getMyAvatarPosition() const { return _myAvatar->getPosition(); }
glm::vec3 getMyAvatarPosition() const { return _myAvatar->getWorldPosition(); }
// Null/Default-constructed QUuids will return MyAvatar
Q_INVOKABLE virtual ScriptAvatarData* getAvatar(QUuid avatarID) override { return new ScriptAvatar(getAvatarBySessionID(avatarID)); }

View file

@ -106,22 +106,22 @@ float AvatarMotionState::getObjectAngularDamping() const {
// virtual
glm::vec3 AvatarMotionState::getObjectPosition() const {
return _avatar->getPosition();
return _avatar->getWorldPosition();
}
// virtual
glm::quat AvatarMotionState::getObjectRotation() const {
return _avatar->getOrientation();
return _avatar->getWorldOrientation();
}
// virtual
glm::vec3 AvatarMotionState::getObjectLinearVelocity() const {
return _avatar->getVelocity();
return _avatar->getWorldVelocity();
}
// virtual
glm::vec3 AvatarMotionState::getObjectAngularVelocity() const {
return _avatar->getAngularVelocity();
return _avatar->getWorldAngularVelocity();
}
// virtual

View file

@ -196,8 +196,8 @@ MyAvatar::MyAvatar(QThread* thread) :
setDisplayName(dummyAvatar.getDisplayName());
}
setPosition(dummyAvatar.getPosition());
setOrientation(dummyAvatar.getOrientation());
setWorldPosition(dummyAvatar.getWorldPosition());
setWorldOrientation(dummyAvatar.getWorldOrientation());
if (!dummyAvatar.getAttachmentData().isEmpty()) {
setAttachmentData(dummyAvatar.getAttachmentData());
@ -250,11 +250,11 @@ void MyAvatar::registerMetaTypes(ScriptEnginePointer engine) {
}
void MyAvatar::setOrientationVar(const QVariant& newOrientationVar) {
Avatar::setOrientation(quatFromVariant(newOrientationVar));
Avatar::setWorldOrientation(quatFromVariant(newOrientationVar));
}
QVariant MyAvatar::getOrientationVar() const {
return quatToVariant(Avatar::getOrientation());
return quatToVariant(Avatar::getWorldOrientation());
}
glm::quat MyAvatar::getOrientationOutbound() const {
@ -276,7 +276,7 @@ void MyAvatar::simulateAttachments(float deltaTime) {
QByteArray MyAvatar::toByteArrayStateful(AvatarDataDetail dataDetail, bool dropFaceTracking) {
CameraMode mode = qApp->getCamera().getMode();
_globalPosition = getPosition();
_globalPosition = getWorldPosition();
// This might not be right! Isn't the capsule local offset in avatar space, and don't we need to add the radius to the y as well? -HRS 5/26/17
_globalBoundingBoxDimensions.x = _characterController.getCapsuleRadius();
_globalBoundingBoxDimensions.y = _characterController.getCapsuleHalfHeight();
@ -284,11 +284,11 @@ QByteArray MyAvatar::toByteArrayStateful(AvatarDataDetail dataDetail, bool dropF
_globalBoundingBoxOffset = _characterController.getCapsuleLocalOffset();
if (mode == CAMERA_MODE_THIRD_PERSON || mode == CAMERA_MODE_INDEPENDENT) {
// fake the avatar position that is sent up to the AvatarMixer
glm::vec3 oldPosition = getPosition();
setPosition(getSkeletonPosition());
glm::vec3 oldPosition = getWorldPosition();
setWorldPosition(getSkeletonPosition());
QByteArray array = AvatarData::toByteArrayStateful(dataDetail);
// copy the correct position back
setPosition(oldPosition);
setWorldPosition(oldPosition);
return array;
}
return AvatarData::toByteArrayStateful(dataDetail);
@ -321,15 +321,15 @@ void MyAvatar::centerBody() {
if (_characterController.getState() == CharacterController::State::Ground) {
// the avatar's physical aspect thinks it is standing on something
// therefore need to be careful to not "center" the body below the floor
float downStep = glm::dot(worldBodyPos - getPosition(), _worldUpDirection);
float downStep = glm::dot(worldBodyPos - getWorldPosition(), _worldUpDirection);
if (downStep < -0.5f * _characterController.getCapsuleHalfHeight() + _characterController.getCapsuleRadius()) {
worldBodyPos -= downStep * _worldUpDirection;
}
}
// this will become our new position.
setPosition(worldBodyPos);
setOrientation(worldBodyRot);
setWorldPosition(worldBodyPos);
setWorldOrientation(worldBodyRot);
// reset the body in sensor space
_bodySensorMatrix = newBodySensorMatrix;
@ -372,8 +372,8 @@ void MyAvatar::reset(bool andRecenter, bool andReload, bool andHead) {
auto worldBodyRot = glmExtractRotation(worldBodyMatrix);
// this will become our new position.
setPosition(worldBodyPos);
setOrientation(worldBodyRot);
setWorldPosition(worldBodyPos);
setWorldOrientation(worldBodyRot);
// now sample the new hmd orientation AFTER sensor reset, which should be identity.
glm::mat4 identity;
@ -410,8 +410,8 @@ void MyAvatar::update(float deltaTime) {
#endif
if (_goToPending) {
setPosition(_goToPosition);
setOrientation(_goToOrientation);
setWorldPosition(_goToPosition);
setWorldOrientation(_goToOrientation);
_headControllerFacingMovingAverage = _headControllerFacing; // reset moving average
_goToPending = false;
// updateFromHMDSensorMatrix (called from paintGL) expects that the sensorToWorldMatrix is updated for any position changes
@ -444,7 +444,7 @@ void MyAvatar::update(float deltaTime) {
// This might not be right! Isn't the capsule local offset in avatar space? -HRS 5/26/17
halfBoundingBoxDimensions += _characterController.getCapsuleLocalOffset();
QMetaObject::invokeMethod(audio.data(), "setAvatarBoundingBoxParameters",
Q_ARG(glm::vec3, (getPosition() - halfBoundingBoxDimensions)),
Q_ARG(glm::vec3, (getWorldPosition() - halfBoundingBoxDimensions)),
Q_ARG(glm::vec3, (halfBoundingBoxDimensions*2.0f)));
if (getIdentityDataChanged()) {
@ -537,7 +537,7 @@ void MyAvatar::simulate(float deltaTime) {
if (!_skeletonModel->hasSkeleton()) {
// All the simulation that can be done has been done
getHead()->setPosition(getPosition()); // so audio-position isn't 0,0,0
getHead()->setPosition(getWorldPosition()); // so audio-position isn't 0,0,0
return;
}
@ -555,7 +555,7 @@ void MyAvatar::simulate(float deltaTime) {
Head* head = getHead();
glm::vec3 headPosition;
if (!_skeletonModel->getHeadPosition(headPosition)) {
headPosition = getPosition();
headPosition = getWorldPosition();
}
head->setPosition(headPosition);
head->setScale(getModelScale());
@ -666,7 +666,7 @@ void MyAvatar::updateSensorToWorldMatrix() {
// update the sensor mat so that the body position will end up in the desired
// position when driven from the head.
float sensorToWorldScale = getEyeHeight() / getUserEyeHeight();
glm::mat4 desiredMat = createMatFromScaleQuatAndPos(glm::vec3(sensorToWorldScale), getOrientation(), getPosition());
glm::mat4 desiredMat = createMatFromScaleQuatAndPos(glm::vec3(sensorToWorldScale), getWorldOrientation(), getWorldPosition());
_sensorToWorldMatrix = desiredMat * glm::inverse(_bodySensorMatrix);
lateUpdatePalms();
@ -783,8 +783,8 @@ controller::Pose MyAvatar::getRightHandTipPose() const {
}
glm::vec3 MyAvatar::worldToJointPoint(const glm::vec3& position, const int jointIndex) const {
glm::vec3 jointPos = getPosition();//default value if no or invalid joint specified
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::vec3 jointPos = getWorldPosition();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if (jointIndex != -1) {
if (_skeletonModel->getJointPositionInWorldFrame(jointIndex, jointPos)) {
_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot);
@ -799,7 +799,7 @@ glm::vec3 MyAvatar::worldToJointPoint(const glm::vec3& position, const int joint
}
glm::vec3 MyAvatar::worldToJointDirection(const glm::vec3& worldDir, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
@ -809,7 +809,7 @@ glm::vec3 MyAvatar::worldToJointDirection(const glm::vec3& worldDir, const int j
}
glm::quat MyAvatar::worldToJointRotation(const glm::quat& worldRot, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
@ -818,8 +818,8 @@ glm::quat MyAvatar::worldToJointRotation(const glm::quat& worldRot, const int jo
}
glm::vec3 MyAvatar::jointToWorldPoint(const glm::vec3& jointSpacePos, const int jointIndex) const {
glm::vec3 jointPos = getPosition();//default value if no or invalid joint specified
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::vec3 jointPos = getWorldPosition();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if (jointIndex != -1) {
if (_skeletonModel->getJointPositionInWorldFrame(jointIndex, jointPos)) {
@ -836,7 +836,7 @@ glm::vec3 MyAvatar::jointToWorldPoint(const glm::vec3& jointSpacePos, const int
}
glm::vec3 MyAvatar::jointToWorldDirection(const glm::vec3& jointSpaceDir, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
@ -845,7 +845,7 @@ glm::vec3 MyAvatar::jointToWorldDirection(const glm::vec3& jointSpaceDir, const
}
glm::quat MyAvatar::jointToWorldRotation(const glm::quat& jointSpaceRot, const int jointIndex) const {
glm::quat jointRot = getRotation();//default value if no or invalid joint specified
glm::quat jointRot = getWorldOrientation();//default value if no or invalid joint specified
if ((jointIndex != -1) && (!_skeletonModel->getJointRotationInWorldFrame(jointIndex, jointRot))) {
qWarning() << "Invalid joint index specified: " << jointIndex;
}
@ -951,11 +951,7 @@ void MyAvatar::saveData() {
}
settings.endArray();
if (_avatarEntityData.size() == 0) {
// HACK: manually remove empty 'avatarEntityData' else deleted avatarEntityData may show up in settings file
settings.remove("avatarEntityData");
}
settings.remove("avatarEntityData");
settings.beginWriteArray("avatarEntityData");
int avatarEntityIndex = 0;
auto hmdInterface = DependencyManager::get<HMDScriptingInterface>();
@ -1230,7 +1226,7 @@ void MyAvatar::updateLookAtTargetAvatar() {
float angleTo = coneSphereAngle(getHead()->getEyePosition(), lookForward, avatar->getHead()->getEyePosition(), radius);
if (angleTo < (smallestAngleTo * (isCurrentTarget ? KEEP_LOOKING_AT_CURRENT_ANGLE_FACTOR : 1.0f))) {
_lookAtTargetAvatar = avatarPointer;
_targetAvatarPosition = avatarPointer->getPosition();
_targetAvatarPosition = avatarPointer->getWorldPosition();
}
if (_lookAtSnappingEnabled && avatar->getLookAtSnappingEnabled() && isLookingAtMe(avatar)) {
@ -1301,7 +1297,7 @@ eyeContactTarget MyAvatar::getEyeContactTarget() {
}
glm::vec3 MyAvatar::getDefaultEyePosition() const {
return getPosition() + getOrientation() * Quaternions::Y_180 * _skeletonModel->getDefaultEyeModelPosition();
return getWorldPosition() + getWorldOrientation() * Quaternions::Y_180 * _skeletonModel->getDefaultEyeModelPosition();
}
const float SCRIPT_PRIORITY = 1.0f + 1.0f;
@ -1461,9 +1457,9 @@ glm::vec3 MyAvatar::getSkeletonPosition() const {
// The avatar is rotated PI about the yAxis, so we have to correct for it
// to get the skeleton offset contribution in the world-frame.
const glm::quat FLIP = glm::angleAxis(PI, glm::vec3(0.0f, 1.0f, 0.0f));
return getPosition() + getOrientation() * FLIP * _skeletonOffset;
return getWorldPosition() + getWorldOrientation() * FLIP * _skeletonOffset;
}
return Avatar::getPosition();
return Avatar::getWorldPosition();
}
void MyAvatar::rebuildCollisionShape() {
@ -1509,7 +1505,7 @@ controller::Pose MyAvatar::getControllerPoseInWorldFrame(controller::Action acti
controller::Pose MyAvatar::getControllerPoseInAvatarFrame(controller::Action action) const {
auto pose = getControllerPoseInWorldFrame(action);
if (pose.valid) {
glm::mat4 invAvatarMatrix = glm::inverse(createMatFromQuatAndPos(getOrientation(), getPosition()));
glm::mat4 invAvatarMatrix = glm::inverse(createMatFromQuatAndPos(getWorldOrientation(), getWorldPosition()));
return pose.transform(invAvatarMatrix);
} else {
return controller::Pose(); // invalid pose
@ -1528,7 +1524,7 @@ void MyAvatar::updateMotors() {
// we decompose camera's rotation and store the twist part in motorRotation
// however, we need to perform the decomposition in the avatar-frame
// using the local UP axis and then transform back into world-frame
glm::quat orientation = getOrientation();
glm::quat orientation = getWorldOrientation();
glm::quat headOrientation = glm::inverse(orientation) * getMyHead()->getHeadOrientation(); // avatar-frame
glm::quat liftRotation;
swingTwistDecomposition(headOrientation, Vectors::UNIT_Y, liftRotation, motorRotation);
@ -1548,7 +1544,7 @@ void MyAvatar::updateMotors() {
if (_scriptedMotorFrame == SCRIPTED_MOTOR_CAMERA_FRAME) {
motorRotation = getMyHead()->getHeadOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
} else if (_scriptedMotorFrame == SCRIPTED_MOTOR_AVATAR_FRAME) {
motorRotation = getOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
motorRotation = getWorldOrientation() * glm::angleAxis(PI, Vectors::UNIT_Y);
} else {
// world-frame
motorRotation = glm::quat();
@ -1576,7 +1572,7 @@ void MyAvatar::prepareForPhysicsSimulation() {
_characterController.setParentVelocity(parentVelocity);
_characterController.setScaleFactor(getSensorToWorldScale());
_characterController.setPositionAndOrientation(getPosition(), getOrientation());
_characterController.setPositionAndOrientation(getWorldPosition(), getWorldOrientation());
auto headPose = getControllerPoseInAvatarFrame(controller::Action::HEAD);
if (headPose.isValid()) {
_follow.prePhysicsUpdate(*this, deriveBodyFromHMDSensor(), _bodySensorMatrix, hasDriveInput());
@ -1610,20 +1606,20 @@ void MyAvatar::harvestResultsFromPhysicsSimulation(float deltaTime) {
if (_characterController.isEnabledAndReady()) {
_characterController.getPositionAndOrientation(position, orientation);
} else {
position = getPosition();
orientation = getOrientation();
position = getWorldPosition();
orientation = getWorldOrientation();
}
nextAttitude(position, orientation);
_bodySensorMatrix = _follow.postPhysicsUpdate(*this, _bodySensorMatrix);
if (_characterController.isEnabledAndReady()) {
setVelocity(_characterController.getLinearVelocity() + _characterController.getFollowVelocity());
setWorldVelocity(_characterController.getLinearVelocity() + _characterController.getFollowVelocity());
if (_characterController.isStuck()) {
_physicsSafetyPending = true;
_goToPosition = getPosition();
_goToPosition = getWorldPosition();
}
} else {
setVelocity(getVelocity() + _characterController.getFollowVelocity());
setWorldVelocity(getWorldVelocity() + _characterController.getFollowVelocity());
}
}
@ -1847,15 +1843,15 @@ void MyAvatar::postUpdate(float deltaTime, const render::ScenePointer& scene) {
}
}
DebugDraw::getInstance().updateMyAvatarPos(getPosition());
DebugDraw::getInstance().updateMyAvatarRot(getOrientation());
DebugDraw::getInstance().updateMyAvatarPos(getWorldPosition());
DebugDraw::getInstance().updateMyAvatarRot(getWorldOrientation());
AnimPose postUpdateRoomPose(_sensorToWorldMatrix);
updateHoldActions(_prePhysicsRoomPose, postUpdateRoomPose);
if (_enableDebugDrawDetailedCollision) {
AnimPose rigToWorldPose(glm::vec3(1.0f), getRotation() * Quaternions::Y_180, getPosition());
AnimPose rigToWorldPose(glm::vec3(1.0f), getWorldOrientation() * Quaternions::Y_180, getWorldPosition());
const int NUM_DEBUG_COLORS = 8;
const glm::vec4 DEBUG_COLORS[NUM_DEBUG_COLORS] = {
glm::vec4(1.0f, 1.0f, 1.0f, 1.0f),
@ -1962,7 +1958,7 @@ void MyAvatar::updateOrientation(float deltaTime) {
if (qApp->isHMDMode() && getCharacterController()->getState() == CharacterController::State::Hover && _hmdRollControlEnabled && hasDriveInput()) {
// Turn with head roll.
const float MIN_CONTROL_SPEED = 0.01f;
float speed = glm::length(getVelocity());
float speed = glm::length(getWorldVelocity());
if (speed >= MIN_CONTROL_SPEED) {
// Feather turn when stopping moving.
float speedFactor;
@ -1973,7 +1969,7 @@ void MyAvatar::updateOrientation(float deltaTime) {
speedFactor = glm::min(speed / _lastDrivenSpeed, 1.0f);
}
float direction = glm::dot(getVelocity(), getRotation() * Vectors::UNIT_NEG_Z) > 0.0f ? 1.0f : -1.0f;
float direction = glm::dot(getWorldVelocity(), getWorldOrientation() * Vectors::UNIT_NEG_Z) > 0.0f ? 1.0f : -1.0f;
float rollAngle = glm::degrees(asinf(glm::dot(IDENTITY_UP, _hmdSensorOrientation * IDENTITY_RIGHT)));
float rollSign = rollAngle < 0.0f ? -1.0f : 1.0f;
@ -1986,12 +1982,12 @@ void MyAvatar::updateOrientation(float deltaTime) {
// update body orientation by movement inputs
glm::quat initialOrientation = getOrientationOutbound();
setOrientation(getOrientation() * glm::quat(glm::radians(glm::vec3(0.0f, totalBodyYaw, 0.0f))));
setWorldOrientation(getWorldOrientation() * glm::quat(glm::radians(glm::vec3(0.0f, totalBodyYaw, 0.0f))));
if (snapTurn) {
// Whether or not there is an existing smoothing going on, just reset the smoothing timer and set the starting position as the avatar's current position, then smooth to the new position.
_smoothOrientationInitial = initialOrientation;
_smoothOrientationTarget = getOrientation();
_smoothOrientationTarget = getWorldOrientation();
_smoothOrientationTimer = 0.0f;
}
@ -2084,7 +2080,7 @@ void MyAvatar::updatePosition(float deltaTime) {
updateActionMotor(deltaTime);
}
vec3 velocity = getVelocity();
vec3 velocity = getWorldVelocity();
float sensorToWorldScale = getSensorToWorldScale();
float sensorToWorldScale2 = sensorToWorldScale * sensorToWorldScale;
const float MOVING_SPEED_THRESHOLD_SQUARED = 0.0001f; // 0.01 m/s
@ -2106,9 +2102,9 @@ void MyAvatar::updatePosition(float deltaTime) {
if (_moving) {
// scan for walkability
glm::vec3 position = getPosition();
glm::vec3 position = getWorldPosition();
MyCharacterController::RayShotgunResult result;
glm::vec3 step = deltaTime * (getRotation() * _actionMotorVelocity);
glm::vec3 step = deltaTime * (getWorldOrientation() * _actionMotorVelocity);
_characterController.testRayShotgun(position, step, result);
_characterController.setStepUpEnabled(result.walkable);
}
@ -2339,7 +2335,7 @@ void MyAvatar::goToLocation(const glm::vec3& newPosition,
_goToPending = true;
_goToPosition = newPosition;
_goToOrientation = getOrientation();
_goToOrientation = getWorldOrientation();
if (hasOrientation) {
qCDebug(interfaceapp).nospace() << "MyAvatar goToLocation - new orientation is "
<< newOrientation.x << ", " << newOrientation.y << ", " << newOrientation.z << ", " << newOrientation.w;
@ -2408,7 +2404,7 @@ bool MyAvatar::requiresSafeLanding(const glm::vec3& positionIn, glm::vec3& bette
return false; // no entity tree
}
// More utilities.
const auto offset = getOrientation() *_characterController.getCapsuleLocalOffset();
const auto offset = getWorldOrientation() *_characterController.getCapsuleLocalOffset();
const auto capsuleCenter = positionIn + offset;
const auto up = _worldUpDirection, down = -up;
glm::vec3 upperIntersection, upperNormal, lowerIntersection, lowerNormal;
@ -2909,7 +2905,7 @@ glm::mat4 MyAvatar::FollowHelper::postPhysicsUpdate(const MyAvatar& myAvatar, co
}
float MyAvatar::getAccelerationEnergy() {
glm::vec3 velocity = getVelocity();
glm::vec3 velocity = getWorldVelocity();
int changeInVelocity = abs(velocity.length() - priorVelocity.length());
float changeInEnergy = priorVelocity.length() * changeInVelocity * AVATAR_MOVEMENT_ENERGY_CONSTANT;
priorVelocity = velocity;
@ -2930,7 +2926,7 @@ float MyAvatar::getAudioEnergy() {
}
bool MyAvatar::didTeleport() {
glm::vec3 pos = getPosition();
glm::vec3 pos = getWorldPosition();
glm::vec3 changeInPosition = pos - lastPosition;
lastPosition = pos;
return (changeInPosition.length() > MAX_AVATAR_MOVEMENT_PER_FRAME);
@ -2974,7 +2970,7 @@ glm::mat4 MyAvatar::computeCameraRelativeHandControllerMatrix(const glm::mat4& c
glm::mat4 controllerWorldMatrix = getSensorToWorldMatrix() * delta * controllerSensorMatrix;
// transform controller into avatar space
glm::mat4 avatarMatrix = createMatFromQuatAndPos(getOrientation(), getPosition());
glm::mat4 avatarMatrix = createMatFromQuatAndPos(getWorldOrientation(), getWorldPosition());
return glm::inverse(avatarMatrix) * controllerWorldMatrix;
}
@ -3178,7 +3174,7 @@ bool MyAvatar::pinJoint(int index, const glm::vec3& position, const glm::quat& o
}
slamPosition(position);
setOrientation(orientation);
setWorldOrientation(orientation);
_skeletonModel->getRig().setMaxHipsOffsetLength(0.05f);

Some files were not shown because too many files have changed in this diff Show more