Merge branch 'master' into 21624

This commit is contained in:
David Rowe 2017-12-13 09:41:09 +13:00
commit 578c42b4ea
229 changed files with 8754 additions and 3754 deletions

4
.gitignore vendored
View file

@ -15,7 +15,9 @@ Makefile
# Android Studio
*.iml
local.properties
android/libraries
android/gradle*
android/.gradle
android/app/src/main/jniLibs
# VSCode
# List taken from Github Global Ignores master@435c4d92

View file

@ -1,25 +1,23 @@
Please read the [general build guide](BUILD.md) for information on dependencies required for all platforms. Only Android specific instructions are found in this file.
Please read the [general build guide](BUILD.md) for information on building other platform. Only Android specific instructions are found in this file.
# Android Dependencies
# Dependencies
*Currently Android building is only supported on 64 bit Linux host environments*
You will need the following tools to build our Android targets.
* [Qt](http://www.qt.io/download-open-source/#) ~> 5.9.1
* [Gradle](https://gradle.org/install/)
* [Android Studio](https://developer.android.com/studio/index.html)
* [Google VR SDK](https://github.com/googlevr/gvr-android-sdk/releases)
* [Gradle](https://gradle.org/releases/)
### Qt
### Gradle
Download the Qt online installer. Run the installer and select the android_armv7 binaries. Installing to the default path is recommended
Install gradle version 4.1 or higher. Following the instructions to install via [SDKMAN!](http://sdkman.io/install.html) is recommended.
### Android Studio
Download the Android Studio installer and run it. Once installed, at the welcome screen, click configure in the lower right corner and select SDK manager
From the SDK Platforms tab, select API level 26.
* Install the ARM EABI v7a System Image if you want to run an emulator.
From the SDK Platforms tab, select API levels 24 and 26.
From the SDK Tools tab select the following
@ -29,123 +27,41 @@ From the SDK Tools tab select the following
* LLDB
* Android SDK Platform-Tools
* Android SDK Tools
* Android SDK Tools
* NDK (even if you have the NDK installed separately)
### Google VR SDK
# Environment
Download the 1.8 Google VR SDK [release](https://github.com/googlevr/gvr-android-sdk/archive/v1.80.0.zip). Unzip the archive to a location on your drive.
### Gradle
Download [Gradle 4.1](https://services.gradle.org/distributions/gradle-4.1-all.zip) and unzip it on your local drive. You may wish to add the location of the bin directory within the archive to your path
Setting up the environment for android builds requires some additional steps
#### Set up machine specific Gradle properties
Create a `gradle.properties` file in ~/.gradle. Edit the file to contain the following
Create a `gradle.properties` file in $HOME/.gradle. Edit the file to contain the following
QT5_ROOT=C\:\\Qt\\5.9.1\\android_armv7
GVR_ROOT=C\:\\Android\\gvr-android-sdk
HIFI_ANDROID_PRECOMPILED=<your_home_directory>/Android/hifi_externals
Replace the paths with your local installations of Qt5 and the Google VR SDK
Note, do not use `$HOME` for the path. It must be a fully qualified path name.
### Setup the repository
Clone the repository
`git clone https://github.com/highfidelity/hifi.git`
Enter the repository `android` directory
`cd hifi/android`
Execute a gradle pre-build setup. This step should only need to be done once
`gradle setupDepedencies`
# TODO fix the rest
# Building & Running
You will also need to cross-compile the dependencies required for all platforms for Android, and help CMake find these compiled libraries on your machine.
* Open Android Studio
* Choose _Open Existing Android Studio Project_
* Navigate to the `hifi` repository and choose the `android` folder and select _OK_
* If Android Studio asks you if you want to use the Gradle wrapper, select cancel and tell it where your local gradle installation is. If you used SDKMAN to install gradle it will be located in `$HOME/.sdkman/candidates/gradle/current/`
* From the _Build_ menu select _Make Project_
* Once the build completes, from the _Run_ menu select _Run App_
#### Scribe
High Fidelity has a shader pre-processing tool called `scribe` that various libraries will call on during the build process. You must compile scribe using your native toolchain (following the build instructions for your platform) and then pass a CMake variable or set an ENV variable `SCRIBE_PATH` that is a path to the scribe executable.
CMake will fatally error if it does not find the scribe executable while using the android toolchain.
#### Optional Components
* [Oculus Mobile SDK](https://developer.oculus.com/downloads/#sdk=mobile) ~> 0.4.2
#### ANDROID_LIB_DIR
Since you won't be installing Android dependencies to system paths on your development machine, CMake will need a little help tracking down your Android dependencies.
This is most easily accomplished by installing all Android dependencies in the same folder. You can place this folder wherever you like on your machine. In this build guide and across our CMakeLists files this folder is referred to as `ANDROID_LIB_DIR`. You can set `ANDROID_LIB_DIR` in your environment or by passing when you run CMake.
#### Qt
Install Qt 5.5.1 for Android for your host environment from the [Qt downloads page](http://www.qt.io/download/). Install Qt to ``$ANDROID_LIB_DIR/Qt``. This is required so that our root CMakeLists file can help CMake find your Android Qt installation.
The component required for the Android build is the `Android armv7` component.
If you would like to install Qt to a different location, or attempt to build with a different Qt version, you can pass `ANDROID_QT_CMAKE_PREFIX_PATH` to CMake. Point to the `cmake` folder inside `$VERSION_NUMBER/android_armv7/lib`. Otherwise, our root CMakeLists will set it to `$ANDROID_LIB_DIR/Qt/5.5/android_armv7/lib/cmake`.
#### OpenSSL
Cross-compilation of OpenSSL has been tested from an OS X machine running 10.10 compiling OpenSSL 1.0.2. It is likely that the steps below will work for other OpenSSL versions than 1.0.2.
The original instructions to compile OpenSSL for Android from your host environment can be found [here](http://wiki.openssl.org/index.php/Android). We required some tweaks to get OpenSSL to successfully compile, those tweaks are explained below.
Download the [OpenSSL source](https://www.openssl.org/source/) and extract the tarball inside your `ANDROID_LIB_DIR`. Rename the extracted folder to `openssl`.
You will need the [setenv-android.sh script](http://wiki.openssl.org/index.php/File:Setenv-android.sh) from the OpenSSL wiki.
You must change three values at the top of the `setenv-android.sh` script - `_ANDROID_NDK`, `_ANDROID_EABI` and `_ANDROID_API`.
`_ANDROID_NDK` should be `android-ndk-r10`, `_ANDROID_EABI` should be `arm-linux-androidebi-4.9` and `_ANDROID_API` should be `19`.
First, make sure `ANDROID_NDK_ROOT` is set in your env. This should be the path to the root of your Android NDK install. `setenv-android.sh` needs `ANDROID_NDK_ROOT` to set the environment variables required for building OpenSSL.
Source the `setenv-android.sh` script so it can set environment variables that OpenSSL will use while compiling. If you use zsh as your shell you may need to modify the `setenv-android.sh` for it to set the correct variables in your env.
```
export ANDROID_NDK_ROOT=YOUR_NDK_ROOT
source setenv-android.sh
```
Then, from the OpenSSL directory, run the following commands.
```
perl -pi -e 's/install: all install_docs install_sw/install: install_docs install_sw/g' Makefile.org
./config shared -no-ssl2 -no-ssl3 -no-comp -no-hw -no-engine --openssldir=/usr/local/ssl/$ANDROID_API
make depend
make all
```
This should generate libcrypto and libssl in the root of the OpenSSL directory. YOU MUST remove the `libssl.so` and `libcrypto.so` files that are generated. They are symlinks to `libssl.so.VER` and `libcrypto.so.VER` which Android does not know how to handle. By removing `libssl.so` and `libcrypto.so` the FindOpenSSL module will find the static libs and use those instead.
If you have been building other components it is possible that the OpenSSL compile will fail based on the values other cross-compilations (tbb, bullet) have set. Ensure that you are in a new terminal window to avoid compilation errors from previously set environment variables.
#### Oculus Mobile SDK
The Oculus Mobile SDK is optional, for Gear VR support. It is not required to compile gvr-interface.
Download the [Oculus Mobile SDK](https://developer.oculus.com/downloads/#sdk=mobile) and extract the archive inside your `ANDROID_LIB_DIR` folder. Rename the extracted folder to `libovr`.
From the VRLib directory, use ndk-build to build VrLib.
```
cd VRLib
ndk-build
```
This will create the liboculus.a archive that our FindLibOVR module will look for when cmake is run.
##### Hybrid testing
Currently the 'vr_dual' mode that would allow us to run a hybrid app has limited support in the Oculus Mobile SDK. The best way to have an application we can launch without having to connect to the GearVR is to put the Gear VR Service into developer mode. This stops Oculus Home from taking over the device when it is plugged into the Gear VR headset, and allows the application to be launched from the Applications page.
To put the Gear VR Service into developer mode you need an application with an Oculus Signature File on your device. Generate an Oculus Signature File for your device on the [Oculus osig tool page](https://developer.oculus.com/tools/osig/). Place this file in the gvr-interface/assets directory. Cmake will automatically copy it into your apk in the right place when you execute `make gvr-interface-apk`.
Once the application is on your device, go to `Settings->Application Manager->Gear VR Service->Manage Storage`. Tap on `VR Service Version` six times. It will scan your device to verify that you have an osig file in an application on your device, and then it will let you enable Developer mode.
### CMake
We use CMake to generate the makefiles that compile and deploy the Android APKs to your device. In order to create Makefiles for the Android targets, CMake requires that some environment variables are set, and that other variables are passed to it when it is run.
The following must be set in your environment:
* ANDROID_NDK - the root of your Android NDK install
* ANDROID_HOME - the root of your Android SDK install
* ANDROID_LIB_DIR - the directory containing cross-compiled versions of dependencies
The following must be passed to CMake when it is run:
* USE_ANDROID_TOOLCHAIN - set to true to build for Android

View file

@ -1,8 +1,10 @@
set(TARGET_NAME native-lib)
setup_hifi_library()
link_hifi_libraries(shared networking gl gpu gpu-gles render-utils)
autoscribe_shader_lib(gpu model render render-utils)
target_opengl()
link_hifi_libraries(shared networking gl gpu gpu-gles image fbx render-utils physics)
target_link_libraries(native-lib android log m)
target_include_directories(native-lib PRIVATE "${GVR_ROOT}/libraries/headers")
target_link_libraries(native-lib "C:/Users/bdavis/Git/hifi/android/libraries/jni/armeabi-v7a/libgvr.so")
target_opengl()
target_googlevr()

View file

@ -1,27 +1,32 @@
apply plugin: 'com.android.application'
ext.RELEASE_NUMBER = project.hasProperty('RELEASE_NUMBER') ? project.getProperty('RELEASE_NUMBER') : '0'
ext.RELEASE_TYPE = project.hasProperty('RELEASE_TYPE') ? project.getProperty('RELEASE_TYPE') : 'DEV'
ext.BUILD_BRANCH = project.hasProperty('BUILD_BRANCH') ? project.getProperty('BUILD_BRANCH') : ''
android {
compileSdkVersion 26
buildToolsVersion "26.0.1"
defaultConfig {
applicationId "org.saintandreas.testapp"
minSdkVersion 24
targetSdkVersion 26
versionCode 1
versionName "1.0"
ndk { abiFilters 'armeabi-v7a' }
ndk { abiFilters 'arm64-v8a' }
externalNativeBuild {
cmake {
arguments '-DHIFI_ANDROID=1',
'-DANDROID_PLATFORM=android-24',
'-DANDROID_TOOLCHAIN=clang',
'-DANDROID_STL=gnustl_shared',
'-DGVR_ROOT=' + GVR_ROOT,
'-DNATIVE_SCRIBE=c:/bin/scribe.exe',
"-DHIFI_ANDROID_PRECOMPILED=${project.rootDir}/libraries/jni/armeabi-v7a"
'-DANDROID_STL=c++_shared',
'-DQT_CMAKE_PREFIX_PATH=' + HIFI_ANDROID_PRECOMPILED + '/qt/lib/cmake',
'-DNATIVE_SCRIBE=' + HIFI_ANDROID_PRECOMPILED + '/scribe',
'-DHIFI_ANDROID_PRECOMPILED=' + HIFI_ANDROID_PRECOMPILED,
'-DRELEASE_NUMBER=' + RELEASE_NUMBER,
'-DRELEASE_TYPE=' + RELEASE_TYPE,
'-DBUILD_BRANCH=' + BUILD_BRANCH
}
}
jackOptions { enabled true }
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
@ -29,17 +34,20 @@ android {
}
buildTypes {
applicationVariants.all { variant ->
variant.outputs.all {
if (RELEASE_NUMBER != '0') {
outputFileName = "app_" + RELEASE_NUMBER + "_" + RELEASE_TYPE + ".apk"
}
}
}
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
sourceSets {
main {
jniLibs.srcDirs += '../libraries/jni';
}
}
externalNativeBuild {
cmake {
path '../../CMakeLists.txt'
@ -53,5 +61,3 @@ dependencies {
compile 'com.google.vr:sdk-audio:1.80.0'
compile 'com.google.vr:sdk-base:1.80.0'
}
build.dependsOn(':extractQt5')

View file

@ -7,12 +7,10 @@
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-feature android:name="android.hardware.sensor.accelerometer" android:required="true"/>
<uses-feature android:name="android.hardware.sensor.gyroscope" android:required="true"/>
<uses-feature android:name="android.software.vr.mode" android:required="false"/>
<uses-feature android:name="android.hardware.vr.high_performance" android:required="false"/>
<application
android:allowBackup="true"
android:theme="@style/VrActivityTheme"
android:theme="@style/NoSystemUI"
android:icon="@mipmap/ic_launcher"
android:roundIcon="@mipmap/ic_launcher_round">
<activity
@ -20,17 +18,10 @@
android:label="@string/app_name"
android:screenOrientation="landscape"
android:configChanges="orientation|keyboardHidden|screenSize"
android:enableVrMode="@string/gvr_vr_mode_component"
android:resizeableActivity="false">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
<category android:name="com.google.intent.category.DAYDREAM"/>
</intent-filter>
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
<category android:name="com.google.intent.category.CARDBOARD" />
</intent-filter>
</activity>
</application>

View file

@ -37,12 +37,7 @@ extern "C" {
JNI_METHOD(jlong, nativeCreateRenderer)
(JNIEnv *env, jclass clazz, jobject class_loader, jobject android_context, jlong native_gvr_api) {
qInstallMessageHandler(messageHandler);
#if defined(GVR)
auto gvrContext = reinterpret_cast<gvr_context *>(native_gvr_api);
return toJni(new NativeRenderer(gvrContext));
#else
return toJni(new NativeRenderer(nullptr));
#endif
return toJni(new NativeRenderer());
}
JNI_METHOD(void, nativeDestroyRenderer)

View file

@ -1,138 +1,14 @@
#include "renderer.h"
#include <mutex>
#include <glm/gtc/matrix_transform.hpp>
#include <QtCore/QDebug>
#include <gl/Config.h>
#include "GoogleVRHelpers.h"
#include <gl/GLShaders.h>
#include <shared/Bilateral.h>
#include <gpu/DrawTransformUnitQuad_vert.h>
#include <gpu/DrawTexcoordRectTransformUnitQuad_vert.h>
#include <gpu/DrawViewportQuadTransformTexcoord_vert.h>
#include <gpu/DrawTexture_frag.h>
#include <gpu/DrawTextureOpaque_frag.h>
#include <gpu/DrawColoredTexture_frag.h>
#include <render-utils/simple_vert.h>
#include <render-utils/simple_frag.h>
#include <render-utils/simple_textured_frag.h>
#include <render-utils/simple_textured_unlit_frag.h>
#include <render-utils/deferred_light_vert.h>
#include <render-utils/deferred_light_point_vert.h>
#include <render-utils/deferred_light_spot_vert.h>
#include <render-utils/directional_ambient_light_frag.h>
#include <render-utils/directional_skybox_light_frag.h>
#include <render-utils/standardTransformPNTC_vert.h>
#include <render-utils/standardDrawTexture_frag.h>
#include <render-utils/model_vert.h>
#include <render-utils/model_shadow_vert.h>
#include <render-utils/model_normal_map_vert.h>
#include <render-utils/model_lightmap_vert.h>
#include <render-utils/model_lightmap_normal_map_vert.h>
#include <render-utils/skin_model_vert.h>
#include <render-utils/skin_model_shadow_vert.h>
#include <render-utils/skin_model_normal_map_vert.h>
#include <render-utils/model_frag.h>
#include <render-utils/model_shadow_frag.h>
#include <render-utils/model_normal_map_frag.h>
#include <render-utils/model_normal_specular_map_frag.h>
#include <render-utils/model_specular_map_frag.h>
#include <render-utils/model_lightmap_frag.h>
#include <render-utils/model_lightmap_normal_map_frag.h>
#include <render-utils/model_lightmap_normal_specular_map_frag.h>
#include <render-utils/model_lightmap_specular_map_frag.h>
#include <render-utils/model_translucent_frag.h>
#include <render-utils/overlay3D_vert.h>
#include <render-utils/overlay3D_frag.h>
#include <render-utils/sdf_text3D_vert.h>
#include <render-utils/sdf_text3D_frag.h>
#if 0
#include <model/skybox_vert.h>
#include <model/skybox_frag.h>
#include <entities-renderer/textured_particle_frag.h>
#include <entities-renderer/textured_particle_vert.h>
#include <entities-renderer/paintStroke_vert.h>
#include <entities-renderer/paintStroke_frag.h>
#include <entities-renderer/polyvox_vert.h>
#include <entities-renderer/polyvox_frag.h>
#endif
template <typename F>
void withFrameBuffer(gvr::Frame& frame, int32_t index, F f) {
frame.BindBuffer(index);
f();
frame.Unbind();
}
static const uint64_t kPredictionTimeWithoutVsyncNanos = 50000000;
// Each shader has two variants: a single-eye ES 2.0 variant, and a multiview
// ES 3.0 variant. The multiview vertex shaders use transforms defined by
// arrays of mat4 uniforms, using gl_ViewID_OVR to determine the array index.
#define UNIFORM_LIGHT_POS 20
#define UNIFORM_M 16
#define UNIFORM_MV 8
#define UNIFORM_MVP 0
#if 0
uniform Transform { // API uses “Transform[2]” to refer to instance 2
mat4 u_MVP[2];
mat4 u_MVMatrix[2];
mat4 u_Model;
vec3 u_LightPos[2];
};
static const char *kDiffuseLightingVertexShader = R"glsl(
#version 300 es
#extension GL_OVR_multiview2 : enable
layout(num_views=2) in;
layout(location = 0) uniform mat4 u_MVP[2];
layout(location = 8) uniform mat4 u_MVMatrix[2];
layout(location = 16) uniform mat4 u_Model;
layout(location = 20) uniform vec3 u_LightPos[2];
layout(location = 0) in vec4 a_Position;
layout(location = 1) in vec4 a_Color;
layout(location = 2) in vec3 a_Normal;
out vec4 v_Color;
out vec3 v_Grid;
void main() {
mat4 mvp = u_MVP[gl_ViewID_OVR];
mat4 modelview = u_MVMatrix[gl_ViewID_OVR];
vec3 lightpos = u_LightPos[gl_ViewID_OVR];
v_Grid = vec3(u_Model * a_Position);
vec3 modelViewVertex = vec3(modelview * a_Position);
vec3 modelViewNormal = vec3(modelview * vec4(a_Normal, 0.0));
float distance = length(lightpos - modelViewVertex);
vec3 lightVector = normalize(lightpos - modelViewVertex);
float diffuse = max(dot(modelViewNormal, lightVector), 0.5);
diffuse = diffuse * (1.0 / (1.0 + (0.00001 * distance * distance)));
v_Color = vec4(a_Color.rgb * diffuse, a_Color.a);
gl_Position = mvp * a_Position;
}
)glsl";
#endif
static const char *kSimepleVertexShader = R"glsl(
#version 300 es
static const char *kSimepleVertexShader = R"glsl(#version 300 es
#extension GL_OVR_multiview2 : enable
layout(num_views=2) in;
@ -147,9 +23,7 @@ void main() {
}
)glsl";
static const char *kPassthroughFragmentShader = R"glsl(
#version 300 es
static const char *kPassthroughFragmentShader = R"glsl(#version 300 es
precision mediump float;
in vec4 v_Color;
out vec4 FragColor;
@ -157,6 +31,17 @@ out vec4 FragColor;
void main() { FragColor = v_Color; }
)glsl";
int LoadGLShader(int type, const char *shadercode) {
GLuint result = 0;
std::string shaderError;
static const std::string SHADER_DEFINES;
if (!gl::compileShader(type, shadercode, SHADER_DEFINES, result, shaderError)) {
qWarning() << "QQQ" << __FUNCTION__ << "Shader compile failure" << shaderError.c_str();
}
return result;
}
static void CheckGLError(const char* label) {
int gl_error = glGetError();
if (gl_error != GL_NO_ERROR) {
@ -167,158 +52,6 @@ static void CheckGLError(const char* label) {
}
// Contains vertex, normal and other data.
namespace cube {
const std::array<float, 108> CUBE_COORDS{{
// Front face
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
// Right face
1.0f, 1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
1.0f, -1.0f, -1.0f,
1.0f, 1.0f, -1.0f,
// Back face
1.0f, 1.0f, -1.0f,
1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, -1.0f,
1.0f, -1.0f, -1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, -1.0f,
// Left face
-1.0f, 1.0f, -1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, -1.0f,
-1.0f, -1.0f, 1.0f,
-1.0f, 1.0f, 1.0f,
// Top face
-1.0f, 1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
-1.0f, 1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
1.0f, 1.0f, -1.0f,
// Bottom face
1.0f, -1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, -1.0f,
1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
-1.0f, -1.0f, -1.0f
}};
const std::array<float, 108> CUBE_COLORS{{
// front, green
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
// right, blue
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
// back, also green
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
0.0f, 0.5273f, 0.2656f,
// left, also blue
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
0.0f, 0.3398f, 0.9023f,
// top, red
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
// bottom, also red
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f,
0.8359375f, 0.17578125f, 0.125f
}};
const std::array<float, 108> CUBE_NORMALS{{
// Front face
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
// Right face
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 0.0f,
// Back face
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
0.0f, 0.0f, -1.0f,
// Left face
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
// Top face
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
// Bottom face
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f,
0.0f, -1.0f, 0.0f
}};
}
namespace triangle {
static std::array<float, 9> TRIANGLE_VERTS {{
-0.5f, -0.5f, 0.0f,
@ -327,229 +60,36 @@ namespace triangle {
}};
}
std::array<gvr::BufferViewport, 2> buildViewports(const std::unique_ptr<gvr::GvrApi> &gvrapi) {
return { {gvrapi->CreateBufferViewport(), gvrapi->CreateBufferViewport()} };
};
const std::string VERTEX_SHADER_DEFINES{ R"GLSL(
#version 300 es
#extension GL_EXT_clip_cull_distance : enable
#define GPU_VERTEX_SHADER
#define GPU_SSBO_TRANSFORM_OBJECT 1
#define GPU_TRANSFORM_IS_STEREO
#define GPU_TRANSFORM_STEREO_CAMERA
#define GPU_TRANSFORM_STEREO_CAMERA_INSTANCED
#define GPU_TRANSFORM_STEREO_SPLIT_SCREEN
)GLSL" };
const std::string PIXEL_SHADER_DEFINES{ R"GLSL(
#version 300 es
precision mediump float;
#define GPU_PIXEL_SHADER
#define GPU_TRANSFORM_IS_STEREO
#define GPU_TRANSFORM_STEREO_CAMERA
#define GPU_TRANSFORM_STEREO_CAMERA_INSTANCED
#define GPU_TRANSFORM_STEREO_SPLIT_SCREEN
)GLSL" };
#if defined(GVR)
NativeRenderer::NativeRenderer(gvr_context *vrContext) :
_gvrapi(new gvr::GvrApi(vrContext, false)),
_viewports(buildViewports(_gvrapi)),
_gvr_viewer_type(_gvrapi->GetViewerType())
#else
NativeRenderer::NativeRenderer(void *vrContext)
#endif
{
start = std::chrono::system_clock::now();
qDebug() << "QQQ" << __FUNCTION__;
}
/**
* Converts a raw text file, saved as a resource, into an OpenGL ES shader.
*
* @param type The type of shader we will be creating.
* @param resId The resource ID of the raw text file.
* @return The shader object handler.
*/
int LoadGLShader(int type, const char *shadercode) {
GLuint result = 0;
std::string shaderError;
static const std::string SHADER_DEFINES;
if (!gl::compileShader(type, shadercode, SHADER_DEFINES, result, shaderError)) {
qWarning() << "QQQ" << __FUNCTION__ << "Shader compile failure" << shaderError.c_str();
}
return result;
}
// Computes a texture size that has approximately half as many pixels. This is
// equivalent to scaling each dimension by approximately sqrt(2)/2.
static gvr::Sizei HalfPixelCount(const gvr::Sizei &in) {
// Scale each dimension by sqrt(2)/2 ~= 7/10ths.
gvr::Sizei out;
out.width = (7 * in.width) / 10;
out.height = (7 * in.height) / 10;
return out;
}
#if defined(GVR)
void NativeRenderer::InitializeVR() {
_gvrapi->InitializeGl();
bool multiviewEnabled = _gvrapi->IsFeatureSupported(GVR_FEATURE_MULTIVIEW);
qWarning() << "QQQ" << __FUNCTION__ << "Multiview enabled " << multiviewEnabled;
// Because we are using 2X MSAA, we can render to half as many pixels and
// achieve similar quality.
_renderSize = HalfPixelCount(_gvrapi->GetMaximumEffectiveRenderTargetSize());
std::vector<gvr::BufferSpec> specs;
specs.push_back(_gvrapi->CreateBufferSpec());
specs[0].SetColorFormat(GVR_COLOR_FORMAT_RGBA_8888);
specs[0].SetDepthStencilFormat(GVR_DEPTH_STENCIL_FORMAT_DEPTH_16);
specs[0].SetSamples(2);
gvr::Sizei half_size = {_renderSize.width / 2, _renderSize.height};
specs[0].SetMultiviewLayers(2);
specs[0].SetSize(half_size);
_swapchain.reset(new gvr::SwapChain(_gvrapi->CreateSwapChain(specs)));
_viewportlist.reset(new gvr::BufferViewportList(_gvrapi->CreateEmptyBufferViewportList()));
}
void NativeRenderer::PrepareFramebuffer() {
const gvr::Sizei recommended_size = HalfPixelCount(
_gvrapi->GetMaximumEffectiveRenderTargetSize());
if (_renderSize.width != recommended_size.width ||
_renderSize.height != recommended_size.height) {
// We need to resize the framebuffer. Note that multiview uses two texture
// layers, each with half the render width.
gvr::Sizei framebuffer_size = recommended_size;
framebuffer_size.width /= 2;
_swapchain->ResizeBuffer(0, framebuffer_size);
_renderSize = recommended_size;
}
}
#endif
void testShaderBuild(const char* vs_src, const char * fs_src) {
std::string error;
GLuint vs, fs;
if (!gl::compileShader(GL_VERTEX_SHADER, vs_src, VERTEX_SHADER_DEFINES, vs, error) ||
!gl::compileShader(GL_FRAGMENT_SHADER, fs_src, PIXEL_SHADER_DEFINES, fs, error)) {
throw std::runtime_error("Failed to compile shader");
}
auto pr = gl::compileProgram({ vs, fs }, error);
if (!pr) {
throw std::runtime_error("Failed to link shader");
}
}
void NativeRenderer::InitializeGl() {
qDebug() << "QQQ" << __FUNCTION__;
//gl::initModuleGl();
#if defined(GVR)
InitializeVR();
#endif
glDisable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glDisable(GL_SCISSOR_TEST);
glDisable(GL_BLEND);
const uint32_t vertShader = LoadGLShader(GL_VERTEX_SHADER, kSimepleVertexShader);
//const uint32_t vertShader = LoadGLShader(GL_VERTEX_SHADER, kDiffuseLightingVertexShader);
const uint32_t fragShader = LoadGLShader(GL_FRAGMENT_SHADER, kPassthroughFragmentShader);
std::string error;
_cubeProgram = gl::compileProgram({ vertShader, fragShader }, error);
_program = gl::compileProgram({ vertShader, fragShader }, error);
CheckGLError("build program");
glGenBuffers(1, &_cubeBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _cubeBuffer);
glGenBuffers(1, &_geometryBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _geometryBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 9, triangle::TRIANGLE_VERTS.data(), GL_STATIC_DRAW);
/*
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 3, NULL, GL_STATIC_DRAW);
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 0, sizeof(float) * 108, cube::CUBE_COORDS.data());
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 1, sizeof(float) * 108, cube::CUBE_COLORS.data());
glBufferSubData(GL_ARRAY_BUFFER, sizeof(float) * 108 * 2, sizeof(float) * 108, cube::CUBE_NORMALS.data());
*/
glBindBuffer(GL_ARRAY_BUFFER, 0);
CheckGLError("upload vertices");
glGenVertexArrays(1, &_cubeVao);
glBindBuffer(GL_ARRAY_BUFFER, _cubeBuffer);
glBindVertexArray(_cubeVao);
glGenVertexArrays(1, &_vao);
glBindBuffer(GL_ARRAY_BUFFER, _geometryBuffer);
glBindVertexArray(_vao);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);
/*
glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
glEnableVertexAttribArray(0);
glVertexAttribPointer(1, 3, GL_FLOAT, false, 0, (const void*)(sizeof(float) * 108 * 1) );
glEnableVertexAttribArray(1);
glVertexAttribPointer(2, 3, GL_FLOAT, false, 0, (const void*)(sizeof(float) * 108 * 2));
glEnableVertexAttribArray(2);
*/
glBindVertexArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
CheckGLError("build vao ");
static std::once_flag once;
std::call_once(once, [&]{
testShaderBuild(sdf_text3D_vert, sdf_text3D_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawTexture_frag);
testShaderBuild(DrawTexcoordRectTransformUnitQuad_vert, DrawTexture_frag);
testShaderBuild(DrawViewportQuadTransformTexcoord_vert, DrawTexture_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawTextureOpaque_frag);
testShaderBuild(DrawTransformUnitQuad_vert, DrawColoredTexture_frag);
testShaderBuild(simple_vert, simple_frag);
testShaderBuild(simple_vert, simple_textured_frag);
testShaderBuild(simple_vert, simple_textured_unlit_frag);
testShaderBuild(deferred_light_vert, directional_ambient_light_frag);
testShaderBuild(deferred_light_vert, directional_skybox_light_frag);
testShaderBuild(standardTransformPNTC_vert, standardDrawTexture_frag);
testShaderBuild(standardTransformPNTC_vert, DrawTextureOpaque_frag);
testShaderBuild(model_vert, model_frag);
testShaderBuild(model_normal_map_vert, model_normal_map_frag);
testShaderBuild(model_vert, model_specular_map_frag);
testShaderBuild(model_normal_map_vert, model_normal_specular_map_frag);
testShaderBuild(model_vert, model_translucent_frag);
testShaderBuild(model_normal_map_vert, model_translucent_frag);
testShaderBuild(model_lightmap_vert, model_lightmap_frag);
testShaderBuild(model_lightmap_normal_map_vert, model_lightmap_normal_map_frag);
testShaderBuild(model_lightmap_vert, model_lightmap_specular_map_frag);
testShaderBuild(model_lightmap_normal_map_vert, model_lightmap_normal_specular_map_frag);
testShaderBuild(skin_model_vert, model_frag);
testShaderBuild(skin_model_normal_map_vert, model_normal_map_frag);
testShaderBuild(skin_model_vert, model_specular_map_frag);
testShaderBuild(skin_model_normal_map_vert, model_normal_specular_map_frag);
testShaderBuild(skin_model_vert, model_translucent_frag);
testShaderBuild(skin_model_normal_map_vert, model_translucent_frag);
testShaderBuild(model_shadow_vert, model_shadow_frag);
testShaderBuild(overlay3D_vert, overlay3D_frag);
#if 0
testShaderBuild(textured_particle_vert, textured_particle_frag);
testShaderBuild(skybox_vert, skybox_frag);
testShaderBuild(paintStroke_vert,paintStroke_frag);
testShaderBuild(polyvox_vert, polyvox_frag);
#endif
});
qDebug() << "done";
}
static const float kZNear = 1.0f;
static const float kZFar = 100.0f;
static const gvr_rectf fullscreen = {0, 1, 0, 1};
void NativeRenderer::DrawFrame() {
auto now = std::chrono::duration_cast<std::chrono::milliseconds>(
@ -559,65 +99,12 @@ void NativeRenderer::DrawFrame() {
v.g = 1.0f - v.r;
v.b = 1.0f;
PrepareFramebuffer();
// A client app does its rendering here.
gvr::ClockTimePoint target_time = gvr::GvrApi::GetTimePointNow();
target_time.monotonic_system_time_nanos += kPredictionTimeWithoutVsyncNanos;
using namespace googlevr;
using namespace bilateral;
const auto gvrHeadPose = _gvrapi->GetHeadSpaceFromStartSpaceRotation(target_time);
_head_view = toGlm(gvrHeadPose);
_viewportlist->SetToRecommendedBufferViewports();
glm::mat4 eye_views[2];
for_each_side([&](bilateral::Side side) {
int eye = index(side);
const gvr::Eye gvr_eye = eye == 0 ? GVR_LEFT_EYE : GVR_RIGHT_EYE;
const auto& eyeView = eye_views[eye] = toGlm(_gvrapi->GetEyeFromHeadMatrix(gvr_eye)) * _head_view;
auto& viewport = _viewports[eye];
_viewportlist->GetBufferViewport(eye, &viewport);
viewport.SetSourceUv(fullscreen);
viewport.SetSourceLayer(eye);
_viewportlist->SetBufferViewport(eye, viewport);
const auto &mvc = _modelview_cube[eye] = eyeView * _model_cube;
const auto &mvf = _modelview_floor[eye] = eyeView * _model_floor;
const gvr_rectf fov = viewport.GetSourceFov();
const glm::mat4 perspective = perspectiveMatrixFromView(fov, kZNear, kZFar);
_modelview_projection_cube[eye] = perspective * mvc;
_modelview_projection_floor[eye] = perspective * mvf;
_light_pos_eye_space[eye] = glm::vec3(eyeView * _light_pos_world_space);
});
gvr::Frame frame = _swapchain->AcquireFrame();
withFrameBuffer(frame, 0, [&]{
glClearColor(v.r, v.g, v.b, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, _renderSize.width / 2, _renderSize.height);
glUseProgram(_cubeProgram);
glBindVertexArray(_cubeVao);
glDrawArrays(GL_TRIANGLES, 0, 3);
/*
float* fp;
fp = (float*)&_light_pos_eye_space[0];
glUniform3fv(UNIFORM_LIGHT_POS, 2, fp);
fp = (float*)&_modelview_cube[0];
glUniformMatrix4fv(UNIFORM_MV, 2, GL_FALSE, fp);
fp = (float*)&_modelview_projection_cube[0];
glUniformMatrix4fv(UNIFORM_MVP, 2, GL_FALSE, fp);
fp = (float*)&_model_cube;
glUniformMatrix4fv(UNIFORM_M, 1, GL_FALSE, fp);
glDrawArrays(GL_TRIANGLES, 0, 36);
*/
glBindVertexArray(0);
});
frame.Submit(*_viewportlist, gvrHeadPose);
CheckGLError("onDrawFrame");
glClearColor(v.r, v.g, v.b, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(_program);
glBindVertexArray(_vao);
glDrawArrays(GL_TRIANGLES, 0, 3);
glBindVertexArray(0);
}
void NativeRenderer::OnTriggerEvent() {
@ -626,11 +113,8 @@ void NativeRenderer::OnTriggerEvent() {
void NativeRenderer::OnPause() {
qDebug() << "QQQ" << __FUNCTION__;
_gvrapi->PauseTracking();
}
void NativeRenderer::OnResume() {
qDebug() << "QQQ" << __FUNCTION__;
_gvrapi->ResumeTracking();
_gvrapi->RefreshViewerProfile();
}

View file

@ -4,21 +4,8 @@
#include <array>
#include <glm/glm.hpp>
#define GVR
#if defined(GVR)
#include <vr/gvr/capi/include/gvr.h>
#endif
class NativeRenderer {
public:
#if defined(GVR)
NativeRenderer(gvr_context* vrContext);
#else
NativeRenderer(void* vrContext);
#endif
void InitializeGl();
void DrawFrame();
void OnTriggerEvent();
@ -26,35 +13,9 @@ public:
void OnResume();
private:
std::chrono::time_point<std::chrono::system_clock> start { std::chrono::system_clock::now() };
std::chrono::time_point<std::chrono::system_clock> start;
#if defined(GVR)
void InitializeVR();
void PrepareFramebuffer();
std::unique_ptr<gvr::GvrApi> _gvrapi;
gvr::ViewerType _gvr_viewer_type;
std::unique_ptr<gvr::BufferViewportList> _viewportlist;
std::unique_ptr<gvr::SwapChain> _swapchain;
std::array<gvr::BufferViewport, 2> _viewports;
gvr::Sizei _renderSize;
#endif
uint32_t _cubeBuffer { 0 };
uint32_t _cubeVao { 0 };
uint32_t _cubeProgram { 0 };
glm::mat4 _head_view;
glm::mat4 _model_cube;
glm::mat4 _camera;
glm::mat4 _view;
glm::mat4 _model_floor;
std::array<glm::mat4, 2> _modelview_cube;
std::array<glm::mat4, 2> _modelview_floor;
std::array<glm::mat4, 2> _modelview_projection_cube;
std::array<glm::mat4, 2> _modelview_projection_floor;
std::array<glm::vec3, 2> _light_pos_eye_space;
const glm::vec4 _light_pos_world_space{ 0, 2, 0, 1};
uint32_t _geometryBuffer { 0 };
uint32_t _vao { 0 };
uint32_t _program { 0 };
};

View file

@ -26,10 +26,9 @@ public class MainActivity extends Activity {
}
private long nativeRenderer;
private GvrLayout gvrLayout;
private GLSurfaceView surfaceView;
private native long nativeCreateRenderer(ClassLoader appClassLoader, Context context, long nativeGvrContext);
private native long nativeCreateRenderer(ClassLoader appClassLoader, Context context);
private native void nativeDestroyRenderer(long renderer);
private native void nativeInitializeGl(long renderer);
private native void nativeDrawFrame(long renderer);
@ -55,30 +54,21 @@ public class MainActivity extends Activity {
if ((visibility & View.SYSTEM_UI_FLAG_FULLSCREEN) == 0) { setImmersiveSticky(); }
});
gvrLayout = new GvrLayout(this);
nativeRenderer = nativeCreateRenderer(
getClass().getClassLoader(),
getApplicationContext(),
gvrLayout.getGvrApi().getNativeGvrContext());
getApplicationContext());
surfaceView = new GLSurfaceView(this);
surfaceView.setEGLContextClientVersion(3);
surfaceView.setEGLConfigChooser(8, 8, 8, 0, 0, 0);
surfaceView.setPreserveEGLContextOnPause(true);
surfaceView.setRenderer(new NativeRenderer());
gvrLayout.setPresentationView(surfaceView);
setContentView(gvrLayout);
if (gvrLayout.setAsyncReprojectionEnabled(true)) {
AndroidCompat.setSustainedPerformanceMode(this, true);
}
AndroidCompat.setVrModeEnabled(this, true);
setContentView(surfaceView);
}
@Override
protected void onDestroy() {
super.onDestroy();
gvrLayout.shutdown();
nativeDestroyRenderer(nativeRenderer);
nativeRenderer = 0;
}
@ -87,14 +77,12 @@ public class MainActivity extends Activity {
protected void onPause() {
surfaceView.queueEvent(()->nativeOnPause(nativeRenderer));
surfaceView.onPause();
gvrLayout.onPause();
super.onPause();
}
@Override
protected void onResume() {
super.onResume();
gvrLayout.onResume();
surfaceView.onResume();
surfaceView.queueEvent(()->nativeOnResume(nativeRenderer));
}

View file

@ -1,91 +1,216 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
jcenter()
google()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.3.3'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
classpath 'com.android.tools.build:gradle:3.0.1'
}
}
plugins {
id 'de.undercouch.download' version '3.3.0'
}
allprojects {
repositories {
jcenter()
google()
}
}
def baseFolder = new File(HIFI_ANDROID_PRECOMPILED)
def jniFolder = new File('app/src/main/jniLibs/arm64-v8a')
import org.apache.tools.ant.taskdefs.condition.Os
def baseUrl = 'https://hifi-public.s3.amazonaws.com/austin/android/'
def qtFile='qt-5.9.3_linux_armv8-libcpp.tgz'
def qtChecksum='547da3547d5690144e23d6504c6d6e91'
if (Os.isFamily(Os.FAMILY_MAC)) {
qtFile = 'qt-5.9.3_osx_armv8-libcpp.tgz'
qtChecksum='6fa3e068cfdee863fc909b294a3a0cc6'
} else if (Os.isFamily(Os.FAMILY_WINDOWS)) {
qtFile = 'qt-5.9.3_win_armv8-libcpp.tgz'
qtChecksum='3a757378a7e9dbbfc662177e0eb46408'
}
def packages = [
qt: [
file: qtFile,
checksum: qtChecksum,
sharedLibFolder: '',
includeLibs: ['lib/*.so', 'plugins/*/*.so']
],
bullet: [
file: 'bullet-2.83_armv8-libcpp.tgz',
checksum: '2c558d604fce337f5eba3eb7ec1252fd'
],
draco: [
file: 'draco_armv8-libcpp.tgz',
checksum: '617a80d213a5ec69fbfa21a1f2f738cd'
],
gvr: [
file: 'gvrsdk_v1.101.0.tgz',
checksum: '57fd02baa069176ba18597a29b6b4fc7'
],
openssl: [
file: 'openssl-1.1.0g_armv8.tgz',
checksum: 'cabb681fbccd79594f65fcc266e02f32'
],
polyvox: [
file: 'polyvox_armv8-libcpp.tgz',
checksum: '5c918288741ee754c16aeb12bb46b9e1',
sharedLibFolder: 'lib',
includeLibs: ['Release/libPolyVoxCore.so', 'libPolyVoxUtil.so']
],
tbb: [
file: 'tbb-2018_U1_armv8_libcpp.tgz',
checksum: '20768f298f53b195e71b414b0ae240c4',
sharedLibFolder: 'lib/release',
includeLibs: ['libtbb.so', 'libtbbmalloc.so']
]
]
task downloadDependencies {
doLast {
packages.each { entry ->
def filename = entry.value['file'];
def url = baseUrl + filename;
download {
src url
dest new File(baseFolder, filename)
onlyIfNewer true
}
}
}
}
import de.undercouch.gradle.tasks.download.Verify
task verifyQt(type: Verify) { def p = packages['qt']; src new File(baseFolder, p['file']); checksum p['checksum']; }
task verifyBullet(type: Verify) { def p = packages['bullet']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyDraco(type: Verify) { def p = packages['draco']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyGvr(type: Verify) { def p = packages['gvr']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyOpenSSL(type: Verify) { def p = packages['openssl']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyPolyvox(type: Verify) { def p = packages['polyvox']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyTBB(type: Verify) { def p = packages['tbb']; src new File(baseFolder, p['file']); checksum p['checksum'] }
task verifyDependencyDownloads(dependsOn: downloadDependencies) { }
verifyDependencyDownloads.dependsOn verifyQt
verifyDependencyDownloads.dependsOn verifyBullet
verifyDependencyDownloads.dependsOn verifyDraco
verifyDependencyDownloads.dependsOn verifyGvr
verifyDependencyDownloads.dependsOn verifyOpenSSL
verifyDependencyDownloads.dependsOn verifyPolyvox
verifyDependencyDownloads.dependsOn verifyTBB
task extractDependencies(dependsOn: verifyDependencyDownloads) {
doLast {
packages.each { entry ->
def folder = entry.key;
def filename = entry.value['file'];
def localFile = new File(HIFI_ANDROID_PRECOMPILED, filename)
def localFolder = new File(HIFI_ANDROID_PRECOMPILED, folder)
copy {
from tarTree(resources.gzip(localFile))
into localFolder
}
}
}
}
task copyDependencies(dependsOn: extractDependencies) {
doLast {
packages.each { entry ->
def packageName = entry.key
def currentPackage = entry.value;
if (currentPackage.containsKey('sharedLibFolder')) {
def localFolder = new File(baseFolder, packageName + '/' + currentPackage['sharedLibFolder'])
def tree = fileTree(localFolder);
if (currentPackage.containsKey('includeLibs')) {
currentPackage['includeLibs'].each { includeSpec -> tree.include includeSpec }
}
tree.visit { element ->
if (!element.file.isDirectory()) {
copy { from element.file; into jniFolder }
}
}
}
}
}
}
def scribeFile='scribe_linux_x86_64'
def scribeLocalFile='scribe'
def scribeChecksum='c98678d9726bd8bbf1bab792acf3ff6c'
if (Os.isFamily(Os.FAMILY_MAC)) {
scribeFile = 'scribe_osx_x86_64'
scribeChecksum='a137ad62c1bf7cca739da219544a9a16'
} else if (Os.isFamily(Os.FAMILY_WINDOWS)) {
scribeFile = 'scribe_win32_x86_64.exe'
scribeLocalFile = 'scribe.exe'
scribeChecksum='75c2ce9ed45d17de375e3988bfaba816'
}
import de.undercouch.gradle.tasks.download.Download
task downloadScribe(type: Download) {
src baseUrl + scribeFile
dest new File(baseFolder, scribeLocalFile)
onlyIfNewer true
}
task verifyScribe (type: Verify, dependsOn: downloadScribe) {
src new File(baseFolder, scribeLocalFile);
checksum scribeChecksum
}
task fixScribePermissions(type: Exec, dependsOn: verifyScribe) {
commandLine 'chmod', 'a+x', HIFI_ANDROID_PRECOMPILED + '/' + scribeLocalFile
}
task setupScribe(dependsOn: verifyScribe) { }
// On Windows, we don't need to set the executable bit, but on OSX and Unix we do
if (!Os.isFamily(Os.FAMILY_WINDOWS)) {
setupScribe.dependsOn fixScribePermissions
}
task extractGvrBinaries(dependsOn: extractDependencies) {
doLast {
def gvrLibFolder = new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries');
zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-audio-1.101.0.aar')).visit { element ->
def fileName = element.file.toString();
if (fileName.endsWith('libgvr_audio.so') && fileName.contains('arm64-v8a')) {
copy { from element.file; into gvrLibFolder }
}
}
zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-base-1.101.0.aar')).visit { element ->
def fileName = element.file.toString();
if (fileName.endsWith('libgvr.so') && fileName.contains('arm64-v8a')) {
copy { from element.file; into gvrLibFolder }
}
}
fileTree(gvrLibFolder).visit { element ->
if (element.file.toString().endsWith('.so')) {
copy { from element.file; into jniFolder }
}
}
}
}
task setupDependencies(dependsOn: [setupScribe, copyDependencies, extractGvrBinaries]) {
}
task cleanDependencies(type: Delete) {
delete HIFI_ANDROID_PRECOMPILED
delete 'app/src/main/jniLibs/arm64-v8a'
}
task clean(type: Delete) {
delete rootProject.buildDir
}
task extractQt5jars(type: Copy) {
from fileTree(QT5_ROOT + "/jar")
into("${project.rootDir}/libraries/jar")
include("*.jar")
}
task extractQt5so(type: Copy) {
from fileTree(QT5_ROOT + "/lib")
into("${project.rootDir}/libraries/jni/armeabi-v7a/")
include("libQt5AndroidExtras.so")
include("libQt5Concurrent.so")
include("libQt5Core.so")
include("libQt5Gamepad.so")
include("libQt5Gui.so")
include("libQt5Location.so")
include("libQt5Multimedia.so")
include("libQt5MultimediaQuick_p.so")
include("libQt5Network.so")
include("libQt5NetworkAuth.so")
include("libQt5OpenGL.so")
include("libQt5Positioning.so")
include("libQt5Qml.so")
include("libQt5Quick.so")
include("libQt5QuickControls2.so")
include("libQt5QuickParticles.so")
include("libQt5QuickTemplates2.so")
include("libQt5QuickWidgets.so")
include("libQt5Script.so")
include("libQt5ScriptTools.so")
include("libQt5Sensors.so")
include("libQt5Svg.so")
include("libQt5WebChannel.so")
include("libQt5WebSockets.so")
include("libQt5WebView.so")
include("libQt5Widgets.so")
include("libQt5Xml.so")
include("libQt5XmlPatterns.so")
}
task extractAudioSo(type: Copy) {
from zipTree(GVR_ROOT + "/libraries/sdk-audio-1.80.0.aar")
into "${project.rootDir}/libraries/"
include "jni/armeabi-v7a/libgvr_audio.so"
}
task extractGvrSo(type: Copy) {
from zipTree(GVR_ROOT + "/libraries/sdk-base-1.80.0.aar")
into "${project.rootDir}/libraries/"
include "jni/armeabi-v7a/libgvr.so"
}
task extractNdk { }
extractNdk.dependsOn extractAudioSo
extractNdk.dependsOn extractGvrSo
task extractQt5 { }
extractQt5.dependsOn extractQt5so
extractQt5.dependsOn extractQt5jars
task extractBinaries { }
extractBinaries.dependsOn extractQt5
extractBinaries.dependsOn extractNdk
task deleteBinaries(type: Delete) {
delete "${project.rootDir}/libraries/jni"
}
//clean.dependsOn(deleteBinaries)

41
android/setupGVR.gradle Normal file
View file

@ -0,0 +1,41 @@
buildscript {
repositories {
jcenter()
google()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
classpath 'de.undercouch:gradle-download-task:3.3.0'
}
}
def file='gvrsdk_v1.101.0.tgz'
def url='https://hifi-public.s3.amazonaws.com/austin/android/' + file
def destFile = new File(HIFI_ANDROID_PRECOMPILED, file)
// FIXME find a way to only download if the file doesn't exist
task downloadGVR(type: de.undercouch.gradle.tasks.download.Download) {
src url
dest destFile
}
task extractGVR(dependsOn: downloadGVR, type: Copy) {
from tarTree(resources.gzip(destFile))
into new File(HIFI_ANDROID_PRECOMPILED, 'gvr')
}
task copyGVRAudioLibs(dependsOn: extractGVR, type: Copy) {
from zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-audio-1.101.0.aar'))
include 'jni/arm64-v8a/libgvr_audio.so'
into HIFI_ANDROID_PRECOMPILED
}
task copyGVRLibs(dependsOn: extractGVR, type: Copy) {
from zipTree(new File(HIFI_ANDROID_PRECOMPILED, 'gvr/gvr-android-sdk-1.101.0/libraries/sdk-base-1.101.0.aar'))
include 'jni/arm64-v8a/libgvr.so'
into HIFI_ANDROID_PRECOMPILED
}
task setupGVR(dependsOn: [copyGVRLibs, copyGVRAudioLibs]) {
}

View file

@ -13,9 +13,25 @@ setup_memory_debugger()
link_hifi_libraries(
audio avatars octree gpu model fbx entities
networking animation recording shared script-engine embedded-webserver
controllers physics plugins midi baking image
controllers physics plugins midi image
)
add_dependencies(${TARGET_NAME} oven)
if (WIN32)
add_custom_command(
TARGET ${TARGET_NAME} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy_directory
$<TARGET_FILE_DIR:oven>
$<TARGET_FILE_DIR:${TARGET_NAME}>)
else()
add_custom_command(
TARGET ${TARGET_NAME} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E create_symlink
$<TARGET_FILE:oven>
$<TARGET_FILE_DIR:${TARGET_NAME}>/oven)
endif()
if (WIN32)
package_libraries_for_deployment()
endif()

View file

@ -29,11 +29,10 @@
#include <QtCore/QUrlQuery>
#include <ClientServerUtils.h>
#include <FBXBaker.h>
#include <JSBaker.h>
#include <NodeType.h>
#include <SharedUtil.h>
#include <PathUtils.h>
#include <image/Image.h>
#include "AssetServerLogging.h"
#include "BakeAssetTask.h"
@ -250,7 +249,7 @@ AssetServer::AssetServer(ReceivedMessage& message) :
image::setNormalTexturesCompressionEnabled(true);
image::setCubeTexturesCompressionEnabled(true);
BAKEABLE_TEXTURE_EXTENSIONS = TextureBaker::getSupportedFormats();
BAKEABLE_TEXTURE_EXTENSIONS = image::getSupportedFormats();
qDebug() << "Supported baking texture formats:" << BAKEABLE_MODEL_EXTENSIONS;
// Most of the work will be I/O bound, reading from disk and constructing packet objects,
@ -416,6 +415,9 @@ void AssetServer::completeSetup() {
if (assetsFilesizeLimit != 0 && assetsFilesizeLimit < MAX_UPLOAD_SIZE) {
_filesizeLimit = assetsFilesizeLimit * BITS_PER_MEGABITS;
}
PathUtils::removeTemporaryApplicationDirs();
PathUtils::removeTemporaryApplicationDirs("Oven");
}
void AssetServer::cleanupUnmappedFiles() {

View file

@ -11,11 +11,18 @@
#include "BakeAssetTask.h"
#include <QtCore/QThread>
#include <mutex>
#include <QtCore/QThread>
#include <QCoreApplication>
#include <FBXBaker.h>
#include <PathUtils.h>
#include <JSBaker.h>
static const int OVEN_STATUS_CODE_SUCCESS { 0 };
static const int OVEN_STATUS_CODE_FAIL { 1 };
static const int OVEN_STATUS_CODE_ABORT { 2 };
std::once_flag registerMetaTypesFlag;
BakeAssetTask::BakeAssetTask(const AssetHash& assetHash, const AssetPath& assetPath, const QString& filePath) :
_assetHash(assetHash),
@ -23,6 +30,10 @@ BakeAssetTask::BakeAssetTask(const AssetHash& assetHash, const AssetPath& assetP
_filePath(filePath)
{
std::call_once(registerMetaTypesFlag, []() {
qRegisterMetaType<QProcess::ProcessError>("QProcess::ProcessError");
qRegisterMetaType<QProcess::ExitStatus>("QProcess::ExitStatus");
});
}
void cleanupTempFiles(QString tempOutputDir, std::vector<QString> files) {
@ -41,67 +52,76 @@ void cleanupTempFiles(QString tempOutputDir, std::vector<QString> files) {
};
void BakeAssetTask::run() {
_isBaking.store(true);
qRegisterMetaType<QVector<QString> >("QVector<QString>");
TextureBakerThreadGetter fn = []() -> QThread* { return QThread::currentThread(); };
QString tempOutputDir;
if (_assetPath.endsWith(".fbx")) {
tempOutputDir = PathUtils::generateTemporaryDir();
_baker = std::unique_ptr<FBXBaker> {
new FBXBaker(QUrl("file:///" + _filePath), fn, tempOutputDir)
};
} else if (_assetPath.endsWith(".js", Qt::CaseInsensitive)) {
_baker = std::unique_ptr<JSBaker>{
new JSBaker(QUrl("file:///" + _filePath), PathUtils::generateTemporaryDir())
};
} else {
tempOutputDir = PathUtils::generateTemporaryDir();
_baker = std::unique_ptr<TextureBaker> {
new TextureBaker(QUrl("file:///" + _filePath), image::TextureUsage::CUBE_TEXTURE,
tempOutputDir)
};
if (_isBaking.exchange(true)) {
qWarning() << "Tried to start bake asset task while already baking";
return;
}
QEventLoop loop;
connect(_baker.get(), &Baker::finished, &loop, &QEventLoop::quit);
connect(_baker.get(), &Baker::aborted, &loop, &QEventLoop::quit);
QMetaObject::invokeMethod(_baker.get(), "bake", Qt::QueuedConnection);
loop.exec();
QString tempOutputDir = PathUtils::generateTemporaryDir();
auto base = QFileInfo(QCoreApplication::applicationFilePath()).absoluteDir();
QString path = base.absolutePath() + "/oven";
QString extension = _assetPath.mid(_assetPath.lastIndexOf('.') + 1);
QStringList args {
"-i", _filePath,
"-o", tempOutputDir,
"-t", extension,
};
if (_baker->wasAborted()) {
qDebug() << "Aborted baking: " << _assetHash << _assetPath;
_ovenProcess.reset(new QProcess());
_wasAborted.store(true);
connect(_ovenProcess.get(), static_cast<void(QProcess::*)(int, QProcess::ExitStatus)>(&QProcess::finished),
this, [this, tempOutputDir](int exitCode, QProcess::ExitStatus exitStatus) {
qDebug() << "Baking process finished: " << exitCode << exitStatus;
cleanupTempFiles(tempOutputDir, _baker->getOutputFiles());
if (exitStatus == QProcess::CrashExit) {
if (_wasAborted) {
emit bakeAborted(_assetHash, _assetPath);
} else {
QString errors = "Fatal error occurred while baking";
emit bakeFailed(_assetHash, _assetPath, errors);
}
} else if (exitCode == OVEN_STATUS_CODE_SUCCESS) {
QDir outputDir = tempOutputDir;
auto files = outputDir.entryInfoList(QDir::Files);
QVector<QString> outputFiles;
for (auto& file : files) {
outputFiles.push_back(file.absoluteFilePath());
}
emit bakeAborted(_assetHash, _assetPath);
} else if (_baker->hasErrors()) {
qDebug() << "Failed to bake: " << _assetHash << _assetPath << _baker->getErrors();
emit bakeComplete(_assetHash, _assetPath, tempOutputDir, outputFiles);
} else if (exitStatus == QProcess::NormalExit && exitCode == OVEN_STATUS_CODE_ABORT) {
_wasAborted.store(true);
emit bakeAborted(_assetHash, _assetPath);
} else {
QString errors;
if (exitCode == OVEN_STATUS_CODE_FAIL) {
QDir outputDir = tempOutputDir;
auto errorFilePath = outputDir.absoluteFilePath("errors.txt");
QFile errorFile { errorFilePath };
if (errorFile.open(QIODevice::ReadOnly)) {
errors = errorFile.readAll();
errorFile.close();
} else {
errors = "Unknown error occurred while baking";
}
}
emit bakeFailed(_assetHash, _assetPath, errors);
}
auto errors = _baker->getErrors().join('\n'); // Join error list into a single string for convenience
_didFinish.store(true);
cleanupTempFiles(tempOutputDir, _baker->getOutputFiles());
});
qDebug() << "Starting oven for " << _assetPath;
_ovenProcess->start(path, args, QIODevice::ReadOnly);
if (!_ovenProcess->waitForStarted(-1)) {
QString errors = "Oven process failed to start";
emit bakeFailed(_assetHash, _assetPath, errors);
} else {
auto vectorOutputFiles = QVector<QString>::fromStdVector(_baker->getOutputFiles());
qDebug() << "Finished baking: " << _assetHash << _assetPath << vectorOutputFiles;
_didFinish.store(true);
emit bakeComplete(_assetHash, _assetPath, tempOutputDir, vectorOutputFiles);
return;
}
_ovenProcess->waitForFinished();
}
void BakeAssetTask::abort() {
if (_baker) {
_baker->abort();
if (!_wasAborted.exchange(true)) {
_ovenProcess->terminate();
}
}

View file

@ -17,9 +17,10 @@
#include <QtCore/QDebug>
#include <QtCore/QObject>
#include <QtCore/QRunnable>
#include <QDir>
#include <QProcess>
#include <AssetUtils.h>
#include <Baker.h>
class BakeAssetTask : public QObject, public QRunnable {
Q_OBJECT
@ -32,7 +33,6 @@ public:
void abort();
bool wasAborted() const { return _wasAborted.load(); }
bool didFinish() const { return _didFinish.load(); }
signals:
void bakeComplete(QString assetHash, QString assetPath, QString tempOutputDir, QVector<QString> outputFiles);
@ -44,9 +44,8 @@ private:
AssetHash _assetHash;
AssetPath _assetPath;
QString _filePath;
std::unique_ptr<Baker> _baker;
std::unique_ptr<QProcess> _ovenProcess { nullptr };
std::atomic<bool> _wasAborted { false };
std::atomic<bool> _didFinish { false };
};
#endif // hifi_BakeAssetTask_h

View file

@ -870,8 +870,8 @@ AvatarMixerClientData* AvatarMixer::getOrCreateClientData(SharedNodePointer node
node->setLinkedData(std::unique_ptr<NodeData> { new AvatarMixerClientData(node->getUUID()) });
clientData = dynamic_cast<AvatarMixerClientData*>(node->getLinkedData());
auto& avatar = clientData->getAvatar();
avatar.setDomainMinimumScale(_domainMinimumScale);
avatar.setDomainMaximumScale(_domainMaximumScale);
avatar.setDomainMinimumHeight(_domainMinimumHeight);
avatar.setDomainMaximumHeight(_domainMaximumHeight);
}
return clientData;
@ -939,21 +939,21 @@ void AvatarMixer::parseDomainServerSettings(const QJsonObject& domainSettings) {
const QString AVATARS_SETTINGS_KEY = "avatars";
static const QString MIN_SCALE_OPTION = "min_avatar_scale";
float settingMinScale = domainSettings[AVATARS_SETTINGS_KEY].toObject()[MIN_SCALE_OPTION].toDouble(MIN_AVATAR_SCALE);
_domainMinimumScale = glm::clamp(settingMinScale, MIN_AVATAR_SCALE, MAX_AVATAR_SCALE);
static const QString MIN_HEIGHT_OPTION = "min_avatar_height";
float settingMinHeight = domainSettings[AVATARS_SETTINGS_KEY].toObject()[MIN_HEIGHT_OPTION].toDouble(MIN_AVATAR_HEIGHT);
_domainMinimumHeight = glm::clamp(settingMinHeight, MIN_AVATAR_HEIGHT, MAX_AVATAR_HEIGHT);
static const QString MAX_SCALE_OPTION = "max_avatar_scale";
float settingMaxScale = domainSettings[AVATARS_SETTINGS_KEY].toObject()[MAX_SCALE_OPTION].toDouble(MAX_AVATAR_SCALE);
_domainMaximumScale = glm::clamp(settingMaxScale, MIN_AVATAR_SCALE, MAX_AVATAR_SCALE);
static const QString MAX_HEIGHT_OPTION = "max_avatar_height";
float settingMaxHeight = domainSettings[AVATARS_SETTINGS_KEY].toObject()[MAX_HEIGHT_OPTION].toDouble(MAX_AVATAR_HEIGHT);
_domainMaximumHeight = glm::clamp(settingMaxHeight, MIN_AVATAR_HEIGHT, MAX_AVATAR_HEIGHT);
// make sure that the domain owner didn't flip min and max
if (_domainMinimumScale > _domainMaximumScale) {
std::swap(_domainMinimumScale, _domainMaximumScale);
if (_domainMinimumHeight > _domainMaximumHeight) {
std::swap(_domainMinimumHeight, _domainMaximumHeight);
}
qCDebug(avatars) << "This domain requires a minimum avatar scale of" << _domainMinimumScale
<< "and a maximum avatar scale of" << _domainMaximumScale;
qCDebug(avatars) << "This domain requires a minimum avatar height of" << _domainMinimumHeight
<< "and a maximum avatar height of" << _domainMaximumHeight;
const QString AVATAR_WHITELIST_DEFAULT{ "" };
static const QString AVATAR_WHITELIST_OPTION = "avatar_whitelist";

View file

@ -90,8 +90,8 @@ private:
float _maxKbpsPerNode = 0.0f;
float _domainMinimumScale { MIN_AVATAR_SCALE };
float _domainMaximumScale { MAX_AVATAR_SCALE };
float _domainMinimumHeight { MIN_AVATAR_HEIGHT };
float _domainMaximumHeight { MAX_AVATAR_HEIGHT };
RateCounter<> _broadcastRate;
p_high_resolution_clock::time_point _lastDebugMessage;

View file

@ -25,6 +25,23 @@ AvatarMixerClientData::AvatarMixerClientData(const QUuid& nodeID) :
_avatar->setID(nodeID);
}
uint64_t AvatarMixerClientData::getLastOtherAvatarEncodeTime(QUuid otherAvatar) const {
std::unordered_map<QUuid, uint64_t>::const_iterator itr = _lastOtherAvatarEncodeTime.find(otherAvatar);
if (itr != _lastOtherAvatarEncodeTime.end()) {
return itr->second;
}
return 0;
}
void AvatarMixerClientData::setLastOtherAvatarEncodeTime(const QUuid& otherAvatar, const uint64_t& time) {
std::unordered_map<QUuid, uint64_t>::iterator itr = _lastOtherAvatarEncodeTime.find(otherAvatar);
if (itr != _lastOtherAvatarEncodeTime.end()) {
itr->second = time;
} else {
_lastOtherAvatarEncodeTime.emplace(std::pair<QUuid, uint64_t>(otherAvatar, time));
}
}
void AvatarMixerClientData::queuePacket(QSharedPointer<ReceivedMessage> message, SharedNodePointer node) {
if (!_packetQueue.node) {
_packetQueue.node = node;

View file

@ -110,16 +110,10 @@ public:
bool getRequestsDomainListData() { return _requestsDomainListData; }
void setRequestsDomainListData(bool requesting) { _requestsDomainListData = requesting; }
ViewFrustum getViewFrustom() const { return _currentViewFrustum; }
ViewFrustum getViewFrustum() const { return _currentViewFrustum; }
quint64 getLastOtherAvatarEncodeTime(QUuid otherAvatar) {
quint64 result = 0;
if (_lastOtherAvatarEncodeTime.find(otherAvatar) != _lastOtherAvatarEncodeTime.end()) {
result = _lastOtherAvatarEncodeTime[otherAvatar];
}
_lastOtherAvatarEncodeTime[otherAvatar] = usecTimestampNow();
return result;
}
uint64_t getLastOtherAvatarEncodeTime(QUuid otherAvatar) const;
void setLastOtherAvatarEncodeTime(const QUuid& otherAvatar, const uint64_t& time);
QVector<JointData>& getLastOtherAvatarSentJoints(QUuid otherAvatar) {
_lastOtherAvatarSentJoints[otherAvatar].resize(_avatar->getJointCount());
@ -143,7 +137,7 @@ private:
// this is a map of the last time we encoded an "other" avatar for
// sending to "this" node
std::unordered_map<QUuid, quint64> _lastOtherAvatarEncodeTime;
std::unordered_map<QUuid, uint64_t> _lastOtherAvatarEncodeTime;
std::unordered_map<QUuid, QVector<JointData>> _lastOtherAvatarSentJoints;
uint64_t _identityChangeTimestamp;

View file

@ -22,6 +22,7 @@
#include <NodeList.h>
#include <Node.h>
#include <OctreeConstants.h>
#include <PrioritySortUtil.h>
#include <udt/PacketHeaders.h>
#include <SharedUtil.h>
#include <StDev.h>
@ -32,7 +33,6 @@
#include "AvatarMixerClientData.h"
#include "AvatarMixerSlave.h"
void AvatarMixerSlave::configure(ConstIter begin, ConstIter end) {
_begin = begin;
_end = end;
@ -184,10 +184,9 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
// setup list of AvatarData as well as maps to map betweeen the AvatarData and the original nodes
// for calling the AvatarData::sortAvatars() function and getting our sorted list of client nodes
QList<AvatarSharedPointer> avatarList;
std::vector<AvatarSharedPointer> avatarsToSort;
std::unordered_map<AvatarSharedPointer, SharedNodePointer> avatarDataToNodes;
std::unordered_map<QUuid, uint64_t> avatarEncodeTimes;
std::for_each(_begin, _end, [&](const SharedNodePointer& otherNode) {
// make sure this is an agent that we have avatar data for before considering it for inclusion
if (otherNode->getType() == NodeType::Agent
@ -195,36 +194,56 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
const AvatarMixerClientData* otherNodeData = reinterpret_cast<const AvatarMixerClientData*>(otherNode->getLinkedData());
AvatarSharedPointer otherAvatar = otherNodeData->getAvatarSharedPointer();
avatarList << otherAvatar;
avatarsToSort.push_back(otherAvatar);
avatarDataToNodes[otherAvatar] = otherNode;
QUuid id = otherAvatar->getSessionUUID();
avatarEncodeTimes[id] = nodeData->getLastOtherAvatarEncodeTime(id);
}
});
AvatarSharedPointer thisAvatar = nodeData->getAvatarSharedPointer();
ViewFrustum cameraView = nodeData->getViewFrustom();
std::priority_queue<AvatarPriority> sortedAvatars;
AvatarData::sortAvatars(avatarList, cameraView, sortedAvatars,
[&](AvatarSharedPointer avatar)->uint64_t {
auto avatarNode = avatarDataToNodes[avatar];
assert(avatarNode); // we can't have gotten here without the avatarData being a valid key in the map
return nodeData->getLastBroadcastTime(avatarNode->getUUID());
}, [&](AvatarSharedPointer avatar)->float{
glm::vec3 nodeBoxHalfScale = (avatar->getWorldPosition() - avatar->getGlobalBoundingBoxCorner() * avatar->getSensorToWorldScale());
return glm::max(nodeBoxHalfScale.x, glm::max(nodeBoxHalfScale.y, nodeBoxHalfScale.z));
}, [&](AvatarSharedPointer avatar)->bool {
class SortableAvatar: public PrioritySortUtil::Sortable {
public:
SortableAvatar() = delete;
SortableAvatar(const AvatarSharedPointer& avatar, uint64_t lastEncodeTime)
: _avatar(avatar), _lastEncodeTime(lastEncodeTime) {}
glm::vec3 getPosition() const override { return _avatar->getWorldPosition(); }
float getRadius() const override {
glm::vec3 nodeBoxHalfScale = (_avatar->getWorldPosition() - _avatar->getGlobalBoundingBoxCorner() * _avatar->getSensorToWorldScale());
return glm::max(nodeBoxHalfScale.x, glm::max(nodeBoxHalfScale.y, nodeBoxHalfScale.z));
}
uint64_t getTimestamp() const override {
return _lastEncodeTime;
}
const AvatarSharedPointer& getAvatar() const { return _avatar; }
private:
AvatarSharedPointer _avatar;
uint64_t _lastEncodeTime;
};
// prepare to sort
ViewFrustum cameraView = nodeData->getViewFrustum();
PrioritySortUtil::PriorityQueue<SortableAvatar> sortedAvatars(cameraView,
AvatarData::_avatarSortCoefficientSize,
AvatarData::_avatarSortCoefficientCenter,
AvatarData::_avatarSortCoefficientAge);
// ignore or sort
const AvatarSharedPointer& thisAvatar = nodeData->getAvatarSharedPointer();
for (const auto& avatar : avatarsToSort) {
if (avatar == thisAvatar) {
return true; // ignore ourselves...
// don't echo updates to self
continue;
}
bool shouldIgnore = false;
// We will also ignore other nodes for a couple of different reasons:
// We ignore other nodes for a couple of reasons:
// 1) ignore bubbles and ignore specific node
// 2) the node hasn't really updated it's frame data recently, this can
// happen if for example the avatar is connected on a desktop and sending
// updates at ~30hz. So every 3 frames we skip a frame.
auto avatarNode = avatarDataToNodes[avatar];
auto avatarNode = avatarDataToNodes[avatar];
assert(avatarNode); // we can't have gotten here without the avatarData being a valid key in the map
const AvatarMixerClientData* avatarNodeData = reinterpret_cast<const AvatarMixerClientData*>(avatarNode->getLinkedData());
@ -240,7 +259,6 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
|| (avatarNode->isIgnoringNodeWithID(node->getUUID()) && !getsAnyIgnored)) {
shouldIgnore = true;
} else {
// Check to see if the space bubble is enabled
// Don't bother with these checks if the other avatar has their bubble enabled and we're gettingAnyIgnored
if (node->isIgnoreRadiusEnabled() || (avatarNode->isIgnoreRadiusEnabled() && !getsAnyIgnored)) {
@ -267,8 +285,6 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
nodeData->removeFromRadiusIgnoringSet(node, avatarNode->getUUID());
}
}
quint64 endIgnoreCalculation = usecTimestampNow();
_stats.ignoreCalculationElapsedTime += (endIgnoreCalculation - startIgnoreCalculation);
if (!shouldIgnore) {
AvatarDataSequenceNumber lastSeqToReceiver = nodeData->getLastBroadcastSequenceNumber(avatarNode->getUUID());
@ -292,20 +308,26 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
++numAvatarsWithSkippedFrames;
}
}
return shouldIgnore;
});
quint64 endIgnoreCalculation = usecTimestampNow();
_stats.ignoreCalculationElapsedTime += (endIgnoreCalculation - startIgnoreCalculation);
if (!shouldIgnore) {
// sort this one for later
uint64_t lastEncodeTime = 0;
std::unordered_map<QUuid, uint64_t>::const_iterator itr = avatarEncodeTimes.find(avatar->getSessionUUID());
if (itr != avatarEncodeTimes.end()) {
lastEncodeTime = itr->second;
}
sortedAvatars.push(SortableAvatar(avatar, lastEncodeTime));
}
}
// loop through our sorted avatars and allocate our bandwidth to them accordingly
int avatarRank = 0;
// this is overly conservative, because it includes some avatars we might not consider
int remainingAvatars = (int)sortedAvatars.size();
while (!sortedAvatars.empty()) {
AvatarPriority sortData = sortedAvatars.top();
const auto& avatarData = sortedAvatars.top().getAvatar();
sortedAvatars.pop();
const auto& avatarData = sortData.avatar;
avatarRank++;
remainingAvatars--;
auto otherNode = avatarDataToNodes[avatarData];
@ -332,10 +354,8 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
nodeData->setLastBroadcastTime(otherNode->getUUID(), usecTimestampNow());
}
// determine if avatar is in view which determines how much data to send
glm::vec3 otherPosition = otherAvatar->getClientGlobalPosition();
// determine if avatar is in view, to determine how much data to include...
glm::vec3 otherNodeBoxScale = (otherPosition - otherNodeData->getGlobalBoundingBoxCorner()) * 2.0f * otherAvatar->getSensorToWorldScale();
AABox otherNodeBox(otherNodeData->getGlobalBoundingBoxCorner(), otherNodeBoxScale);
bool isInView = nodeData->otherAvatarInView(otherNodeBox);
@ -405,14 +425,18 @@ void AvatarMixerSlave::broadcastAvatarDataToAgent(const SharedNodePointer& node)
// set the last sent sequence number for this sender on the receiver
nodeData->setLastBroadcastSequenceNumber(otherNode->getUUID(),
otherNodeData->getLastReceivedSequenceNumber());
nodeData->setLastOtherAvatarEncodeTime(otherNode->getUUID(), usecTimestampNow());
}
} else {
// TODO? this avatar is not included now, and will probably not be included next frame.
// It would be nice if we could tweak its future sort priority to put it at the back of the list.
}
avatarPacketList->endSegment();
quint64 endAvatarDataPacking = usecTimestampNow();
_stats.avatarDataPackingElapsedTime += (endAvatarDataPacking - startAvatarDataPacking);
};
}
quint64 startPacketSending = usecTimestampNow();

View file

@ -29,10 +29,6 @@ macro(GENERATE_INSTALLERS)
if (WIN32)
# Do not install the Visual Studio C runtime libraries. The installer will do this automatically
set(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS_SKIP TRUE)
include(InstallRequiredSystemLibraries)
set(CPACK_NSIS_MUI_ICON "${HF_CMAKE_DIR}/installer/installer.ico")
# install and reference the Add/Remove icon
@ -49,6 +45,10 @@ macro(GENERATE_INSTALLERS)
set(_UNINSTALLER_HEADER_BAD_PATH "${HF_CMAKE_DIR}/installer/uninstaller-header.bmp")
set(UNINSTALLER_HEADER_IMAGE "")
fix_path_for_nsis(${_UNINSTALLER_HEADER_BAD_PATH} UNINSTALLER_HEADER_IMAGE)
# grab the latest VC redist (2017) and add it to the installer, our NSIS template
# will call it during the install
install(CODE "file(DOWNLOAD https://go.microsoft.com/fwlink/?LinkId=746572 \"\${CMAKE_INSTALL_PREFIX}/vcredist_x64.exe\")")
elseif (APPLE)
# produce a drag and drop DMG on OS X
set(CPACK_GENERATOR "DragNDrop")
@ -84,4 +84,3 @@ macro(GENERATE_INSTALLERS)
include(CPack)
endmacro()

View file

@ -0,0 +1,17 @@
#
# Created by Bradley Austin Davis on 2017/11/27
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
function(set_from_env _RESULT_NAME _ENV_VAR_NAME _DEFAULT_VALUE)
if (NOT DEFINED ${_RESULT_NAME})
if ("$ENV{${_ENV_VAR_NAME}}" STREQUAL "")
set (${_RESULT_NAME} ${_DEFAULT_VALUE} PARENT_SCOPE)
else()
set (${_RESULT_NAME} $ENV{${_ENV_VAR_NAME}} PARENT_SCOPE)
endif()
endif()
endfunction()

View file

@ -15,13 +15,14 @@ macro(SET_PACKAGING_PARAMETERS)
set(PR_BUILD 0)
set(PRODUCTION_BUILD 0)
set(DEV_BUILD 0)
set(RELEASE_TYPE $ENV{RELEASE_TYPE})
set(RELEASE_NUMBER $ENV{RELEASE_NUMBER})
string(TOLOWER "$ENV{BRANCH}" BUILD_BRANCH)
set(BUILD_GLOBAL_SERVICES "DEVELOPMENT")
set(USE_STABLE_GLOBAL_SERVICES 0)
set_from_env(RELEASE_TYPE RELEASE_TYPE "DEV")
set_from_env(RELEASE_NUMBER RELEASE_NUMBER "")
set_from_env(BUILD_BRANCH BRANCH "")
string(TOLOWER "${BUILD_BRANCH}" BUILD_BRANCH)
message(STATUS "The BUILD_BRANCH variable is: ${BUILD_BRANCH}")
message(STATUS "The BRANCH environment variable is: $ENV{BRANCH}")
message(STATUS "The RELEASE_TYPE variable is: ${RELEASE_TYPE}")

View file

@ -1,21 +1,11 @@
#
# Copyright 2015 High Fidelity, Inc.
# Created by Bradley Austin Davis on 2015/10/10
# Created by Bradley Austin Davis on 2017/09/02
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
function(set_from_env _RESULT_NAME _ENV_VAR_NAME _DEFAULT_VALUE)
if (NOT DEFINED ${_RESULT_NAME})
if ("$ENV{${_ENV_VAR_NAME}}" STREQUAL "")
set (${_RESULT_NAME} ${_DEFAULT_VALUE} PARENT_SCOPE)
else()
set (${_RESULT_NAME} $ENV{${_ENV_VAR_NAME}} PARENT_SCOPE)
endif()
endif()
endfunction()
# Construct a default QT location from a root path, a version and an architecture
function(calculate_default_qt_dir _RESULT_NAME)
if (ANDROID)

View file

@ -6,8 +6,19 @@
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_BULLET)
add_dependency_external_projects(bullet)
find_package(Bullet REQUIRED)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/bullet)
set(BULLET_INCLUDE_DIRS "${INSTALL_DIR}/include/bullet" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletDynamics.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletCollision.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libLinearMath.a)
list(APPEND BULLET_LIBRARIES ${LIB_DIR}/libBulletSoftBody.a)
else()
add_dependency_external_projects(bullet)
find_package(Bullet REQUIRED)
endif()
# perform the system include hack for OS X to ignore warnings
if (APPLE)
SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -isystem ${BULLET_INCLUDE_DIRS}")
@ -16,3 +27,5 @@ macro(TARGET_BULLET)
endif()
target_link_libraries(${TARGET_NAME} ${BULLET_LIBRARIES})
endmacro()

18
cmake/macros/TargetDraco.cmake Executable file
View file

@ -0,0 +1,18 @@
macro(TARGET_DRACO)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/draco)
set(DRACO_INCLUDE_DIRS "${INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdraco.a)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdracodec.a)
list(APPEND DRACO_LIBRARIES ${LIB_DIR}/libdracoenc.a)
else()
add_dependency_external_projects(draco)
find_package(Draco REQUIRED)
list(APPEND DRACO_LIBRARIES ${DRACO_LIBRARY})
list(APPEND DRACO_LIBRARIES ${DRACO_ENCODER_LIBRARY})
endif()
target_include_directories(${TARGET_NAME} SYSTEM PRIVATE ${DRACO_INCLUDE_DIRS})
target_link_libraries(${TARGET_NAME} ${DRACO_LIBRARIES})
endmacro()

View file

@ -0,0 +1,14 @@
#
# Created by Bradley Austin Davis on 2017/11/28
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_GOOGLEVR)
if (ANDROID)
set(GVR_ROOT "${HIFI_ANDROID_PRECOMPILED}/gvr/gvr-android-sdk-1.101.0/")
target_include_directories(native-lib PRIVATE "${GVR_ROOT}/libraries/headers")
target_link_libraries(native-lib "${GVR_ROOT}/libraries/libgvr.so")
endif()
endmacro()

View file

@ -6,14 +6,10 @@
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_OPENSSL)
if (ANDROID)
# FIXME use a distributable binary
set(OPENSSL_INSTALL_DIR C:/Android/openssl)
set(OPENSSL_INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/openssl)
set(OPENSSL_INCLUDE_DIR "${OPENSSL_INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(OPENSSL_LIBRARIES "${OPENSSL_INSTALL_DIR}/lib/libcrypto.a;${OPENSSL_INSTALL_DIR}/lib/libssl.a" CACHE TYPE INTERNAL)
else()
find_package(OpenSSL REQUIRED)
@ -28,5 +24,4 @@ macro(TARGET_OPENSSL)
include_directories(SYSTEM "${OPENSSL_INCLUDE_DIR}")
target_link_libraries(${TARGET_NAME} ${OPENSSL_LIBRARIES})
endmacro()

View file

@ -0,0 +1,24 @@
#
# Created by Bradley Austin Davis on 2017/11/28
# Copyright 2013-2017 High Fidelity, Inc.
#
# Distributed under the Apache License, Version 2.0.
# See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
#
macro(TARGET_POLYVOX)
if (ANDROID)
set(INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/polyvox)
set(POLYVOX_INCLUDE_DIRS "${INSTALL_DIR}/include" CACHE TYPE INTERNAL)
set(LIB_DIR ${INSTALL_DIR}/lib)
list(APPEND POLYVOX_LIBRARIES ${LIB_DIR}/libPolyVoxUtil.so)
list(APPEND POLYVOX_LIBRARIES ${LIB_DIR}/Release/libPolyVoxCore.so)
else()
add_dependency_external_projects(polyvox)
find_package(PolyVox REQUIRED)
endif()
target_include_directories(${TARGET_NAME} SYSTEM PUBLIC ${POLYVOX_INCLUDE_DIRS})
target_link_libraries(${TARGET_NAME} ${POLYVOX_LIBRARIES})
endmacro()

View file

@ -8,10 +8,10 @@
macro(TARGET_TBB)
if (ANDROID)
set(TBB_INSTALL_DIR C:/tbb-2018/built)
set(TBB_LIBRARY ${HIFI_ANDROID_PRECOMPILED}/libtbb.so CACHE FILEPATH "TBB library location")
set(TBB_MALLOC_LIBRARY ${HIFI_ANDROID_PRECOMPILED}/libtbbmalloc.so CACHE FILEPATH "TBB malloc library location")
set(TBB_INCLUDE_DIRS ${TBB_INSTALL_DIR}/include CACHE TYPE "List of tbb include directories" CACHE FILEPATH "TBB includes location")
set(TBB_INSTALL_DIR ${HIFI_ANDROID_PRECOMPILED}/tbb)
set(TBB_INCLUDE_DIRS ${TBB_INSTALL_DIR}/include CACHE FILEPATH "TBB includes location")
set(TBB_LIBRARY ${TBB_INSTALL_DIR}/lib/release/libtbb.so CACHE FILEPATH "TBB library location")
set(TBB_MALLOC_LIBRARY ${TBB_INSTALL_DIR}/lib/release/libtbbmalloc.so CACHE FILEPATH "TBB malloc library location")
set(TBB_LIBRARIES ${TBB_LIBRARY} ${TBB_MALLOC_LIBRARY})
else()
add_dependency_external_projects(tbb)

View file

@ -45,5 +45,4 @@ else()
endif()
file(GLOB EXTRA_PLUGINS "${BUNDLE_PLUGIN_DIR}/*.${PLUGIN_EXTENSION}")
fixup_bundle("${BUNDLE_EXECUTABLE}" "${EXTRA_PLUGINS}" "@FIXUP_LIBS@")
fixup_bundle("${BUNDLE_EXECUTABLE}" "${EXTRA_PLUGINS}" "@FIXUP_LIBS@" IGNORE_ITEM "vcredist_x86.exe;vcredist_x64.exe")

View file

@ -1,5 +1,5 @@
{
"version": 2.0,
"version": 2.1,
"settings": [
{
"name": "label",
@ -916,6 +916,14 @@
"default": false
}
]
},
{
"name": "multi_kick_logged_in",
"type": "checkbox",
"label": "Multi-Kick for Logged In Users",
"help": "Kick logged in users by machine fingerprint (in addition to the default kick by username)",
"default": false,
"advanced": true
}
]
},
@ -1007,20 +1015,20 @@
"assignment-types": [ 1, 2 ],
"settings": [
{
"name": "min_avatar_scale",
"name": "min_avatar_height",
"type": "double",
"label": "Minimum Avatar Scale",
"help": "Limits the scale of avatars in your domain. Must be at least 0.005.",
"placeholder": 0.25,
"default": 0.25
"label": "Minimum Avatar Height (meters)",
"help": "Limits the height of avatars in your domain. Must be at least 0.009.",
"placeholder": 0.4,
"default": 0.4
},
{
"name": "max_avatar_scale",
"name": "max_avatar_height",
"type": "double",
"label": "Maximum Avatar Scale",
"help": "Limits the scale of avatars in your domain. Cannot be greater than 1000.",
"placeholder": 3.0,
"default": 3.0
"label": "Maximum Avatar Height (meters)",
"help": "Limits the scale of avatars in your domain. Cannot be greater than 1755.",
"placeholder": 5.2,
"default": 5.2
},
{
"name": "avatar_whitelist",

View file

@ -183,6 +183,11 @@ NodePermissions DomainGatekeeper::setPermissionsForUser(bool isLocalUser, QStrin
#ifdef WANT_DEBUG
qDebug() << "| user-permissions: specific MAC matches, so:" << userPerms;
#endif
} else if (_server->_settingsManager.hasPermissionsForMachineFingerprint(machineFingerprint)) {
userPerms = _server->_settingsManager.getPermissionsForMachineFingerprint(machineFingerprint);
#ifdef WANT_DEBUG
qDebug(() << "| user-permissions: specific Machine Fingerprint matches, so: " << userPerms;
#endif
} else if (_server->_settingsManager.hasPermissionsForIP(senderAddress)) {
// this user comes from an IP we have in our permissions table, apply those permissions

View file

@ -304,6 +304,26 @@ void DomainServerSettingsManager::setupConfigMap(const QStringList& argumentList
*wizardCompletedOnce = QVariant(true);
}
if (oldVersion < 2.1) {
// convert old avatar scale settings into avatar height.
const QString AVATAR_MIN_SCALE_KEYPATH = "avatars.min_avatar_scale";
const QString AVATAR_MAX_SCALE_KEYPATH = "avatars.max_avatar_scale";
const QString AVATAR_MIN_HEIGHT_KEYPATH = "avatars.min_avatar_height";
const QString AVATAR_MAX_HEIGHT_KEYPATH = "avatars.max_avatar_height";
QVariant* avatarMinScale = _configMap.valueForKeyPath(AVATAR_MIN_SCALE_KEYPATH);
if (avatarMinScale) {
float scale = avatarMinScale->toFloat();
_configMap.valueForKeyPath(AVATAR_MIN_HEIGHT_KEYPATH, scale * DEFAULT_AVATAR_HEIGHT);
}
QVariant* avatarMaxScale = _configMap.valueForKeyPath(AVATAR_MAX_SCALE_KEYPATH);
if (avatarMaxScale) {
float scale = avatarMaxScale->toFloat();
_configMap.valueForKeyPath(AVATAR_MAX_HEIGHT_KEYPATH, scale * DEFAULT_AVATAR_HEIGHT);
}
}
// write the current description version to our settings
*versionVariant = _descriptionVersion;
@ -672,7 +692,7 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
bool newPermissions = false;
if (!verifiedUsername.isEmpty()) {
// if we have a verified user name for this user, we apply the kick to the username
// if we have a verified user name for this user, we first apply the kick to the username
// check if there were already permissions
bool hadPermissions = havePermissionsForName(verifiedUsername);
@ -684,7 +704,14 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
// ensure that the connect permission is clear
userPermissions->clear(NodePermissions::Permission::canConnectToDomain);
} else {
}
// if we didn't have a username, or this domain-server uses the "multi-kick" setting to
// kick logged in users via username AND machine fingerprint (or IP as fallback)
// then we remove connect permissions for the machine fingerprint (or IP as fallback)
const QString MULTI_KICK_SETTINGS_KEYPATH = "security.multi_kick_logged_in";
if (verifiedUsername.isEmpty() || valueOrDefaultValueForKeyPath(MULTI_KICK_SETTINGS_KEYPATH).toBool()) {
// remove connect permissions for the machine fingerprint
DomainServerNodeData* nodeData = static_cast<DomainServerNodeData*>(matchingNode->getLinkedData());
if (nodeData) {
@ -719,8 +746,8 @@ void DomainServerSettingsManager::processNodeKickRequestPacket(QSharedPointer<Re
// TODO: soon we will have feedback (in the form of a message to the client) after we kick. When we
// do, we will have a success flag, and perhaps a reason for failure. For now, just don't do it.
if (kickAddress == limitedNodeList->getPublicSockAddr().getAddress() ||
kickAddress == limitedNodeList->getLocalSockAddr().getAddress() ||
kickAddress.isLoopback() ) {
kickAddress == limitedNodeList->getLocalSockAddr().getAddress() ||
kickAddress.isLoopback() ) {
qWarning() << "attempt to kick node running on same machine as domain server, ignoring KickRequest";
return;
}

View file

@ -13,11 +13,11 @@
{ "from": "OculusTouch.LY", "to": "Standard.LY",
"filters": [
{ "type": "deadZone", "min": 0.3 },
{ "type": "deadZone", "min": 0.7 },
"invert"
]
},
{ "from": "OculusTouch.LX", "filters": { "type": "deadZone", "min": 0.3 }, "to": "Standard.LX" },
{ "from": "OculusTouch.LX", "filters": { "type": "deadZone", "min": 0.7 }, "to": "Standard.LX" },
{ "from": "OculusTouch.LT", "to": "Standard.LTClick",
"peek": true,
"filters": [ { "type": "hysteresis", "min": 0.85, "max": 0.9 } ]
@ -29,11 +29,11 @@
{ "from": "OculusTouch.RY", "to": "Standard.RY",
"filters": [
{ "type": "deadZone", "min": 0.3 },
{ "type": "deadZone", "min": 0.7 },
"invert"
]
},
{ "from": "OculusTouch.RX", "filters": { "type": "deadZone", "min": 0.3 }, "to": "Standard.RX" },
{ "from": "OculusTouch.RX", "filters": { "type": "deadZone", "min": 0.7 }, "to": "Standard.RX" },
{ "from": "OculusTouch.RT", "to": "Standard.RTClick",
"peek": true,
"filters": [ { "type": "hysteresis", "min": 0.85, "max": 0.9 } ]

View file

@ -65,21 +65,23 @@ var EventBridge;
// we need to listen to events that might precede the addition of this elements.
// A more robust hack will be to add a setInterval that look for DOM changes every 100-300 ms (low performance?)
window.onload = function(){
window.addEventListener("load",function(event) {
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
};
document.onclick = function(){
}, false);
document.addEventListener("click",function(){
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
};
document.onchange = function(){
}, false);
document.addEventListener("change",function(){
setTimeout(function() {
EventBridge.forceHtmlAudioOutputDeviceUpdate();
}, 1200);
};
}, false);
tempEventBridge._callbacks.forEach(function (callback) {
EventBridge.scriptEventReceived.connect(callback);

View file

@ -1,127 +0,0 @@
<!-- Copyright 2016 High Fidelity, Inc. -->
<html>
<head>
<meta charset="utf-8"/>
<input type="hidden" id="version" value="1"/>
<title>Welcome to Interface</title>
<style>
body {
background: black;
width: 100%;
overflow-x: hidden;
margin: 0;
padding: 0;
}
#kbm_button {
position: absolute;
left: 70;
top: 118;
width: 297;
height: 80;
}
#hand_controllers_button {
position: absolute;
left: 367;
top: 118;
width: 267;
height: 80;
}
#game_controller_button {
position: absolute;
left: 634;
top: 118;
width: 297;
height: 80;
}
#image_area {
width: 1024;
height: 720;
margin: auto;
position: absolute;
top: 0; left: 0; bottom: 0; right: 0;
}
</style>
<script>
var handControllerImageURL = null;
function showKbm() {
document.getElementById("main_image").setAttribute("src", "img/controls-help-keyboard.png");
}
function showHandControllers() {
document.getElementById("main_image").setAttribute("src", handControllerImageURL);
}
function showGamepad() {
document.getElementById("main_image").setAttribute("src", "img/controls-help-gamepad.png");
}
// This is not meant to be a complete or hardened query string parser - it only
// needs to handle the values we send in and have control over.
//
// queryString is a string of the form "key1=value1&key2=value2&key3&key4=value4"
function parseQueryString(queryString) {
var params = {};
var paramsParts = queryString.split("&");
for (var i = 0; i < paramsParts.length; ++i) {
var paramKeyValue = paramsParts[i].split("=");
if (paramKeyValue.length == 1) {
params[paramKeyValue[0]] = undefined;
} else if (paramKeyValue.length == 2) {
params[paramKeyValue[0]] = paramKeyValue[1];
} else {
console.error("Error parsing param keyvalue: ", paramParts);
}
}
return params;
}
function load() {
var parts = window.location.href.split("?");
var params = {};
if (parts.length > 0) {
params = parseQueryString(parts[1]);
}
switch (params.handControllerName) {
case "oculus":
handControllerImageURL = "img/controls-help-oculus.png";
break;
case "vive":
default:
handControllerImageURL = "img/controls-help-vive.png";
}
switch (params.defaultTab) {
case "gamepad":
showGamepad();
break;
case "handControllers":
showHandControllers();
break;
case "kbm":
default:
showKbm();
}
}
</script>
</head>
<body onload="load()">
<div id="image_area">
<img id="main_image" src="img/controls-help-keyboard.png" width="1024px" height="720px"></img>
<a href="#" id="kbm_button" onmousedown="showKbm()"></a>
<a href="#" id="hand_controllers_button" onmousedown="showHandControllers()"></a>
<a href="#" id="game_controller_button" onmousedown="showGamepad()"></a>
</div>
</body>
</html>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 67 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 604 KiB

After

Width:  |  Height:  |  Size: 298 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 503 KiB

After

Width:  |  Height:  |  Size: 215 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 585 KiB

After

Width:  |  Height:  |  Size: 289 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 547 KiB

After

Width:  |  Height:  |  Size: 254 KiB

View file

@ -12,9 +12,9 @@
var MAX_WARNINGS = 3;
var numWarnings = 0;
var isWindowFocused = true;
var isKeyboardRaised = false;
var isNumericKeyboard = false;
var isPasswordField = false;
window.isKeyboardRaised = false;
window.isNumericKeyboard = false;
window.isPasswordField = false;
function shouldSetPasswordField() {
var nodeType = document.activeElement.type;
@ -62,7 +62,7 @@
var passwordField = shouldSetPasswordField();
if (isWindowFocused &&
(keyboardRaised !== isKeyboardRaised || numericKeyboard !== isNumericKeyboard || passwordField !== isPasswordField)) {
(keyboardRaised !== window.isKeyboardRaised || numericKeyboard !== window.isNumericKeyboard || passwordField !== window.isPasswordField)) {
if (typeof EventBridge !== "undefined" && EventBridge !== null) {
EventBridge.emitWebEvent(
@ -75,20 +75,20 @@
}
}
if (!isKeyboardRaised) {
if (!window.isKeyboardRaised) {
scheduleBringToView(250); // Allow time for keyboard to be raised in QML.
// 2DO: should it be rather done from 'client area height changed' event?
}
isKeyboardRaised = keyboardRaised;
isNumericKeyboard = numericKeyboard;
isPasswordField = passwordField;
window.isKeyboardRaised = keyboardRaised;
window.isNumericKeyboard = numericKeyboard;
window.isPasswordField = passwordField;
}
}, POLL_FREQUENCY);
window.addEventListener("click", function () {
var keyboardRaised = shouldRaiseKeyboard();
if(keyboardRaised && isKeyboardRaised) {
if (keyboardRaised && window.isKeyboardRaised) {
scheduleBringToView(150);
}
});
@ -99,7 +99,7 @@
window.addEventListener("blur", function () {
isWindowFocused = false;
isKeyboardRaised = false;
isNumericKeyboard = false;
window.isKeyboardRaised = false;
window.isNumericKeyboard = false;
});
})();

View file

@ -0,0 +1,634 @@
//
// AudioScope.qml
//
// Created by Luis Cuenca on 11/22/2017
// Copyright 2017 High Fidelity, Inc.
//
// Distributed under the Apache License, Version 2.0.
// See the accompanying file LICENSE or https://www.apache.org/licenses/LICENSE-2.0.html
//
import QtQuick 2.5
import QtQuick.Controls 1.4
import "styles-uit"
import "controls-uit" as HifiControlsUit
Item {
id: root
width: parent.width
height: parent.height
property var _scopeInputData
property var _scopeOutputLeftData
property var _scopeOutputRightData
property var _triggerInputData
property var _triggerOutputLeftData
property var _triggerOutputRightData
property var _triggerValues: QtObject{
property int x: parent.width/2
property int y: parent.height/3
}
property var _triggered: false
property var _steps
property var _refreshMs: 32
property var _framesPerSecond: AudioScope.getFramesPerSecond()
property var _isFrameUnits: true
property var _holdStart: QtObject{
property int x: 0
property int y: 0
}
property var _holdEnd: QtObject{
property int x: 0
property int y: 0
}
property var _timeBeforeHold: 300
property var _pressedTime: 0
property var _isPressed: false
property var _recOpacity : 0.0
property var _recSign : 0.05
property var _outputLeftState: false
property var _outputRightState: false
property var _wavFilePath: ""
function isHolding() {
return (_pressedTime > _timeBeforeHold);
}
function updateMeasureUnits() {
timeButton.text = _isFrameUnits ? "Display Frames" : "Milliseconds";
fiveLabel.text = _isFrameUnits ? "5" : "" + (Math.round(1000 * 5.0/_framesPerSecond));
twentyLabel.text = _isFrameUnits ? "20" : "" + (Math.round(1000 * 20.0/_framesPerSecond));
fiftyLabel.text = _isFrameUnits ? "50" : "" + (Math.round(1000 * 50.0/_framesPerSecond));
}
function collectScopeData() {
if (inputCh.checked) {
_scopeInputData = AudioScope.scopeInput;
}
if (outputLeftCh.checked) {
_scopeOutputLeftData = AudioScope.scopeOutputLeft;
}
if (outputRightCh.checked) {
_scopeOutputRightData = AudioScope.scopeOutputRight;
}
}
function collectTriggerData() {
if (inputCh.checked) {
_triggerInputData = AudioScope.triggerInput;
}
if (outputLeftCh.checked) {
_triggerOutputLeftData = AudioScope.triggerOutputLeft;
}
if (outputRightCh.checked) {
_triggerOutputRightData = AudioScope.triggerOutputRight;
}
}
function setRecordingLabelOpacity(opacity) {
_recOpacity = opacity;
recCircle.opacity = _recOpacity;
recText.opacity = _recOpacity;
}
function updateRecordingLabel() {
_recOpacity += _recSign;
if (_recOpacity > 1.0 || _recOpacity < 0.0) {
_recOpacity = _recOpacity > 1.0 ? 1.0 : 0.0;
_recSign *= -1;
}
setRecordingLabelOpacity(_recOpacity);
}
function pullFreshValues() {
if (Audio.getRecording()) {
updateRecordingLabel();
}
if (!AudioScope.getPause()) {
if (!_triggered) {
collectScopeData();
}
}
if (inputCh.checked || outputLeftCh.checked || outputRightCh.checked) {
mycanvas.requestPaint();
}
}
function startRecording() {
_wavFilePath = (new Date()).toISOString(); // yyyy-mm-ddThh:mm:ss.sssZ
_wavFilePath = _wavFilePath.replace(/[\-:]|\.\d*Z$/g, "").replace("T", "-") + ".wav";
// Using controller recording default directory
_wavFilePath = Recording.getDefaultRecordingSaveDirectory() + _wavFilePath;
if (!Audio.startRecording(_wavFilePath)) {
Messages.sendMessage("Hifi-Notifications", JSON.stringify({message:"Error creating: "+_wavFilePath}));
updateRecordingUI(false);
}
}
function stopRecording() {
Audio.stopRecording();
setRecordingLabelOpacity(0.0);
Messages.sendMessage("Hifi-Notifications", JSON.stringify({message:"Saved: "+_wavFilePath}));
}
function updateRecordingUI(isRecording) {
if (!isRecording) {
recordButton.text = "Record";
recordButton.color = hifi.buttons.black;
outputLeftCh.checked = _outputLeftState;
outputRightCh.checked = _outputRightState;
} else {
recordButton.text = "Stop";
recordButton.color = hifi.buttons.red;
_outputLeftState = outputLeftCh.checked;
_outputRightState = outputRightCh.checked;
outputLeftCh.checked = true;
outputRightCh.checked = true;
}
}
function toggleRecording() {
if (Audio.getRecording()) {
updateRecordingUI(false);
stopRecording();
} else {
updateRecordingUI(true);
startRecording();
}
}
Timer {
interval: _refreshMs; running: true; repeat: true
onTriggered: pullFreshValues()
}
Canvas {
id: mycanvas
anchors.fill:parent
onPaint: {
function displayMeasureArea(ctx) {
ctx.fillStyle = Qt.rgba(0.1, 0.1, 0.1, 1);
ctx.fillRect(_holdStart.x, 0, _holdEnd.x - _holdStart.x, height);
ctx.lineWidth = "2";
ctx.strokeStyle = "#555555";
ctx.beginPath();
ctx.moveTo(_holdStart.x, 0);
ctx.lineTo(_holdStart.x, height);
ctx.moveTo(_holdEnd.x, 0);
ctx.lineTo(_holdEnd.x, height);
ctx.moveTo(_holdStart.x, _holdStart.y);
ctx.lineTo(_holdEnd.x, _holdStart.y);
ctx.moveTo(_holdEnd.x, _holdEnd.y);
ctx.lineTo(_holdStart.x, _holdEnd.y);
ctx.stroke();
}
function displayTrigger(ctx, lineWidth, color) {
var crossSize = 3;
var holeSize = 2;
ctx.lineWidth = lineWidth;
ctx.strokeStyle = color;
ctx.beginPath();
ctx.moveTo(_triggerValues.x - (crossSize + holeSize), _triggerValues.y);
ctx.lineTo(_triggerValues.x - holeSize, _triggerValues.y);
ctx.moveTo(_triggerValues.x + holeSize, _triggerValues.y);
ctx.lineTo(_triggerValues.x + (crossSize + holeSize), _triggerValues.y);
ctx.moveTo(_triggerValues.x, _triggerValues.y - (crossSize + holeSize));
ctx.lineTo(_triggerValues.x, _triggerValues.y - holeSize);
ctx.moveTo(_triggerValues.x, _triggerValues.y + holeSize);
ctx.lineTo(_triggerValues.x, _triggerValues.y + (crossSize + holeSize));
ctx.stroke();
}
function displayBackground(ctx, datawidth, steps, lineWidth, color) {
var verticalPadding = 100;
ctx.strokeStyle = color;
ctx.lineWidth = lineWidth;
ctx.moveTo(0, height/2);
ctx.lineTo(datawidth, height/2);
var gap = datawidth/steps;
for (var i = 0; i < steps; i++) {
ctx.moveTo(i*gap + 1, verticalPadding);
ctx.lineTo(i*gap + 1, height-verticalPadding);
}
ctx.moveTo(datawidth-1, verticalPadding);
ctx.lineTo(datawidth-1, height-verticalPadding);
ctx.stroke();
}
function drawScope(ctx, data, width, color) {
ctx.beginPath();
ctx.strokeStyle = color;
ctx.lineWidth = width;
var x = 0;
for (var i = 0; i < data.length-1; i++) {
ctx.moveTo(x, data[i] + height/2);
ctx.lineTo(++x, data[i+1] + height/2);
}
ctx.stroke();
}
function getMeasurementText(dist) {
var datasize = _scopeInputData.length;
var value = 0;
if (fiveFrames.checked) {
value = (_isFrameUnits) ? 5.0*dist/datasize : (Math.round(1000 * 5.0/_framesPerSecond))*dist/datasize;
} else if (twentyFrames.checked) {
value = (_isFrameUnits) ? 20.0*dist/datasize : (Math.round(1000 * 20.0/_framesPerSecond))*dist/datasize;
} else if (fiftyFrames.checked) {
value = (_isFrameUnits) ? 50.0*dist/datasize : (Math.round(1000 * 50.0/_framesPerSecond))*dist/datasize;
}
value = Math.abs(Math.round(value*100)/100);
var measureText = "" + value + (_isFrameUnits ? " frames" : " milliseconds");
return measureText;
}
function drawMeasurements(ctx, color) {
ctx.fillStyle = color;
ctx.font = "normal 16px sans-serif";
var fontwidth = 8;
var measureText = getMeasurementText(_holdEnd.x - _holdStart.x);
if (_holdStart.x < _holdEnd.x) {
ctx.fillText("" + height/2 - _holdStart.y, _holdStart.x-40, _holdStart.y);
ctx.fillText("" + height/2 - _holdEnd.y, _holdStart.x-40, _holdEnd.y);
ctx.fillText(measureText, _holdEnd.x+10, _holdEnd.y);
} else {
ctx.fillText("" + height/2 - _holdStart.y, _holdStart.x+10, _holdStart.y);
ctx.fillText("" + height/2 - _holdEnd.y, _holdStart.x+10, _holdEnd.y);
ctx.fillText(measureText, _holdEnd.x-fontwidth*measureText.length, _holdEnd.y);
}
}
var ctx = getContext("2d");
ctx.fillStyle = Qt.rgba(0, 0, 0, 1);
ctx.fillRect(0, 0, width, height);
if (isHolding()) {
displayMeasureArea(ctx);
}
var guideLinesColor = "#555555"
var guideLinesWidth = "1"
displayBackground(ctx, _scopeInputData.length, _steps, guideLinesWidth, guideLinesColor);
var triggerWidth = "3"
var triggerColor = "#EFB400"
if (AudioScope.getAutoTrigger()) {
displayTrigger(ctx, triggerWidth, triggerColor);
}
var scopeWidth = "2"
var scopeInputColor = "#00B4EF"
var scopeOutputLeftColor = "#BB0000"
var scopeOutputRightColor = "#00BB00"
if (!_triggered) {
if (inputCh.checked) {
drawScope(ctx, _scopeInputData, scopeWidth, scopeInputColor);
}
if (outputLeftCh.checked) {
drawScope(ctx, _scopeOutputLeftData, scopeWidth, scopeOutputLeftColor);
}
if (outputRightCh.checked) {
drawScope(ctx, _scopeOutputRightData, scopeWidth, scopeOutputRightColor);
}
} else {
if (inputCh.checked) {
drawScope(ctx, _triggerInputData, scopeWidth, scopeInputColor);
}
if (outputLeftCh.checked) {
drawScope(ctx, _triggerOutputLeftData, scopeWidth, scopeOutputLeftColor);
}
if (outputRightCh.checked) {
drawScope(ctx, _triggerOutputRightData, scopeWidth, scopeOutputRightColor);
}
}
if (isHolding()) {
drawMeasurements(ctx, "#eeeeee");
}
if (_isPressed) {
_pressedTime += _refreshMs;
}
}
}
MouseArea {
id: hitbox
anchors.fill: mycanvas
hoverEnabled: true
onPressed: {
_isPressed = true;
_pressedTime = 0;
_holdStart.x = mouseX;
_holdStart.y = mouseY;
}
onPositionChanged: {
_holdEnd.x = mouseX;
_holdEnd.y = mouseY;
}
onReleased: {
if (!isHolding() && AudioScope.getAutoTrigger()) {
_triggerValues.x = mouseX
_triggerValues.y = mouseY
AudioScope.setTriggerValues(mouseX, mouseY-height/2);
}
_isPressed = false;
_pressedTime = 0;
}
}
HifiControlsUit.CheckBox {
id: activated
boxSize: 20
anchors.top: parent.top;
anchors.left: parent.left;
anchors.topMargin: 8;
anchors.leftMargin: 20;
checked: AudioScope.getVisible();
onCheckedChanged: {
AudioScope.setVisible(checked);
activelabel.text = AudioScope.getVisible() ? "On" : "Off"
}
}
HifiControlsUit.Label {
id: activelabel
text: AudioScope.getVisible() ? "On" : "Off"
anchors.top: activated.top;
anchors.left: activated.right;
}
HifiControlsUit.CheckBox {
id: outputLeftCh
boxSize: 20
text: "Output L"
anchors.horizontalCenter: parent.horizontalCenter;
anchors.top: parent.top;
anchors.topMargin: 8;
onCheckedChanged: {
AudioScope.setServerEcho(outputLeftCh.checked || outputRightCh.checked);
}
}
HifiControlsUit.Label {
text: "Channels";
anchors.horizontalCenter: outputLeftCh.horizontalCenter;
anchors.bottom: outputLeftCh.top;
anchors.bottomMargin: 8;
}
HifiControlsUit.CheckBox {
id: inputCh
boxSize: 20
text: "Input Mono"
anchors.bottom: outputLeftCh.bottom;
anchors.right: outputLeftCh.left;
anchors.rightMargin: 40;
onCheckedChanged: {
AudioScope.setLocalEcho(checked);
}
}
HifiControlsUit.CheckBox {
id: outputRightCh
boxSize: 20
text: "Output R"
anchors.bottom: outputLeftCh.bottom;
anchors.left: outputLeftCh.right;
anchors.leftMargin: 40;
onCheckedChanged: {
AudioScope.setServerEcho(outputLeftCh.checked || outputRightCh.checked);
}
}
HifiControlsUit.Button {
id: recordButton;
text: "Record";
color: hifi.buttons.black;
colorScheme: hifi.colorSchemes.dark;
anchors.right: parent.right;
anchors.bottom: parent.bottom;
anchors.rightMargin: 30;
anchors.bottomMargin: 8;
width: 95;
height: 55;
onClicked: {
toggleRecording();
}
}
HifiControlsUit.Button {
id: pauseButton;
color: hifi.buttons.black;
colorScheme: hifi.colorSchemes.dark;
anchors.right: recordButton.left;
anchors.bottom: parent.bottom;
anchors.rightMargin: 30;
anchors.bottomMargin: 8;
height: 55;
width: 95;
text: " Pause ";
onClicked: {
AudioScope.togglePause();
}
}
HifiControlsUit.CheckBox {
id: twentyFrames
boxSize: 20
anchors.left: parent.horizontalCenter;
anchors.bottom: parent.bottom;
anchors.bottomMargin: 8;
onCheckedChanged: {
if (checked){
fiftyFrames.checked = false;
fiveFrames.checked = false;
AudioScope.selectAudioScopeTwentyFrames();
_steps = 20;
AudioScope.setPause(false);
}
}
}
HifiControlsUit.Label {
id:twentyLabel
anchors.left: twentyFrames.right;
anchors.verticalCenter: twentyFrames.verticalCenter;
}
HifiControlsUit.Button {
id: timeButton;
color: hifi.buttons.black;
colorScheme: hifi.colorSchemes.dark;
text: "Display Frames";
anchors.horizontalCenter: twentyFrames.horizontalCenter;
anchors.bottom: twentyFrames.top;
anchors.bottomMargin: 8;
height: 26;
onClicked: {
_isFrameUnits = !_isFrameUnits;
updateMeasureUnits();
}
}
HifiControlsUit.CheckBox {
id: fiveFrames
boxSize: 20
anchors.horizontalCenter: parent.horizontalCenter;
anchors.bottom: parent.bottom;
anchors.bottomMargin: 8;
anchors.horizontalCenterOffset: -50;
checked: true;
onCheckedChanged: {
if (checked) {
fiftyFrames.checked = false;
twentyFrames.checked = false;
AudioScope.selectAudioScopeFiveFrames();
_steps = 5;
AudioScope.setPause(false);
}
}
}
HifiControlsUit.Label {
id:fiveLabel
anchors.left: fiveFrames.right;
anchors.verticalCenter: fiveFrames.verticalCenter;
}
HifiControlsUit.CheckBox {
id: fiftyFrames
boxSize: 20
anchors.horizontalCenter: parent.horizontalCenter;
anchors.bottom: parent.bottom;
anchors.bottomMargin: 8;
anchors.horizontalCenterOffset: 70;
onCheckedChanged: {
if (checked) {
twentyFrames.checked = false;
fiveFrames.checked = false;
AudioScope.selectAudioScopeFiftyFrames();
_steps = 50;
AudioScope.setPause(false);
}
}
}
HifiControlsUit.Label {
id:fiftyLabel
anchors.left: fiftyFrames.right;
anchors.verticalCenter: fiftyFrames.verticalCenter;
}
HifiControlsUit.Switch {
id: triggerSwitch;
height: 26;
anchors.left: parent.left;
anchors.bottom: parent.bottom;
anchors.leftMargin: 75;
anchors.bottomMargin: 8;
labelTextOff: "Off";
labelTextOn: "On";
onCheckedChanged: {
if (!checked) AudioScope.setPause(false);
AudioScope.setPause(false);
AudioScope.setAutoTrigger(checked);
AudioScope.setTriggerValues(_triggerValues.x, _triggerValues.y-root.height/2);
}
}
HifiControlsUit.Label {
text: "Trigger";
anchors.left: triggerSwitch.left;
anchors.leftMargin: -15;
anchors.bottom: triggerSwitch.top;
}
Rectangle {
id: recordIcon;
width:110;
height:40;
anchors.right: parent.right;
anchors.top: parent.top;
anchors.topMargin: 8;
color: "transparent"
Text {
id: recText
text: "REC"
color: "red"
font.pixelSize: 30;
anchors.left: recCircle.right;
anchors.leftMargin: 10;
opacity: _recOpacity;
y: -8;
}
Rectangle {
id: recCircle;
width: 25;
height: 25;
radius: width*0.5
opacity: _recOpacity;
color: "red";
}
}
Component.onCompleted: {
_steps = AudioScope.getFramesPerScope();
AudioScope.setTriggerValues(_triggerValues.x, _triggerValues.y-root.height/2);
activated.checked = true;
inputCh.checked = true;
updateMeasureUnits();
}
Connections {
target: AudioScope
onPauseChanged: {
if (!AudioScope.getPause()) {
pauseButton.text = "Pause";
pauseButton.color = hifi.buttons.black;
AudioScope.setTriggered(false);
_triggered = false;
} else {
pauseButton.text = "Continue";
pauseButton.color = hifi.buttons.blue;
}
}
onTriggered: {
_triggered = true;
collectTriggerData();
AudioScope.setPause(true);
}
}
}

View file

@ -11,6 +11,7 @@
import QtQuick 2.5
import QtQuick.Controls 1.4
import QtQuick.Controls.Styles 1.4
import QtQuick.Controls 2.2 as QQC2
import "../styles-uit"
@ -24,6 +25,45 @@ TableView {
model: ListModel { }
Component.onCompleted: {
if (flickableItem !== null && flickableItem !== undefined) {
tableView.flickableItem.QQC2.ScrollBar.vertical = scrollbar
}
}
QQC2.ScrollBar {
id: scrollbar
parent: tableView.flickableItem
policy: QQC2.ScrollBar.AsNeeded
orientation: Qt.Vertical
visible: size < 1.0
topPadding: tableView.headerVisible ? hifi.dimensions.tableHeaderHeight + 1 : 1
anchors.top: tableView.top
anchors.left: tableView.right
anchors.bottom: tableView.bottom
background: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent;
topMargin: tableView.headerVisible ? hifi.dimensions.tableHeaderHeight : 0
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight
: hifi.colors.tableScrollBackgroundDark
}
}
contentItem: Item {
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors.fill: parent
radius: (width - 4)/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
}
headerVisible: false
headerDelegate: Rectangle {
height: hifi.dimensions.tableHeaderHeight
@ -98,74 +138,13 @@ TableView {
backgroundVisible: true
horizontalScrollBarPolicy: Qt.ScrollBarAlwaysOff
verticalScrollBarPolicy: Qt.ScrollBarAsNeeded
verticalScrollBarPolicy: Qt.ScrollBarAlwaysOff
style: TableViewStyle {
// Needed in order for rows to keep displaying rows after end of table entries.
backgroundColor: tableView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
alternateBackgroundColor: tableView.isLightColorScheme ? hifi.colors.tableRowLightOdd : hifi.colors.tableRowDarkOdd
padding.top: headerVisible ? hifi.dimensions.tableHeaderHeight: 0
handle: Item {
id: scrollbarHandle
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors {
fill: parent
topMargin: 3
bottomMargin: 3 // ""
leftMargin: 1 // Move it right
rightMargin: -1 // ""
}
radius: hifi.dimensions.scrollbarHandleWidth/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
scrollBarBackground: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent
margins: -1 // Expand
topMargin: -1
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight : hifi.colors.tableScrollBackgroundDark
// Extend header color above scrollbar background
Rectangle {
anchors {
top: parent.top
topMargin: -hifi.dimensions.tableHeaderHeight
left: parent.left
right: parent.right
}
height: hifi.dimensions.tableHeaderHeight
color: tableView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
visible: headerVisible
}
Rectangle {
// Extend header bottom border
anchors {
top: parent.top
left: parent.left
right: parent.right
}
height: 1
color: isLightColorScheme ? hifi.colors.lightGrayText : hifi.colors.baseGrayHighlight
visible: headerVisible
}
}
}
incrementControl: Item {
visible: false
}
decrementControl: Item {
visible: false
}
}
rowDelegate: Rectangle {

View file

@ -9,9 +9,11 @@
//
import QtQml.Models 2.2
import QtQuick 2.5
import QtQuick 2.7
import QtQuick.Controls 1.4
import QtQuick.Controls.Styles 1.4
import QtQuick.Controls 2.2 as QQC2
import "../styles-uit"
@ -35,6 +37,45 @@ TreeView {
headerVisible: false
Component.onCompleted: {
if (flickableItem !== null && flickableItem !== undefined) {
treeView.flickableItem.QQC2.ScrollBar.vertical = scrollbar
}
}
QQC2.ScrollBar {
id: scrollbar
parent: treeView.flickableItem
policy: QQC2.ScrollBar.AsNeeded
orientation: Qt.Vertical
visible: size < 1.0
topPadding: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight + 1 : 1
anchors.top: treeView.top
anchors.left: treeView.right
anchors.bottom: treeView.bottom
background: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent;
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight: 0
}
color: isLightColorScheme ? hifi.colors.tableScrollBackgroundLight
: hifi.colors.tableScrollBackgroundDark
}
}
contentItem: Item {
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors.fill: parent
radius: (width - 4)/2
color: isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
}
// Use rectangle to draw border with rounded corners.
frameVisible: false
Rectangle {
@ -50,7 +91,7 @@ TreeView {
backgroundVisible: true
horizontalScrollBarPolicy: Qt.ScrollBarAlwaysOff
verticalScrollBarPolicy: Qt.ScrollBarAsNeeded
verticalScrollBarPolicy: Qt.ScrollBarAlwaysOff
style: TreeViewStyle {
// Needed in order for rows to keep displaying rows after end of table entries.
@ -126,66 +167,6 @@ TreeView {
leftMargin: hifi.dimensions.tablePadding / 2
}
}
handle: Item {
id: scrollbarHandle
implicitWidth: hifi.dimensions.scrollbarHandleWidth
Rectangle {
anchors {
fill: parent
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight + 3 : 3
bottomMargin: 3 // ""
leftMargin: 1 // Move it right
rightMargin: -1 // ""
}
radius: hifi.dimensions.scrollbarHandleWidth / 2
color: treeView.isLightColorScheme ? hifi.colors.tableScrollHandleLight : hifi.colors.tableScrollHandleDark
}
}
scrollBarBackground: Item {
implicitWidth: hifi.dimensions.scrollbarBackgroundWidth
Rectangle {
anchors {
fill: parent
topMargin: treeView.headerVisible ? hifi.dimensions.tableHeaderHeight - 1 : -1
margins: -1 // Expand
}
color: treeView.isLightColorScheme ? hifi.colors.tableScrollBackgroundLight : hifi.colors.tableScrollBackgroundDark
// Extend header color above scrollbar background
Rectangle {
anchors {
top: parent.top
topMargin: -hifi.dimensions.tableHeaderHeight
left: parent.left
right: parent.right
}
height: hifi.dimensions.tableHeaderHeight
color: treeView.isLightColorScheme ? hifi.colors.tableBackgroundLight : hifi.colors.tableBackgroundDark
visible: treeView.headerVisible
}
Rectangle {
// Extend header bottom border
anchors {
top: parent.top
left: parent.left
right: parent.right
}
height: 1
color: treeView.isLightColorScheme ? hifi.colors.lightGrayText : hifi.colors.baseGrayHighlight
visible: treeView.headerVisible
}
}
}
incrementControl: Item {
visible: false
}
decrementControl: Item {
visible: false
}
}
rowDelegate: Rectangle {
@ -193,8 +174,8 @@ TreeView {
color: styleData.selected
? hifi.colors.primaryHighlight
: treeView.isLightColorScheme
? (styleData.alternate ? hifi.colors.tableRowLightEven : hifi.colors.tableRowLightOdd)
: (styleData.alternate ? hifi.colors.tableRowDarkEven : hifi.colors.tableRowDarkOdd)
? (styleData.alternate ? hifi.colors.tableRowLightEven : hifi.colors.tableRowLightOdd)
: (styleData.alternate ? hifi.colors.tableRowDarkEven : hifi.colors.tableRowDarkOdd)
}
itemDelegate: FiraSansSemiBold {
@ -209,9 +190,9 @@ TreeView {
text: styleData.value
size: hifi.fontSizes.tableText
color: colorScheme == hifi.colorSchemes.light
? (styleData.selected ? hifi.colors.black : hifi.colors.baseGrayHighlight)
: (styleData.selected ? hifi.colors.black : hifi.colors.lightGrayText)
? (styleData.selected ? hifi.colors.black : hifi.colors.baseGrayHighlight)
: (styleData.selected ? hifi.colors.black : hifi.colors.lightGrayText)
elide: Text.ElideRight
}

View file

@ -21,6 +21,8 @@ Item {
signal newViewRequestedCallback(var request)
signal loadingChangedCallback(var loadRequest)
width: parent.width
property bool interactive: false
StylesUIt.HifiConstants {
@ -58,7 +60,8 @@ Item {
WebEngineView {
id: webViewCore
anchors.fill: parent
width: parent.width
height: parent.height
profile: HFWebEngineProfile;
settings.pluginsEnabled: true
@ -91,20 +94,19 @@ Item {
userScripts: [ createGlobalEventBridge, raiseAndLowerKeyboard, userScript ]
property string newUrl: ""
Component.onCompleted: {
webChannel.registerObject("eventBridge", eventBridge);
webChannel.registerObject("eventBridgeWrapper", eventBridgeWrapper);
// Ensure the JS from the web-engine makes it to our logging
webViewCore.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);
});
if (webViewCoreUserAgent !== undefined) {
webViewCore.profile.httpUserAgent = webViewCoreUserAgent
} else {
webViewCore.profile.httpUserAgent += " (HighFidelityInterface)";
}
// Ensure the JS from the web-engine makes it to our logging
webViewCore.javaScriptConsoleMessage.connect(function(level, message, lineNumber, sourceID) {
console.log("Web Entity JS message: " + sourceID + " " + lineNumber + " " + message);
});
}
onFeaturePermissionRequested: {

View file

@ -26,6 +26,7 @@ Rectangle {
// Style
color: "#E3E3E3";
// Properties
property bool debug: false;
property int myCardWidth: width - upperRightInfoContainer.width;
property int myCardHeight: 80;
property int rowHeight: 60;
@ -1120,7 +1121,9 @@ Rectangle {
break;
case 'connections':
var data = message.params;
console.log('Got connection data: ', JSON.stringify(data));
if (pal.debug) {
console.log('Got connection data: ', JSON.stringify(data));
}
connectionsUserModelData = data;
sortConnectionsModel();
connectionsLoading.visible = false;

View file

@ -79,10 +79,12 @@ Rectangle {
if (result.status !== 'success') {
failureErrorText.text = result.message;
root.activeView = "checkoutFailure";
UserActivityLogger.commercePurchaseFailure(root.itemId, root.itemPrice, !root.alreadyOwned, result.message);
} else {
root.itemHref = result.data.download_url;
root.isWearable = result.data.categories.indexOf("Wearables") > -1;
root.activeView = "checkoutSuccess";
UserActivityLogger.commercePurchaseSuccess(root.itemId, root.itemPrice, !root.alreadyOwned);
}
}
@ -599,6 +601,7 @@ Rectangle {
sendToScript({method: 'checkout_rezClicked', itemHref: root.itemHref, isWearable: root.isWearable});
rezzedNotifContainer.visible = true;
rezzedNotifContainerTimer.start();
UserActivityLogger.commerceEntityRezzed(root.itemId, "checkout", root.isWearable ? "rez" : "wear");
}
}
RalewaySemiBold {
@ -902,7 +905,7 @@ Rectangle {
}
buyTextContainer.color = "#FFC3CD";
buyTextContainer.border.color = "#F3808F";
buyGlyph.text = hifi.glyphs.error;
buyGlyph.text = hifi.glyphs.alert;
buyGlyph.size = 54;
} else {
if (root.alreadyOwned) {

View file

@ -430,7 +430,7 @@ Rectangle {
var a = new Date(timestamp);
var year = a.getFullYear();
var month = addLeadingZero(a.getMonth());
var month = addLeadingZero(a.getMonth() + 1);
var day = addLeadingZero(a.getDate());
var hour = a.getHours();
var drawnHour = hour;

View file

@ -349,6 +349,7 @@ Item {
sendToPurchases({method: 'purchases_rezClicked', itemHref: root.itemHref, isWearable: root.isWearable});
rezzedNotifContainer.visible = true;
rezzedNotifContainerTimer.start();
UserActivityLogger.commerceEntityRezzed(root.itemId, "purchases", root.isWearable ? "rez" : "wear");
}
style: ButtonStyle {

View file

@ -343,6 +343,9 @@ Rectangle {
ListModel {
id: previousPurchasesModel;
}
HifiCommerceCommon.SortableListModel {
id: tempPurchasesModel;
}
HifiCommerceCommon.SortableListModel {
id: filteredPurchasesModel;
}
@ -635,20 +638,40 @@ Rectangle {
}
function buildFilteredPurchasesModel() {
filteredPurchasesModel.clear();
var sameItemCount = 0;
tempPurchasesModel.clear();
for (var i = 0; i < purchasesModel.count; i++) {
if (purchasesModel.get(i).title.toLowerCase().indexOf(filterBar.text.toLowerCase()) !== -1) {
if (purchasesModel.get(i).status !== "confirmed" && !root.isShowingMyItems) {
filteredPurchasesModel.insert(0, purchasesModel.get(i));
tempPurchasesModel.insert(0, purchasesModel.get(i));
} else if ((root.isShowingMyItems && purchasesModel.get(i).edition_number === "0") ||
(!root.isShowingMyItems && purchasesModel.get(i).edition_number !== "0")) {
filteredPurchasesModel.append(purchasesModel.get(i));
tempPurchasesModel.append(purchasesModel.get(i));
}
}
}
for (var i = 0; i < tempPurchasesModel.count; i++) {
if (!filteredPurchasesModel.get(i)) {
sameItemCount = -1;
break;
} else if (tempPurchasesModel.get(i).itemId === filteredPurchasesModel.get(i).itemId &&
tempPurchasesModel.get(i).edition_number === filteredPurchasesModel.get(i).edition_number &&
tempPurchasesModel.get(i).status === filteredPurchasesModel.get(i).status) {
sameItemCount++;
}
}
populateDisplayedItemCounts();
sortByDate();
if (sameItemCount !== tempPurchasesModel.count) {
filteredPurchasesModel.clear();
for (var i = 0; i < tempPurchasesModel.count; i++) {
filteredPurchasesModel.append(tempPurchasesModel.get(i));
}
populateDisplayedItemCounts();
sortByDate();
}
}
function checkIfAnyItemStatusChanged() {

View file

@ -206,16 +206,6 @@ Item {
root.isPasswordField = (focus && passphraseField.echoMode === TextInput.Password);
}
MouseArea {
anchors.fill: parent;
onClicked: {
root.keyboardRaised = true;
root.isPasswordField = (passphraseField.echoMode === TextInput.Password);
mouse.accepted = false;
}
}
onAccepted: {
submitPassphraseInputButton.enabled = false;
commerce.setPassphrase(passphraseField.text);
@ -362,25 +352,6 @@ Item {
right: parent.right;
}
Image {
id: lowerKeyboardButton;
z: 999;
source: "images/lowerKeyboard.png";
anchors.right: keyboard.right;
anchors.top: keyboard.showMirrorText ? keyboard.top : undefined;
anchors.bottom: keyboard.showMirrorText ? undefined : keyboard.bottom;
height: 50;
width: 60;
MouseArea {
anchors.fill: parent;
onClicked: {
root.keyboardRaised = false;
}
}
}
HifiControlsUit.Keyboard {
id: keyboard;
raised: HMD.mounted && root.keyboardRaised;

View file

@ -82,17 +82,6 @@ Item {
if (focus) {
var hidePassword = (currentPassphraseField.echoMode === TextInput.Password);
sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseFieldAgain.focus) {
sendSignalToWallet({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}
MouseArea {
anchors.fill: parent;
onPressed: {
var hidePassword = (currentPassphraseField.echoMode === TextInput.Password);
sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
mouse.accepted = false;
}
}
@ -115,21 +104,10 @@ Item {
activeFocusOnPress: true;
activeFocusOnTab: true;
MouseArea {
anchors.fill: parent;
onPressed: {
var hidePassword = (passphraseField.echoMode === TextInput.Password);
sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
mouse.accepted = false;
}
}
onFocusChanged: {
if (focus) {
var hidePassword = (passphraseField.echoMode === TextInput.Password);
sendMessageToLightbox({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseFieldAgain.focus) {
sendMessageToLightbox({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}
@ -151,21 +129,10 @@ Item {
activeFocusOnPress: true;
activeFocusOnTab: true;
MouseArea {
anchors.fill: parent;
onPressed: {
var hidePassword = (passphraseFieldAgain.echoMode === TextInput.Password);
sendSignalToWallet({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
mouse.accepted = false;
}
}
onFocusChanged: {
if (focus) {
var hidePassword = (passphraseFieldAgain.echoMode === TextInput.Password);
sendMessageToLightbox({method: 'walletSetup_raiseKeyboard', isPasswordField: hidePassword});
} else if (!passphraseField.focus) {
sendMessageToLightbox({method: 'walletSetup_lowerKeyboard', isPasswordField: false});
}
}

View file

@ -47,6 +47,12 @@ Rectangle {
} else if (walletStatus === 1) {
if (root.activeView !== "walletSetup") {
root.activeView = "walletSetup";
commerce.resetLocalWalletOnly();
var timestamp = new Date();
walletSetup.startingTimestamp = timestamp;
walletSetup.setupAttemptID = generateUUID();
UserActivityLogger.commerceWalletSetupStarted(timestamp, setupAttemptID, walletSetup.setupFlowVersion, walletSetup.referrer ? walletSetup.referrer : "wallet app",
(AddressManager.placename || AddressManager.hostname || '') + (AddressManager.pathname ? AddressManager.pathname.match(/\/[^\/]+/)[0] : ''));
}
} else if (walletStatus === 2) {
if (root.activeView !== "passphraseModal") {
@ -172,7 +178,7 @@ Rectangle {
Connections {
onSendSignalToWallet: {
if (msg.method === 'walletSetup_finished') {
if (msg.referrer === '') {
if (msg.referrer === '' || msg.referrer === 'marketplace cta') {
root.activeView = "initialize";
commerce.getWalletStatus();
} else if (msg.referrer === 'purchases') {
@ -666,25 +672,6 @@ Rectangle {
right: parent.right;
}
Image {
id: lowerKeyboardButton;
z: 999;
source: "images/lowerKeyboard.png";
anchors.right: keyboard.right;
anchors.top: keyboard.showMirrorText ? keyboard.top : undefined;
anchors.bottom: keyboard.showMirrorText ? undefined : keyboard.bottom;
height: 50;
width: 60;
MouseArea {
anchors.fill: parent;
onClicked: {
root.keyboardRaised = false;
}
}
}
HifiControlsUit.Keyboard {
id: keyboard;
raised: HMD.mounted && root.keyboardRaised;
@ -719,12 +706,28 @@ Rectangle {
case 'updateWalletReferrer':
walletSetup.referrer = message.referrer;
break;
case 'inspectionCertificate_resetCert':
// NOP
break;
default:
console.log('Unrecognized message from wallet.js:', JSON.stringify(message));
}
}
signal sendToScript(var message);
// generateUUID() taken from:
// https://stackoverflow.com/a/8809472
function generateUUID() { // Public Domain/MIT
var d = new Date().getTime();
if (typeof performance !== 'undefined' && typeof performance.now === 'function'){
d += performance.now(); //use high-precision timer if available
}
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function (c) {
var r = (d + Math.random() * 16) % 16 | 0;
d = Math.floor(d / 16);
return (c === 'x' ? r : (r & 0x3 | 0x8)).toString(16);
});
}
//
// FUNCTION DEFINITIONS END
//

View file

@ -38,10 +38,28 @@ Item {
onHistoryResult : {
historyReceived = true;
if (result.status === 'success') {
transactionHistoryModel.clear();
transactionHistoryModel.append(result.data.history);
var sameItemCount = 0;
tempTransactionHistoryModel.clear();
tempTransactionHistoryModel.append(result.data.history);
for (var i = 0; i < tempTransactionHistoryModel.count; i++) {
if (!transactionHistoryModel.get(i)) {
sameItemCount = -1;
break;
} else if (tempTransactionHistoryModel.get(i).transaction_type === transactionHistoryModel.get(i).transaction_type &&
tempTransactionHistoryModel.get(i).text === transactionHistoryModel.get(i).text) {
sameItemCount++;
}
}
calculatePendingAndInvalidated();
if (sameItemCount !== tempTransactionHistoryModel.count) {
transactionHistoryModel.clear();
for (var i = 0; i < tempTransactionHistoryModel.count; i++) {
transactionHistoryModel.append(tempTransactionHistoryModel.get(i));
}
calculatePendingAndInvalidated();
}
}
refreshTimer.start();
}
@ -50,6 +68,7 @@ Item {
Connections {
target: GlobalServices
onMyUsernameChanged: {
transactionHistoryModel.clear();
usernameText.text = Account.username;
}
}
@ -143,10 +162,9 @@ Item {
Timer {
id: refreshTimer;
interval: 4000; // Remove this after demo?
interval: 4000;
onTriggered: {
console.log("Refreshing Wallet Home...");
historyReceived = false;
commerce.balance();
commerce.history();
}
@ -187,6 +205,9 @@ Item {
// Style
color: hifi.colors.baseGrayHighlight;
}
ListModel {
id: tempTransactionHistoryModel;
}
ListModel {
id: transactionHistoryModel;
}
@ -339,7 +360,7 @@ Item {
var a = new Date(timestamp);
var year = a.getFullYear();
var month = addLeadingZero(a.getMonth());
var month = addLeadingZero(a.getMonth() + 1);
var day = addLeadingZero(a.getDate());
var hour = a.getHours();
var drawnHour = hour;

View file

@ -31,6 +31,10 @@ Item {
property bool hasShownSecurityImageTip: false;
property string referrer;
property string keyFilePath;
property date startingTimestamp;
property string setupAttemptID;
readonly property int setupFlowVersion: 1;
readonly property var setupStepNames: [ "Setup Prompt", "Security Image Selection", "Passphrase Selection", "Private Keys Ready" ];
Image {
anchors.fill: parent;
@ -67,6 +71,13 @@ Item {
anchors.fill: parent;
}
onActiveViewChanged: {
var timestamp = new Date();
var currentStepNumber = root.activeView.substring(5);
UserActivityLogger.commerceWalletSetupProgress(timestamp, root.setupAttemptID,
Math.round((timestamp - root.startingTimestamp)/1000), currentStepNumber, root.setupStepNames[currentStepNumber - 1]);
}
//
// TITLE BAR START
//
@ -371,7 +382,7 @@ Item {
Item {
id: securityImageTip;
visible: false;
visible: !root.hasShownSecurityImageTip && root.activeView === "step_3";
z: 999;
anchors.fill: root;
@ -421,7 +432,6 @@ Item {
text: "Got It";
onClicked: {
root.hasShownSecurityImageTip = true;
securityImageTip.visible = false;
passphraseSelection.focusFirstTextField();
}
}
@ -439,9 +449,6 @@ Item {
onVisibleChanged: {
if (visible) {
commerce.getWalletAuthenticatedStatus();
if (!root.hasShownSecurityImageTip) {
securityImageTip.visible = true;
}
}
}
@ -732,7 +739,11 @@ Item {
text: "Finish";
onClicked: {
root.visible = false;
root.hasShownSecurityImageTip = false;
sendSignalToWallet({method: 'walletSetup_finished', referrer: root.referrer ? root.referrer : ""});
var timestamp = new Date();
UserActivityLogger.commerceWalletSetupFinished(timestamp, setupAttemptID, Math.round((timestamp - root.startingTimestamp)/1000));
}
}
}

View file

@ -26,6 +26,7 @@ Item {
}
Connections {
id: onAttachmentsChangedConnection
target: MyAvatar
onAttachmentsChanged: reload()
}
@ -34,6 +35,12 @@ Item {
reload()
}
function setAttachmentsVariant(attachments) {
onAttachmentsChangedConnection.enabled = false;
MyAvatar.setAttachmentsVariant(attachments);
onAttachmentsChangedConnection.enabled = true;
}
Column {
width: pane.width
@ -92,11 +99,15 @@ Item {
attachments.splice(index, 1);
listView.model.remove(index, 1);
}
onUpdateAttachment: MyAvatar.setAttachmentsVariant(attachments);
onUpdateAttachment: {
setAttachmentsVariant(attachments);
}
}
}
onCountChanged: MyAvatar.setAttachmentsVariant(attachments);
onCountChanged: {
setAttachmentsVariant(attachments);
}
/*
// DEBUG
@ -220,7 +231,7 @@ Item {
};
attachments.push(template);
listView.model.append({});
MyAvatar.setAttachmentsVariant(attachments);
setAttachmentsVariant(attachments);
}
}
@ -250,7 +261,7 @@ Item {
id: cancelAction
text: "Cancel"
onTriggered: {
MyAvatar.setAttachmentsVariant(originalAttachments);
setAttachmentsVariant(originalAttachments);
closeDialog();
}
}
@ -263,7 +274,7 @@ Item {
console.log("Attachment " + i + ": " + attachments[i]);
}
MyAvatar.setAttachmentsVariant(attachments);
setAttachmentsVariant(attachments);
closeDialog();
}
}

View file

@ -9,33 +9,30 @@ Overlay {
Image {
id: image
property bool scaleFix: true;
property real xOffset: 0
property real yOffset: 0
property bool scaleFix: true
property real xStart: 0
property real yStart: 0
property real xSize: 0
property real ySize: 0
property real imageScale: 1.0
property var resizer: Timer {
interval: 50
repeat: false
running: false
onTriggered: {
var targetAspect = root.width / root.height;
var sourceAspect = image.sourceSize.width / image.sourceSize.height;
if (sourceAspect <= targetAspect) {
if (root.width === image.sourceSize.width) {
return;
}
image.imageScale = root.width / image.sourceSize.width;
} else if (sourceAspect > targetAspect){
if (root.height === image.sourceSize.height) {
return;
}
image.imageScale = root.height / image.sourceSize.height;
if (image.xSize === 0) {
image.xSize = image.sourceSize.width - image.xStart;
}
image.sourceSize = Qt.size(image.sourceSize.width * image.imageScale, image.sourceSize.height * image.imageScale);
if (image.ySize === 0) {
image.ySize = image.sourceSize.height - image.yStart;
}
image.anchors.leftMargin = -image.xStart * root.width / image.xSize;
image.anchors.topMargin = -image.yStart * root.height / image.ySize;
image.anchors.rightMargin = (image.xStart + image.xSize - image.sourceSize.width) * root.width / image.xSize;
image.anchors.bottomMargin = (image.yStart + image.ySize - image.sourceSize.height) * root.height / image.ySize;
}
}
x: -1 * xOffset * imageScale
y: -1 * yOffset * imageScale
onSourceSizeChanged: {
if (sourceSize.width !== 0 && sourceSize.height !== 0 && progress === 1.0 && scaleFix) {
@ -43,6 +40,8 @@ Overlay {
resizer.start();
}
}
anchors.fill: parent
}
ColorOverlay {
@ -57,8 +56,10 @@ Overlay {
var key = keys[i];
var value = subImage[key];
switch (key) {
case "x": image.xOffset = value; break;
case "y": image.yOffset = value; break;
case "x": image.xStart = value; break;
case "y": image.yStart = value; break;
case "width": image.xSize = value; break;
case "height": image.ySize = value; break;
}
}
}
@ -78,6 +79,7 @@ Overlay {
case "imageURL": image.source = value; break;
case "subImage": updateSubImage(value); break;
case "color": color.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, root.opacity); break;
case "bounds": break; // The bounds property is handled in C++.
default: console.log("OVERLAY Unhandled image property " + key);
}
}

View file

@ -29,6 +29,7 @@ Overlay {
case "borderColor": rectangle.border.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, rectangle.border.color.a); break;
case "borderWidth": rectangle.border.width = value; break;
case "radius": rectangle.radius = value; break;
case "bounds": break; // The bounds property is handled in C++.
default: console.warn("OVERLAY Unhandled rectangle property " + key);
}
}

View file

@ -46,6 +46,7 @@ Overlay {
case "backgroundColor": background.color = Qt.rgba(value.red / 255, value.green / 255, value.blue / 255, background.color.a); break;
case "font": textField.font.pixelSize = value.size; break;
case "lineHeight": textField.lineHeight = value; break;
case "bounds": break; // The bounds property is handled in C++.
default: console.warn("OVERLAY text unhandled property " + key);
}
}

View file

@ -36,7 +36,9 @@ Rectangle {
readonly property bool hmdHead: headBox.checked
readonly property bool headPuck: headPuckBox.checked
readonly property bool handController: handBox.checked
readonly property bool handPuck: handPuckBox.checked
readonly property bool hmdDesktop: hmdInDesktop.checked
property int state: buttonState.disabled
property var lastConfiguration: null
@ -53,10 +55,6 @@ Rectangle {
}
MouseArea {
id: mouseArea
@ -99,6 +97,7 @@ Rectangle {
onClicked: {
if (checked) {
headPuckBox.checked = false;
hmdInDesktop.checked = false;
} else {
checked = true;
}
@ -121,6 +120,7 @@ Rectangle {
onClicked: {
if (checked) {
headBox.checked = false;
hmdInDesktop.checked = false;
} else {
checked = true;
}
@ -133,6 +133,36 @@ Rectangle {
text: "Tracker"
color: hifi.colors.lightGrayText
}
HifiControls.CheckBox {
id: hmdInDesktop
width: 15
height: 15
boxRadius: 7
visible: viveInDesktop.checked
anchors.top: viveInDesktop.bottom
anchors.topMargin: 5
anchors.left: openVrConfiguration.left
anchors.leftMargin: leftMargin + 10
onClicked: {
if (checked) {
headBox.checked = false;
headPuckBox.checked = false;
} else {
checked = true;
}
sendConfigurationSettings();
}
}
RalewayBold {
size: 12
visible: viveInDesktop.checked
text: "None"
color: hifi.colors.lightGrayText
}
}
Row {
@ -773,6 +803,11 @@ Rectangle {
anchors.leftMargin: leftMargin + 10
onClicked: {
if (!checked & hmdInDesktop.checked) {
headBox.checked = true;
headPuckBox.checked = false;
hmdInDesktop.checked = false;
}
sendConfigurationSettings();
}
}
@ -789,6 +824,7 @@ Rectangle {
verticalCenter: viveInDesktop.verticalCenter
}
}
NumberAnimation {
id: numberAnimation
@ -797,6 +833,7 @@ Rectangle {
to: 0
}
function logAction(action, status) {
console.log("calibrated from ui");
var data = {
@ -877,6 +914,7 @@ Rectangle {
var HmdHead = settings["HMDHead"];
var viveController = settings["handController"];
var desktopMode = settings["desktopMode"];
var hmdDesktopPosition = settings["hmdDesktopTracking"];
armCircumference.value = settings.armCircumference;
shoulderWidth.value = settings.shoulderWidth;
@ -898,6 +936,7 @@ Rectangle {
}
viveInDesktop.checked = desktopMode;
hmdInDesktop.checked = hmdDesktopPosition;
initializeButtonState();
updateCalibrationText();
@ -1058,7 +1097,8 @@ Rectangle {
"handConfiguration": handObject,
"armCircumference": armCircumference.value,
"shoulderWidth": shoulderWidth.value,
"desktopMode": viveInDesktop.checked
"desktopMode": viveInDesktop.checked,
"hmdDesktopTracking": hmdInDesktop.checked
}
return settingsObject;

View file

@ -162,7 +162,7 @@ Item {
readonly property real controlLineHeight: 28 // Height of spinbox control on 1920 x 1080 monitor
readonly property real controlInterlineHeight: 21 // 75% of controlLineHeight
readonly property vector2d menuPadding: Qt.vector2d(14, 102)
readonly property real scrollbarBackgroundWidth: 18
readonly property real scrollbarBackgroundWidth: 20
readonly property real scrollbarHandleWidth: scrollbarBackgroundWidth - 2
readonly property real tabletMenuHeader: 90
}

View file

@ -2812,10 +2812,10 @@ static int getEventQueueSize(QThread* thread) {
static void dumpEventQueue(QThread* thread) {
auto threadData = QThreadData::get2(thread);
QMutexLocker locker(&threadData->postEventList.mutex);
qDebug() << "AJT: event list, size =" << threadData->postEventList.size();
qDebug() << "Event list, size =" << threadData->postEventList.size();
for (auto& postEvent : threadData->postEventList) {
QEvent::Type type = (postEvent.event ? postEvent.event->type() : QEvent::None);
qDebug() << "AJT: " << type;
qDebug() << " " << type;
}
}
#endif // DEBUG_EVENT_QUEUE
@ -5974,7 +5974,7 @@ bool Application::acceptURL(const QString& urlString, bool defaultUpload) {
}
}
if (defaultUpload) {
if (defaultUpload && !url.fileName().isEmpty() && url.isLocalFile()) {
showAssetServerWidget(urlString);
}
return defaultUpload;
@ -7075,11 +7075,11 @@ QRect Application::getRecommendedHUDRect() const {
return result;
}
QSize Application::getDeviceSize() const {
glm::vec2 Application::getDeviceSize() const {
static const int MIN_SIZE = 1;
QSize result(MIN_SIZE, MIN_SIZE);
glm::vec2 result(MIN_SIZE);
if (_displayPlugin) {
result = fromGlm(getActiveDisplayPlugin()->getRecommendedRenderSize());
result = getActiveDisplayPlugin()->getRecommendedRenderSize();
}
return result;
}
@ -7098,10 +7098,6 @@ bool Application::hasFocus() const {
return (QApplication::activeWindow() != nullptr);
}
glm::vec2 Application::getViewportDimensions() const {
return toGlm(getDeviceSize());
}
void Application::setMaxOctreePacketsPerSecond(int maxOctreePPS) {
if (maxOctreePPS != _maxOctreePPS) {
_maxOctreePPS = maxOctreePPS;

View file

@ -158,7 +158,7 @@ public:
glm::uvec2 getUiSize() const;
QRect getRecommendedHUDRect() const;
QSize getDeviceSize() const;
glm::vec2 getDeviceSize() const;
bool hasFocus() const;
void showCursor(const Cursor::Icon& cursor);
@ -228,8 +228,6 @@ public:
FileLogger* getLogger() const { return _logger; }
glm::vec2 getViewportDimensions() const;
NodeToJurisdictionMap& getEntityServerJurisdictions() { return _entityServerJurisdictions; }
float getRenderResolutionScale() const;

View file

@ -104,8 +104,7 @@ void Application::paintGL() {
PerformanceTimer perfTimer("renderOverlay");
// NOTE: There is no batch associated with this renderArgs
// the ApplicationOverlay class assumes it's viewport is setup to be the device size
QSize size = getDeviceSize();
renderArgs._viewport = glm::ivec4(0, 0, size.width(), size.height());
renderArgs._viewport = glm::ivec4(0, 0, getDeviceSize());
_applicationOverlay.renderOverlay(&renderArgs);
}

View file

@ -679,36 +679,16 @@ Menu::Menu() {
});
auto audioIO = DependencyManager::get<AudioClient>();
addCheckableActionToQMenuAndActionHash(audioDebugMenu, MenuOption::EchoServerAudio, 0, false,
audioIO.data(), SLOT(toggleServerEcho()));
addCheckableActionToQMenuAndActionHash(audioDebugMenu, MenuOption::EchoLocalAudio, 0, false,
audioIO.data(), SLOT(toggleLocalEcho()));
addActionToQMenuAndActionHash(audioDebugMenu, MenuOption::MuteEnvironment, 0,
audioIO.data(), SLOT(sendMuteEnvironmentPacket()));
auto scope = DependencyManager::get<AudioScope>();
MenuWrapper* audioScopeMenu = audioDebugMenu->addMenu("Audio Scope");
addCheckableActionToQMenuAndActionHash(audioScopeMenu, MenuOption::AudioScope, Qt::CTRL | Qt::Key_F2, false,
scope.data(), SLOT(toggle()));
addCheckableActionToQMenuAndActionHash(audioScopeMenu, MenuOption::AudioScopePause, Qt::CTRL | Qt::SHIFT | Qt::Key_F2, false,
scope.data(), SLOT(togglePause()));
addDisabledActionAndSeparator(audioScopeMenu, "Display Frames");
{
QAction* fiveFrames = addCheckableActionToQMenuAndActionHash(audioScopeMenu, MenuOption::AudioScopeFiveFrames,
0, true, scope.data(), SLOT(selectAudioScopeFiveFrames()));
QAction* twentyFrames = addCheckableActionToQMenuAndActionHash(audioScopeMenu, MenuOption::AudioScopeTwentyFrames,
0, false, scope.data(), SLOT(selectAudioScopeTwentyFrames()));
QAction* fiftyFrames = addCheckableActionToQMenuAndActionHash(audioScopeMenu, MenuOption::AudioScopeFiftyFrames,
0, false, scope.data(), SLOT(selectAudioScopeFiftyFrames()));
QActionGroup* audioScopeFramesGroup = new QActionGroup(audioScopeMenu);
audioScopeFramesGroup->addAction(fiveFrames);
audioScopeFramesGroup->addAction(twentyFrames);
audioScopeFramesGroup->addAction(fiftyFrames);
}
action = addActionToQMenuAndActionHash(audioDebugMenu, MenuOption::AudioScope);
connect(action, &QAction::triggered, [] {
auto scriptEngines = DependencyManager::get<ScriptEngines>();
QUrl defaultScriptsLoc = PathUtils::defaultScriptsLocation();
defaultScriptsLoc.setPath(defaultScriptsLoc.path() + "developer/utilities/audio/audioScope.js");
scriptEngines->loadScript(defaultScriptsLoc.toString());
});
// Developer > Physics >>>
MenuWrapper* physicsOptionsMenu = developerMenu->addMenu("Physics");

View file

@ -9,6 +9,7 @@
// See the accompanying file LICENSE or http://www.apache.org/licenses/LICENSE-2.0.html
//
#include <qvector2d.h>
#include <limits>
#include <AudioClient.h>
@ -21,13 +22,14 @@
#include "AudioScope.h"
static const unsigned int DEFAULT_FRAMES_PER_SCOPE = 5;
static const unsigned int SCOPE_WIDTH = AudioConstants::NETWORK_FRAME_SAMPLES_PER_CHANNEL * DEFAULT_FRAMES_PER_SCOPE;
static const unsigned int MULTIPLIER_SCOPE_HEIGHT = 20;
static const unsigned int SCOPE_HEIGHT = 2 * 15 * MULTIPLIER_SCOPE_HEIGHT;
AudioScope::AudioScope() :
_isEnabled(false),
_isPaused(false),
_isTriggered(false),
_autoTrigger(false),
_scopeInputOffset(0),
_scopeOutputOffset(0),
_framesPerScope(DEFAULT_FRAMES_PER_SCOPE),
@ -43,6 +45,7 @@ AudioScope::AudioScope() :
_outputRightD(DependencyManager::get<GeometryCache>()->allocateID())
{
auto audioIO = DependencyManager::get<AudioClient>();
connect(&audioIO->getReceivedAudioStream(), &MixedProcessedAudioStream::addedSilence,
this, &AudioScope::addStereoSilenceToScope);
connect(&audioIO->getReceivedAudioStream(), &MixedProcessedAudioStream::addedLastFrameRepeatedWithFade,
@ -75,6 +78,18 @@ void AudioScope::selectAudioScopeFiftyFrames() {
reallocateScope(50);
}
void AudioScope::setLocalEcho(bool localEcho) {
DependencyManager::get<AudioClient>()->setLocalEcho(localEcho);
}
void AudioScope::setServerEcho(bool serverEcho) {
DependencyManager::get<AudioClient>()->setServerEcho(serverEcho);
}
float AudioScope::getFramesPerSecond(){
return AudioConstants::NETWORK_FRAMES_PER_SEC;
}
void AudioScope::allocateScope() {
_scopeInputOffset = 0;
_scopeOutputOffset = 0;
@ -108,63 +123,14 @@ void AudioScope::freeScope() {
}
}
void AudioScope::render(RenderArgs* renderArgs, int width, int height) {
if (!_isEnabled) {
return;
}
static const glm::vec4 backgroundColor = { 0.4f, 0.4f, 0.4f, 0.6f };
static const glm::vec4 gridColor = { 0.7f, 0.7f, 0.7f, 1.0f };
static const glm::vec4 inputColor = { 0.3f, 1.0f, 0.3f, 1.0f };
static const glm::vec4 outputLeftColor = { 1.0f, 0.3f, 0.3f, 1.0f };
static const glm::vec4 outputRightColor = { 0.3f, 0.3f, 1.0f, 1.0f };
static const int gridCols = 2;
int gridRows = _framesPerScope;
int x = (width - (int)SCOPE_WIDTH) / 2;
int y = (height - (int)SCOPE_HEIGHT) / 2;
int w = (int)SCOPE_WIDTH;
int h = (int)SCOPE_HEIGHT;
gpu::Batch& batch = *renderArgs->_batch;
auto geometryCache = DependencyManager::get<GeometryCache>();
// Grid uses its own pipeline, so draw it before setting another
const float GRID_EDGE = 0.005f;
geometryCache->renderGrid(batch, glm::vec2(x, y), glm::vec2(x + w, y + h),
gridRows, gridCols, GRID_EDGE, gridColor, true, _audioScopeGrid);
geometryCache->useSimpleDrawPipeline(batch);
auto textureCache = DependencyManager::get<TextureCache>();
batch.setResourceTexture(0, textureCache->getWhiteTexture());
// FIXME - do we really need to reset this here? we know that we're called inside of ApplicationOverlay::renderOverlays
// which already set up our batch for us to have these settings
mat4 legacyProjection = glm::ortho<float>(0, width, height, 0, -1000, 1000);
batch.setProjectionTransform(legacyProjection);
batch.setModelTransform(Transform());
batch.resetViewTransform();
geometryCache->renderQuad(batch, x, y, w, h, backgroundColor, _audioScopeBackground);
renderLineStrip(batch, _inputID, inputColor, x, y, _samplesPerScope, _scopeInputOffset, _scopeInput);
renderLineStrip(batch, _outputLeftID, outputLeftColor, x, y, _samplesPerScope, _scopeOutputOffset, _scopeOutputLeft);
renderLineStrip(batch, _outputRightD, outputRightColor, x, y, _samplesPerScope, _scopeOutputOffset, _scopeOutputRight);
}
void AudioScope::renderLineStrip(gpu::Batch& batch, int id, const glm::vec4& color, int x, int y, int n, int offset, const QByteArray* byteArray) {
QVector<int> AudioScope::getScopeVector(const QByteArray* byteArray, int offset) {
int16_t sample;
int16_t* samples = ((int16_t*) byteArray->data()) + offset;
QVector<int> points;
if (!_isEnabled || byteArray == NULL) return points;
int16_t* samples = ((int16_t*)byteArray->data()) + offset;
int numSamplesToAverage = _framesPerScope / DEFAULT_FRAMES_PER_SCOPE;
int count = (n - offset) / numSamplesToAverage;
int remainder = (n - offset) % numSamplesToAverage;
y += SCOPE_HEIGHT / 2;
auto geometryCache = DependencyManager::get<GeometryCache>();
QVector<glm::vec2> points;
int count = (_samplesPerScope - offset) / numSamplesToAverage;
int remainder = (_samplesPerScope - offset) % numSamplesToAverage;
// Compute and draw the sample averages from the offset position
for (int i = count; --i >= 0; ) {
@ -173,7 +139,7 @@ void AudioScope::renderLineStrip(gpu::Batch& batch, int id, const glm::vec4& col
sample += *samples++;
}
sample /= numSamplesToAverage;
points << glm::vec2(x++, y - sample);
points << -sample;
}
// Compute and draw the sample average across the wrap boundary
@ -182,16 +148,17 @@ void AudioScope::renderLineStrip(gpu::Batch& batch, int id, const glm::vec4& col
for (int j = remainder; --j >= 0; ) {
sample += *samples++;
}
samples = (int16_t*) byteArray->data();
samples = (int16_t*)byteArray->data();
for (int j = numSamplesToAverage - remainder; --j >= 0; ) {
sample += *samples++;
}
sample /= numSamplesToAverage;
points << glm::vec2(x++, y - sample);
} else {
samples = (int16_t*) byteArray->data();
points << -sample;
}
else {
samples = (int16_t*)byteArray->data();
}
// Compute and draw the sample average from the beginning to the offset
@ -202,12 +169,51 @@ void AudioScope::renderLineStrip(gpu::Batch& batch, int id, const glm::vec4& col
sample += *samples++;
}
sample /= numSamplesToAverage;
points << glm::vec2(x++, y - sample);
points << -sample;
}
return points;
}
bool AudioScope::shouldTrigger(const QVector<int>& scope) {
int threshold = 4;
if (_autoTrigger && _triggerValues.x < scope.size()) {
for (int i = -4*threshold; i < +4*threshold; i++) {
int idx = _triggerValues.x + i;
idx = (idx < 0) ? 0 : (idx < scope.size() ? idx : scope.size() - 1);
int dif = abs(_triggerValues.y - scope[idx]);
if (dif < threshold) {
return true;
}
}
}
return false;
}
void AudioScope::storeTriggerValues() {
_triggerInputData = _scopeInputData;
_triggerOutputLeftData = _scopeOutputLeftData;
_triggerOutputRightData = _scopeOutputRightData;
_isTriggered = true;
emit triggered();
}
void AudioScope::computeInputData() {
_scopeInputData = getScopeVector(_scopeInput, _scopeInputOffset);
if (shouldTrigger(_scopeInputData)) {
storeTriggerValues();
}
}
void AudioScope::computeOutputData() {
_scopeOutputLeftData = getScopeVector(_scopeOutputLeft, _scopeOutputOffset);
if (shouldTrigger(_scopeOutputLeftData)) {
storeTriggerValues();
}
_scopeOutputRightData = getScopeVector(_scopeOutputRight, _scopeOutputOffset);
if (shouldTrigger(_scopeOutputRightData)) {
storeTriggerValues();
}
geometryCache->updateVertices(id, points, color);
geometryCache->renderVertices(batch, gpu::LINE_STRIP, id);
}
int AudioScope::addBufferToScope(QByteArray* byteArray, int frameOffset, const int16_t* source, int sourceSamplesPerChannel,
@ -231,7 +237,7 @@ int AudioScope::addBufferToScope(QByteArray* byteArray, int frameOffset, const i
}
int AudioScope::addSilenceToScope(QByteArray* byteArray, int frameOffset, int silentSamples) {
// Short int pointer to mapped samples in byte array
int16_t* destination = (int16_t*)byteArray->data();
@ -271,6 +277,7 @@ void AudioScope::addStereoSamplesToScope(const QByteArray& samples) {
_scopeOutputOffset = addBufferToScope(_scopeOutputRight, _scopeOutputOffset, samplesData, samplesPerChannel, 1, AudioConstants::STEREO);
_scopeLastFrame = samples.right(AudioConstants::NETWORK_FRAME_BYTES_STEREO);
computeOutputData();
}
void AudioScope::addLastFrameRepeatedWithFadeToScope(int samplesPerChannel) {
@ -302,4 +309,5 @@ void AudioScope::addInputToScope(const QByteArray& inputSamples) {
_scopeInputOffset = addBufferToScope(_scopeInput, _scopeInputOffset,
reinterpret_cast<const int16_t*>(inputSamples.data()),
inputSamples.size() / sizeof(int16_t), INPUT_AUDIO_CHANNEL, NUM_INPUT_CHANNELS);
computeInputData();
}

View file

@ -24,27 +24,60 @@
class AudioScope : public QObject, public Dependency {
Q_OBJECT
SINGLETON_DEPENDENCY
Q_PROPERTY(QVector<int> scopeInput READ getScopeInput)
Q_PROPERTY(QVector<int> scopeOutputLeft READ getScopeOutputLeft)
Q_PROPERTY(QVector<int> scopeOutputRight READ getScopeOutputRight)
Q_PROPERTY(QVector<int> triggerInput READ getTriggerInput)
Q_PROPERTY(QVector<int> triggerOutputLeft READ getTriggerOutputLeft)
Q_PROPERTY(QVector<int> triggerOutputRight READ getTriggerOutputRight)
public:
// Audio scope methods for allocation/deallocation
void allocateScope();
void freeScope();
void reallocateScope(int frames);
void render(RenderArgs* renderArgs, int width, int height);
public slots:
void toggle() { setVisible(!_isEnabled); }
void setVisible(bool visible);
bool getVisible() const { return _isEnabled; }
void togglePause() { _isPaused = !_isPaused; }
void setPause(bool paused) { _isPaused = paused; }
void togglePause() { setPause(!_isPaused); }
void setPause(bool paused) { _isPaused = paused; emit pauseChanged(); }
bool getPause() { return _isPaused; }
void toggleTrigger() { _autoTrigger = !_autoTrigger; }
bool getAutoTrigger() { return _autoTrigger; }
void setAutoTrigger(bool autoTrigger) { _isTriggered = false; _autoTrigger = autoTrigger; }
void setTriggerValues(int x, int y) { _triggerValues.x = x; _triggerValues.y = y; }
void setTriggered(bool triggered) { _isTriggered = triggered; }
bool getTriggered() { return _isTriggered; }
float getFramesPerSecond();
int getFramesPerScope() { return _framesPerScope; }
void selectAudioScopeFiveFrames();
void selectAudioScopeTwentyFrames();
void selectAudioScopeFiftyFrames();
QVector<int> getScopeInput() { return _scopeInputData; };
QVector<int> getScopeOutputLeft() { return _scopeOutputLeftData; };
QVector<int> getScopeOutputRight() { return _scopeOutputRightData; };
QVector<int> getTriggerInput() { return _triggerInputData; };
QVector<int> getTriggerOutputLeft() { return _triggerOutputLeftData; };
QVector<int> getTriggerOutputRight() { return _triggerOutputRightData; };
void setLocalEcho(bool serverEcho);
void setServerEcho(bool serverEcho);
signals:
void pauseChanged();
void triggered();
protected:
AudioScope();
@ -55,24 +88,44 @@ private slots:
void addInputToScope(const QByteArray& inputSamples);
private:
// Audio scope methods for rendering
void renderLineStrip(gpu::Batch& batch, int id, const glm::vec4& color, int x, int y, int n, int offset, const QByteArray* byteArray);
// Audio scope methods for data acquisition
int addBufferToScope(QByteArray* byteArray, int frameOffset, const int16_t* source, int sourceSamples,
unsigned int sourceChannel, unsigned int sourceNumberOfChannels, float fade = 1.0f);
int addSilenceToScope(QByteArray* byteArray, int frameOffset, int silentSamples);
QVector<int> getScopeVector(const QByteArray* scope, int offset);
bool shouldTrigger(const QVector<int>& scope);
void computeInputData();
void computeOutputData();
void storeTriggerValues();
bool _isEnabled;
bool _isPaused;
bool _isTriggered;
bool _autoTrigger;
int _scopeInputOffset;
int _scopeOutputOffset;
int _framesPerScope;
int _samplesPerScope;
QByteArray* _scopeInput;
QByteArray* _scopeOutputLeft;
QByteArray* _scopeOutputRight;
QByteArray _scopeLastFrame;
QVector<int> _scopeInputData;
QVector<int> _scopeOutputLeftData;
QVector<int> _scopeOutputRightData;
QVector<int> _triggerInputData;
QVector<int> _triggerOutputLeftData;
QVector<int> _triggerOutputRightData;
glm::ivec2 _triggerValues;
int _audioScopeBackground;
int _audioScopeGrid;

View file

@ -28,6 +28,7 @@
#include <shared/QtHelpers.h>
#include <AvatarData.h>
#include <PerfStat.h>
#include <PrioritySortUtil.h>
#include <RegisteredMetaTypes.h>
#include <Rig.h>
#include <SettingHandle.h>
@ -142,32 +143,39 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
PerformanceTimer perfTimer("otherAvatars");
auto avatarMap = getHashCopy();
QList<AvatarSharedPointer> avatarList = avatarMap.values();
class SortableAvatar: public PrioritySortUtil::Sortable {
public:
SortableAvatar() = delete;
SortableAvatar(const AvatarSharedPointer& avatar) : _avatar(avatar) {}
glm::vec3 getPosition() const override { return _avatar->getWorldPosition(); }
float getRadius() const override { return std::static_pointer_cast<Avatar>(_avatar)->getBoundingRadius(); }
uint64_t getTimestamp() const override { return std::static_pointer_cast<Avatar>(_avatar)->getLastRenderUpdateTime(); }
const AvatarSharedPointer& getAvatar() const { return _avatar; }
private:
AvatarSharedPointer _avatar;
};
ViewFrustum cameraView;
qApp->copyDisplayViewFrustum(cameraView);
PrioritySortUtil::PriorityQueue<SortableAvatar> sortedAvatars(cameraView,
AvatarData::_avatarSortCoefficientSize,
AvatarData::_avatarSortCoefficientCenter,
AvatarData::_avatarSortCoefficientAge);
std::priority_queue<AvatarPriority> sortedAvatars;
AvatarData::sortAvatars(avatarList, cameraView, sortedAvatars,
[](AvatarSharedPointer avatar)->uint64_t{
return std::static_pointer_cast<Avatar>(avatar)->getLastRenderUpdateTime();
},
[](AvatarSharedPointer avatar)->float{
return std::static_pointer_cast<Avatar>(avatar)->getBoundingRadius();
},
[this](AvatarSharedPointer avatar)->bool{
const auto& castedAvatar = std::static_pointer_cast<Avatar>(avatar);
if (castedAvatar == _myAvatar || !castedAvatar->isInitialized()) {
// DO NOT update _myAvatar! Its update has already been done earlier in the main loop.
// DO NOT update or fade out uninitialized Avatars
return true; // ignore it
}
return false;
});
// sort
auto avatarMap = getHashCopy();
AvatarHash::iterator itr = avatarMap.begin();
while (itr != avatarMap.end()) {
const auto& avatar = std::static_pointer_cast<Avatar>(*itr);
// DO NOT update _myAvatar! Its update has already been done earlier in the main loop.
// DO NOT update or fade out uninitialized Avatars
if (avatar != _myAvatar && avatar->isInitialized()) {
sortedAvatars.push(SortableAvatar(avatar));
}
++itr;
}
// process in sorted order
uint64_t startTime = usecTimestampNow();
const uint64_t UPDATE_BUDGET = 2000; // usec
uint64_t updateExpiry = startTime + UPDATE_BUDGET;
@ -176,8 +184,8 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
render::Transaction transaction;
while (!sortedAvatars.empty()) {
const AvatarPriority& sortData = sortedAvatars.top();
const auto& avatar = std::static_pointer_cast<Avatar>(sortData.avatar);
const SortableAvatar& sortData = sortedAvatars.top();
const auto& avatar = std::static_pointer_cast<Avatar>(sortData.getAvatar());
bool ignoring = DependencyManager::get<NodeList>()->isPersonalMutingNode(avatar->getID());
if (ignoring) {
@ -207,7 +215,7 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
uint64_t now = usecTimestampNow();
if (now < updateExpiry) {
// we're within budget
bool inView = sortData.priority > OUT_OF_VIEW_THRESHOLD;
bool inView = sortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (inView && avatar->hasNewJointData()) {
numAvatarsUpdated++;
}
@ -221,7 +229,7 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
// --> some avatar velocity measurements may be a little off
// no time simulate, but we take the time to count how many were tragically missed
bool inView = sortData.priority > OUT_OF_VIEW_THRESHOLD;
bool inView = sortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (!inView) {
break;
}
@ -230,9 +238,9 @@ void AvatarManager::updateOtherAvatars(float deltaTime) {
}
sortedAvatars.pop();
while (inView && !sortedAvatars.empty()) {
const AvatarPriority& newSortData = sortedAvatars.top();
const auto& newAvatar = std::static_pointer_cast<Avatar>(newSortData.avatar);
inView = newSortData.priority > OUT_OF_VIEW_THRESHOLD;
const SortableAvatar& newSortData = sortedAvatars.top();
const auto& newAvatar = std::static_pointer_cast<Avatar>(newSortData.getAvatar());
inView = newSortData.getPriority() > OUT_OF_VIEW_THRESHOLD;
if (inView && newAvatar->hasNewJointData()) {
numAVatarsNotUpdated++;
}

View file

@ -114,7 +114,8 @@ MyAvatar::MyAvatar(QThread* thread) :
_skeletonModel = std::make_shared<MySkeletonModel>(this, nullptr);
connect(_skeletonModel.get(), &Model::setURLFinished, this, &Avatar::setModelURLFinished);
connect(_skeletonModel.get(), &Model::rigReady, this, &Avatar::rigReady);
connect(_skeletonModel.get(), &Model::rigReset, this, &Avatar::rigReset);
using namespace recording;
_skeletonModel->flagAsCauterized();
@ -1516,9 +1517,19 @@ void MyAvatar::updateMotors() {
_characterController.clearMotors();
glm::quat motorRotation;
if (_motionBehaviors & AVATAR_MOTION_ACTION_MOTOR_ENABLED) {
const float FLYING_MOTOR_TIMESCALE = 0.05f;
const float WALKING_MOTOR_TIMESCALE = 0.2f;
const float INVALID_MOTOR_TIMESCALE = 1.0e6f;
float horizontalMotorTimescale;
float verticalMotorTimescale;
if (_characterController.getState() == CharacterController::State::Hover ||
_characterController.computeCollisionGroup() == BULLET_COLLISION_GROUP_COLLISIONLESS) {
motorRotation = getMyHead()->getHeadOrientation();
horizontalMotorTimescale = FLYING_MOTOR_TIMESCALE;
verticalMotorTimescale = FLYING_MOTOR_TIMESCALE;
} else {
// non-hovering = walking: follow camera twist about vertical but not lift
// we decompose camera's rotation and store the twist part in motorRotation
@ -1529,11 +1540,12 @@ void MyAvatar::updateMotors() {
glm::quat liftRotation;
swingTwistDecomposition(headOrientation, Vectors::UNIT_Y, liftRotation, motorRotation);
motorRotation = orientation * motorRotation;
horizontalMotorTimescale = WALKING_MOTOR_TIMESCALE;
verticalMotorTimescale = INVALID_MOTOR_TIMESCALE;
}
const float DEFAULT_MOTOR_TIMESCALE = 0.2f;
const float INVALID_MOTOR_TIMESCALE = 1.0e6f;
if (_isPushing || _isBraking || !_isBeingPushed) {
_characterController.addMotor(_actionMotorVelocity, motorRotation, DEFAULT_MOTOR_TIMESCALE, INVALID_MOTOR_TIMESCALE);
_characterController.addMotor(_actionMotorVelocity, motorRotation, horizontalMotorTimescale, verticalMotorTimescale);
} else {
// _isBeingPushed must be true --> disable action motor by giving it a long timescale,
// otherwise it's attempt to "stand in in place" could defeat scripted motor/thrusts
@ -1799,6 +1811,7 @@ void MyAvatar::postUpdate(float deltaTime, const render::ScenePointer& scene) {
_skeletonModel->setCauterizeBoneSet(_headBoneSet);
_fstAnimGraphOverrideUrl = _skeletonModel->getGeometry()->getAnimGraphOverrideUrl();
initAnimGraph();
_isAnimatingScale = true;
}
if (_enableDebugDrawDefaultPose || _enableDebugDrawAnimPose) {
@ -1956,27 +1969,33 @@ void MyAvatar::updateOrientation(float deltaTime) {
// Use head/HMD roll to turn while flying, but not when standing still.
if (qApp->isHMDMode() && getCharacterController()->getState() == CharacterController::State::Hover && _hmdRollControlEnabled && hasDriveInput()) {
// Turn with head roll.
const float MIN_CONTROL_SPEED = 0.01f;
float speed = glm::length(getWorldVelocity());
if (speed >= MIN_CONTROL_SPEED) {
// Feather turn when stopping moving.
float speedFactor;
if (getDriveKey(TRANSLATE_Z) != 0.0f || _lastDrivenSpeed == 0.0f) {
_lastDrivenSpeed = speed;
speedFactor = 1.0f;
} else {
speedFactor = glm::min(speed / _lastDrivenSpeed, 1.0f);
}
const float MIN_CONTROL_SPEED = 2.0f * getSensorToWorldScale(); // meters / sec
const glm::vec3 characterForward = getWorldOrientation() * Vectors::UNIT_NEG_Z;
float forwardSpeed = glm::dot(characterForward, getWorldVelocity());
float direction = glm::dot(getWorldVelocity(), getWorldOrientation() * Vectors::UNIT_NEG_Z) > 0.0f ? 1.0f : -1.0f;
// only enable roll-turns if we are moving forward or backward at greater then MIN_CONTROL_SPEED
if (fabsf(forwardSpeed) >= MIN_CONTROL_SPEED) {
float direction = forwardSpeed > 0.0f ? 1.0f : -1.0f;
float rollAngle = glm::degrees(asinf(glm::dot(IDENTITY_UP, _hmdSensorOrientation * IDENTITY_RIGHT)));
float rollSign = rollAngle < 0.0f ? -1.0f : 1.0f;
rollAngle = fabsf(rollAngle);
rollAngle = rollAngle > _hmdRollControlDeadZone ? rollSign * (rollAngle - _hmdRollControlDeadZone) : 0.0f;
totalBodyYaw += speedFactor * direction * rollAngle * deltaTime * _hmdRollControlRate;
const float MIN_ROLL_ANGLE = _hmdRollControlDeadZone;
const float MAX_ROLL_ANGLE = 90.0f; // degrees
if (rollAngle > MIN_ROLL_ANGLE) {
// rate of turning is linearly proportional to rollAngle
rollAngle = glm::clamp(rollAngle, MIN_ROLL_ANGLE, MAX_ROLL_ANGLE);
// scale rollAngle into a value from zero to one.
float rollFactor = (rollAngle - MIN_ROLL_ANGLE) / (MAX_ROLL_ANGLE - MIN_ROLL_ANGLE);
float angularSpeed = rollSign * rollFactor * _hmdRollControlRate;
totalBodyYaw += direction * angularSpeed * deltaTime;
}
}
}
@ -2022,12 +2041,13 @@ void MyAvatar::updateActionMotor(float deltaTime) {
_isBraking = _wasPushing || (_isBraking && speed > MIN_ACTION_BRAKE_SPEED);
}
CharacterController::State state = _characterController.getState();
// compute action input
glm::vec3 forward = (getDriveKey(TRANSLATE_Z)) * IDENTITY_FORWARD;
glm::vec3 right = (getDriveKey(TRANSLATE_X)) * IDENTITY_RIGHT;
glm::vec3 direction = forward + right;
CharacterController::State state = _characterController.getState();
if (state == CharacterController::State::Hover ||
_characterController.computeCollisionGroup() == BULLET_COLLISION_GROUP_COLLISIONLESS) {
// we can fly --> support vertical motion
@ -2161,41 +2181,6 @@ bool findAvatarAvatarPenetration(const glm::vec3 positionA, float radiusA, float
// target scale to match the new scale they have chosen. When they leave the domain they will not return to the scale they were
// before they entered the limiting domain.
void MyAvatar::clampTargetScaleToDomainLimits() {
// when we're about to change the target scale because the user has asked to increase or decrease their scale,
// we first make sure that we're starting from a target scale that is allowed by the current domain
auto clampedTargetScale = glm::clamp(_targetScale, _domainMinimumScale, _domainMaximumScale);
if (clampedTargetScale != _targetScale) {
qCDebug(interfaceapp, "Clamped scale to %f since original target scale %f was not allowed by domain",
(double)clampedTargetScale, (double)_targetScale);
setTargetScale(clampedTargetScale);
}
}
void MyAvatar::clampScaleChangeToDomainLimits(float desiredScale) {
auto clampedTargetScale = glm::clamp(desiredScale, _domainMinimumScale, _domainMaximumScale);
if (clampedTargetScale != desiredScale) {
qCDebug(interfaceapp, "Forcing scale to %f since %f is not allowed by domain",
clampedTargetScale, desiredScale);
}
setTargetScale(clampedTargetScale);
qCDebug(interfaceapp, "Changed scale to %f", (double)_targetScale);
emit(scaleChanged());
}
float MyAvatar::getDomainMinScale() {
return _domainMinimumScale;
}
float MyAvatar::getDomainMaxScale() {
return _domainMaximumScale;
}
void MyAvatar::setGravity(float gravity) {
_characterController.setGravity(gravity);
}
@ -2205,70 +2190,58 @@ float MyAvatar::getGravity() {
}
void MyAvatar::increaseSize() {
// make sure we're starting from an allowable scale
clampTargetScaleToDomainLimits();
float minScale = getDomainMinScale();
float maxScale = getDomainMaxScale();
// calculate what our new scale should be
float updatedTargetScale = _targetScale * (1.0f + SCALING_RATIO);
float clampedTargetScale = glm::clamp(_targetScale, minScale, maxScale);
float newTargetScale = glm::clamp(clampedTargetScale * (1.0f + SCALING_RATIO), minScale, maxScale);
// attempt to change to desired scale (clamped to the domain limits)
clampScaleChangeToDomainLimits(updatedTargetScale);
setTargetScale(newTargetScale);
}
void MyAvatar::decreaseSize() {
// make sure we're starting from an allowable scale
clampTargetScaleToDomainLimits();
float minScale = getDomainMinScale();
float maxScale = getDomainMaxScale();
// calculate what our new scale should be
float updatedTargetScale = _targetScale * (1.0f - SCALING_RATIO);
float clampedTargetScale = glm::clamp(_targetScale, minScale, maxScale);
float newTargetScale = glm::clamp(clampedTargetScale * (1.0f - SCALING_RATIO), minScale, maxScale);
// attempt to change to desired scale (clamped to the domain limits)
clampScaleChangeToDomainLimits(updatedTargetScale);
setTargetScale(newTargetScale);
}
void MyAvatar::resetSize() {
// attempt to reset avatar size to the default (clamped to domain limits)
const float DEFAULT_AVATAR_SCALE = 1.0f;
clampScaleChangeToDomainLimits(DEFAULT_AVATAR_SCALE);
setTargetScale(DEFAULT_AVATAR_SCALE);
}
void MyAvatar::restrictScaleFromDomainSettings(const QJsonObject& domainSettingsObject) {
// pull out the minimum and maximum scale and set them to restrict our scale
// pull out the minimum and maximum height and set them to restrict our scale
static const QString AVATAR_SETTINGS_KEY = "avatars";
auto avatarsObject = domainSettingsObject[AVATAR_SETTINGS_KEY].toObject();
static const QString MIN_SCALE_OPTION = "min_avatar_scale";
float settingMinScale = avatarsObject[MIN_SCALE_OPTION].toDouble(MIN_AVATAR_SCALE);
setDomainMinimumScale(settingMinScale);
static const QString MIN_HEIGHT_OPTION = "min_avatar_height";
float settingMinHeight = avatarsObject[MIN_HEIGHT_OPTION].toDouble(MIN_AVATAR_HEIGHT);
setDomainMinimumHeight(settingMinHeight);
static const QString MAX_SCALE_OPTION = "max_avatar_scale";
float settingMaxScale = avatarsObject[MAX_SCALE_OPTION].toDouble(MAX_AVATAR_SCALE);
setDomainMaximumScale(settingMaxScale);
static const QString MAX_HEIGHT_OPTION = "max_avatar_height";
float settingMaxHeight = avatarsObject[MAX_HEIGHT_OPTION].toDouble(MAX_AVATAR_HEIGHT);
setDomainMaximumHeight(settingMaxHeight);
// make sure that the domain owner didn't flip min and max
if (_domainMinimumScale > _domainMaximumScale) {
std::swap(_domainMinimumScale, _domainMaximumScale);
if (_domainMinimumHeight > _domainMaximumHeight) {
std::swap(_domainMinimumHeight, _domainMaximumHeight);
}
// Set avatar current scale
Settings settings;
settings.beginGroup("Avatar");
_targetScale = loadSetting(settings, "scale", 1.0f);
qCDebug(interfaceapp) << "This domain requires a minimum avatar scale of " << _domainMinimumScale
<< " and a maximum avatar scale of " << _domainMaximumScale
<< ". Current avatar scale is " << _targetScale;
qCDebug(interfaceapp) << "This domain requires a minimum avatar scale of " << _domainMinimumHeight
<< " and a maximum avatar scale of " << _domainMaximumHeight;
// debug to log if this avatar's scale in this domain will be clamped
float clampedScale = glm::clamp(_targetScale, _domainMinimumScale, _domainMaximumScale);
if (_targetScale != clampedScale) {
qCDebug(interfaceapp) << "Current avatar scale is clamped to " << clampedScale
<< " because " << _targetScale << " is not allowed by current domain";
// The current scale of avatar should not be more than domain's max_avatar_scale and not less than domain's min_avatar_scale .
_targetScale = clampedScale;
}
_isAnimatingScale = true;
setModelScale(_targetScale);
rebuildCollisionShape();
@ -2288,8 +2261,8 @@ void MyAvatar::saveAvatarScale() {
}
void MyAvatar::clearScaleRestriction() {
_domainMinimumScale = MIN_AVATAR_SCALE;
_domainMaximumScale = MAX_AVATAR_SCALE;
_domainMinimumHeight = MIN_AVATAR_HEIGHT;
_domainMaximumHeight = MAX_AVATAR_HEIGHT;
}
void MyAvatar::goToLocation(const QVariant& propertiesVar) {
@ -3248,6 +3221,7 @@ void MyAvatar::setModelScale(float scale) {
if (changed) {
float sensorToWorldScale = getEyeHeight() / getUserEyeHeight();
emit sensorToWorldScaleChanged(sensorToWorldScale);
emit scaleChanged();
}
}

View file

@ -110,6 +110,10 @@ class MyAvatar : public Avatar {
* @property userEyeHeight {number} Estimated height of the users eyes in sensor space. (meters)
* @property SELF_ID {string} READ-ONLY. UUID representing "my avatar". Only use for local-only entities and overlays in situations where MyAvatar.sessionUUID is not available (e.g., if not connected to a domain).
* Note: Likely to be deprecated.
* @property hmdRollControlEnabled {bool} When enabled the roll angle of your HMD will turn your avatar while flying.
* @property hmdRollControlDeadZone {number} If hmdRollControlEnabled is true, this value can be used to tune what roll angle is required to begin turning.
* This angle is specified in degrees.
* @property hmdRollControlRate {number} If hmdRollControlEnabled is true, this value determines the maximum turn rate of your avatar when rolling your HMD in degrees per second.
*/
// FIXME: `glm::vec3 position` is not accessible from QML, so this exposes position in a QML-native type
@ -158,7 +162,7 @@ class MyAvatar : public Avatar {
Q_PROPERTY(float userEyeHeight READ getUserEyeHeight)
Q_PROPERTY(QUuid SELF_ID READ getSelfID CONSTANT)
const QString DOMINANT_LEFT_HAND = "left";
const QString DOMINANT_RIGHT_HAND = "right";
@ -558,8 +562,6 @@ public slots:
void increaseSize();
void decreaseSize();
void resetSize();
float getDomainMinScale();
float getDomainMaxScale();
void setGravity(float gravity);
float getGravity();
@ -737,12 +739,12 @@ private:
bool _clearOverlayWhenMoving { true };
QString _dominantHand { DOMINANT_RIGHT_HAND };
const float ROLL_CONTROL_DEAD_ZONE_DEFAULT = 8.0f; // deg
const float ROLL_CONTROL_RATE_DEFAULT = 2.5f; // deg/sec/deg
const float ROLL_CONTROL_DEAD_ZONE_DEFAULT = 8.0f; // degrees
const float ROLL_CONTROL_RATE_DEFAULT = 114.0f; // degrees / sec
bool _hmdRollControlEnabled { true };
float _hmdRollControlDeadZone { ROLL_CONTROL_DEAD_ZONE_DEFAULT };
float _hmdRollControlRate { ROLL_CONTROL_RATE_DEFAULT };
float _lastDrivenSpeed { 0.0f };
// working copy -- see AvatarData for thread-safe _sensorToWorldMatrixCache, used for outward facing access
glm::mat4 _sensorToWorldMatrix { glm::mat4() };

View file

@ -83,19 +83,28 @@ void QmlCommerce::buy(const QString& assetId, int cost, const bool controlledFai
void QmlCommerce::balance() {
auto ledger = DependencyManager::get<Ledger>();
auto wallet = DependencyManager::get<Wallet>();
ledger->balance(wallet->listPublicKeys());
QStringList cachedPublicKeys = wallet->listPublicKeys();
if (!cachedPublicKeys.isEmpty()) {
ledger->balance(cachedPublicKeys);
}
}
void QmlCommerce::inventory() {
auto ledger = DependencyManager::get<Ledger>();
auto wallet = DependencyManager::get<Wallet>();
ledger->inventory(wallet->listPublicKeys());
QStringList cachedPublicKeys = wallet->listPublicKeys();
if (!cachedPublicKeys.isEmpty()) {
ledger->inventory(cachedPublicKeys);
}
}
void QmlCommerce::history() {
auto ledger = DependencyManager::get<Ledger>();
auto wallet = DependencyManager::get<Wallet>();
ledger->history(wallet->listPublicKeys());
QStringList cachedPublicKeys = wallet->listPublicKeys();
if (!cachedPublicKeys.isEmpty()) {
ledger->history(cachedPublicKeys);
}
}
void QmlCommerce::changePassphrase(const QString& oldPassphrase, const QString& newPassphrase) {
@ -128,6 +137,11 @@ void QmlCommerce::reset() {
wallet->reset();
}
void QmlCommerce::resetLocalWalletOnly() {
auto wallet = DependencyManager::get<Wallet>();
wallet->reset();
}
void QmlCommerce::account() {
auto ledger = DependencyManager::get<Ledger>();
ledger->account();

View file

@ -65,6 +65,7 @@ protected:
Q_INVOKABLE void history();
Q_INVOKABLE void generateKeyPair();
Q_INVOKABLE void reset();
Q_INVOKABLE void resetLocalWalletOnly();
Q_INVOKABLE void account();
Q_INVOKABLE void certificateInfo(const QString& certificateId);

View file

@ -27,11 +27,11 @@
#include <openssl/ssl.h>
#include <openssl/err.h>
#include <openssl/rsa.h>
#include <openssl/x509.h>
#include <openssl/pem.h>
#include <openssl/evp.h>
#include <openssl/aes.h>
#include <openssl/ecdsa.h>
// I know, right? But per https://www.openssl.org/docs/faq.html
// this avoids OPENSSL_Uplink(00007FF847238000,08): no OPENSSL_Applink
@ -78,18 +78,19 @@ int passwordCallback(char* password, int maxPasswordSize, int rwFlag, void* u) {
}
}
RSA* readKeys(const char* filename) {
EC_KEY* readKeys(const char* filename) {
FILE* fp;
RSA* key = NULL;
EC_KEY *key = NULL;
if ((fp = fopen(filename, "rt"))) {
// file opened successfully
qCDebug(commerce) << "opened key file" << filename;
if ((key = PEM_read_RSAPublicKey(fp, NULL, NULL, NULL))) {
if ((key = PEM_read_EC_PUBKEY(fp, NULL, NULL, NULL))) {
// now read private key
qCDebug(commerce) << "read public key";
if ((key = PEM_read_RSAPrivateKey(fp, &key, passwordCallback, NULL))) {
if ((key = PEM_read_ECPrivateKey(fp, &key, passwordCallback, NULL))) {
qCDebug(commerce) << "read private key";
fclose(fp);
return key;
@ -137,18 +138,18 @@ bool Wallet::writeBackupInstructions() {
return retval;
}
bool writeKeys(const char* filename, RSA* keys) {
bool writeKeys(const char* filename, EC_KEY* keys) {
FILE* fp;
bool retval = false;
if ((fp = fopen(filename, "wt"))) {
if (!PEM_write_RSAPublicKey(fp, keys)) {
if (!PEM_write_EC_PUBKEY(fp, keys)) {
fclose(fp);
qCDebug(commerce) << "failed to write public key";
QFile(QString(filename)).remove();
return retval;
}
if (!PEM_write_RSAPrivateKey(fp, keys, EVP_des_ede3_cbc(), NULL, 0, passwordCallback, NULL)) {
if (!PEM_write_ECPrivateKey(fp, keys, EVP_des_ede3_cbc(), NULL, 0, passwordCallback, NULL)) {
fclose(fp);
qCDebug(commerce) << "failed to write private key";
QFile(QString(filename)).remove();
@ -164,50 +165,29 @@ bool writeKeys(const char* filename, RSA* keys) {
return retval;
}
// copied (without emits for various signals) from libraries/networking/src/RSAKeypairGenerator.cpp.
// We will have a different implementation in practice, but this gives us a start for now
//
// TODO: we don't really use the private keys returned - we can see how this evolves, but probably
// we should just return a list of public keys?
// or perhaps return the RSA* instead?
QPair<QByteArray*, QByteArray*> generateRSAKeypair() {
QPair<QByteArray*, QByteArray*> generateECKeypair() {
RSA* keyPair = RSA_new();
BIGNUM* exponent = BN_new();
EC_KEY* keyPair = EC_KEY_new_by_curve_name(NID_secp256k1);
QPair<QByteArray*, QByteArray*> retval;
const unsigned long RSA_KEY_EXPONENT = 65537;
BN_set_word(exponent, RSA_KEY_EXPONENT);
// seed the random number generator before we call RSA_generate_key_ex
srand(time(NULL));
const int RSA_KEY_BITS = 2048;
if (!RSA_generate_key_ex(keyPair, RSA_KEY_BITS, exponent, NULL)) {
qCDebug(commerce) << "Error generating 2048-bit RSA Keypair -" << ERR_get_error();
// we're going to bust out of here but first we cleanup the BIGNUM
BN_free(exponent);
EC_KEY_set_asn1_flag(keyPair, OPENSSL_EC_NAMED_CURVE);
if (!EC_KEY_generate_key(keyPair)) {
qCDebug(commerce) << "Error generating EC Keypair -" << ERR_get_error();
return retval;
}
// we don't need the BIGNUM anymore so clean that up
BN_free(exponent);
// grab the public key and private key from the file
unsigned char* publicKeyDER = NULL;
int publicKeyLength = i2d_RSAPublicKey(keyPair, &publicKeyDER);
int publicKeyLength = i2d_EC_PUBKEY(keyPair, &publicKeyDER);
unsigned char* privateKeyDER = NULL;
int privateKeyLength = i2d_RSAPrivateKey(keyPair, &privateKeyDER);
int privateKeyLength = i2d_ECPrivateKey(keyPair, &privateKeyDER);
if (publicKeyLength <= 0 || privateKeyLength <= 0) {
qCDebug(commerce) << "Error getting DER public or private key from RSA struct -" << ERR_get_error();
qCDebug(commerce) << "Error getting DER public or private key from EC struct -" << ERR_get_error();
// cleanup the RSA struct
RSA_free(keyPair);
// cleanup the EC struct
EC_KEY_free(keyPair);
// cleanup the public and private key DER data, if required
if (publicKeyLength > 0) {
@ -227,13 +207,13 @@ QPair<QByteArray*, QByteArray*> generateRSAKeypair() {
return retval;
}
RSA_free(keyPair);
EC_KEY_free(keyPair);
// prepare the return values. TODO: Fix this - we probably don't really even want the
// private key at all (better to read it when we need it?). Or maybe we do, when we have
// multiple keys?
retval.first = new QByteArray(reinterpret_cast<char*>(publicKeyDER), publicKeyLength ),
retval.second = new QByteArray(reinterpret_cast<char*>(privateKeyDER), privateKeyLength );
retval.first = new QByteArray(reinterpret_cast<char*>(publicKeyDER), publicKeyLength);
retval.second = new QByteArray(reinterpret_cast<char*>(privateKeyDER), privateKeyLength);
// cleanup the publicKeyDER and publicKeyDER data
OPENSSL_free(publicKeyDER);
@ -245,18 +225,18 @@ QPair<QByteArray*, QByteArray*> generateRSAKeypair() {
// the public key can just go into a byte array
QByteArray readPublicKey(const char* filename) {
FILE* fp;
RSA* key = NULL;
EC_KEY* key = NULL;
if ((fp = fopen(filename, "r"))) {
// file opened successfully
qCDebug(commerce) << "opened key file" << filename;
if ((key = PEM_read_RSAPublicKey(fp, NULL, NULL, NULL))) {
if ((key = PEM_read_EC_PUBKEY(fp, NULL, NULL, NULL))) {
// file read successfully
unsigned char* publicKeyDER = NULL;
int publicKeyLength = i2d_RSAPublicKey(key, &publicKeyDER);
int publicKeyLength = i2d_EC_PUBKEY(key, &publicKeyDER);
// TODO: check for 0 length?
// cleanup
RSA_free(key);
EC_KEY_free(key);
fclose(fp);
qCDebug(commerce) << "parsed public key file successfully";
@ -274,15 +254,15 @@ QByteArray readPublicKey(const char* filename) {
return QByteArray();
}
// the private key should be read/copied into heap memory. For now, we need the RSA struct
// so I'll return that. Note we need to RSA_free(key) later!!!
RSA* readPrivateKey(const char* filename) {
// the private key should be read/copied into heap memory. For now, we need the EC_KEY struct
// so I'll return that.
EC_KEY* readPrivateKey(const char* filename) {
FILE* fp;
RSA* key = NULL;
EC_KEY* key = NULL;
if ((fp = fopen(filename, "r"))) {
// file opened successfully
qCDebug(commerce) << "opened key file" << filename;
if ((key = PEM_read_RSAPrivateKey(fp, &key, passwordCallback, NULL))) {
if ((key = PEM_read_ECPrivateKey(fp, &key, passwordCallback, NULL))) {
qCDebug(commerce) << "parsed private key file successfully";
} else {
@ -341,6 +321,16 @@ Wallet::Wallet() {
auto accountManager = DependencyManager::get<AccountManager>();
connect(accountManager.data(), &AccountManager::usernameChanged, this, [&]() {
getWalletStatus();
_publicKeys.clear();
if (_securityImage) {
delete _securityImage;
}
_securityImage = nullptr;
// tell the provider we got nothing
updateImageProvider();
_passphrase->clear();
});
}
@ -509,7 +499,7 @@ bool Wallet::walletIsAuthenticatedWithPassphrase() {
if (publicKey.size() > 0) {
if (auto key = readPrivateKey(keyFilePath().toStdString().c_str())) {
RSA_free(key);
EC_KEY_free(key);
// be sure to add the public key so we don't do this over and over
_publicKeys.push_back(publicKey.toBase64());
@ -525,7 +515,7 @@ bool Wallet::generateKeyPair() {
initialize();
qCInfo(commerce) << "Generating keypair.";
auto keyPair = generateRSAKeypair();
auto keyPair = generateECKeypair();
writeBackupInstructions();
@ -557,25 +547,25 @@ QStringList Wallet::listPublicKeys() {
// the horror of code pages and so on (changing the bytes) by just returning a base64
// encoded string representing the signature (suitable for http, etc...)
QString Wallet::signWithKey(const QByteArray& text, const QString& key) {
qCInfo(commerce) << "Signing text.";
RSA* rsaPrivateKey = NULL;
if ((rsaPrivateKey = readPrivateKey(keyFilePath().toStdString().c_str()))) {
QByteArray signature(RSA_size(rsaPrivateKey), 0);
qCInfo(commerce) << "Signing text" << text << "with key" << key;
EC_KEY* ecPrivateKey = NULL;
if ((ecPrivateKey = readPrivateKey(keyFilePath().toStdString().c_str()))) {
unsigned char* sig = new unsigned char[ECDSA_size(ecPrivateKey)];
unsigned int signatureBytes = 0;
QByteArray hashedPlaintext = QCryptographicHash::hash(text, QCryptographicHash::Sha256);
int encryptReturn = RSA_sign(NID_sha256,
reinterpret_cast<const unsigned char*>(hashedPlaintext.constData()),
hashedPlaintext.size(),
reinterpret_cast<unsigned char*>(signature.data()),
&signatureBytes,
rsaPrivateKey);
// free the private key RSA struct now that we are done with it
RSA_free(rsaPrivateKey);
int retrn = ECDSA_sign(0,
reinterpret_cast<const unsigned char*>(hashedPlaintext.constData()),
hashedPlaintext.size(),
sig,
&signatureBytes, ecPrivateKey);
if (encryptReturn != -1) {
EC_KEY_free(ecPrivateKey);
QByteArray signature(reinterpret_cast<const char*>(sig), signatureBytes);
if (retrn != -1) {
return signature.toBase64();
}
}
@ -674,7 +664,7 @@ void Wallet::reset() {
keyFile.remove();
}
bool Wallet::writeWallet(const QString& newPassphrase) {
RSA* keys = readKeys(keyFilePath().toStdString().c_str());
EC_KEY* keys = readKeys(keyFilePath().toStdString().c_str());
if (keys) {
// we read successfully, so now write to a new temp file
QString tempFileName = QString("%1.%2").arg(keyFilePath(), QString("temp"));
@ -720,82 +710,86 @@ bool Wallet::changePassphrase(const QString& newPassphrase) {
void Wallet::handleChallengeOwnershipPacket(QSharedPointer<ReceivedMessage> packet, SharedNodePointer sendingNode) {
auto nodeList = DependencyManager::get<NodeList>();
// With EC keys, we receive a nonce from the metaverse server, which is signed
// here with the private key and returned. Verification is done at server.
bool challengeOriginatedFromClient = packet->getType() == PacketType::ChallengeOwnershipRequest;
unsigned char decryptedText[64];
int status;
int certIDByteArraySize;
int encryptedTextByteArraySize;
int textByteArraySize;
int challengingNodeUUIDByteArraySize;
packet->readPrimitive(&certIDByteArraySize);
packet->readPrimitive(&encryptedTextByteArraySize);
packet->readPrimitive(&textByteArraySize); // returns a cast char*, size
if (challengeOriginatedFromClient) {
packet->readPrimitive(&challengingNodeUUIDByteArraySize);
}
// "encryptedText" is now a series of random bytes, a nonce
QByteArray certID = packet->read(certIDByteArraySize);
QByteArray encryptedText = packet->read(encryptedTextByteArraySize);
QByteArray text = packet->read(textByteArraySize);
QByteArray challengingNodeUUID;
if (challengeOriginatedFromClient) {
challengingNodeUUID = packet->read(challengingNodeUUIDByteArraySize);
}
RSA* rsa = readKeys(keyFilePath().toStdString().c_str());
int decryptionStatus = -1;
EC_KEY* ec = readKeys(keyFilePath().toStdString().c_str());
QString sig;
if (rsa) {
if (ec) {
ERR_clear_error();
decryptionStatus = RSA_private_decrypt(encryptedTextByteArraySize,
reinterpret_cast<const unsigned char*>(encryptedText.constData()),
decryptedText,
rsa,
RSA_PKCS1_OAEP_PADDING);
RSA_free(rsa);
sig = signWithKey(text, ""); // base64 signature, QByteArray cast (on return) to QString FIXME should pass ec as string so we can tell which key to sign with
status = 1;
} else {
qCDebug(commerce) << "During entity ownership challenge, creating the RSA object failed.";
qCDebug(commerce) << "During entity ownership challenge, creating the EC-signed nonce failed.";
status = -1;
}
QByteArray decryptedTextByteArray;
if (decryptionStatus > -1) {
decryptedTextByteArray = QByteArray(reinterpret_cast<const char*>(decryptedText), decryptionStatus);
EC_KEY_free(ec);
QByteArray ba = sig.toLocal8Bit();
const char *sigChar = ba.data();
QByteArray textByteArray;
if (status > -1) {
textByteArray = QByteArray(sigChar, (int) strlen(sigChar));
}
int decryptedTextByteArraySize = decryptedTextByteArray.size();
textByteArraySize = textByteArray.size();
int certIDSize = certID.size();
// setup the packet
if (challengeOriginatedFromClient) {
auto decryptedTextPacket = NLPacket::create(PacketType::ChallengeOwnershipReply,
certIDSize + decryptedTextByteArraySize + challengingNodeUUIDByteArraySize + 3 * sizeof(int),
auto textPacket = NLPacket::create(PacketType::ChallengeOwnershipReply,
certIDSize + textByteArraySize + challengingNodeUUIDByteArraySize + 3 * sizeof(int),
true);
decryptedTextPacket->writePrimitive(certIDSize);
decryptedTextPacket->writePrimitive(decryptedTextByteArraySize);
decryptedTextPacket->writePrimitive(challengingNodeUUIDByteArraySize);
decryptedTextPacket->write(certID);
decryptedTextPacket->write(decryptedTextByteArray);
decryptedTextPacket->write(challengingNodeUUID);
textPacket->writePrimitive(certIDSize);
textPacket->writePrimitive(textByteArraySize);
textPacket->writePrimitive(challengingNodeUUIDByteArraySize);
textPacket->write(certID);
textPacket->write(textByteArray);
textPacket->write(challengingNodeUUID);
qCDebug(commerce) << "Sending ChallengeOwnershipReply Packet containing decrypted text" << decryptedTextByteArray << "for CertID" << certID;
qCDebug(commerce) << "Sending ChallengeOwnershipReply Packet containing signed text" << textByteArray << "for CertID" << certID;
nodeList->sendPacket(std::move(decryptedTextPacket), *sendingNode);
nodeList->sendPacket(std::move(textPacket), *sendingNode);
} else {
auto decryptedTextPacket = NLPacket::create(PacketType::ChallengeOwnership, certIDSize + decryptedTextByteArraySize + 2 * sizeof(int), true);
auto textPacket = NLPacket::create(PacketType::ChallengeOwnership, certIDSize + textByteArraySize + 2 * sizeof(int), true);
decryptedTextPacket->writePrimitive(certIDSize);
decryptedTextPacket->writePrimitive(decryptedTextByteArraySize);
decryptedTextPacket->write(certID);
decryptedTextPacket->write(decryptedTextByteArray);
textPacket->writePrimitive(certIDSize);
textPacket->writePrimitive(textByteArraySize);
textPacket->write(certID);
textPacket->write(textByteArray);
qCDebug(commerce) << "Sending ChallengeOwnership Packet containing decrypted text" << decryptedTextByteArray << "for CertID" << certID;
qCDebug(commerce) << "Sending ChallengeOwnership Packet containing signed text" << textByteArray << "for CertID" << certID;
nodeList->sendPacket(std::move(decryptedTextPacket), *sendingNode);
nodeList->sendPacket(std::move(textPacket), *sendingNode);
}
if (decryptionStatus == -1) {
qCDebug(commerce) << "During entity ownership challenge, decrypting the encrypted text failed.";
if (status == -1) {
qCDebug(commerce) << "During entity ownership challenge, signing the text failed.";
long error = ERR_get_error();
if (error != 0) {
const char* error_str = ERR_error_string(error, NULL);
qCWarning(entities) << "RSA error:" << error_str;
qCWarning(entities) << "EC error:" << error_str;
}
}
}

View file

@ -84,18 +84,7 @@ glm::vec2 RayPick::projectOntoXYPlane(const glm::vec3& worldPos, const glm::vec3
glm::vec2 RayPick::projectOntoOverlayXYPlane(const QUuid& overlayID, const glm::vec3& worldPos, bool unNormalized) {
glm::vec3 position = vec3FromVariant(qApp->getOverlays().getProperty(overlayID, "position").value);
glm::quat rotation = quatFromVariant(qApp->getOverlays().getProperty(overlayID, "rotation").value);
glm::vec3 dimensions;
float dpi = qApp->getOverlays().getProperty(overlayID, "dpi").value.toFloat();
if (dpi > 0) {
// Calculate physical dimensions for web3d overlay from resolution and dpi; "dimensions" property is used as a scale.
glm::vec3 resolution = glm::vec3(vec2FromVariant(qApp->getOverlays().getProperty(overlayID, "resolution").value), 1);
glm::vec3 scale = glm::vec3(vec2FromVariant(qApp->getOverlays().getProperty(overlayID, "dimensions").value), 0.01f);
const float INCHES_TO_METERS = 1.0f / 39.3701f;
dimensions = (resolution * INCHES_TO_METERS / dpi) * scale;
} else {
dimensions = glm::vec3(vec2FromVariant(qApp->getOverlays().getProperty(overlayID, "dimensions").value), 0.01);
}
glm::vec3 dimensions = glm::vec3(vec2FromVariant(qApp->getOverlays().getProperty(overlayID, "dimensions").value), 0.01f);
return projectOntoXYPlane(worldPos, position, rotation, dimensions, ENTITY_ITEM_DEFAULT_REGISTRATION_POINT, unNormalized);
}

View file

@ -58,6 +58,21 @@ Audio::Audio() : _devices(_contextIsHMD) {
enableNoiseReduction(enableNoiseReductionSetting.get());
}
bool Audio::startRecording(const QString& filepath) {
auto client = DependencyManager::get<AudioClient>().data();
return client->startRecording(filepath);
}
bool Audio::getRecording() {
auto client = DependencyManager::get<AudioClient>().data();
return client->getRecording();
}
void Audio::stopRecording() {
auto client = DependencyManager::get<AudioClient>().data();
client->stopRecording();
}
void Audio::setMuted(bool isMuted) {
if (_isMuted != isMuted) {
auto client = DependencyManager::get<AudioClient>().data();

View file

@ -16,6 +16,7 @@
#include "AudioDevices.h"
#include "AudioEffectOptions.h"
#include "SettingHandle.h"
#include "AudioFileWav.h"
namespace scripting {
@ -55,6 +56,10 @@ public:
Q_INVOKABLE void setReverb(bool enable);
Q_INVOKABLE void setReverbOptions(const AudioEffectOptions* options);
Q_INVOKABLE bool startRecording(const QString& filename);
Q_INVOKABLE void stopRecording();
Q_INVOKABLE bool getRecording();
signals:
void nop();
void mutedChanged(bool isMuted);
@ -83,7 +88,6 @@ private:
bool _isMuted { false };
bool _enableNoiseReduction { true }; // Match default value of AudioClient::_isNoiseGateEnabled.
bool _contextIsHMD { false };
AudioDevices* getDevices() { return &_devices; }
AudioDevices _devices;
};

View file

@ -18,7 +18,9 @@ GameplayObjects::GameplayObjects() {
bool GameplayObjects::addToGameplayObjects(const QUuid& avatarID) {
containsData = true;
_avatarIDs.push_back(avatarID);
if (std::find(_avatarIDs.begin(), _avatarIDs.end(), avatarID) == _avatarIDs.end()) {
_avatarIDs.push_back(avatarID);
}
return true;
}
bool GameplayObjects::removeFromGameplayObjects(const QUuid& avatarID) {
@ -28,7 +30,9 @@ bool GameplayObjects::removeFromGameplayObjects(const QUuid& avatarID) {
bool GameplayObjects::addToGameplayObjects(const EntityItemID& entityID) {
containsData = true;
_entityIDs.push_back(entityID);
if (std::find(_entityIDs.begin(), _entityIDs.end(), entityID) == _entityIDs.end()) {
_entityIDs.push_back(entityID);
}
return true;
}
bool GameplayObjects::removeFromGameplayObjects(const EntityItemID& entityID) {
@ -38,7 +42,9 @@ bool GameplayObjects::removeFromGameplayObjects(const EntityItemID& entityID) {
bool GameplayObjects::addToGameplayObjects(const OverlayID& overlayID) {
containsData = true;
_overlayIDs.push_back(overlayID);
if (std::find(_overlayIDs.begin(), _overlayIDs.end(), overlayID) == _overlayIDs.end()) {
_overlayIDs.push_back(overlayID);
}
return true;
}
bool GameplayObjects::removeFromGameplayObjects(const OverlayID& overlayID) {
@ -72,28 +78,125 @@ bool SelectionScriptingInterface::removeFromSelectedItemsList(const QString& lis
}
bool SelectionScriptingInterface::clearSelectedItemsList(const QString& listName) {
_selectedItemsListMap.insert(listName, GameplayObjects());
emit selectedItemsListChanged(listName);
{
QWriteLocker lock(&_selectionListsLock);
_selectedItemsListMap.insert(listName, GameplayObjects());
}
onSelectedItemsListChanged(listName);
return true;
}
QStringList SelectionScriptingInterface::getListNames() const {
QStringList list;
QReadLocker lock(&_selectionListsLock);
list = _selectedItemsListMap.keys();
return list;
}
QStringList SelectionScriptingInterface::getHighlightedListNames() const {
QStringList list;
QReadLocker lock(&_highlightStylesLock);
list = _highlightStyleMap.keys();
return list;
}
bool SelectionScriptingInterface::enableListHighlight(const QString& listName, const QVariantMap& highlightStyleValues) {
QWriteLocker lock(&_highlightStylesLock);
auto highlightStyle = _highlightStyleMap.find(listName);
if (highlightStyle == _highlightStyleMap.end()) {
highlightStyle = _highlightStyleMap.insert(listName, SelectionHighlightStyle());
}
if (!(*highlightStyle).isBoundToList()) {
setupHandler(listName);
(*highlightStyle).setBoundToList(true);
}
(*highlightStyle).fromVariantMap(highlightStyleValues);
auto mainScene = qApp->getMain3DScene();
if (mainScene) {
render::Transaction transaction;
transaction.resetSelectionHighlight(listName.toStdString(), (*highlightStyle).getStyle());
mainScene->enqueueTransaction(transaction);
}
else {
qWarning() << "SelectionToSceneHandler::highlightStyleChanged(), Unexpected null scene, possibly during application shutdown";
}
return true;
}
bool SelectionScriptingInterface::disableListHighlight(const QString& listName) {
QWriteLocker lock(&_highlightStylesLock);
auto highlightStyle = _highlightStyleMap.find(listName);
if (highlightStyle != _highlightStyleMap.end()) {
if ((*highlightStyle).isBoundToList()) {
}
_highlightStyleMap.erase(highlightStyle);
auto mainScene = qApp->getMain3DScene();
if (mainScene) {
render::Transaction transaction;
transaction.removeHighlightFromSelection(listName.toStdString());
mainScene->enqueueTransaction(transaction);
}
else {
qWarning() << "SelectionToSceneHandler::highlightStyleChanged(), Unexpected null scene, possibly during application shutdown";
}
}
return true;
}
QVariantMap SelectionScriptingInterface::getListHighlightStyle(const QString& listName) const {
QReadLocker lock(&_highlightStylesLock);
auto highlightStyle = _highlightStyleMap.find(listName);
if (highlightStyle == _highlightStyleMap.end()) {
return QVariantMap();
} else {
return (*highlightStyle).toVariantMap();
}
}
render::HighlightStyle SelectionScriptingInterface::getHighlightStyle(const QString& listName) const {
QReadLocker lock(&_highlightStylesLock);
auto highlightStyle = _highlightStyleMap.find(listName);
if (highlightStyle == _highlightStyleMap.end()) {
return render::HighlightStyle();
} else {
return (*highlightStyle).getStyle();
}
}
template <class T> bool SelectionScriptingInterface::addToGameplayObjects(const QString& listName, T idToAdd) {
GameplayObjects currentList = _selectedItemsListMap.value(listName);
currentList.addToGameplayObjects(idToAdd);
_selectedItemsListMap.insert(listName, currentList);
emit selectedItemsListChanged(listName);
{
QWriteLocker lock(&_selectionListsLock);
GameplayObjects currentList = _selectedItemsListMap.value(listName);
currentList.addToGameplayObjects(idToAdd);
_selectedItemsListMap.insert(listName, currentList);
}
onSelectedItemsListChanged(listName);
return true;
}
template <class T> bool SelectionScriptingInterface::removeFromGameplayObjects(const QString& listName, T idToRemove) {
GameplayObjects currentList = _selectedItemsListMap.value(listName);
if (currentList.getContainsData()) {
currentList.removeFromGameplayObjects(idToRemove);
_selectedItemsListMap.insert(listName, currentList);
emit selectedItemsListChanged(listName);
bool listExist = false;
{
QWriteLocker lock(&_selectionListsLock);
auto currentList = _selectedItemsListMap.find(listName);
if (currentList != _selectedItemsListMap.end()) {
listExist = true;
(*currentList).removeFromGameplayObjects(idToRemove);
}
}
if (listExist) {
onSelectedItemsListChanged(listName);
return true;
} else {
}
else {
return false;
}
}
@ -102,50 +205,123 @@ template <class T> bool SelectionScriptingInterface::removeFromGameplayObjects(c
//
GameplayObjects SelectionScriptingInterface::getList(const QString& listName) {
QReadLocker lock(&_selectionListsLock);
return _selectedItemsListMap.value(listName);
}
void SelectionScriptingInterface::printList(const QString& listName) {
GameplayObjects currentList = _selectedItemsListMap.value(listName);
if (currentList.getContainsData()) {
QReadLocker lock(&_selectionListsLock);
auto currentList = _selectedItemsListMap.find(listName);
if (currentList != _selectedItemsListMap.end()) {
if ((*currentList).getContainsData()) {
qDebug() << "Avatar IDs:";
for (auto i : currentList.getAvatarIDs()) {
qDebug() << i << ';';
}
qDebug() << "";
qDebug() << "List named " << listName << ":";
qDebug() << "Avatar IDs:";
for (auto i : (*currentList).getAvatarIDs()) {
qDebug() << i << ';';
}
qDebug() << "";
qDebug() << "Entity IDs:";
for (auto j : currentList.getEntityIDs()) {
qDebug() << j << ';';
}
qDebug() << "";
qDebug() << "Entity IDs:";
for (auto j : (*currentList).getEntityIDs()) {
qDebug() << j << ';';
}
qDebug() << "";
qDebug() << "Overlay IDs:";
for (auto k : currentList.getOverlayIDs()) {
qDebug() << k << ';';
qDebug() << "Overlay IDs:";
for (auto k : (*currentList).getOverlayIDs()) {
qDebug() << k << ';';
}
qDebug() << "";
}
else {
qDebug() << "List named " << listName << " empty";
}
qDebug() << "";
} else {
qDebug() << "List named" << listName << "doesn't exist.";
qDebug() << "List named " << listName << " doesn't exist.";
}
}
QVariantMap SelectionScriptingInterface::getSelectedItemsList(const QString& listName) const {
QReadLocker lock(&_selectionListsLock);
QVariantMap list;
auto currentList = _selectedItemsListMap.find(listName);
if (currentList != _selectedItemsListMap.end()) {
QList<QVariant> avatarIDs;
QList<QVariant> entityIDs;
QList<QVariant> overlayIDs;
if ((*currentList).getContainsData()) {
if (!(*currentList).getAvatarIDs().empty()) {
for (auto j : (*currentList).getAvatarIDs()) {
avatarIDs.push_back((QUuid)j);
}
}
if (!(*currentList).getEntityIDs().empty()) {
for (auto j : (*currentList).getEntityIDs()) {
entityIDs.push_back((QUuid)j );
}
}
if (!(*currentList).getOverlayIDs().empty()) {
for (auto j : (*currentList).getOverlayIDs()) {
overlayIDs.push_back((QUuid)j);
}
}
}
list["avatars"] = (avatarIDs);
list["entities"] = (entityIDs);
list["overlays"] = (overlayIDs);
return list;
}
else {
return list;
}
}
bool SelectionScriptingInterface::removeListFromMap(const QString& listName) {
if (_selectedItemsListMap.remove(listName)) {
emit selectedItemsListChanged(listName);
bool removed = false;
{
QWriteLocker lock(&_selectionListsLock);
removed = _selectedItemsListMap.remove(listName);
}
if (removed) {
onSelectedItemsListChanged(listName);
return true;
} else {
return false;
}
}
void SelectionScriptingInterface::setupHandler(const QString& selectionName) {
QWriteLocker lock(&_selectionHandlersLock);
auto handler = _handlerMap.find(selectionName);
if (handler == _handlerMap.end()) {
handler = _handlerMap.insert(selectionName, new SelectionToSceneHandler());
}
(*handler)->initialize(selectionName);
}
void SelectionScriptingInterface::onSelectedItemsListChanged(const QString& listName) {
{
QWriteLocker lock(&_selectionHandlersLock);
auto handler = _handlerMap.find(listName);
if (handler != _handlerMap.end()) {
(*handler)->updateSceneFromSelectedList();
}
}
emit selectedItemsListChanged(listName);
}
SelectionToSceneHandler::SelectionToSceneHandler() {
}
void SelectionToSceneHandler::initialize(const QString& listName) {
_listName = listName;
updateSceneFromSelectedList();
}
void SelectionToSceneHandler::selectedItemsListChanged(const QString& listName) {
@ -199,3 +375,85 @@ void SelectionToSceneHandler::updateSceneFromSelectedList() {
qWarning() << "SelectionToSceneHandler::updateRendererSelectedList(), Unexpected null scene, possibly during application shutdown";
}
}
bool SelectionHighlightStyle::fromVariantMap(const QVariantMap& properties) {
auto colorVariant = properties["outlineUnoccludedColor"];
if (colorVariant.isValid()) {
bool isValid;
auto color = xColorFromVariant(colorVariant, isValid);
if (isValid) {
_style._outlineUnoccluded.color = toGlm(color);
}
}
colorVariant = properties["outlineOccludedColor"];
if (colorVariant.isValid()) {
bool isValid;
auto color = xColorFromVariant(colorVariant, isValid);
if (isValid) {
_style._outlineOccluded.color = toGlm(color);
}
}
colorVariant = properties["fillUnoccludedColor"];
if (colorVariant.isValid()) {
bool isValid;
auto color = xColorFromVariant(colorVariant, isValid);
if (isValid) {
_style._fillUnoccluded.color = toGlm(color);
}
}
colorVariant = properties["fillOccludedColor"];
if (colorVariant.isValid()) {
bool isValid;
auto color = xColorFromVariant(colorVariant, isValid);
if (isValid) {
_style._fillOccluded.color = toGlm(color);
}
}
auto intensityVariant = properties["outlineUnoccludedAlpha"];
if (intensityVariant.isValid()) {
_style._outlineUnoccluded.alpha = intensityVariant.toFloat();
}
intensityVariant = properties["outlineOccludedAlpha"];
if (intensityVariant.isValid()) {
_style._outlineOccluded.alpha = intensityVariant.toFloat();
}
intensityVariant = properties["fillUnoccludedAlpha"];
if (intensityVariant.isValid()) {
_style._fillUnoccluded.alpha = intensityVariant.toFloat();
}
intensityVariant = properties["fillOccludedAlpha"];
if (intensityVariant.isValid()) {
_style._fillOccluded.alpha = intensityVariant.toFloat();
}
auto outlineWidth = properties["outlineWidth"];
if (outlineWidth.isValid()) {
_style._outlineWidth = outlineWidth.toFloat();
}
auto isOutlineSmooth = properties["isOutlineSmooth"];
if (isOutlineSmooth.isValid()) {
_style._isOutlineSmooth = isOutlineSmooth.toBool();
}
return true;
}
QVariantMap SelectionHighlightStyle::toVariantMap() const {
QVariantMap properties;
properties["outlineUnoccludedColor"] = xColorToVariant(xColorFromGlm(_style._outlineUnoccluded.color));
properties["outlineOccludedColor"] = xColorToVariant(xColorFromGlm(_style._outlineOccluded.color));
properties["fillUnoccludedColor"] = xColorToVariant(xColorFromGlm(_style._fillUnoccluded.color));
properties["fillOccludedColor"] = xColorToVariant(xColorFromGlm(_style._fillOccluded.color));
properties["outlineUnoccludedAlpha"] = _style._outlineUnoccluded.alpha;
properties["outlineOccludedAlpha"] = _style._outlineOccluded.alpha;
properties["fillUnoccludedAlpha"] = _style._fillUnoccluded.alpha;
properties["fillOccludedAlpha"] = _style._fillOccluded.alpha;
properties["outlineWidth"] = _style._outlineWidth;
properties["isOutlineSmooth"] = _style._isOutlineSmooth;
return properties;
}

View file

@ -21,22 +21,23 @@
#include "RenderableEntityItem.h"
#include "ui/overlays/Overlay.h"
#include <avatar/AvatarManager.h>
#include <render/HighlightStyle.h>
class GameplayObjects {
public:
GameplayObjects();
bool getContainsData() { return containsData; }
bool getContainsData() const { return containsData; }
std::vector<QUuid> getAvatarIDs() { return _avatarIDs; }
std::vector<QUuid> getAvatarIDs() const { return _avatarIDs; }
bool addToGameplayObjects(const QUuid& avatarID);
bool removeFromGameplayObjects(const QUuid& avatarID);
std::vector<EntityItemID> getEntityIDs() { return _entityIDs; }
std::vector<EntityItemID> getEntityIDs() const { return _entityIDs; }
bool addToGameplayObjects(const EntityItemID& entityID);
bool removeFromGameplayObjects(const EntityItemID& entityID);
std::vector<OverlayID> getOverlayIDs() { return _overlayIDs; }
std::vector<OverlayID> getOverlayIDs() const { return _overlayIDs; }
bool addToGameplayObjects(const OverlayID& overlayID);
bool removeFromGameplayObjects(const OverlayID& overlayID);
@ -48,20 +49,52 @@ private:
};
class SelectionToSceneHandler : public QObject {
Q_OBJECT
public:
SelectionToSceneHandler();
void initialize(const QString& listName);
void updateSceneFromSelectedList();
public slots:
void selectedItemsListChanged(const QString& listName);
private:
QString _listName{ "" };
};
using SelectionToSceneHandlerPointer = QSharedPointer<SelectionToSceneHandler>;
class SelectionHighlightStyle {
public:
SelectionHighlightStyle() {}
void setBoundToList(bool bound) { _isBoundToList = bound; }
bool isBoundToList() const { return _isBoundToList; }
bool fromVariantMap(const QVariantMap& properties);
QVariantMap toVariantMap() const;
render::HighlightStyle getStyle() const { return _style; }
protected:
bool _isBoundToList{ false };
render::HighlightStyle _style;
};
class SelectionScriptingInterface : public QObject, public Dependency {
Q_OBJECT
public:
SelectionScriptingInterface();
GameplayObjects getList(const QString& listName);
/**jsdoc
* Prints out the list of avatars, entities and overlays stored in a particular selection.
* @function Selection.printList
* @param listName {string} name of the selection
* Query the names of all the selection lists
* @function Selection.getListNames
* @return An array of names of all the selection lists
*/
Q_INVOKABLE void printList(const QString& listName);
Q_INVOKABLE QStringList getListNames() const;
/**jsdoc
* Removes a named selection from the list of selections.
* @function Selection.removeListFromMap
@ -96,30 +129,103 @@ public:
*/
Q_INVOKABLE bool clearSelectedItemsList(const QString& listName);
/**jsdoc
* Prints out the list of avatars, entities and overlays stored in a particular selection.
* @function Selection.printList
* @param listName {string} name of the selection
*/
Q_INVOKABLE void printList(const QString& listName);
/**jsdoc
* Query the list of avatars, entities and overlays stored in a particular selection.
* @function Selection.getList
* @param listName {string} name of the selection
* @return a js object describing the content of a selection list with the following properties:
* - "entities": [ and array of the entityID of the entities in the selection]
* - "avatars": [ and array of the avatarID of the avatars in the selection]
* - "overlays": [ and array of the overlayID of the overlays in the selection]
* If the list name doesn't exist, the function returns an empty js object with no properties.
*/
Q_INVOKABLE QVariantMap getSelectedItemsList(const QString& listName) const;
/**jsdoc
* Query the names of the highlighted selection lists
* @function Selection.getHighlightedListNames
* @return An array of names of the selection list currently highlight enabled
*/
Q_INVOKABLE QStringList getHighlightedListNames() const;
/**jsdoc
* Enable highlighting for the named selection.
* If the Selection doesn't exist, it will be created.
* All objects in the list will be displayed with the highlight effect as specified from the highlightStyle.
* The function can be called several times with different values in the style to modify it.
*
* @function Selection.enableListHighlight
* @param listName {string} name of the selection
* @param highlightStyle {jsObject} highlight style fields (see Selection.getListHighlightStyle for a detailed description of the highlightStyle).
* @returns {bool} true if the selection was successfully enabled for highlight.
*/
Q_INVOKABLE bool enableListHighlight(const QString& listName, const QVariantMap& highlightStyle);
/**jsdoc
* Disable highlighting for the named selection.
* If the Selection doesn't exist or wasn't enabled for highliting then nothing happens simply returning false.
*
* @function Selection.disableListHighlight
* @param listName {string} name of the selection
* @returns {bool} true if the selection was successfully disabled for highlight, false otherwise.
*/
Q_INVOKABLE bool disableListHighlight(const QString& listName);
/**jsdoc
* Query the highlight style values for the named selection.
* If the Selection doesn't exist or hasn't been highlight enabled yet, it will return an empty object.
* Otherwise, the jsObject describes the highlight style properties:
* - outlineUnoccludedColor: {xColor} Color of the specified highlight region
* - outlineOccludedColor: {xColor} "
* - fillUnoccludedColor: {xColor} "
* - fillOccludedColor: {xColor} "
*
* - outlineUnoccludedAlpha: {float} Alpha value ranging from 0.0 (not visible) to 1.0 (fully opaque) for the specified highlight region
* - outlineOccludedAlpha: {float} "
* - fillUnoccludedAlpha: {float} "
* - fillOccludedAlpha: {float} "
*
* - outlineWidth: {float} width of the outline expressed in pixels
* - isOutlineSmooth: {bool} true to enable oultine smooth falloff
*
* @function Selection.getListHighlightStyle
* @param listName {string} name of the selection
* @returns {jsObject} highlight style as described above
*/
Q_INVOKABLE QVariantMap getListHighlightStyle(const QString& listName) const;
GameplayObjects getList(const QString& listName);
render::HighlightStyle getHighlightStyle(const QString& listName) const;
void onSelectedItemsListChanged(const QString& listName);
signals:
void selectedItemsListChanged(const QString& listName);
private:
mutable QReadWriteLock _selectionListsLock;
QMap<QString, GameplayObjects> _selectedItemsListMap;
mutable QReadWriteLock _selectionHandlersLock;
QMap<QString, SelectionToSceneHandler*> _handlerMap;
mutable QReadWriteLock _highlightStylesLock;
QMap<QString, SelectionHighlightStyle> _highlightStyleMap;
template <class T> bool addToGameplayObjects(const QString& listName, T idToAdd);
template <class T> bool removeFromGameplayObjects(const QString& listName, T idToRemove);
};
void setupHandler(const QString& selectionName);
class SelectionToSceneHandler : public QObject {
Q_OBJECT
public:
SelectionToSceneHandler();
void initialize(const QString& listName);
void updateSceneFromSelectedList();
public slots:
void selectedItemsListChanged(const QString& listName);
private:
QString _listName { "" };
};
#endif // hifi_SelectionScriptingInterface_h

View file

@ -22,31 +22,3 @@ void WalletScriptingInterface::refreshWalletStatus() {
auto wallet = DependencyManager::get<Wallet>();
wallet->getWalletStatus();
}
static const QString CHECKOUT_QML_PATH = qApp->applicationDirPath() + "../../../qml/hifi/commerce/checkout/Checkout.qml";
void WalletScriptingInterface::buy(const QString& name, const QString& id, const int& price, const QString& href) {
if (QThread::currentThread() != thread()) {
QMetaObject::invokeMethod(this, "buy", Q_ARG(const QString&, name), Q_ARG(const QString&, id), Q_ARG(const int&, price), Q_ARG(const QString&, href));
return;
}
auto tabletScriptingInterface = DependencyManager::get<TabletScriptingInterface>();
auto tablet = dynamic_cast<TabletProxy*>(tabletScriptingInterface->getTablet("com.highfidelity.interface.tablet.system"));
tablet->loadQMLSource(CHECKOUT_QML_PATH);
DependencyManager::get<HMDScriptingInterface>()->openTablet();
QQuickItem* root = nullptr;
if (tablet->getToolbarMode() || (!tablet->getTabletRoot() && !qApp->isHMDMode())) {
root = DependencyManager::get<OffscreenUi>()->getRootItem();
} else {
root = tablet->getTabletRoot();
}
CheckoutProxy* checkout = new CheckoutProxy(root->findChild<QObject*>("checkout"));
// Example: Wallet.buy("Test Flaregun", "0d90d21c-ce7a-4990-ad18-e9d2cf991027", 17, "http://mpassets.highfidelity.com/0d90d21c-ce7a-4990-ad18-e9d2cf991027-v1/flaregun.json");
checkout->writeProperty("itemName", name);
checkout->writeProperty("itemId", id);
checkout->writeProperty("itemPrice", price);
checkout->writeProperty("itemHref", href);
}

View file

@ -41,8 +41,6 @@ public:
Q_INVOKABLE uint getWalletStatus() { return _walletStatus; }
void setWalletStatus(const uint& status) { _walletStatus = status; }
Q_INVOKABLE void buy(const QString& name, const QString& id, const int& price, const QString& href);
signals:
void walletStatusChanged();
void walletNotSetup();

View file

@ -176,6 +176,10 @@ bool WindowScriptingInterface::isPointOnDesktopWindow(QVariant point) {
return offscreenUi->isPointOnDesktopWindow(point);
}
glm::vec2 WindowScriptingInterface::getDeviceSize() const {
return qApp->getDeviceSize();
}
/// Makes sure that the reticle is visible, use this in blocking forms that require a reticle and
/// might be in same thread as a script that sets the reticle to invisible
void WindowScriptingInterface::ensureReticleVisible() const {

View file

@ -12,6 +12,8 @@
#ifndef hifi_WindowScriptingInterface_h
#define hifi_WindowScriptingInterface_h
#include <glm/glm.hpp>
#include <QtCore/QObject>
#include <QtCore/QString>
#include <QtQuick/QQuickItem>
@ -73,6 +75,7 @@ public slots:
bool isPhysicsEnabled();
bool setDisplayTexture(const QString& name);
bool isPointOnDesktopWindow(QVariant point);
glm::vec2 getDeviceSize() const;
int openMessageBox(QString title, QString text, int buttons, int defaultButton);
void updateMessageBox(int id, QString title, QString text, int buttons, int defaultButton);

View file

@ -82,7 +82,6 @@ void ApplicationOverlay::renderOverlay(RenderArgs* renderArgs) {
// Now render the overlay components together into a single texture
renderDomainConnectionStatusBorder(renderArgs); // renders the connected domain line
renderAudioScope(renderArgs); // audio scope in the very back - NOTE: this is the debug audio scope, not the VU meter
renderOverlays(renderArgs); // renders Scripts Overlay and AudioScope
renderQmlUi(renderArgs); // renders a unit quad with the QML UI texture, and the text overlays from scripts
});
@ -118,25 +117,6 @@ void ApplicationOverlay::renderQmlUi(RenderArgs* renderArgs) {
geometryCache->renderUnitQuad(batch, glm::vec4(1), _qmlGeometryId);
}
void ApplicationOverlay::renderAudioScope(RenderArgs* renderArgs) {
PROFILE_RANGE(app, __FUNCTION__);
gpu::Batch& batch = *renderArgs->_batch;
auto geometryCache = DependencyManager::get<GeometryCache>();
geometryCache->useSimpleDrawPipeline(batch);
auto textureCache = DependencyManager::get<TextureCache>();
batch.setResourceTexture(0, textureCache->getWhiteTexture());
int width = renderArgs->_viewport.z;
int height = renderArgs->_viewport.w;
mat4 legacyProjection = glm::ortho<float>(0, width, height, 0, ORTHO_NEAR_CLIP, ORTHO_FAR_CLIP);
batch.setProjectionTransform(legacyProjection);
batch.setModelTransform(Transform());
batch.resetViewTransform();
// Render the audio scope
DependencyManager::get<AudioScope>()->render(renderArgs, width, height);
}
void ApplicationOverlay::renderOverlays(RenderArgs* renderArgs) {
PROFILE_RANGE(app, __FUNCTION__);

View file

@ -32,7 +32,6 @@ private:
void renderStatsAndLogs(RenderArgs* renderArgs);
void renderDomainConnectionStatusBorder(RenderArgs* renderArgs);
void renderQmlUi(RenderArgs* renderArgs);
void renderAudioScope(RenderArgs* renderArgs);
void renderOverlays(RenderArgs* renderArgs);
void buildFramebufferObject();

View file

@ -35,6 +35,8 @@ public:
// getters
virtual bool is3D() const override { return true; }
virtual uint32_t fetchMetaSubItems(render::ItemIDs& subItems) const override { subItems.push_back(getRenderItemID()); return (uint32_t) subItems.size(); }
// TODO: consider implementing registration points in this class
glm::vec3 getCenter() const { return getWorldPosition(); }

View file

@ -72,14 +72,7 @@ ContextOverlayInterface::ContextOverlayInterface() {
connect(&qApp->getOverlays(), &Overlays::hoverLeaveOverlay, this, &ContextOverlayInterface::contextOverlays_hoverLeaveOverlay);
{
render::Transaction transaction;
initializeSelectionToSceneHandler(_selectionToSceneHandlers[0], "contextOverlayHighlightList", transaction);
for (auto i = 1; i < MAX_SELECTION_COUNT; i++) {
auto selectionName = QString("highlightList") + QString::number(i);
initializeSelectionToSceneHandler(_selectionToSceneHandlers[i], selectionName, transaction);
}
const render::ScenePointer& scene = qApp->getMain3DScene();
scene->enqueueTransaction(transaction);
_selectionScriptingInterface->enableListHighlight("contextOverlayHighlightList", QVariantMap());
}
auto nodeList = DependencyManager::get<NodeList>();
@ -88,12 +81,6 @@ ContextOverlayInterface::ContextOverlayInterface() {
_challengeOwnershipTimeoutTimer.setSingleShot(true);
}
void ContextOverlayInterface::initializeSelectionToSceneHandler(SelectionToSceneHandler& handler, const QString& selectionName, render::Transaction& transaction) {
handler.initialize(selectionName);
connect(_selectionScriptingInterface.data(), &SelectionScriptingInterface::selectedItemsListChanged, &handler, &SelectionToSceneHandler::selectedItemsListChanged);
transaction.resetSelectionHighlight(selectionName.toStdString());
}
static const xColor CONTEXT_OVERLAY_COLOR = { 255, 255, 255 };
static const float CONTEXT_OVERLAY_INSIDE_DISTANCE = 1.0f; // in meters
static const float CONTEXT_OVERLAY_SIZE = 0.09f; // in meters, same x and y dims
@ -326,21 +313,21 @@ void ContextOverlayInterface::openInspectionCertificate() {
QString ownerKey = jsonObject["transfer_recipient_key"].toString();
QByteArray certID = entityProperties.getCertificateID().toUtf8();
QByteArray encryptedText = DependencyManager::get<EntityTreeRenderer>()->getTree()->computeEncryptedNonce(certID, ownerKey);
QByteArray text = DependencyManager::get<EntityTreeRenderer>()->getTree()->computeNonce(certID, ownerKey);
QByteArray nodeToChallengeByteArray = entityProperties.getOwningAvatarID().toRfc4122();
int certIDByteArraySize = certID.length();
int encryptedTextByteArraySize = encryptedText.length();
int textByteArraySize = text.length();
int nodeToChallengeByteArraySize = nodeToChallengeByteArray.length();
auto challengeOwnershipPacket = NLPacket::create(PacketType::ChallengeOwnershipRequest,
certIDByteArraySize + encryptedTextByteArraySize + nodeToChallengeByteArraySize + 3 * sizeof(int),
certIDByteArraySize + textByteArraySize + nodeToChallengeByteArraySize + 3 * sizeof(int),
true);
challengeOwnershipPacket->writePrimitive(certIDByteArraySize);
challengeOwnershipPacket->writePrimitive(encryptedTextByteArraySize);
challengeOwnershipPacket->writePrimitive(textByteArraySize);
challengeOwnershipPacket->writePrimitive(nodeToChallengeByteArraySize);
challengeOwnershipPacket->write(certID);
challengeOwnershipPacket->write(encryptedText);
challengeOwnershipPacket->write(text);
challengeOwnershipPacket->write(nodeToChallengeByteArray);
nodeList->sendPacket(std::move(challengeOwnershipPacket), *entityServer);
@ -421,16 +408,16 @@ void ContextOverlayInterface::handleChallengeOwnershipReplyPacket(QSharedPointer
_challengeOwnershipTimeoutTimer.stop();
int certIDByteArraySize;
int decryptedTextByteArraySize;
int textByteArraySize;
packet->readPrimitive(&certIDByteArraySize);
packet->readPrimitive(&decryptedTextByteArraySize);
packet->readPrimitive(&textByteArraySize);
QString certID(packet->read(certIDByteArraySize));
QString decryptedText(packet->read(decryptedTextByteArraySize));
QString text(packet->read(textByteArraySize));
EntityItemID id;
bool verificationSuccess = DependencyManager::get<EntityTreeRenderer>()->getTree()->verifyDecryptedNonce(certID, decryptedText, id);
bool verificationSuccess = DependencyManager::get<EntityTreeRenderer>()->getTree()->verifyNonce(certID, text, id);
if (verificationSuccess) {
emit ledger->updateCertificateStatus(certID, (uint)(ledger->CERTIFICATE_STATUS_VERIFICATION_SUCCESS));

View file

@ -96,9 +96,6 @@ private:
void disableEntityHighlight(const EntityItemID& entityItemID);
void deletingEntity(const EntityItemID& entityItemID);
void initializeSelectionToSceneHandler(SelectionToSceneHandler& handler, const QString& selectionName, render::Transaction& transaction);
SelectionToSceneHandler _selectionToSceneHandlers[MAX_SELECTION_COUNT];
Q_INVOKABLE void startChallengeOwnershipTimer();
QTimer _challengeOwnershipTimeoutTimer;

View file

@ -166,26 +166,36 @@ void Line3DOverlay::setProperties(const QVariantMap& originalProperties) {
bool newEndSet { false };
auto start = properties["start"];
// if "start" property was not there, check to see if they included aliases: startPoint
// If "start" property was not there, check to see if they included aliases: startPoint, p1
if (!start.isValid()) {
start = properties["startPoint"];
}
if (!start.isValid()) {
start = properties["p1"];
}
if (start.isValid()) {
newStart = vec3FromVariant(start);
newStartSet = true;
}
properties.remove("start"); // so that Base3DOverlay doesn't respond to it
properties.remove("startPoint");
properties.remove("p1");
auto end = properties["end"];
// if "end" property was not there, check to see if they included aliases: endPoint
// If "end" property was not there, check to see if they included aliases: endPoint, p2
if (!end.isValid()) {
end = properties["endPoint"];
}
if (!end.isValid()) {
end = properties["p2"];
}
if (end.isValid()) {
newEnd = vec3FromVariant(end);
newEndSet = true;
}
properties.remove("end"); // so that Base3DOverlay doesn't respond to it
properties.remove("endPoint");
properties.remove("p2");
auto length = properties["length"];
if (length.isValid()) {
@ -313,14 +323,23 @@ QVariant Line3DOverlay::getProperty(const QString& property) {
if (property == "end" || property == "endPoint" || property == "p2") {
return vec3toVariant(getEnd());
}
if (property == "length") {
return QVariant(getLength());
}
if (property == "endParentID") {
return _endParentID;
}
if (property == "endParentJointIndex") {
return _endParentJointIndex;
}
if (property == "localStart") {
return vec3toVariant(getLocalStart());
}
if (property == "localEnd") {
return vec3toVariant(getLocalEnd());
}
if (property == "length") {
return QVariant(getLength());
if (property == "glow") {
return getGlow();
}
if (property == "lineWidth") {
return _lineWidth;

View file

@ -35,7 +35,8 @@ ModelOverlay::ModelOverlay(const ModelOverlay* modelOverlay) :
_modelTextures(QVariantMap()),
_url(modelOverlay->_url),
_updateModel(false),
_loadPriority(modelOverlay->getLoadPriority())
_scaleToFit(modelOverlay->_scaleToFit),
_loadPriority(modelOverlay->_loadPriority)
{
_model->init();
_model->setLoadingPriority(_loadPriority);
@ -78,6 +79,12 @@ void ModelOverlay::update(float deltatime) {
if (_model->needsFixupInScene()) {
_model->removeFromScene(scene, transaction);
_model->addToScene(scene, transaction);
auto newRenderItemIDs{ _model->fetchRenderItemIDs() };
transaction.updateItem<Overlay>(getRenderItemID(), [newRenderItemIDs](Overlay& data) {
auto modelOverlay = static_cast<ModelOverlay*>(&data);
modelOverlay->setSubRenderItemIDs(newRenderItemIDs);
});
}
if (_visibleDirty) {
_visibleDirty = false;
@ -103,6 +110,10 @@ bool ModelOverlay::addToScene(Overlay::Pointer overlay, const render::ScenePoint
void ModelOverlay::removeFromScene(Overlay::Pointer overlay, const render::ScenePointer& scene, render::Transaction& transaction) {
Volume3DOverlay::removeFromScene(overlay, scene, transaction);
_model->removeFromScene(scene, transaction);
transaction.updateItem<Overlay>(getRenderItemID(), [](Overlay& data) {
auto modelOverlay = static_cast<ModelOverlay*>(&data);
modelOverlay->clearSubRenderItemIDs();
});
}
void ModelOverlay::setVisible(bool visible) {
@ -134,6 +145,9 @@ void ModelOverlay::setProperties(const QVariantMap& properties) {
}
auto dimensions = properties["dimensions"];
if (!dimensions.isValid()) {
dimensions = properties["size"];
}
if (dimensions.isValid()) {
_scaleToFit = true;
setDimensions(vec3FromVariant(dimensions));
@ -598,3 +612,19 @@ void ModelOverlay::copyAnimationJointDataToModel(QVector<JointData> jointsData)
_updateModel = true;
}
void ModelOverlay::clearSubRenderItemIDs() {
_subRenderItemIDs.clear();
}
void ModelOverlay::setSubRenderItemIDs(const render::ItemIDs& ids) {
_subRenderItemIDs = ids;
}
uint32_t ModelOverlay::fetchMetaSubItems(render::ItemIDs& subItems) const {
if (_model) {
auto metaSubItems = _subRenderItemIDs;
subItems.insert(subItems.end(), metaSubItems.begin(), metaSubItems.end());
return (uint32_t)metaSubItems.size();
}
return 0;
}

View file

@ -30,6 +30,12 @@ public:
virtual void update(float deltatime) override;
virtual void render(RenderArgs* args) override {};
virtual uint32_t fetchMetaSubItems(render::ItemIDs& subItems) const override;
void clearSubRenderItemIDs();
void setSubRenderItemIDs(const render::ItemIDs& ids);
void setProperties(const QVariantMap& properties) override;
QVariant getProperty(const QString& property) override;
virtual bool findRayIntersection(const glm::vec3& origin, const glm::vec3& direction, float& distance,
@ -74,9 +80,11 @@ private:
ModelPointer _model;
QVariantMap _modelTextures;
render::ItemIDs _subRenderItemIDs;
QUrl _url;
bool _updateModel = { false };
bool _scaleToFit = { false };
bool _updateModel { false };
bool _scaleToFit { false };
float _loadPriority { 0.0f };
AnimationPointer _animation;
@ -87,7 +95,7 @@ private:
bool _animationRunning { false };
bool _animationLoop { false };
float _animationFirstFrame { 0.0f };
float _animationLastFrame = { 0.0f };
float _animationLastFrame { 0.0f };
bool _animationHold { false };
bool _animationAllowTranslation { false };
uint64_t _lastAnimated { 0 };

View file

@ -53,6 +53,8 @@ public:
virtual const render::ShapeKey getShapeKey() { return render::ShapeKey::Builder::ownPipeline(); }
virtual uint32_t fetchMetaSubItems(render::ItemIDs& subItems) const { return 0; }
// getters
virtual QString getType() const = 0;
virtual bool is3D() const = 0;
@ -130,6 +132,7 @@ namespace render {
template <> int payloadGetLayer(const Overlay::Pointer& overlay);
template <> void payloadRender(const Overlay::Pointer& overlay, RenderArgs* args);
template <> const ShapeKey shapeGetShapeKey(const Overlay::Pointer& overlay);
template <> uint32_t metaFetchMetaSubItems(const Overlay::Pointer& overlay, ItemIDs& subItems);
}
Q_DECLARE_METATYPE(OverlayID);

View file

@ -26,6 +26,8 @@ public:
virtual bool is3D() const override { return false; }
virtual uint32_t fetchMetaSubItems(render::ItemIDs& subItems) const override { subItems.push_back(getRenderItemID()); return 1; }
// getters
int getX() const { return _bounds.x(); }
int getY() const { return _bounds.y(); }

View file

@ -586,13 +586,6 @@ public slots:
*/
void setKeyboardFocusOverlay(const OverlayID& id);
void mousePressPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void mouseMovePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void mouseReleasePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverEnterPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverOverPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverLeavePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
signals:
/**jsdoc
* Triggered when an overlay is deleted.
@ -755,6 +748,14 @@ private:
OverlayID _currentHoverOverOverlayID { UNKNOWN_OVERLAY_ID };
RayToOverlayIntersectionResult findRayIntersectionForMouseEvent(PickRay ray);
private slots:
void mousePressPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void mouseMovePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void mouseReleasePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverEnterPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverOverPointerEvent(const OverlayID& overlayID, const PointerEvent& event);
void hoverLeavePointerEvent(const OverlayID& overlayID, const PointerEvent& event);
};
#endif // hifi_Overlays_h

View file

@ -87,4 +87,10 @@ namespace render {
template <> const ShapeKey shapeGetShapeKey(const Overlay::Pointer& overlay) {
return overlay->getShapeKey();
}
template <> uint32_t metaFetchMetaSubItems(const Overlay::Pointer& overlay, ItemIDs& subItems) {
return overlay->fetchMetaSubItems(subItems);
}
}

View file

@ -18,8 +18,9 @@
QString const Shape3DOverlay::TYPE = "shape";
Shape3DOverlay::Shape3DOverlay(const Shape3DOverlay* Shape3DOverlay) :
Volume3DOverlay(Shape3DOverlay)
Shape3DOverlay::Shape3DOverlay(const Shape3DOverlay* shape3DOverlay) :
Volume3DOverlay(shape3DOverlay),
_shape(shape3DOverlay->_shape)
{
}

Some files were not shown because too many files have changed in this diff Show more