Raúl Jiménez

elecash@gmail.com

@elecash

360 & VR

VIDEO 

about me

Angular GDE

videogular

google partner

toptal partner

Let's

talk

about

VR...

it can be scary

VR is complex

  • Need to learn a lot of maths
  • Understand how 3D works
  • Shaders, materials, etc
  • Lights
  • Cameras
  • Raycasters

Three.js can help

import {Component, ElementRef, OnInit, Input, Output, EventEmitter} from "angular2/core";
import {VgAPI} from "../services/vg-api";
import {VgUtils} from "../services/vg-utils";
import Object3D = THREE.Object3D;
import {IHotSpot} from "./i-hot-spot";

@Component({
    selector: 'vg-360',
    template: `
        <div id="container">
            <span class="left-pointer" [style.display]="(pointer) ? 'inherit' : 'none'" [class.vr]="vr"></span>
            <span class="right-pointer" [style.display]="(pointer && vr) ? 'inherit' : 'none'"></span>
            <div id="css-container"></div>
        </div>
        <ng-content></ng-content>
    `,
    styles: [`
        :host {
            display: flex;
            align-items: center;
        }
        #container {
            width: 100%;
            height: auto;
        }
        
        #css-container {
            position: absolute;
        }
        
        .left-pointer {
            width: 6px;
            height: 6px;
            position: absolute;
            display: block;
            top: calc(50% - 3px);
            left: calc(50% - 3px);
            background-color: #FFFFFF;
            opacity: 0.5;
            z-index: 1;
            
            border-radius: 3px;
            -moz-border-radius: 3px;
            -webkit-border-radius: 3px;
        }
        
        .left-pointer.vr {
            left: calc(25% - 3px);
        }
        
        .right-pointer {
            width: 6px;
            height: 6px;
            position: absolute;
            display: block;
            top: calc(50% - 3px);
            left: calc(75% - 3px);
            background-color: #FFFFFF;
            opacity: 0.5;
            z-index: 1;
            
            border-radius: 3px;
            -moz-border-radius: 3px;
            -webkit-border-radius: 3px;
        }
    `]
})
export class Vg360 implements OnInit {
    elem:HTMLElement;
    video:any;
    api:VgAPI;

    raycaster:THREE.Raycaster;
    camera:THREE.PerspectiveCamera;
    scene:THREE.Scene;
    leftScene:THREE.Scene;
    rightScene:THREE.Scene;
    renderer:THREE.WebGLRenderer;
    leftRenderer:THREE.CSS3DRenderer;
    rightRenderer:THREE.CSS3DRenderer;
    container:any;
    cssContainer:any;
    controls:any;
    effect:any;
    intersected:any;
    objects:Array<any> = [];

    onPointerDownPointerX:number = 0;
    onPointerDownPointerY:number = 0;
    onPointerDownLon:number = 0;
    onPointerDownLat:number = 0;
    lat:number = 0;
    lon:number = 0;
    phi:number = 0;
    theta:number = 0;
    distance:number = 500;

    renderWidth:number = 1;
    renderHeight:number = 1;

    isUserInteracting:boolean = false;

    @Input('vr') vr:boolean = false;
    @Input('pointer') pointer:boolean = false;
    @Input('hotSpots') hotSpots:Array<IHotSpot>;

    @Output() onEnterHotSpot:EventEmitter<IHotSpot> = new EventEmitter();
    @Output() onLeaveHotSpot:EventEmitter<IHotSpot> = new EventEmitter();

    constructor(ref:ElementRef, api:VgAPI) {
        this.api = api;
        this.elem = ref.nativeElement;
    }

    ngOnInit() {
        this.createContainer();
        this.createScene();
        this.createHotSpots();
        this.createControls();
        this.createVR();

        this.animate();

        window.addEventListener('resize', this.onResize.bind(this));
    }

    createContainer() {
        this.container = this.elem.querySelector('#container');
        this.cssContainer = this.elem.querySelector('#css-container');
        this.video = this.elem.querySelector('video');
        this.video.onloadedmetadata = this.onLoadMetadata.bind(this);
        this.elem.removeChild(this.video);
    }
    
    createScene() {
        var texture:THREE.VideoTexture = new THREE.VideoTexture(this.video);
        texture.minFilter = THREE.LinearFilter;
        texture.format = THREE.RGBFormat;

        var geometry = new THREE.SphereBufferGeometry(500, 60, 40);
        geometry.scale(-1, 1, 1);

        var material:THREE.MeshBasicMaterial = new THREE.MeshBasicMaterial({map: texture});

        this.camera = new THREE.PerspectiveCamera(75, 16 / 9, 1, 1100);

        this.scene = new THREE.Scene();

        var mesh:THREE.Mesh = new THREE.Mesh(geometry, material);

        this.scene.add(mesh);

        this.renderer = new THREE.WebGLRenderer({alpha:true});
        this.renderer.setPixelRatio(window.devicePixelRatio);
        this.renderer.setSize(this.renderWidth, this.renderHeight);

        this.container.appendChild(this.renderer.domElement);
    }
    
    createHotSpots() {
        if (this.hotSpots) {
            var objMaterial:THREE.MeshBasicMaterial = new THREE.MeshBasicMaterial({transparent: true, opacity: 0});

            this.raycaster = new THREE.Raycaster();

            this.leftScene = new THREE.Scene();
            this.rightScene = new THREE.Scene();

            for (var i=0, l=this.hotSpots.length; i<l; i++) {
                var item = this.createCSS3DObject(this.hotSpots[i], true);

                if (this.vr) {
                    this.rightScene.add(this.createCSS3DObject(this.hotSpots[i], true));
                }

                this.leftScene.add(this.createCSS3DObject(this.hotSpots[i]));

                var objGeo = new THREE.PlaneGeometry(100, 100);
                var objMesh:THREE.Mesh = new THREE.Mesh(objGeo, objMaterial);
                objMesh.position.copy(item.position);
                objMesh.rotation.copy(item.rotation);
                objMesh.scale.copy(item.scale);
                (<any>objMesh).hotSpot = this.hotSpots[i];
                this.scene.add(objMesh);

                this.objects.push(objMesh);
            }

            this.leftRenderer = new THREE.CSS3DRenderer();
            this.leftRenderer.setSize(this.renderWidth, this.renderHeight);
            this.leftRenderer.domElement.style.position = 'absolute';
            this.leftRenderer.domElement.style.top = 0;
            this.leftRenderer.domElement.style.pointerEvents = 'none';

            this.cssContainer.appendChild(this.leftRenderer.domElement);

            if (this.vr) {
                this.rightRenderer = new THREE.CSS3DRenderer();
                this.rightRenderer.setSize(this.renderWidth, this.renderHeight);
                this.rightRenderer.domElement.style.position = 'absolute';
                this.rightRenderer.domElement.style.top = 0;
                this.rightRenderer.domElement.style.left = this.renderWidth / 2 + 'px';
                this.rightRenderer.domElement.style.pointerEvents = 'none';

                this.cssContainer.appendChild(this.rightRenderer.domElement);
            }
        }
    }

    createCSS3DObject(hs:IHotSpot, clone:boolean = false):Object3D {
        var obj:THREE.CSS3DObject;

        if (clone) {
            if (hs.elementClone) {
                obj = new THREE.CSS3DObject(hs.elementClone);
            }
            else {
                obj = new THREE.CSS3DObject(hs.element.cloneNode(true));
            }
        }
        else {
            obj = new THREE.CSS3DObject(hs.element);
        }

        obj.position.set(
            hs.position.x,
            hs.position.y,
            hs.position.z
        );
        obj.rotation.x = hs.rotation.x;
        obj.rotation.y = hs.rotation.y;
        obj.rotation.z = hs.rotation.z;

        return <Object3D>obj;
    }
    
    createControls() {
        if (VgUtils.isMobileDevice()) {
            this.controls = new THREE.DeviceOrientationControls(this.camera, true);
            this.controls.update();
        }
        else {
            this.controls = new THREE.OrbitControls(this.camera, this.renderer.domElement);
            this.controls.target.set(
                this.camera.position.x + 1,
                this.camera.position.y,
                this.camera.position.z
            );

            this.camera.lookAt(new THREE.Vector3(0,180,0));

            this.controls.enableZoom = false;
        }
    }
    
    createVR() {
        if (this.vr) {
            this.effect = new THREE.CardboardEffect(this.renderer);
            this.effect.setSize(this.renderWidth, this.renderHeight);
        }
    }

    onLoadMetadata() {
        this.scaleRender();
    }

    scaleRender() {
        var scaleRatio:number = this.api.videogularElement.clientWidth / this.video.videoWidth;

        this.renderWidth = this.api.videogularElement.clientWidth;
        this.renderHeight = this.video.videoHeight * scaleRatio;

        this.renderer.setSize(this.renderWidth, this.renderHeight);

        if (this.hotSpots) {
            let w:number = this.renderWidth;

            if (this.vr) {
                w = this.renderWidth / 2;

                this.rightRenderer.setSize(w, this.renderHeight);
                this.rightRenderer.domElement.style.left = this.renderWidth / 2 + 'px';
            }

            this.leftRenderer.setSize(w, this.renderHeight);
        }

        if (this.vr) this.effect.setSize(this.renderWidth, this.renderHeight);
    }

    animate() {
        requestAnimationFrame(() => this.animate());

        this.controls.update();

        this.renderer.render(this.scene, this.camera);

        if (this.hotSpots) {
            this.leftRenderer.render(this.leftScene, this.camera);
            if (this.vr) this.rightRenderer.render(this.rightScene, this.camera);

            this.raycaster.setFromCamera(
                {
                    x: 0,
                    y: 0
                },
                this.camera
            );

            let intersections = this.raycaster.intersectObjects(this.objects);
            
            if (intersections.length) {
                if (this.intersected != intersections[0].object) {
                    this.intersected = intersections[0].object;
                    this.onEnterHotSpot.next(<IHotSpot>((<any>intersections[0].object).hotSpot));
                }
            }
            else {
                if (this.intersected) this.onLeaveHotSpot.next(<IHotSpot>(this.intersected.hotSpot));
                this.intersected = null;
            }
        }

        if (this.vr) this.effect.render(this.scene, this.camera);
    }

    onResize() {
        this.scaleRender();
    }
}

But it's still tough...

but we can

use our new

friend...

A-Frame wraps three.js and WebGL in HTML custom elements.

 

This enables web developers, designers, and artists to create 3D/VR scenes without having to learn WebGL’s complex low-level API.

source: A-Frame FAQ

  • A-Frame works with Custom Elements
     
  • Create declarative WebVR scenes
     
  • Support for headsets and controllers
     
  • You can combine it with your favourite framework!
  • Angular 2 component based video framework
  • Plugins like controls, ads, streaming and more
  • Cue points system to synchronize content
  • Extensible through plugins
  • Free and open source

Believe it or not, it's just...

< 225 TS LOC && < 70 HTML LOC

show me the code!

bootstrap

import{ NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';

import { VRPlayer }  from './app.component';

import { VgCore } from 'videogular2/core';
import { VgControlsModule } from 'videogular2/controls';

@NgModule({
    imports: [ BrowserModule, VgCore, VgControlsModule ],
    declarations: [ VRPlayer ],
    bootstrap: [ VRPlayer ],
    schemas: [CUSTOM_ELEMENTS_SCHEMA]
})
export class AppModule { }

We need to import CUSTOM_ELEMENTS_SCHEMA to use A-Frame Custom Elements in our NgModule

Create a component

import { Component } from '@angular/core';

@Component({
    selector: 'vr-player',
    templateUrl: 'app.component.html',
    styleUrls: ['app.component.scss']
})
export class VRPlayer {
    currentVideo: string = 'path/to/your-video-file.mp4';
}
<vg-player>
    <a-scene vr-mode-ui="enabled: true" (renderstart)="onAframeRenderStart()">
        <a-assets>
            <video [src]="currentVideo" vg-media id="video" preload="auto" crossorigin="anonymous" loop></video>
        </a-assets>

        <a-videosphere src="#video"></a-videosphere>

        <a-camera>
            <a-cursor color="#2E3A87"></a-cursor>
        </a-camera>
    </a-scene>
</vg-player>

Add some doors

<vg-player>
    <a-scene vr-mode-ui="enabled: true">
        <a-assets>
            <video [src]="currentVideo.url" vg-media id="video" preload="auto" crossorigin="anonymous" loop></video>
            <img id="ringImg" src="assets/images/ring1.png" width="512" height="512">
        </a-assets>

        <a-image
            *ngFor="let door of currentVideo.doors; let i=index"
            [id]="door.id"
            [attr.depth]="100 + i"
            [attr.position]="door.position"
            [attr.rotation]="door.rotation"
            src="#ringImg"
            scale="0 0 0"
            (mouseenter)="onMouseEnter($event, door)"
            (mouseleave)="onMouseLeave($event)">
        </a-image>

        <a-videosphere src="#video"></a-videosphere>

        <a-camera>
            <a-cursor color="#2E3A87"></a-cursor>
        </a-camera>
    </a-scene>
</vg-player>

To navigate between scenes

Create some data and event listeners

export class VRPlayer {
    currentVideo:<IVideo> = {
        id: 'v0',
        url: 'http://static.videogular.com/assets/videos/vr-route-0.mp4',
        doors: [
            {id: 'd1', position: '-3 2 -10', rotation: '0 0 0', goto: 'v1'}
        ]
    };
    
    videos:Array<IVideo> = [ /* ... */ ];
    
    onMouseEnter($event, door:VrDoor) {
        // Start 2 secs timer to change to the next video
        this.timeout = TimerObservable.create(2000).subscribe(
            () => {
                this.currentVideo = this.videos.filter(v => v.id === door.goto)[0];
            }
        );
    }
    
    onMouseLeave($event) {
        // Clear timer when mouse leaves
        this.timeout.unsubscribe();
    }
}

creating animations

Define the animation and the events

<a-image
    *ngFor="let door of currentVideo.doors; let i=index"
    [id]="door.id"
    [attr.depth]="100 + i"
    [attr.position]="door.position"
    [attr.rotation]="door.rotation"
    src="#ringImg"
    scale="0 0 0"
    (mouseenter)="onMouseEnter($event, door)"
    (mouseleave)="onMouseLeave($event)"
    animation__fadein="startEvents: vgStartFadeInAnimation; property: scale; dur: 2000; to: 1 1 1"
    animation__scale="startEvents: vgStartAnimation; pauseEvents: vgPauseAnimation; property: scale; dur: 2000; from: 1 1 1; to: 2 2 2"
    animation__visibility="startEvents: vgStartAnimation; pauseEvents: vgPauseAnimation; property: material.opacity; dur: 2000; from: 1; to: 0">
</a-image>

In this example we're using an animation library instead of A-Frame animations

Trigger the animations with a CustomEvent

export class VRPlayer {
    onMouseEnter($event, door:IVrDoor) {
        $event.target.dispatchEvent(new CustomEvent('vgStartAnimation'));
    
        this.timeout = TimerObservable.create(2000).subscribe(
            () => {
                this.currentVideo = this.videos.filter(v => v.id === door.goto)[0];
            }
        );
    }
    
    onMouseLeave($event) {
        $event.target.dispatchEvent(new CustomEvent('vgPauseAnimation'));
    
        // Send start and pause again to reset the scale and opacity
        $event.target.dispatchEvent(new CustomEvent('vgStartAnimation'));
        $event.target.dispatchEvent(new CustomEvent('vgPauseAnimation'));
    
        this.timeout.unsubscribe();
    }
}

You can also bind your animations!

<a-text
    *ngFor="let txt of currentVideo.texts; let i=index"
    color="#FFF"
    [id]="txt.id"
    [attr.depth]="10 + i"
    [attr.position]="txt.position"
    [attr.rotation]="txt.rotation"
    [attr.scale]="txt.scale"
    [attr.text]="txt.text"
    [attr.animation__visibility]="txt.opaAnim"
    [attr.animation__position]="txt.posAnim"
    opacity="0">
</a-text>

synchronizing content

You can use VTT tracks to synchronize data

VTT is an HTML5 standard to create tracks with metadata.

It's very similar to SRT files but it allows you to add JSON

WEBVTT FILE

stage-1
00:00:05.200 --> 00:00:08.800
{
    "title": "Stage 1: Altitude 1610m"
}

Add a VTT track

<vg-scrub-bar style="bottom: 0;">
    <vg-scrub-bar-current-time></vg-scrub-bar-current-time>
    <vg-scrub-bar-buffering-time></vg-scrub-bar-buffering-time>
    <vg-scrub-bar-cue-points [cuePoints]="metadataTrack.cues"></vg-scrub-bar-cue-points>
</vg-scrub-bar>

<div class="title" [ngClass]="{ 'hide': hideTitle }">{{ cuePointData.title }}</div>

<a-assets>
    <video [src]="currentVideo.url" vg-media id="video" preload="auto" crossorigin="anonymous" loop>
        <track [src]="currentVideo.track" kind="metadata" label="Cue Points" default
               #metadataTrack
               vgCuePoints
               (onEnterCuePoint)="onEnterCuePoint($event)"
               (onExitCuePoint)="onExitCuePoint($event)">
    </video>
    <img id="ringImg" src="assets/images/ring1.png" width="512" height="512">
</a-assets>

We've added also a scrub bar to display the current time, buffering and the cue points

Listen to the events

onEnterCuePoint($event) {
    this.hideTitle = false;
    this.cuePointData = JSON.parse($event.text);
}

onExitCuePoint($event) {
    this.hideTitle = true;

    // wait transition
    TimerObservable.create(500).subscribe(
        () => { this.cuePointData = {}; }
    );
}

Inside $event.text you receive the JSON object in plain text. You can parse it with JSON.parse()

summary

  • A-Frame will help you a lot
  • Angular and A-Frame work pretty well together
  • Use Videogular if you need some advanced features
  • Animations might be tricky, try some of the libraries out there

Summary

Links

Start today! It's fun!

make awesome things...

danke!

360 and VR Vídeo

By Raúl Jiménez

360 and VR Vídeo

360 and VR Vídeo for my talk at JS-Kongress.

  • 2,550