Lesson 35 - Playing Movie Files In WebGL

 

Introducton

This lesson is pretty simple as three.js does virtually all of the work. Basically, the video is loaded into a standard HTML5 video element. From that element, three.js maps the rendered image onto a three.js Texture which is refreshed in each frame, thereby rendering the video.

Initializing The Video

The first step is to load the video.

video = document.createElement( 'video' );
// video.type = ' video/ogg; codecs="theora, vorbis" ';
video.src = "videos/Big_Buck_Bunny_Trailer_400p.ogv";
video.load();
video.play();
video.loop = true;

This simply creates the HTML5 video element and tells it to play. Of course, playing it could be delayed or triggered by some UI event, but for this demo we just start it. Note that the type of video is detected automatically so we don't have to set the type explicitly, but you can if you need to.

Then we create a HTML5 canvas onto which we will render the video. Note that the size of the canvas has to match the video's aspect ration or distorion will occur.

videoCanvas = document.createElement( 'canvas' );
videoCanvas.width = 720;
videoCanvas.height = 400;

Next we create the canvas onto which we will render the video. Just in case the video can't load, we set a background color. We could also have set the poster attribute instead or in addiion to the background color. See more info about the video element here.

videoCanvasContext = videoImage.getContext( '2d' );
videoCanvasContext.fillStyle = '#000000';
videoCanvasContext.fillRect( 0, 0, videoImage.width, videoImage.height );

We then create the texture, passing the video "canvas" as the source for the texture. We then create the material using that texture. Note that we make the material double-sided, which wouldn't normally be needed, but we do it here just for amusement.

videoTexture = new THREE.Texture( videoCanvas );
videoTexture.minFilter = THREE.LinearFilter;
videoTexture.magFilter = THREE.LinearFilter;

var videoMaterial = new THREE.MeshBasicMaterial( { map: videoTexture, side:THREE.DoubleSide } );

Finally, we create a THREE.PlaneGeometry onto which we map the material as our "movie screen". Again, the dimensions of the "screen" have to match the aspect ratio of the video or it will naturally be distorted.

var movieGeometry = new THREE.PlaneGeometry( 21.6, 12, 4, 4 );
movieScreen = new THREE.Mesh( movieGeometry, videoMaterial );
movieScreen.position.set(0,0,0);
gfxScene.add(movieScreen);

Rendering the Video

But wait! The video never got rendered? That happens in the animation handler:

if ( video.readyState === video.HAVE_ENOUGH_DATA ) {
    videoImageContext.drawImage( video, 0, 0 );
    if ( videoTexture )
        videoTexture.needsUpdate = true;
}

yRotation += 0.01;
movieScreen.rotation.set(0, yRotation, 0);

Basically, we just keep copying the video "image" onto our video-canvas at each frame refresh. We have to check for enough data and that the videoTexture exists because our animation handler may be called before the video has completed loading.

And that's it! Click on this link to see the actual rendered demo in all it's moving glory!

As always, the original sources are on github here.



About Us | Contact Us | ©2017 Geo-F/X