Skip to content
This repository has been archived by the owner on Jul 22, 2024. It is now read-only.

Firefox Reality internals: graphics pipeline

Randall E. Barker edited this page Jul 6, 2020 · 5 revisions

This documents the graphics pipeline on the Android-based self-contained VR platforms supported by Firefox Reality.

This article focus specifically on the pipeline from the browser component GeckoView to the HMD eye buffer.

1. Receiving the WebView output from Gecko

Gecko is the core web engine in Firefox and on Android-based platforms is used via the GeckoView API. This is very similar to a WebView API common on mobile platforms (e.g. iOS UIWebView / WKWebView or Android WebView). Instead of using a traditional iOS UIView or Android FrameView, Firefox Reality exposes a render Surface to GeckoView https://github.com/MozillaReality/FirefoxReality/blob/377cf2254318823145c1b69146579f49ddf73ba6/app/src/common/shared/org/mozilla/vrbrowser/ui/widgets/WindowWidget.java#L316. Firefox Reality uses SurfaceTextures on Android, which can be shared across different processes (the Firefox Reality main scene and Gecko Webview are on different processes). A GeckoView active tab directly renders all its contents to this exposed Surface.

2. Rendering the Android Surface with GeckoView contents to the Eye Buffer

The Android Surface (with contents from GeckoView) is a normal quad that is brought into the Firefox Reality main scene via 2D texture render operation in OpenGL (using GL_TEXTURE_EXTERNAL instead of GL_TEXTURE_2D). The quad is rendered into the Eye Buffer FBO exposed by the HMD SDKs. Textures are updated each frame by calling surfaceTexture->updateTexImage(). Android OS automatically handles the synchronization, updateTexImage does nothing if new contents are not available, or automatically updates the buffer if GeckoView has rendered new contents.

On the Oculus platform, with the implementation of TimeWarp Layers the rendering stack is more optimized. Firefox Reality directly exposes the swapChain surface from an Oculus Layer to GeckoView. The surface is created here: https://github.com/MozillaReality/FirefoxReality/blob/377cf2254318823145c1b69146579f49ddf73ba6/app/src/oculusvr/cpp/DeviceDelegateOculusVR.cpp#L303. Thanks to this the GeckoView SurfaceTexture is sampled once, and directly on the Oculus VRCompositor. Implementing layers in this way obviates the need for any extra blits, whereas previously the texture was sampled twice: SurfaceTexture → Eye Buffer FBO → VRCompositor distortion. With layers, the VRCompositor directly uses the SurfaceTexture with a single sample thanks to the TimeWarpLayers API.

3. Rendering a video media stream

A video media stream is directly rendered from Gecko on the same SurfaceTexture outlined in the previous steps. Gecko renders a video on the same surface used for the web contents, both when the video is just a "iframe" of a web or when the video is in fullscreen mode.

On Firefox Reality when the users selects a video projection from a fullscreened webpage the following is done:

  1. Resize the window so that the window has exactly the same contents of the video size in pixels. For example, for a window FullScreen size of 1600x1200 and a video size of 1280×720 the window would be resized to 1280x720. If the video was 4K the window (and it's TimeWarp Layer surface) would be resized to 4K. This is done to get the best quality for the video and to avoid any scaling in the process which may cause aliasing issues. This resize is only done once, when the VR video presentation starts. It's not an OpenGL scale but a Surface resize. It's done in https://github.com/MozillaReality/FirefoxReality/blob/377cf2254318823145c1b69146579f49ddf73ba6/app/src/common/shared/org/mozilla/vrbrowser/ui/widgets/NavigationBarWidget.java#L507
  2. The Browser surface is mapped to a 360°/180° sphere instead of a quad. On Oculus Firefox Reality reuses the same swapChain used for the Quad mode, saving swapChain creation overhead. The reuse is implemented in https://github.com/MozillaReality/FirefoxReality/blob/377cf2254318823145c1b69146579f49ddf73ba6/app/src/oculusvr/cpp/DeviceDelegateOculusVR.cpp#L540
  3. In order to map the Surface to a sphere:
  4. Finally, Firefox Reality takes some actions to control videos and events to detect media playback changes https://github.com/MozillaReality/FirefoxReality/blob/main/app/src/common/shared/org/mozilla/vrbrowser/browser/Media.java but these don't do any transformations on the video stream texture, just playback state changes. The Gecko-side implementation of this can be seen in this pull request https://phabricator.services.mozilla.com/D9026.

4. Video pipeline internals of the GeckoView

This is a black box for Firefox Reality. As outlined above, Firefox Reality just uses the video contents available on the WebView surface. No extra operations are performed other than mapping it to a sphere with the correct texture coordinates.

Further documentation on this part of the pipeline will be added at a later date.

Clone this wiki locally