Tags

, ,

For the past few days I’ve been tackling a particularly sticky problem on the Android app that I’m working on, which involves rendering an Android WebView to an OpenGL texture in order to allow the web render to move around willy-nilly in OpenGL space allowing one to do all kinds of fancy things with the render.

Originally I figured this was a common problem that people had tackled before and there was a multitude of solutions for it and all I had to do was to pick one. However, Google failed me, and if I were to believe StackOverflow the problem has no solution.

However, in this circumstance I basically could not give up on the problem and had to find a solution, and find one I did.

Originally I was thinking to follow the advice given in the stack answer above and render the view to a Bitmap (a task which, owing to screenshot functionalities, is well researched and documented), and then pipe the Bitmap to an OpenGL texture using gluTexImage2D. However, as one can imagine, for large webpages this becomes infeasible, especially when you want your opengl thread to run in real-time.

It turns out that, after much late-night, bleary-eyed Google searching, there is in fact a solution, and a very elegant one at that. However it is only available from API level 15. This was an acceptable solution to me since we were targeting mainly Ice-Cream Sandwich phones anyway.

There are two magic classes that provided me with what I wanted: Surface and SurfaceTexture. I would consider them both to be very poorly documented, and the documentation would have you believe they are quite limited in their usage but actually they are quite powerful. For example, SurfaceTexture’s documentation states: “The image stream may come from either camera preview or video decode.”  This made me think no other source was possible but, as I state below, you can actually connect Surface to SurfaceTexture and that opens up your possibilities a whole lot more.

The SurfaceTexture is basically your entry point into the OpenGL layer. It is initialised with an OpenGL texture id, and performs all of it’s rendering onto that texture.

The Surface class provides the abstraction required to perform View drawing onto hardware. It provides a Canvas object (through the lockCanvas method), which is basically what an Android View uses for all it’s drawing (the onDraw() method of View actually takes in a Canvas).

As it turns out, the Surface class takes in a SurfaceTexture in its constructor. As a side note, this makes sense, as Android performs all it’s native drawing in OpenGL anyway. It’s why I refused to believe there wasn’t a solution to this problem.

And that’s basically it. The steps to render your view to opengl:

  1. Initialise an OpenGL texture
  2. Within an OpenGL context construct a SurfaceTexture with the texture id. Use SurfaceTexture.setDefaultBufferSize(int width, int height) to make sure you have enough space on the texture for the view to render.
  3. Create a Surface constructed with the above SurfaceTexture.
  4. Within the View’s onDraw, use the Canvas returned by Surface.lockCanvas to do the view drawing. You can obviously do this with any View, and not just WebView. Plus Canvas has a whole bunch of drawing methods, allowing you to do funky, funky things.

Hope this helps.