For the past few days I’ve been tackling a particularly sticky problem on the Android app that I’m working on, which involves rendering an Android WebView to an OpenGL texture in order to allow the web render to move around willy-nilly in OpenGL space allowing one to do all kinds of fancy things with the render.
Originally I figured this was a common problem that people had tackled before and there was a multitude of solutions for it and all I had to do was to pick one. However, Google failed me, and if I were to believe StackOverflow the problem has no solution.
However, in this circumstance I basically could not give up on the problem and had to find a solution, and find one I did.
Originally I was thinking to follow the advice given in the stack answer above and render the view to a Bitmap (a task which, owing to screenshot functionalities, is well researched and documented), and then pipe the Bitmap to an OpenGL texture using gluTexImage2D. However, as one can imagine, for large webpages this becomes infeasible, especially when you want your opengl thread to run in real-time.
It turns out that, after much late-night, bleary-eyed Google searching, there is in fact a solution, and a very elegant one at that. However it is only available from API level 15. This was an acceptable solution to me since we were targeting mainly Ice-Cream Sandwich phones anyway.
There are two magic classes that provided me with what I wanted: Surface and SurfaceTexture. I would consider them both to be very poorly documented, and the documentation would have you believe they are quite limited in their usage but actually they are quite powerful. For example, SurfaceTexture’s documentation states: “The image stream may come from either camera preview or video decode.” This made me think no other source was possible but, as I state below, you can actually connect Surface to SurfaceTexture and that opens up your possibilities a whole lot more.
The SurfaceTexture is basically your entry point into the OpenGL layer. It is initialised with an OpenGL texture id, and performs all of it’s rendering onto that texture.
The Surface class provides the abstraction required to perform View drawing onto hardware. It provides a Canvas object (through the lockCanvas method), which is basically what an Android View uses for all it’s drawing (the onDraw() method of View actually takes in a Canvas).
As it turns out, the Surface class takes in a SurfaceTexture in its constructor. As a side note, this makes sense, as Android performs all it’s native drawing in OpenGL anyway. It’s why I refused to believe there wasn’t a solution to this problem.
And that’s basically it. The steps to render your view to opengl:
- Initialise an OpenGL texture
- Within an OpenGL context construct a SurfaceTexture with the texture id. Use
SurfaceTexture.setDefaultBufferSize(int width, int height)to make sure you have enough space on the texture for the view to render. - Create a Surface constructed with the above SurfaceTexture.
- Within the View’s onDraw, use the Canvas returned by Surface.lockCanvas to do the view drawing. You can obviously do this with any View, and not just WebView. Plus Canvas has a whole bunch of drawing methods, allowing you to do funky, funky things.
Hope this helps.
Hi,
This sounds very interesting. Any chance you can share sample code?
for sample code check this link: http://stackoverflow.com/a/29066625/1713920
Helped Me A Lot!! Thanks!!
So what was the cause of your problems?
The Problem Is Getting The WebView’s Canvas And Interfacing It With The SurfaceTexture/Surface
I had the same problem. Would you mind posting the code that fixed it?
Yeah…Actually I Could’nt Solve It Fully…
However I’ve Posted The Question On StackOverFlow:
http://stackoverflow.com/questions/17265009/surfacetexture-modification-through-canvas
Pingback: SurfaceTexture Modification Through Canvas | BlogoSfera
Pingback: [Android Code]Rendering Android View To OpenGL ES Texture : Saurabh Diaries
http://stackoverflow.com/questions/19007247/how-to-record-screen-using-android-mediacodec
Great post!
I have been struggling with the same problem for a while now.
Did you eventually solve the problem? Could you please post a working example?
I was able to manage it to work and posted the working code here: https://github.com/ArtemBogush/AndroidViewToGLRendering
The solution is neat. But it doesn’t work well when there’s a video in the webview, see http://stackoverflow.com/questions/19273437/android-draw-youtube-video-on-surfacetexture . I found this on Google Developer “When you lock a Surface for Canvas access, the “CPU renderer” connects to the producer side of the BufferQueue and does not disconnect until the Surface is destroyed. Most other producers (like GLES) can be disconnected and reconnected to a Surface, but the Canvas-based “CPU renderer” cannot. This means you can’t draw on a surface with GLES or send it frames from a video decoder if you’ve ever locked it for a Canvas.” So, is there any workaround to render a webview with video to an openGL texture?
Did you find any solution on this ?
However, it doesn’t seem to work with websites that have WebGL in them
Or canvas inside