[CV] OpenGL and Qt: Video as a Texture


Because it was a pain to find out exactly how to use the QT video decoder to get the texture of an video and use it in openGL, i will show it here. Its much easier than u might think. For me i took quite long, because i found nothing useful in the net and was new to QT.

First we take a look at the used QtClasses we use:

  • QMediaplayer (plays/decodes our movie)
  • QAbstractVideoSurface (receives the frames)
  • QMediaPlaylist (contains our playlist)

The first thing is directly from the Qt Homepage, it shows how the Mediaplayer is used:

Thats pretty much straight forward, nothing to explain here. It just shows the video in the QVideoWidget. To access the Frame one by one, we need to set our own VideoOutput.

QAbstractVideoSurface ¬†has two methods, which we need to implement. The first one is the¬†supportedPixelFormats() method, which returns the pixelFormats which we can handle in our VideoSurface. The second one is the interesting part. It is called every time the QMediaplayer got a new frame from the video. This frame is called QVideoFrame and has some information attached, like pixelformat, metadata, dimension…

To access the image inside the QVideoFrame we need to map the content, by calling the map() method. This makes the image data available in memory, we can read from. While the image is mapped we can access the bits raw image with the bits() method. After using this data we need to unmap() the image. So that’s all the magic by now.

Update the openGL texture.

This post has been migrated from my old blog, because it was refereed often.

Related Posts

Leave a reply