[CV] OpenGL and Qt: Video as a Texture

Welcome,

Because it was a pain to find out exactly how to use the QT video decoder to get the texture of an video and use it in openGL, i will show it here. Its much easier than u might think. For me i took quite long, because i found nothing useful in the net and was new to QT.

First we take a look at the used QtClasses we use:

  • QMediaplayer (plays/decodes our movie)
  • QAbstractVideoSurface (receives the frames)
  • QMediaPlaylist (contains our playlist)

The first thing is directly from the Qt Homepage, it shows how the Mediaplayer is used:

playlist = new QMediaPlaylist;
playlist->addMedia(QUrl("https://example.com/movie1.mp4"));
playlist->addMedia(QUrl("https://example.com/movie2.mp4"));
playlist->addMedia(QUrl("https://example.com/movie3.mp4"));
playlist->setCurrentIndex(1);

player = new QMediaPlayer;
player->setPlaylist(playlist);

videoWidget = new QVideoWidget;
player->setVideoOutput(videoWidget);
videoWidget->show();

player->play();

Thats pretty much straight forward, nothing to explain here. It just shows the video in the QVideoWidget. To access the Frame one by one, we need to set our own VideoOutput.

QAbstractVideoSurface  has two methods, which we need to implement. The first one is the supportedPixelFormats() method, which returns the pixelFormats which we can handle in our VideoSurface. The second one is the interesting part. It is called every time the QMediaplayer got a new frame from the video. This frame is called QVideoFrame and has some information attached, like pixelformat, metadata, dimension…

To access the image inside the QVideoFrame we need to map the content, by calling the map() method. This makes the image data available in memory, we can read from. While the image is mapped we can access the bits raw image with the bits() method. After using this data we need to unmap() the image. So that’s all the magic by now.

Pseudo Code: Class derived from QAbstractVideoSurface

explicit VideoFrameSurface(const QString& filename, QObject *parent = 0)
{
QImage image = QImage(100, 100, QImage::Format_ARGB32);
	image.fill(Qt::red);

        MyImageBuffer newBuffer; //ARGB Buffer
	memcpy(&newBuffer->getData(), image.bits(), newBuffer->getSizeInBytes();

	auto playlist = new QMediaPlaylist(this);	
	playlist->setPlaybackMode(QMediaPlaylist::PlaybackMode::Loop);
	playlist->addMedia(QUrl::fromLocalFile(filename));

	mediaPlayer = new QMediaPlayer(this);
	mediaPlayer->setVideoOutput(this);
	mediaPlayer->setPlaylist(playlist);
	mediaPlayer->setMuted(true);
	mediaPlayer->play();
}

bool VideoFrameSurface::present(const QVideoFrame& frame)
{
	if(frame.isValid())
	{
		QVideoFrame videoFrame(frame);

		if(videoFrame.map(QAbstractVideoBuffer::ReadOnly))
		{
			auto newBuffer = MyImageBuffer (frame.width(), frame.height(), false);
			memcpy(&newBuffer->getData(), videoFrame.bits(), newBuffer->getSizeInBytes());

			{
				std::lock_guard<std::mutex> lock(d->lock);
				currentBuffer = newBuffer;
			}
		}

		videoFrame.unmap();
	}

	return true;
}
QList VideoFrameSurface::supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const
{
	// Return the formats you will support
	return QList()
		<< QVideoFrame::Format_RGB32
		<< QVideoFrame::Format_ARGB32
		;
}

Update the openGL texture.

VideoTexture::VideoTexture(QGLWidget* context, QString file)
{
	this->context = context;
	context->context()->makeCurrent();
	initializedTexture = false;
	frameProcessor = new FrameProcessor();
	QMediaPlaylist* playlist = new QMediaPlaylist();
	playlist->setPlaybackMode(QMediaPlaylist::PlaybackMode::Loop);
	playlist->addMedia(QUrl::fromLocalFile(file));
	player.setVideoOutput(frameProcessor);
	player.setPlaylist(playlist);
	player.setMuted(true);
	connect(frameProcessor, SIGNAL(frameUpdated(QVideoFrame)), this, SLOT(updateTexture(QVideoFrame)));
	player.play();

}


VideoTexture::~VideoTexture(void)
{
	context->deleteTexture(texture);
	delete frameProcessor;
}

void VideoTexture::updateTexture(QVideoFrame frame)
{

	if (frame.isMapped())
	{	
		if (initializedTexture)
		{	
		context->context()->makeCurrent();
		glBindTexture(GL_TEXTURE_2D, texture); 	
		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, frame.width(), frame.height(),0, GL_BGRA, GL_UNSIGNED_BYTE, frame.bits());	
		}
		else
		{			
		QImage::Format  format =  frame.imageFormatFromPixelFormat(frame.pixelFormat());		
		QImage image = QImage(frame.bits(), frame.width(), frame.height(), frame.bytesPerLine(), format);
		context->context()->makeCurrent();
		texture = context->bindTexture(image, GL_TEXTURE_2D);		
		initializedTexture = true;
		}
			
	}

}

GLuint VideoTexture::getGLTextureId(){
	return this->texture;
}

This post has been migrated from my old blog, because it was refereed often.

Related Posts

Leave a reply