1
0
Fork 0
mirror of https://github.com/TerryCavanagh/VVVVVV.git synced 2025-01-10 19:09:45 +01:00

Work around SDL2 bug where VSync hint only applies on renderer creation

Ugh, this is terrible and stupid and I hate myself for it.

Anyway, since the SDL2 VSync hint only applies when the renderer is
created, we have to re-create the renderer whenever VSync is toggled.
However, this also means we need to re-create m_screenTexture as well,
AND call ResizeScreen() after that or else the letterbox/integer modes
won't be applied.

Unfortunately, this means that in main(), gameScreen.init() will create
a renderer only to be destroyed later by graphics.processVsync().
There's not much we can do about this. Fixing this would require putting
graphics.processVsync() before gameScreen.init(). However, in order to
know whether the user has VSync set, we would have to call
game.loadstats() first, but wait, we can't, because game.loadstats()
mutates gameScreen! Gahhhhhh!!!!

@leo60228 suggested to fix that problem (
https://github.com/TerryCavanagh/VVVVVV/pull/220#issuecomment-624217939
) by adding NULL checks to game.loadstats() and then calling it twice,
but then you're trading wastefully creating a renderer only to be
destroyed, for wastefully opening and parsing unlock.vvv twice instead
of once. In either case, you're doing something twice and wasting work.
This commit is contained in:
Misa 2020-06-19 14:12:56 -07:00 committed by Ethan Lee
parent d898597c1e
commit 5c7e869ee7
3 changed files with 25 additions and 0 deletions

View file

@ -3106,4 +3106,22 @@ Uint32 Graphics::crewcolourreal(int t)
void Graphics::processVsync()
{
SDL_SetHintWithPriority(SDL_HINT_RENDER_VSYNC, vsync ? "1" : "0", SDL_HINT_OVERRIDE);
// FIXME: Sigh... work around SDL2 bug where the VSync hint is only listened to at renderer creation
SDL_DestroyRenderer(screenbuffer->m_renderer);
screenbuffer->m_renderer = SDL_CreateRenderer(screenbuffer->m_window, -1, 0);
// Ugh, have to re-create m_screenTexture as well, otherwise the screen will be black...
SDL_DestroyTexture(screenbuffer->m_screenTexture);
// FIXME: This is duplicated from Screen::init()!
screenbuffer->m_screenTexture = SDL_CreateTexture(
screenbuffer->m_renderer,
SDL_PIXELFORMAT_ARGB8888,
SDL_TEXTUREACCESS_STREAMING,
320,
240
);
// Ugh, have to make sure to re-apply graphics options after doing the above, otherwise letterbox/integer won't be applied...
screenbuffer->ResizeScreen(-1, -1);
}

View file

@ -34,6 +34,7 @@ void Screen::init()
// Uncomment this next line when you need to debug -flibit
// SDL_SetHintWithPriority(SDL_HINT_RENDER_DRIVER, "software", SDL_HINT_OVERRIDE);
// FIXME: m_renderer is also created in Graphics::processVsync()!
SDL_CreateWindowAndRenderer(
640,
480,
@ -76,6 +77,7 @@ void Screen::init()
0x000000FF,
0xFF000000
);
// ALSO FIXME: This SDL_CreateTexture() is duplicated in Graphics::processVsync()!
m_screenTexture = SDL_CreateTexture(
m_renderer,
SDL_PIXELFORMAT_ARGB8888,

View file

@ -217,6 +217,11 @@ int main(int argc, char *argv[])
//Moved screensetting init here from main menu V2.1
game.loadstats();
// FIXME: Thanks to having to work around an SDL2 bug, this destroys the
// renderer created by Screen::init(), which is a bit wasteful!
// This is annoying to fix because we'd have to call gameScreen.init() after
// game.loadstats(), but game.loadstats() assumes gameScreen.init() is already called!
graphics.processVsync();
if (game.skipfakeload)