The problem we're running into is entirely contained in the Screen - we need to
either decouple graphics context init from Screen::init or we need to take out
the screenbuffer interaction from loadstats (which I'm more in favor of since we
can just pull the config values and pass them to Screen::init later).
If you want your game window to simply be exactly 320x240, or 640x480,
or 960x720 etc. then it's really annoying that there's no easy way to do
this (to clarify, this is different from integer mode, which controls
the size of the game INSIDE the window). The easiest way would be having
to close the game, go into unlock.vvv, and edit the window size
manually. VCE has a 1x/2x/3x/4x graphics option to solve this, although
it does not account for actual monitor size (those 1x/2x/3x/4x modes are
all you get, whether or not you have a monitor too small for some of
them or too big for any of them to be what you want).
I discussed this with flibit, and he said that VCE's approach (if it
accounted for monitor size) wouldn't work on high-retina displays or
high DPIs, because getting the actual multiplier to account for those
monitors is kind of a pain. So the next best thing would be to add an
option that resizes to the nearest perfect multiple of 320x240. That way
you could simply resize the window and let the game correct any
imperfect dimensions automatically.
Ugh, this is terrible and stupid and I hate myself for it.
Anyway, since the SDL2 VSync hint only applies when the renderer is
created, we have to re-create the renderer whenever VSync is toggled.
However, this also means we need to re-create m_screenTexture as well,
AND call ResizeScreen() after that or else the letterbox/integer modes
won't be applied.
Unfortunately, this means that in main(), gameScreen.init() will create
a renderer only to be destroyed later by graphics.processVsync().
There's not much we can do about this. Fixing this would require putting
graphics.processVsync() before gameScreen.init(). However, in order to
know whether the user has VSync set, we would have to call
game.loadstats() first, but wait, we can't, because game.loadstats()
mutates gameScreen! Gahhhhhh!!!!
@leo60228 suggested to fix that problem (
https://github.com/TerryCavanagh/VVVVVV/pull/220#issuecomment-624217939
) by adding NULL checks to game.loadstats() and then calling it twice,
but then you're trading wastefully creating a renderer only to be
destroyed, for wastefully opening and parsing unlock.vvv twice instead
of once. In either case, you're doing something twice and wasting work.
Instead of passing the error codes out of the function, just handle the
errors directly as they happen, and fail gracefully if something goes
wrong instead of continuing.
Whenever you would press Alt+Enter, or Alt+F, or on macOS Command+Enter,
or on macOS Command+F, or F11, the game would print this useless error
message to console, every single time: "Error: failed: " and it would
concatenate SDL_GetError() after it, but most of the time SDL_GetError()
is blank, so it would print just that.
Instead, what the fullscreen shortcut will now do is check the result of
the relevant SDL functions, BEFORE it decides to print an error message.
And when it DOES print an error message, it will be less vague and will
say instead "Error: toggling fullscreen failed: <output of
SDL_GetError()>".
This means Screen::ResizeScreen() and Screen::toggleFullScreen() are now
int-returning functions. Ideally, every function interfacing with SDL
would return an error code, but that's too much for this simple patch.
Additionally, I took the opportunity to clean up the surrounding
formatting of the code a bit, most notably dedenting the
keypress-clearing stuff by one tab level, converting the
shortcut-handling code to spaces, and removing commented-out code.