For future PRs, it'll be very nice to have full control over how VVVVVV
gets drawn to the window. This means we can use the entire window size
for things like touch input, drawing borders, or anything we want.
I have this annoying issue where the game will open on the wrong monitor
in fullscreen mode, because that monitor is considered to be display 0,
whereas the primary monitor I want is display 1.
To mitigate this somewhat, the game now stores the display index that it
was closed on, and will save it to settings. Then the next time the game
opens, it will open on that display index. This should work pretty well
as long as you aren't changing your monitor setup constantly.
Of course, none of this applies if your window manager is busted. For
example, on GNOME Wayland, which is what I use, in windowed mode the
game will always open on the monitor the cursor is on, and it won't even
be centered in the monitor. But it works fine if I use XWayland via
SDL_VIDEODRIVER=x11.
Previously, the game would not store the size of the window itself, and
would always call SDL_GetRendererOutputSize() (via
Screen::GetWindowSize()) to figure out the size of the window. The only
problem is, this would return the size of the whole monitor if the game
was in fullscreen mode. And the only place where the original windowed
mode size was stored would be in SDL itself, but that wouldn't persist
after the game was closed.
So, if you exited the game while in fullscreen mode, then your window
size would get set to the size of your monitor (1920 by 1080 in my
case). Then when you opened the game and toggled fullscreen off, it
would go back to the default window size, which is 640 by 480.
This is made worse, however, if you were in forced fullscreen mode when
you previously exited the game in windowed mode. In that case, the game
saves the size of 1920 by 1080, but doesn't save that you were in
fullscreen mode, so opening the game not in forced fullscreen mode would
result in you having a 1920 by 1080 window, but in windowed mode.
Meaning that not even fullscreening and unfullscreening would put the
game window back to normal size.
The solution, of course, is to just store the window size ourselves,
like any other screen setting, and only use GetWindowSize() if needed.
And just to make things clear, I've also renamed the GetWindowSize()
function to GetScreenSize(), because if it was named "window" it could
lead one to think that it would always return the size of the screen in
windowed mode, when in fact it returns the size of the screen whatever
mode it is in - fullscreen size if in fullscreen mode and window size if
in windowed mode.
And doing this also fixes the FIXME above Screen::isForcedFullscreen().
This fixes the following bug that only occurs on Wayland: If the game is
configured to be fullscreened and in stretch mode, on startup, it won't
be in stretch mode. It will appear to be in letterbox mode, but the game
still thinks it's in stretch mode.
This is because during the ResizeScreen() call on startup, for whatever
reason, the window size will be reported to be the default size (640 by
480) instead of the screen resolution of the monitor, as one would
expect from being in fullscreen. It seems like when the game queries the
window size, the window isn't actually in fullscreen at that time, even
though this is after fullscreen has been set to true.
To fix this, I decided to always update the logical size before
SDL_RenderPresent() is called. To make this neater, I put the scaling
code in its own function named UpdateScaling().
This bug has existed since 2.3 and does not occur on X11. I tested this
on GNOME Wayland, and for testing it on X11, I used Openbox in a Xephyr
session while running VVVVVV with SDL_VIDEODRIVER=x11.
The style we have here is that functions with no arguments are to have
explicit `void` arguments, even though this doesn't apply in C++,
because it does apply in C.
Unfortunately, some have slipped through the cracks. This commit fixes
them.
Ever since VVVVVV was initially ported to C++ in 2.0, it has used surfaces from SDL. The downside is, that's all software rendering. This commit moves most things off of surfaces, and all into GPU, by using textures and SDL_Renderer.
Pixel-perfect collision has been kept by keeping a copy of sprites as surfaces. There's plans for pixel-perfect collision to use masks instead of reading pixel data directly, but that's out of scope for this commit.
- `graphics.reloadresources()` is now called later in `main`, because textures cannot be created without a renderer.
- This commit also removes a bunch of surface functions which are no longer needed.
- This also recaches target textures in certain places for d3d9.
- graphics.images was converted to a fixed-size array.
- fillbox and fillboxabs use SDL_RenderDrawRect instead of drawing an outline using four filled rectangles
- Update my name in the credits
This is mainly to make sure the game is definitely set to fullscreen in
Big Picture and on the Steam Deck, and to also remove windowed options
that wouldn't make sense if you're not on a desktop (toggling
fullscreen, resize to nearest). Those options would also be removed on
console and mobile too.
There's a bit of an annoying bug where if you launch the game in forced
fullscreen mode, but then exit and relaunch in normal mode, your game
will have fullscreen window sizes but it won't be fullscreen. This is
because forced fullscreen mode tries to preserve your non-forced
fullscreen setting, but due to the way window sizes are stored and
queried, it can't preserve the non-forced window size. This is a bit
difficult to work around, so I'm just putting in a FIXME here because we
can fix it later and I'd rather have a slightly buggy forced fullscreen
mode than not have one at all.
Closes#849.
This includes:
- Removing the constructor in favor of actually being able to see that
there's an actual function called being made initializing the struct
- Removing the use of a reference in Screen::init() in favor of using a
pointer
- Adding the struct qualifier everywhere (it's not much typing),
although technically you could typedef it in C, but I'd rather much
not typedef just to remove a tag qualifier
I know earlier I removed the gameScreen extern in favor of using
screenbuffer, but that was only to be consistent. After further
consideration, I have found that it's actually really stupid.
There's no reason to be accessing it through screenbuffer, and it's
probably an artifact of 2.0-2.2 passing stack-allocated otherwise-global
classes everywhere through function arguments. Also, it leads to stupid
bugs where screenbuffer could potentially be NULL, which has already
resulted in various annoying crashes in the past. Although those could
be fixed by simply initializing screenbuffer at the very top of main(),
but, why not just scrap the whole thing anyway?
So that's what I'm doing.
As a nice side effect, I've removed the transitive include of Screen.h
from Graphics.h. This could've been done already since it only includes
it for the pointer anyway, but it's still good to do it now.
It's been long overdue that this variable be named properly. 2.2 added
integer scaling mode (thanks Ethan), 2.3 renamed it to scaling mode. Now
2.4 will properly call it what it is so people won't be confused by it.
The ScreenSettings struct member is renamed from stretch to scalingMode
along with the Screen class member being renamed, as well as the
toggleStretchMode function being renamed to toggleScalingMode as well.
Unfortunately, due to compatibility, we can't change the <stretch> XML
tag.
SDL just got an API to toggle VSync without having to tear down the
renderer ( libsdl-org/SDL#4157 ). We can remove the workaround and use
that instead. For now, we are putting it behind an ifdef until SDL
2.0.18 officially releases in November.
Fixes#831.
Previously, Flip Mode rendering had to be complicated and allocate
another buffer to call FlipSurfaceVerticle, and it was just a mess.
Instead, why not just do SDL_RenderCopyEx, and let SDL flip the screen
for us? This ends up pretty massively simplifying the rendering code.
Apparently in C, if you have `void test();`, it's completely okay to do
`test(2);`. The function will take in the argument, but just discard it
and throw it away. It's like a trash can, and a rude one at that. If you
declare it like `void test(void);`, this is prevented.
This is not a problem in C++ - doing `void test();` and `test(2);` is
guaranteed to result in a compile error (this also means that right now,
at least in all `.cpp` files, nobody is ever calling a void parameter
function with arguments and having their arguments be thrown away).
However, we may not be using C++ in the future, so I just want to lay
down the precedent that if a function takes in no arguments, you must
explicitly declare it as such.
I would've added `-Wstrict-prototypes`, but it produces an annoying
warning message saying it doesn't work in C++ mode if you're compiling
in C++ mode. So it can be added later.
Another step to fix the bug #556 is to allow Game::savestats() to accept
a pointer to an existing ScreenSettings struct. This entails refactoring
Game::savesettings() and Game::serializesettings() to accept the
function as well, along with adding Screen::GetSettings() so the
settings of the current Screen can be easily grabbed.
The problem we're running into is entirely contained in the Screen - we need to
either decouple graphics context init from Screen::init or we need to take out
the screenbuffer interaction from loadstats (which I'm more in favor of since we
can just pull the config values and pass them to Screen::init later).
If you want your game window to simply be exactly 320x240, or 640x480,
or 960x720 etc. then it's really annoying that there's no easy way to do
this (to clarify, this is different from integer mode, which controls
the size of the game INSIDE the window). The easiest way would be having
to close the game, go into unlock.vvv, and edit the window size
manually. VCE has a 1x/2x/3x/4x graphics option to solve this, although
it does not account for actual monitor size (those 1x/2x/3x/4x modes are
all you get, whether or not you have a monitor too small for some of
them or too big for any of them to be what you want).
I discussed this with flibit, and he said that VCE's approach (if it
accounted for monitor size) wouldn't work on high-retina displays or
high DPIs, because getting the actual multiplier to account for those
monitors is kind of a pain. So the next best thing would be to add an
option that resizes to the nearest perfect multiple of 320x240. That way
you could simply resize the window and let the game correct any
imperfect dimensions automatically.
Instead of passing the error codes out of the function, just handle the
errors directly as they happen, and fail gracefully if something goes
wrong instead of continuing.
Whenever you would press Alt+Enter, or Alt+F, or on macOS Command+Enter,
or on macOS Command+F, or F11, the game would print this useless error
message to console, every single time: "Error: failed: " and it would
concatenate SDL_GetError() after it, but most of the time SDL_GetError()
is blank, so it would print just that.
Instead, what the fullscreen shortcut will now do is check the result of
the relevant SDL functions, BEFORE it decides to print an error message.
And when it DOES print an error message, it will be less vague and will
say instead "Error: toggling fullscreen failed: <output of
SDL_GetError()>".
This means Screen::ResizeScreen() and Screen::toggleFullScreen() are now
int-returning functions. Ideally, every function interfacing with SDL
would return an error code, but that's too much for this simple patch.
Additionally, I took the opportunity to clean up the surrounding
formatting of the code a bit, most notably dedenting the
keypress-clearing stuff by one tab level, converting the
shortcut-handling code to spaces, and removing commented-out code.