I know earlier I removed the gameScreen extern in favor of using
screenbuffer, but that was only to be consistent. After further
consideration, I have found that it's actually really stupid.
There's no reason to be accessing it through screenbuffer, and it's
probably an artifact of 2.0-2.2 passing stack-allocated otherwise-global
classes everywhere through function arguments. Also, it leads to stupid
bugs where screenbuffer could potentially be NULL, which has already
resulted in various annoying crashes in the past. Although those could
be fixed by simply initializing screenbuffer at the very top of main(),
but, why not just scrap the whole thing anyway?
So that's what I'm doing.
As a nice side effect, I've removed the transitive include of Screen.h
from Graphics.h. This could've been done already since it only includes
it for the pointer anyway, but it's still good to do it now.
It's been long overdue that this variable be named properly. 2.2 added
integer scaling mode (thanks Ethan), 2.3 renamed it to scaling mode. Now
2.4 will properly call it what it is so people won't be confused by it.
The ScreenSettings struct member is renamed from stretch to scalingMode
along with the Screen class member being renamed, as well as the
toggleStretchMode function being renamed to toggleScalingMode as well.
Unfortunately, due to compatibility, we can't change the <stretch> XML
tag.
SDL just got an API to toggle VSync without having to tear down the
renderer ( libsdl-org/SDL#4157 ). We can remove the workaround and use
that instead. For now, we are putting it behind an ifdef until SDL
2.0.18 officially releases in November.
Fixes#831.
This fixes the color ordering of every SDL_Surface in the game.
Basically, images need to be loaded in ABGR format (except if they don't
have alpha, then you use RGB? I'm not sure what's going on here), and
then they will be converted to RGB/RGBA afterwards.
Due to the surfaces actually being BGR/BGRA, the game used to use
getRGBA/getRGB to swap the colors back around to BGRA/BGR, but I've
fixed those too.
Previously, Flip Mode rendering had to be complicated and allocate
another buffer to call FlipSurfaceVerticle, and it was just a mess.
Instead, why not just do SDL_RenderCopyEx, and let SDL flip the screen
for us? This ends up pretty massively simplifying the rendering code.
This fixes a bug where using the fullscreen toggle keybind (Alt+Enter,
Alt+F, or F11) wouldn't update the color of the "resize to nearest" menu
option. The color doesn't functionally change anything - the option
still won't work, and will still have the message telling you that you
need to be in windowed mode when you move your menu selection to it -
but it's an easy inconsistency to fix; just move the menu recreation in
to Screen::toggleFullScreen() itself.
So, the codebase was kind of undecided about who is responsible for
initializing the parameters passed to FILESYSTEM_loadFileToMemory() - is
it the caller? Is it FILESYSTEM_loadFileToMemory()? Sometimes callers
would initialize one variable but not the other, and it was always a
toss-up whether or not FILESYSTEM_loadFileToMemory() would end up
initializing everything in the end.
All of this is to say that the game dereferences an uninitialized
pointer if it can't load a sound effect. Which is bad. Now, I could
either fix that single case, or fix every case. Judging by the title of
this commit, you can infer that I decided to fix every case - fixing
every case means not just all cases that currently exist (which, as far
as I know, is only the sound effect one), but all cases that could exist
in the future.
So, FILESYSTEM_loadFileToMemory() is now guaranteed to initialize its
parameters even if the file fails to be loaded. This is better than
passing the responsibility to the caller anyway, because if the caller
initialized it, then that would be wasted work if the file succeeds
anyway because FILESYSTEM_loadFileToMemory() will overwrite it, and if
the file fails to load, well that's when the variables get initialized
anyway.
Default function arguments are the devil, and it's better to be more
explicit about what you're passing into the function. Also because we
might become C-only in the future and to help faciliate that, we should
get rid of C++-isms like default function arguments now.
Screen.cpp wasn't explicitly including SDL.h, instead relying on
Screen.h to include it.
It was also relying on SDL.h to include stdio.h on Linux, which breaks
because SDL.h doesn't include stdio.h on Windows. So stdio.h is now
explicitly included as well.
stdlib.h is not used in this file.
ClearSurface() is less verbose than doing it the old way, and also
conveys intent clearer. Plus, some of these FillRect()s had hardcoded
width and height values, whereas ClearSurface() doesn't - meaning this
change also has better future-proofing, in case the widths and heights
of the surfaces involved change in the future.
Apparently in C, if you have `void test();`, it's completely okay to do
`test(2);`. The function will take in the argument, but just discard it
and throw it away. It's like a trash can, and a rude one at that. If you
declare it like `void test(void);`, this is prevented.
This is not a problem in C++ - doing `void test();` and `test(2);` is
guaranteed to result in a compile error (this also means that right now,
at least in all `.cpp` files, nobody is ever calling a void parameter
function with arguments and having their arguments be thrown away).
However, we may not be using C++ in the future, so I just want to lay
down the precedent that if a function takes in no arguments, you must
explicitly declare it as such.
I would've added `-Wstrict-prototypes`, but it produces an annoying
warning message saying it doesn't work in C++ mode if you're compiling
in C++ mode. So it can be added later.
These FIXME comments are still correct about code duplication, but
they're incorrect about where exactly the original code is after the
original code got moved around. So I've fixed them to refer to the
correct locations.
We really should get around to de-duplicating the code mentioned in
these comments...
This works on macOS, Wayland, and a few more esoteric platforms. X11
doesn't have a concept of DPI-awareness. Note that with this flag
SDL_GetWindowSize() isn't guaranteed to return the actual window size.
During 2.3 development, there's been a gradual shift to using SDL stdlib
functions instead of libc functions, but there are still some libc
functions (or the same libc function but from the STL) in the code.
Well, this patch replaces all the rest of them in one fell swoop.
SDL's stdlib can replace most of these, but its SDL_min() and SDL_max()
are inadequate - they aren't really functions, they're more like macros
with a nasty penchant for double-evaluation. So I just made my own
VVV_min() and VVV_max() functions and placed them in Maths.h instead,
then replaced all the previous usages of min(), max(), std::min(),
std::max(), SDL_min(), and SDL_max() with VVV_min() and VVV_max().
Additionally, there's no SDL_isxdigit(), so I just implemented my own
VVV_isxdigit().
SDL has SDL_malloc() and SDL_free(), but they have some refcounting
built in to them, so in order to use them with LodePNG, I have to
replace the malloc() and free() that LodePNG uses. Which isn't too hard,
I did it in a new file called ThirdPartyDeps.c, and LodePNG is now
compiled with the LODEPNG_NO_COMPILE_ALLOCATORS definition.
Lastly, I also refactored the awful strcpy() and strcat() usages in
PLATFORM_migrateSaveData() to use SDL_snprintf() instead. I know save
migration is getting axed in 2.4, but it still bothers me to have
something like that in the codebase otherwise.
Without further ado, here is the full list of functions that the
codebase now uses:
- SDL_strlcpy() instead of strcpy()
- SDL_strlcat() instead of strcat()
- SDL_snprintf() instead of sprintf(), strcpy(), or strcat() (see above)
- VVV_min() instead of min(), std::min(), or SDL_min()
- VVV_max() instead of max(), std::max(), or SDL_max()
- VVV_isxdigit() instead of isxdigit()
- SDL_strcmp() instead of strcmp()
- SDL_strcasecmp() instead of strcasecmp() or Win32 strcmpi()
- SDL_strstr() instead of strstr()
- SDL_strlen() instead of strlen()
- SDL_sscanf() instead of sscanf()
- SDL_getenv() instead of getenv()
- SDL_malloc() instead of malloc() (replacing in LodePNG as well)
- SDL_free() instead of free() (replacing in LodePNG as well)
Another step to fix the bug #556 is to allow Game::savestats() to accept
a pointer to an existing ScreenSettings struct. This entails refactoring
Game::savesettings() and Game::serializesettings() to accept the
function as well, along with adding Screen::GetSettings() so the
settings of the current Screen can be easily grabbed.
Okay, so basically here's the include layout that this game now
consistently uses:
[The "main" header file, if any (e.g. Graphics.h for Graphics.cpp)]
[blank line]
[All system includes, such as tinyxml2/physfs/utfcpp/SDL]
[blank line]
[All project includes, such as Game.h/Entity.h/etc.]
And if applicable, another blank line, and then some special-case
include screwy stuff (take a look at editor.cpp or FileSystemUtils.cpp,
for example, they have ifdefs and defines with their includes).
The problem we're running into is entirely contained in the Screen - we need to
either decouple graphics context init from Screen::init or we need to take out
the screenbuffer interaction from loadstats (which I'm more in favor of since we
can just pull the config values and pass them to Screen::init later).
If you want your game window to simply be exactly 320x240, or 640x480,
or 960x720 etc. then it's really annoying that there's no easy way to do
this (to clarify, this is different from integer mode, which controls
the size of the game INSIDE the window). The easiest way would be having
to close the game, go into unlock.vvv, and edit the window size
manually. VCE has a 1x/2x/3x/4x graphics option to solve this, although
it does not account for actual monitor size (those 1x/2x/3x/4x modes are
all you get, whether or not you have a monitor too small for some of
them or too big for any of them to be what you want).
I discussed this with flibit, and he said that VCE's approach (if it
accounted for monitor size) wouldn't work on high-retina displays or
high DPIs, because getting the actual multiplier to account for those
monitors is kind of a pain. So the next best thing would be to add an
option that resizes to the nearest perfect multiple of 320x240. That way
you could simply resize the window and let the game correct any
imperfect dimensions automatically.
Ugh, this is terrible and stupid and I hate myself for it.
Anyway, since the SDL2 VSync hint only applies when the renderer is
created, we have to re-create the renderer whenever VSync is toggled.
However, this also means we need to re-create m_screenTexture as well,
AND call ResizeScreen() after that or else the letterbox/integer modes
won't be applied.
Unfortunately, this means that in main(), gameScreen.init() will create
a renderer only to be destroyed later by graphics.processVsync().
There's not much we can do about this. Fixing this would require putting
graphics.processVsync() before gameScreen.init(). However, in order to
know whether the user has VSync set, we would have to call
game.loadstats() first, but wait, we can't, because game.loadstats()
mutates gameScreen! Gahhhhhh!!!!
@leo60228 suggested to fix that problem (
https://github.com/TerryCavanagh/VVVVVV/pull/220#issuecomment-624217939
) by adding NULL checks to game.loadstats() and then calling it twice,
but then you're trading wastefully creating a renderer only to be
destroyed, for wastefully opening and parsing unlock.vvv twice instead
of once. In either case, you're doing something twice and wasting work.
Instead of passing the error codes out of the function, just handle the
errors directly as they happen, and fail gracefully if something goes
wrong instead of continuing.
Whenever you would press Alt+Enter, or Alt+F, or on macOS Command+Enter,
or on macOS Command+F, or F11, the game would print this useless error
message to console, every single time: "Error: failed: " and it would
concatenate SDL_GetError() after it, but most of the time SDL_GetError()
is blank, so it would print just that.
Instead, what the fullscreen shortcut will now do is check the result of
the relevant SDL functions, BEFORE it decides to print an error message.
And when it DOES print an error message, it will be less vague and will
say instead "Error: toggling fullscreen failed: <output of
SDL_GetError()>".
This means Screen::ResizeScreen() and Screen::toggleFullScreen() are now
int-returning functions. Ideally, every function interfacing with SDL
would return an error code, but that's too much for this simple patch.
Additionally, I took the opportunity to clean up the surrounding
formatting of the code a bit, most notably dedenting the
keypress-clearing stuff by one tab level, converting the
shortcut-handling code to spaces, and removing commented-out code.