A quick glance at PhysFS source code will show that PhysFS will bail if
PHYSFS_deinit() is called if it's not initialized.
"Bail" here just means setting an error code and returning early, so
it's not that bad. Still, it's the principle of the thing, and I just
want to ensure that FILESYSTEM_deinit() can be safely called no matter
if the filesystem hasn't initialized yet; having an error set by PhysFS
kind of taints the whole safety thing, even if it does nothing wrong,
no?
(although, speaking of which, we should be handling all errors by
PhysFS, but that's for later...)
These FIXME comments are still correct about code duplication, but
they're incorrect about where exactly the original code is after the
original code got moved around. So I've fixed them to refer to the
correct locations.
We really should get around to de-duplicating the code mentioned in
these comments...
Since musicWriteBlob is a temporary object that gets destroyed at the
end of musicclass::init(), in order to make sure we don't leak memory
and lose all the pointers to the blocks we just allocated in
musicWriteBlob, we need to call its clear() method after writing
BinaryMusic.vvv.
musicReadBlob was used for both MMMMMM and PPPPPP soundtracks. This
causes a memory leak if you have mmmmmm.vvv installed, because the
pointers holding each allocated block of MMMMMM would be lost when
PPPPPP got loaded. Valgrind complains about this memory leak.
This is in contrast to 2.2 and previous behavior, where musicReadBlob
was only a temporary object instead of being held in musicclass.
However, this wasn't really a memory leak (moreso something that just
didn't get cleaned up when closing the game), but it did get turned into
a leak when per-level assets mounting and unmounting got introduced in
2.3 (loading a level with custom assets after starting the game with an
mmmmmm.vvv, or exiting out of a level that had an mmmmmm.vvv, would
cause the game to leak memory). Leo recognized this, and moved
musicReadBlob onto musicclass in a separate 2.3 PR, but either he didn't
think about what was happening here too closely, or he didn't use
Valgrind, because he forgot about the memory leak caused by re-using the
same binaryBlob for PPPPPP and MMMMMM.
So instead, just use two different binaryBlob objects for MMMMMM and
PPPPPP. That way, no memory leaks happen.
I'm going to introduce another binaryBlob object in to the mix, and I
want to be able to re-use an existing FOREACH_TRACK #define without
having to copy-paste it again. So, TRACK_NAMES now takes in a blob
parameter, which will be passed to the temporary FOREACH_TRACK #define.
This removes the music cleanup code from musicclass::init(), and
requires that we also call destroy() in Graphics::reloadresources().
This is because we'll need to re-use the musicclass cleanup code
elsewhere, and we don't want to copy-paste the cleanup code. Or at
least, I don't (but I'm not a game dev, game devs copy-paste all the
friggin' time).
It doesn't feel quite write leaving all the buffer creation code in
main(), even though it's perfectly okay to do so and it doesn't result
in any memory mismanagement that Valgrind can report; so I'm factoring
all of it out to a separate function, Graphics::create_buffers().
As a bonus, we no longer have to keep qualifying with `graphics.` in the
buffer creation code, which is nice.
These destroy all the buffers that are created on the Graphics class.
Since these buffers can't be created at the same time as the rest of
Graphics is (due to the fact that they require knowing the pixel format
of the game screen), they can't be destroyed at the same as the rest of
Graphics is, either.
This is a very complicated way of zeroing out grphx (instead of using
SDL_zero()), which itself is completely unnecessary because grphx.init()
gets called immediately afterwards anyway.
It should be next-line brace, not same-line brace. Even in a codebase
that uses same-line braces everywhere, I still prefer having next-line
braces inside functions (because they're at the top level, and you can't
next them). But regardless, this should still be next-line brace like
(most of) the rest of the codebase.
The function previously conditionally freed a m_memblocks pointer if its
corresponding m_headers was valid. This makes me slightly worried about
the possibility that memory would be allocated, but the header would
still be marked as invalid.
I don't see how that could happen, but it's better to be safe than
sorry. SDL_free() does a guaranteed NULL pointer check (like most SDL
functions), so it's okay to pass NULL pointers to it.
Just to be sure, I'm also zeroing m_memblocks and m_headers after
freeing everything in the function.
MSVC complains about these, doesn't seem like GCC does. These can be
safely removed because they're unreachable, and they always follow a
case-switch or similar that has a default case which this code is a
duplicate of anyway. (Unless it isn't, in which case all the better to
remove it, becausee otherwise it looks misleading or confusing to casual
glances at the code.)
find_tag() would commit out-of-bounds indexing if someone made a level
file with malformed XML entity encodings in the metadata tags.
This would happen if the end of the string followed immediately after an
ampersand and hash, or if there wasn't a semicolon ending an XML entity.
Valgrind complains about these, so I've fixed it.
This fixes a bug where "12" gets properly evaluated as 12, but "148"
gets evaluated as 1408. It's because `place` gets multiplied by `radix`
again, so `retval` gets multipled by 100 instead of 10.
There's no reason to have a `place` variable, so I've removed it
entirely. This simplifies the function a little bit.
The previous person who wrote this (a girl named Misa) clearly didn't
understand the reason why you couldn't compare line[line.length()-1]
directly to a string literal. It's because the former is a char, and the
latter is a pointer to a char. Both are ints, so it compiles fine, but
it doesn't do what you want it to.
Why not just make the latter a char instead of a string literal? Well,
because you can, but also I clearly didn't think this through earlier,
so that's why I didn't do it in the first place.
But this is fixed now.
This avoids an unnecessary copy of the input std::vector, since we don't
need to modify it for anything. This cuts down on unnecessary memory
operations.
Apart from the std::string, this function no longer uses the STL.
ss_toi() is a simple function - it converts the input into an int,
taking as many digits as possible until it reaches a non-digit
character, at which point it stops. It's trivial to implement this
without the STL.
I could've used Int() here, but that would've required copying the
string to a temporary buffer to insert a null-terminator (we can't just
use a pointer-and-length data type either, the string functions don't
operate like that - one disadvantage of C strings!). Instead, I decided
to implement my own conversion to int here, because I don't think the
way we humans write our Arabic numerals is going to change anytime soon.
Also, the std::string input is now passed by const reference, instead of
making a copy - cutting down on unnecessary memory operations.
I personally like putting the asterisk with the type, because despite
the language parsing the asterisk as a part of the name, the pointer
part is clearly a part of the return type of the function. Also,
put constness here, to indicate that the input won't be modified inside
the function.
This comment indicates that the function is used by
UtilityClass::GCString(). Which is unnecessary, because the reader can
trivially search for usages of GCChar in the file itself (the 'static'
preceding the function should be a good enough hint) - and if there
aren't any, then the reader will know the function is unused, whereas if
they read the comment, they would have been under the assumption that it
wasn't used. (There might also a compiler warning about it being unused,
which would be more confusing if the comment was still there.)
Point is, comments can get outdated, so removing the comment here makes
the code more self-documenting.
This is a re-do of 942217f871 (#509), but
with a more conservative fix that only resets the player's newxp and
newyp when they respawn from a checkpoint or spawn in to the map.
Unlike the previous patch, if the player were to suddenly collide with a
conveyor or horizontally-moving platform during gameplay, their
y-position would revert back to the intended next y-position of the
previous frame. But this is the same behavior as before, I haven't ever
seen such a contrived situation come up, and this behavior is probably
more preferable for gameplay than actually going to the conveyor, so
it's fine.
I also decided to reset newxp here, and not just newyp, because while
resetting newyp seems to be enough, it's safer to also reset newxp (and
so future readers won't question why only newyp is reset but not newxp).
I tested this and it once again fixes the death loop issue from earlier,
while also still allowing for that Trench Warfare trick to be possible
(I tested it with the libTAS movie I mentioned in #606; it syncs fine).
There are no other known regressions resulting from this fix
(hopefully).
This reverts commit 942217f871.
This fix (of a regression of a fix) has a regression where immediately
flipping off of horizontally-moving platforms or conveyors will no
longer provide you with a "boost" given certain vertical pixel
alignments.
The regression that this fix fixed will be fixed another way.
Fixes#606.
This works on macOS, Wayland, and a few more esoteric platforms. X11
doesn't have a concept of DPI-awareness. Note that with this flag
SDL_GetWindowSize() isn't guaranteed to return the actual window size.
Retextured checkpoints have always been in the game, but clicking on
them in the editor would lead to them losing their retextured-ness. So,
checkpoints should be left alone if their p1 isn't either 0 or 1. Also,
they don't show up properly in the editor, so that's fixed, too.
Retextured and flipped terminals were added in 2.3, and show up properly
in-game, but don't properly show up in the editor, either. So now they
show up in the editor. Additionally, clicking on them will flip the
terminal as well, but only if its p1 is 0 or 1, just like checkpoints
now do.
This call to Makebfont() always existed, but ever since 2.3's per-level
custom assets were added, graphics.reloadresources() also calls
graphics.Makebfont(), meaning this call is unnecessary. Calling it twice
results in graphics.bfont and graphics.flipbfont having twice the number
of elements, until custom assets get mounted (or unmounted, technically).
This does the same thing as the last commit, but for No Death Mode
instead of Time Trials. Whenever you die in No Death Mode, or complete
it, all the relevant variables get copied to variables prefixed with
'ndmresult' that never get reset by script.hardreset(), and these
variables are what titlerender() use, instead of the "live" ones.
This makes it so when a Time Trial gets completed, all the relevant
variables get copied onto variables prefixed with 'timetrialresult',
which never get reset by script.hardreset(). Then titlerender() will use
those variables accordingly.
There are multiple different exit paths to the main menu. In 2.2, they
all had a bunch of copy-pasted code. In 2.3 currently, most of them use
game.quittomenu(), but there are some stragglers that still use
hand-copied code.
This is a bit of a problem, because all exit paths should consistently
have FILESYSTEM_unmountassets(), as part of the 2.3 feature of per-level
custom assets. Furthermore, most (but not all) of the paths call
script.hardreset() too, and some of the stragglers don't. So there could
be something persisting through to the title screen (like a really long
flash/shake timer) that could only persist if exiting to the title
screen through those paths.
But, actually, it seems like there's a good reason for some of those to
not call script.hardreset() - namely, dying or completing No Death Mode
and completing a Time Trial presents some information onscreen that
would get reset by script.hardreset(), so I'll fix that in a later
commit.
So what I've done for this commit is found every exit path that didn't
already use game.quittomenu(), and made them use game.quittomenu(). As
well, some of them had special handling that existed on top of them
already having a corresponding entry in game.quittomenu() (but the path
would take the special handling because it never did game.quittomenu()),
so I removed that special handling as well (e.g. exiting from a custom
level used returntomenu(Menu::levellist) when quittomenu() already had
that same returntomenu()).
The menu that exiting from the level editor returns to is now handled in
game.quittomenu() as well, where the map.custommode branch now also
checks for map.custommodeforreal. Unfortunately, it seems like entering
the level editor doesn't properly initialize map.custommode, so entering
the level editor now initializes map.custommode, too.
I've also taken the music.play(6) out of game.quittomenu(), because not
all exit paths immediately play Presenting VVVVVV, so all exit paths
that DO immediately play Presenting VVVVVV now have music.play(6)
special-cased for them, which is fine enough for me.
Here is the list of all exit paths to the menu:
- Exiting through the pause menu (without glitchrunner mode)
- Exiting through the pause menu (with glitchrunner mode)
- Completing a custom level
- Completing a Time Trial
- Dying in No Death Mode
- Completing No Death Mode
- Completing an Intermission replay
- Exiting from the level editor
- Completing the main game
Comments in general don't get verified by the compiler, but
commented-out code is even worse. Especially since this looks to be
outdated code.
As always, if we need some of this code, then we can just look back in
the Git history.
This fixes a segfault, because we would then pass compressed image data
to SDL_ConvertSurfaceFormat() in LoadImage(). I didn't test my previous
PR. Whoops.
Implicit conversion warnings happen all over the codebase, but there's
no reason to warn on all of them, and adding casts everywhere is
annoying to read and patently unnecessary.
MSVC is the only compiler that has this warning (GCC even on -Wall
-Wextra doesn't warn about implicit conversions), so disable it for
MSVC.
While compiling in release mode, GCC warns about these two potentially
being used uninitialized further down. The only way this could happen is
if the case-switches below didn't match up with a case, which would
require the game to be in an invalid state (and have invalid values for
rcol and spcol), but it's better to be safe than sorry.
The only thing we need LodePNG for is to decode a PNG that we've already
loaded into memory. We handle the filesystem part ourselves, so we don't
need LodePNG's filesystem functions; we don't encode images, and we
don't use the zlib functions. So disable all of those.
During 2.3 development, there's been a gradual shift to using SDL stdlib
functions instead of libc functions, but there are still some libc
functions (or the same libc function but from the STL) in the code.
Well, this patch replaces all the rest of them in one fell swoop.
SDL's stdlib can replace most of these, but its SDL_min() and SDL_max()
are inadequate - they aren't really functions, they're more like macros
with a nasty penchant for double-evaluation. So I just made my own
VVV_min() and VVV_max() functions and placed them in Maths.h instead,
then replaced all the previous usages of min(), max(), std::min(),
std::max(), SDL_min(), and SDL_max() with VVV_min() and VVV_max().
Additionally, there's no SDL_isxdigit(), so I just implemented my own
VVV_isxdigit().
SDL has SDL_malloc() and SDL_free(), but they have some refcounting
built in to them, so in order to use them with LodePNG, I have to
replace the malloc() and free() that LodePNG uses. Which isn't too hard,
I did it in a new file called ThirdPartyDeps.c, and LodePNG is now
compiled with the LODEPNG_NO_COMPILE_ALLOCATORS definition.
Lastly, I also refactored the awful strcpy() and strcat() usages in
PLATFORM_migrateSaveData() to use SDL_snprintf() instead. I know save
migration is getting axed in 2.4, but it still bothers me to have
something like that in the codebase otherwise.
Without further ado, here is the full list of functions that the
codebase now uses:
- SDL_strlcpy() instead of strcpy()
- SDL_strlcat() instead of strcat()
- SDL_snprintf() instead of sprintf(), strcpy(), or strcat() (see above)
- VVV_min() instead of min(), std::min(), or SDL_min()
- VVV_max() instead of max(), std::max(), or SDL_max()
- VVV_isxdigit() instead of isxdigit()
- SDL_strcmp() instead of strcmp()
- SDL_strcasecmp() instead of strcasecmp() or Win32 strcmpi()
- SDL_strstr() instead of strstr()
- SDL_strlen() instead of strlen()
- SDL_sscanf() instead of sscanf()
- SDL_getenv() instead of getenv()
- SDL_malloc() instead of malloc() (replacing in LodePNG as well)
- SDL_free() instead of free() (replacing in LodePNG as well)
This patch de-duplicates the tool drawing code a bit in the menu that
gets brought up when you press Space in the level editor, as well as
fixes several bugs related to the fact that the original author(s) of
the code decided to copy-paste everything. (It was most likely Terry,
judging by the distinct lack of whitespace between tokens in the code.)
There are two "pages" of tools that get shown when you open the tool
menu, according to your currently-selected tool.
1. On the first page, your currently-selected tool gets a brighter
outline. However, on the second page, the code to draw the outline over
your currently-selected tool is missing. So I've fixed that.
2. On the first page, the glyph indicator next to the tool icon also
gets brighter when you have that tool selected. However, on the
second page, the code that drew the brighter-colored indicator got
ran before the code that drew the normal-colored indicator, so this
was never shown. This is also fixed.
3. The glyph indicator of the gravity line tool didn't get brighter when
you had it selected, due to its special-cased copy-pasted code
drawing its brighter color before drawing its normal color. This has
also been fixed.
4. Lastly, the tool menu no longer draws the brighter-colored glyphs on
top of the normal-colored glyphs. Instead, the menu will simply draw
the brighter-colored glyphs and will not draw the normal-colored
glyphs in the first place. This is because double-drawing text like
this will look bad if the user has a custom font.png that has
translucent pixels, like I do.
All of these bugs have been fixed by paying off the technical debt of
copy-pasting code.