This WOULD be a huge breaking change, if it weren't for the fact that no
one uses them. Which is why I'm removing them, to simplify the code.
I asked on the VVVVVV Discord whether anyone used them or was even aware
of them and basically the answer was no. I go on Distractionware and no
one uses them. And why would they, when they'd have to distribute the
level .vvvvvv file separately? Better to just distribute everything in
one zip. And it's quite a bit obscure that you have to suffix the file
with .data.zip anyway.
During review of #869, I looked at this part of the codebase again. I
have no idea how or why, but during the course of 2.4 this whole area
just became a mess.
The issues I fixed (in no particular order):
- Copy-pasting the code that loads from the binary blobs
- Making sure SDL_RWFromConstMem is used over SDL_RWFromMem wherever
possible
- Adding checks to make sure the index from the binary blob is valid
(it's possible it could not exist)
- Adding checks to make sure we gracefully handle
SDL_RWFromConstMem/PHYSFSRWOPS_openRead returning NULL
- Moving the pointer asterisk to the type instead of the name :)
So, it turns out we weren't quite done fighting CMake yet...
To accommodate #869 (and actually also #272), the C standard was raised
from C90 to C99. This turned out to require a bit of a fight with the
CentOS CI's CMake version to get it to set the flags we wanted (and to
not overwrite them later). Eventually the fix was to move the block
that sets the standards to later in the file, which was done in
24353a54bb.
As it apparently turns out, if your CMake is at least 3.1.3 and
`CMAKE_<LANG>_STANDARD` is used instead of the workaround, the standard
setting now has an effect on the third party libraries, but not on
VVVVVV itself. The cause is (probably) the phrase "if it is set when a
target is created" in the CMake documentation - the
`CMAKE_<LANG>_STANDARD` values have to come before the VVVVVV target is
defined. In other words, the compiler's default C/C++ standard will be
used, probably something like C17 and C++17. As I can confirm with
`__cplusplus` and `__STDC_VERSION__` with my recent-enough CMake. If I
force the pre-3.1.3 workaround to be used, everything is compiled with
C99/C++98 as expected; and the `-fno-exceptions` `-fno-rtti` flags
appear everywhere regardless of version.
So my fix is to make the CMakeLists a little less complex by
simplifying away the `CMAKE_<LANG>_STANDARD` and
`CMAKE_<LANG>_EXTENSIONS`, and always using the workaround regardless
of CMake version. There's nothing wrong with the workaround, the same
thing is also done for `-fno-exceptions` `-fno-rtti`, and it's good to
have a less complicated CMakeLists that doesn't do different and
unexpected things for different versions.
The previous commit f6d7a214f8 ended up
breaking CI because the workaround ended up breaking the PhysFS build
too, which was previously relying on extensions to compile.
Since #869 is going to require C99 anyways, I might as well just up the
standard now. That way the PR won't have to fight it too.
Previously, if the user had a CMake version below 3.1.3, we told them to
set `-std` themselves.
However, we are going to go to C99 soon (because of FAudio, see #869),
and CentOS 7's CMake is too old to set `-std=` automatically, defaulting
to C90. This is bad because it fails the CI.
To work around this, we set `-std=` ourselves, but first we have to
clear any existing `-std=` flag in C_FLAGS or CXX_FLAGS. Amusingly
enough, MSVC does not have `/std:` switches for either C90 or C++98, so
we just get to do nothing.
This isn't necessary, but it does silence these annoying logs if you
pass an invalid argument or don't have data.zip:
[ERROR] Could not get window size: Invalid renderer
[WARN] Stats not loaded! Not writing unlock.vvv.
[ERROR] Could not get window size: Invalid renderer
[WARN] Settings not loaded! Not writing settings.vvv.
To do this, I've added FILESYSTEM_isInit().
This means we are no longer copy-pasting PhysFS source files directly.
Since the source files reside in a src/ subdirectory, the paths in the
CMakeLists.txt have to be adjusted.
We are no longer copy-pasting LodePNG source files directly.
As we can't rename lodepng.cpp to lodepng.c in the submodule itself, we
need to make a wrapper file, lodepng_wrapper.c, that #includes
lodepng.cpp, but gets compiled as C.
This prevents writing to unlock.vvv or settings.vvv if the game hasn't
made an attempt to load them yet. Otherwise, if the game aborted via
VVV_exit() because of, say, failure to parse a graphics file, it would
overwrite perfectly existing valid save data since it hasn't loaded it
yet.
Fixes#870.
Another cause of #870 is d0ffafe117, as a
bisect tells me. What that commit did is remove screenbuffer as a
pointer, since it's a statically-allocated object that _should_ always
exist, and it removed the `screenbuffer == NULL` guards in savestats()
and savesettings(). Unfortunately, those guards did something very
important - namely, they prevented writing to the save files when the
filesystem wasn't initialized. But that wasn't made clear, because it
seemed like the point of those guards was to prevent dereferencing NULL.
So instead, explicitly make it clear that
FILESYSTEM_saveTiXml2Document() needs to fail if the filesystem isn't
initialized. I've done this by adding an isInit bool to
FileSystemUtils.cpp.
Issue #870 showed one of the problems that this game has, namely that it
only sometimes checks SDL return values, and did not do so in this case.
Part of the cause of #870 is that Screen::GetWindowSize does not check
the return value of SDL_GetRendererOutputSize, so when that function
fails (as in the case where m_renderer is NULL and does not exist), it
does not initialize the out values, so it ends up writing uninitialized
values to the save files.
We need to make sure every function's return value is checked, not just
SDL functions, but that will have to be done later.
While reviewing #272, I noticed that the PR was passing these two
arguments through a helper function, even though they really shouldn't
ever change. To obviate the need to pass these through, I'm making them
global variables.
pathSep is just a string literal from PhysFS, while basePath is a whole
complicated calculation from SDL and needs to be freed. It will be freed
upon filesystem deinit (as is done with PhysFS and the STDIN buffer).
Additionally the logic in FILESYSTEM_init is simplified by no longer
needing to keep a retval variable or use gotos to free basePath in
there.
This lets any script name use capitals and spaces all they want, while
still being able to jump to them via iftrinkets() or similar.
The issue is that whenever tokenize() is ran, all spaces are stripped
and every argument is lowercased before being put into `words`. So, the
solution here is to create a raw_words array that doesn't perform space
stripping or lowercasing, and to refer to that whenever there's a script
command that loads a script. We keep the lowercasing and space removal
elsewhere to be more forgiving to newcomers.
This is technically a forwards compatibility break, but it's only a
minor one, and all levels that utilize it can still be easily modified
to work on older versions anyway.
As reported by Dav999, Victoria and Vermilion's trophy colors are
swapped again in 2.4. He points to
37b7615b71, the commit where I fixed the
color masks of every single surface to always be RGB or RGBA.
It sounded plausible to me, because it did have to do with colors, after
all. However, it didn't make sense to me, because I was like, I didn't
touch the trophy colors at all after I originally fixed them.
After I ruled out the RGBf() function as a confounder, I decided to see
whether intentionally reversing the color order in RGBf() to be BGR
would do anything, and to my surprise it actually swapped the colors
back around and it didn't actually look bad.
And then I realized: Swapping the trophy colors between RGB and BGR
ordering results in similar colors that still look good, but are simply
wrong, but not so wrong that they take on a color that no crewmate uses,
so it'd appear as if the crewmates were swapped, when in reality the
only thing that was swapped was actually the color order of the colors.
Trying to fix this by swapping the colors again, I actively confused
colors 33 and 35 (Vermilion and Victoria) with colors 32 and 34
(Vitellary and Viridian), so I was confused when Vermilion and Victoria
weren't swapping. Then as a debugging step, I only changed 34 to 32
without swapping 32 as well, and then finally noticed that I was
swapping Vitellary and Viridian, because there were now two Vitellarys.
And then I was reminded that Vitellary and Viridian were also wrongly
swapped since 2.0 as well.
And so then I finally realized: The original comments accompanying the
colors were correct after all. The only problem was that they were fed
into a function, RGBf(), that read the colors backwards, because the
codebase habitually changed the color order on a whim and it was really
hard to reason out which color order should be used at a given time, so
it ended up reading RGB colors as BGR, while it looked like it was
passing them through as-is.
So what happened was that in the first place, RGBf() was swapping RGB to
BGR. Then I came and swapped Vermilion and Victoria, and Vitellary and
Viridian around. Then later I fixed all the color masks, so RGBf()
stopped swapping RGB and BGR around. But then this ended up swapping the
colors of Vermilion and Victoria, and Vitellary and Viridian once again!
Therefore, swapping Vermilion and Victoria, and Vitellary and Viridian
was incorrect. Or at least, not the fix to the root cause. The root
cause would be to swap the colors in RGBf(), but this would be sort of
confusing to reason about - at least if I didn't bother to just type the
RGB values into an image editor. But that doesn't fix the real issue,
which is that the game kept swapping RGB and BGR around in every corner
of the codebase.
I further confirmed that there was no more RGB or BGR swapping by
deleting the plus-one-divide-by-three transformation in RGBf() and
seeing if the colors looked okay. Now with the colors being brighter, I
could see that passing it straight through looked fine, but
intentionally reversing it to be BGR resulted in colors that at a
distance looked okay, but were either washed out or too bright. At least
finally I could use my 8 years of playing this game for something.
So in conclusion, actually, 37b7615b71
("Fix surface color masks") was the real fix, and
d271907f8c ("Fix Secret Lab Time Trial
trophies having wrong colors") was the real regression. It's just that
the regression came first, but it wasn't really a regression until I did
the other fix, so the fix isn't the regression, the regression is...
this is hurting my brain. Or the real regression was the friends we made
along the way, or something like that.
This is the most trivial bug ever caused by the technical debt of those
god-awful reversed color masks.
---
This reverts commit d271907f8c.
Fixes#862.
In hindsight, the FAudio pointer will likely be in SoundTrack since we will
want to keep the mastering voice closer to the sounds and their source voice
arrays, while the MusicTrack will likely just be one source voice that gets
PCM from different streams.
This looks redundant but will actually help in the transition to FAudio; we
mostly want to keep the game logic the same while reimplementing the current
mixer, weirdness and all. Once that's done and confirmed to be stable and
consistent we can start cutting out the workarounds and quirks.
This meant making the track vectors static, but that's kind of what we do with musicclass anyway?
In any case, this will make the transition to FAudio MUCH less invasive.
This is quite simple. Just use a function pointer that switches out
which function we're going to use.
...Or not. C++ syntax makes this a bit awful since the function is a
member of a class. Did I mention how much I don't like C++?
Issue #849 suggested making integer be the default on Big Picture and
Steam Deck, but after thinking about it more, I think it's better and
more simple to just default to integer mode in general.
Reason being that people in Big Picture shouldn't expect the picture to
look different if they're out of Big Picture but still in fullscreen, or
have the picture look different in fullscreen depending on if they
launched the game for the first time in Big Picture or not. And besides,
the less lines of code, the better. So I'm just making integer mode the
default.
This enum is to just make each mode be readable, instead of mysterious
0/1/2 values. It's not a strictly-typed enum because we still have to
serialize it as ints in the XML, but it's better than just leaving them
as ints.
This also adds a NUM_SCALING_MODES enum, so we don't have to hardcode
that 3 when cycling scaling modes anymore.
This is mainly to make sure the game is definitely set to fullscreen in
Big Picture and on the Steam Deck, and to also remove windowed options
that wouldn't make sense if you're not on a desktop (toggling
fullscreen, resize to nearest). Those options would also be removed on
console and mobile too.
There's a bit of an annoying bug where if you launch the game in forced
fullscreen mode, but then exit and relaunch in normal mode, your game
will have fullscreen window sizes but it won't be fullscreen. This is
because forced fullscreen mode tries to preserve your non-forced
fullscreen setting, but due to the way window sizes are stored and
queried, it can't preserve the non-forced window size. This is a bit
difficult to work around, so I'm just putting in a FIXME here because we
can fix it later and I'd rather have a slightly buggy forced fullscreen
mode than not have one at all.
Closes#849.
Here's my notes on all the existing functions and what kind of time
formats they output:
- Game::giventimestring(int hrs, int min, int sec)
H:MM:SS
MM:SS
- Game::timestring()
// uses game.hours/minutes/seconds
H:MM:SS
MM:SS
- Game::partimestring()
// uses game.timetrialpar (seconds)
MM:SS
- Game::resulttimestring()
// uses game.timetrialresulttime (sec) + timetrialresultframes (1/30s)
MM:SS.CC
- Game::timetstring(int t)
// t = seconds
MM:SS
- Game::timestringcenti(char* buffer, const size_t buffer_size)
// uses game.hours/minutes/seconds/frames
H:MM:SS.CC
MM:SS.CC
- UtilityClass::timestring(int t)
// t = frames, 30 frames = 1 second
S:CC
M:SS:CC
This is kind of a mess, and there's a lot of functions that do the same
thing except using different variables. For localization, I also want
translators to be able to localize all these time formats - many
languages use the decimal comma instead of the decimal point (12:34,56)
maybe some languages really prefer something like 1時02分11秒44瞬...
Which I don't know to be correct, but it's good to be prepared for it
and not restrict translators arbitrarily to only changing ":" and "."
when we can start making the system better in the first place.
I added a new function, UtilityClass::format_time. This is the place
where all time formats come together, given the number of seconds and
optionally frames. I have simplified the above-mentioned functions
somewhat, but I haven't given them a complete refactor or renaming -
I mainly made sure that they all use the same backend so I can make the
formats consistent and properly localizable.
(And before we start shoving more temporary char buffers everywhere
just to get rid of the std::string's, maybe we need to think of a
globally used working buffer of size SCREEN_WIDTH_CHARS+1, as a
register of sorts, for when any line of text needs to be made or
processed, then printed, and then goes unused. Maybe help.textrow,
or something like that.)
As for this commit, the available time formats are now more consistent
and changed a little in some places. Leading zeroes for the first unit
are now no longer included, time trial results and the Super Gravitron
can now display hours when they went to 60 minutes before, and we now
always use .CC instead of :CC. These are the formats:
- H:MM:SS
- H:MM:SS.CC
- M:SS
- M:SS.CC
- S.CC (only used when always_minutes=false, for the Gravitrons)
Here's what changes to the current functions:
- Game::partimestring() is removed - it was used in two places, and
could be replaced by game.timetstring(game.timetrialpar)
- Game::giventimestring(h,m,s) and Game::timestring() are now wrappers
for the other functions
- The four remaining functions (Game::resulttimestring(),
Game::timetstring(t), Game::timestringcenti(buffer, buffer_size)
and UtilityClass::timestring(t)) are now wrappers for the "central
function", UtilityClass::format_time.
- UtilityClass::twodigits(int t) is now unused so it's also removed.
- I also added int UtilityClass::hms_to_seconds(int h, int m, int s)
This de-duplicates the code, simplifying the codebase and reducing the
number of code paths that needs to be maintained. It also adds
robustness checks to LoadIcon that weren't there before (checking that
loading the file succeeded and that decoding the file also succeeded).
Now, you might think that loading the image with alpha will change
things in some way. But actually, I tested it, and I'm pretty sure it
doesn't. Since my window manager, i3, doesn't display icons, I've had to
resort to this hacky multi-liner
( https://unix.stackexchange.com/a/48866 ) to dump the icon to a PAM
file. I don't know what a PAM file is and all my various attempts to
convert it into something readable failed. But what I did instead was
just grab the icon of the game before this commit (on 2.3, just to be
extra sure), and `diff`ed it with the grabbed icon now, and they end up
being the exact same file. So there's literally no difference.
The only other consideration is that LoadImage needs to be exported,
since it's implemented in GraphicsResources.cpp. I just opted to
forward-declare it right before LoadIcon in Screen.cpp, since it's
really the only other time it's used. No need to create a new header
file for it or anything.
This is just to simplify the function. I really don't see any point in
taking away the alpha for some images, other than to disappoint people
who mod the game assets. It just complicates loading the image with no
real gain. To reduce maintenance costs, let's remove this alternate code
path.
Also it's a default argument and I don't like default arguments.
This argument... doesn't do anything.
First off, setting it to true explicitly enables blending on the
resulting surface, which is kind of the exact opposite of the variable
name and is misleading to say the least? And secondly, SDL surfaces have
blending enabled by default anyways, so it still doesn't even do
anything.
It's also a default argument, and I'm not one to shy away from removing
such default arguments.