Issue #849 suggested making integer be the default on Big Picture and
Steam Deck, but after thinking about it more, I think it's better and
more simple to just default to integer mode in general.
Reason being that people in Big Picture shouldn't expect the picture to
look different if they're out of Big Picture but still in fullscreen, or
have the picture look different in fullscreen depending on if they
launched the game for the first time in Big Picture or not. And besides,
the less lines of code, the better. So I'm just making integer mode the
default.
This enum is to just make each mode be readable, instead of mysterious
0/1/2 values. It's not a strictly-typed enum because we still have to
serialize it as ints in the XML, but it's better than just leaving them
as ints.
This also adds a NUM_SCALING_MODES enum, so we don't have to hardcode
that 3 when cycling scaling modes anymore.
This is mainly to make sure the game is definitely set to fullscreen in
Big Picture and on the Steam Deck, and to also remove windowed options
that wouldn't make sense if you're not on a desktop (toggling
fullscreen, resize to nearest). Those options would also be removed on
console and mobile too.
There's a bit of an annoying bug where if you launch the game in forced
fullscreen mode, but then exit and relaunch in normal mode, your game
will have fullscreen window sizes but it won't be fullscreen. This is
because forced fullscreen mode tries to preserve your non-forced
fullscreen setting, but due to the way window sizes are stored and
queried, it can't preserve the non-forced window size. This is a bit
difficult to work around, so I'm just putting in a FIXME here because we
can fix it later and I'd rather have a slightly buggy forced fullscreen
mode than not have one at all.
Closes#849.
Here's my notes on all the existing functions and what kind of time
formats they output:
- Game::giventimestring(int hrs, int min, int sec)
H:MM:SS
MM:SS
- Game::timestring()
// uses game.hours/minutes/seconds
H:MM:SS
MM:SS
- Game::partimestring()
// uses game.timetrialpar (seconds)
MM:SS
- Game::resulttimestring()
// uses game.timetrialresulttime (sec) + timetrialresultframes (1/30s)
MM:SS.CC
- Game::timetstring(int t)
// t = seconds
MM:SS
- Game::timestringcenti(char* buffer, const size_t buffer_size)
// uses game.hours/minutes/seconds/frames
H:MM:SS.CC
MM:SS.CC
- UtilityClass::timestring(int t)
// t = frames, 30 frames = 1 second
S:CC
M:SS:CC
This is kind of a mess, and there's a lot of functions that do the same
thing except using different variables. For localization, I also want
translators to be able to localize all these time formats - many
languages use the decimal comma instead of the decimal point (12:34,56)
maybe some languages really prefer something like 1時02分11秒44瞬...
Which I don't know to be correct, but it's good to be prepared for it
and not restrict translators arbitrarily to only changing ":" and "."
when we can start making the system better in the first place.
I added a new function, UtilityClass::format_time. This is the place
where all time formats come together, given the number of seconds and
optionally frames. I have simplified the above-mentioned functions
somewhat, but I haven't given them a complete refactor or renaming -
I mainly made sure that they all use the same backend so I can make the
formats consistent and properly localizable.
(And before we start shoving more temporary char buffers everywhere
just to get rid of the std::string's, maybe we need to think of a
globally used working buffer of size SCREEN_WIDTH_CHARS+1, as a
register of sorts, for when any line of text needs to be made or
processed, then printed, and then goes unused. Maybe help.textrow,
or something like that.)
As for this commit, the available time formats are now more consistent
and changed a little in some places. Leading zeroes for the first unit
are now no longer included, time trial results and the Super Gravitron
can now display hours when they went to 60 minutes before, and we now
always use .CC instead of :CC. These are the formats:
- H:MM:SS
- H:MM:SS.CC
- M:SS
- M:SS.CC
- S.CC (only used when always_minutes=false, for the Gravitrons)
Here's what changes to the current functions:
- Game::partimestring() is removed - it was used in two places, and
could be replaced by game.timetstring(game.timetrialpar)
- Game::giventimestring(h,m,s) and Game::timestring() are now wrappers
for the other functions
- The four remaining functions (Game::resulttimestring(),
Game::timetstring(t), Game::timestringcenti(buffer, buffer_size)
and UtilityClass::timestring(t)) are now wrappers for the "central
function", UtilityClass::format_time.
- UtilityClass::twodigits(int t) is now unused so it's also removed.
- I also added int UtilityClass::hms_to_seconds(int h, int m, int s)
This de-duplicates the code, simplifying the codebase and reducing the
number of code paths that needs to be maintained. It also adds
robustness checks to LoadIcon that weren't there before (checking that
loading the file succeeded and that decoding the file also succeeded).
Now, you might think that loading the image with alpha will change
things in some way. But actually, I tested it, and I'm pretty sure it
doesn't. Since my window manager, i3, doesn't display icons, I've had to
resort to this hacky multi-liner
( https://unix.stackexchange.com/a/48866 ) to dump the icon to a PAM
file. I don't know what a PAM file is and all my various attempts to
convert it into something readable failed. But what I did instead was
just grab the icon of the game before this commit (on 2.3, just to be
extra sure), and `diff`ed it with the grabbed icon now, and they end up
being the exact same file. So there's literally no difference.
The only other consideration is that LoadImage needs to be exported,
since it's implemented in GraphicsResources.cpp. I just opted to
forward-declare it right before LoadIcon in Screen.cpp, since it's
really the only other time it's used. No need to create a new header
file for it or anything.
This is just to simplify the function. I really don't see any point in
taking away the alpha for some images, other than to disappoint people
who mod the game assets. It just complicates loading the image with no
real gain. To reduce maintenance costs, let's remove this alternate code
path.
Also it's a default argument and I don't like default arguments.
This argument... doesn't do anything.
First off, setting it to true explicitly enables blending on the
resulting surface, which is kind of the exact opposite of the variable
name and is misleading to say the least? And secondly, SDL surfaces have
blending enabled by default anyways, so it still doesn't even do
anything.
It's also a default argument, and I'm not one to shy away from removing
such default arguments.
This includes:
- Removing the constructor in favor of actually being able to see that
there's an actual function called being made initializing the struct
- Removing the use of a reference in Screen::init() in favor of using a
pointer
- Adding the struct qualifier everywhere (it's not much typing),
although technically you could typedef it in C, but I'd rather much
not typedef just to remove a tag qualifier
I know earlier I removed the gameScreen extern in favor of using
screenbuffer, but that was only to be consistent. After further
consideration, I have found that it's actually really stupid.
There's no reason to be accessing it through screenbuffer, and it's
probably an artifact of 2.0-2.2 passing stack-allocated otherwise-global
classes everywhere through function arguments. Also, it leads to stupid
bugs where screenbuffer could potentially be NULL, which has already
resulted in various annoying crashes in the past. Although those could
be fixed by simply initializing screenbuffer at the very top of main(),
but, why not just scrap the whole thing anyway?
So that's what I'm doing.
As a nice side effect, I've removed the transitive include of Screen.h
from Graphics.h. This could've been done already since it only includes
it for the pointer anyway, but it's still good to do it now.
In aa7b63fa5f, I didn't notice that the
result was implicitly being converted to int by the min/max from before.
I instead added it to the existing char, but that resulted in a char
overflow (it's unsigned, so thankfully not undefined behavior).
But of course the entire point of that commit is to make it explicitly
clear when you are converting between types, intentionally or otherwise,
in min/max comparisons. So despite causing a regression (which I have
now fixed), at least it did its job.
It's been long overdue that this variable be named properly. 2.2 added
integer scaling mode (thanks Ethan), 2.3 renamed it to scaling mode. Now
2.4 will properly call it what it is so people won't be confused by it.
The ScreenSettings struct member is renamed from stretch to scalingMode
along with the Screen class member being renamed, as well as the
toggleStretchMode function being renamed to toggleScalingMode as well.
Unfortunately, due to compatibility, we can't change the <stretch> XML
tag.
VVV_min/max are functions that only operate on ints, and SDL_min/max are
macros that operate on any type but double-evaluate everything.
I know I more-or-less said earlier that SDL_min/max were dumb but I've
changed my mind and think it's better to use them, taking care to make
sure you don't double-evaluate, rather than trying to generate your own
litany of functions with either your own hand-rolled generation macros,
C++ templates, C11 generics, or GCC extensions (that last one you'd
technically use in a macro but it doesn't really matter), all of which
have more downsides than just not double-evaluating.
And the upside of not double-evaluating is that you're disencouraged
from having really complicated single-line min/max expressions and
encouraged to precompute the values beforehand anyway so the final
min/max is more readable. And furthermore you'll notice when you
yourself end up doing double-evaluations anyway. I removed a couple
instances of Graphics::len() being double-evaluated in this commit (as
well as cleaned up some other min/max-using code). Although the only
downside to those double-evaluations was unnecessary computation,
rather than checking the wrong result or having multiple side effects,
thankfully, it's still good to minimize double-evaluations where
possible.
The reason why the wall stuck flipping behavior happened in the first
place was because the code went like this:
if (jumppressed)
{
if (onground && gravitycontrol == 0)
{
gravitycontrol = 1;
}
if (onroof && gravitycontrol == 1)
{
gravitycontrol = 0;
}
}
Basically, if you were both on ground and on a roof (i.e. stuck in a
wall), you would flip, but then due to code order and the fact that the
statement is not connected to the previous one, you would immediately
unflip afterwards. But if you were already flipped then the only path
that can be taken is to unflip you, since it's the statement that
appears last.
52fceb3f69 replaces the onground/onroof
conditionals with any_onground/any_onroof, so any player entity would
allow you to flip. But otherwise the code is the same. So is that the
problem?
No; tracing it through with GDB reveals that when you flip,
gravitycontrol is being set to 1, but never being set to 0. And it turns
out that's because any_onroof is not getting set. And that happens
because of another thing that 52fceb3f69
did - which was to set any_onground/any_onroof to true if indeed any
player entity was on ground or on a roof.
Unfortunately, the way Leo did it was to make the two statements
mutually exclusive - an 'if'-'else if' instead of two separate
statements. So a single entity could not mark both any_onground and
any_onroof as true (and the majority of the time, you will be a single
entity).
Thus, the solution is to just drop that 'else'.
Fixes#855.
I noticed when going frame-by-frame in Vertigo that sometimes the
wrapping enemies at the top sometimes just "popped" in frame. This is
because the sprite warp code only draws the warping sprite of sprites at
the bottom of the screen if they're below y=210. However, the warp point
starts at y=232, and warp sprites can be at most 32x32, which is exactly
the case with the Vertigo sprites, which are exactly 32x32. So the warp
code should start warping sprites if they're below y=200 (232 - 32)
instead.
Horizontal warping also has this problem; it warps at x=320 and
starts drawing warp sprites at x=300, even though it should start
drawing at x=288 (320 - 32). I've gone ahead and fixed that as well.
This is just in case the background gets changed by a custom level or
something to be something that would otherwise result in bad contrast.
Also if it needs to go outside the box for some reason. And I just like
the look of the outline.
Whew, look at all those copy-pasted print statements!
Doing this because of the in-game timer feature. The text would
otherwise clash harshly with the timer otherwise. Even with the outline
it still clashes, but at least there's an outline so it's not as harsh.
This adds centiseconds to the in-game timer, as well as the time trial
timer.
This is to aid speedrun moderators in determining when exactly a run was
completed, which they can't easily do if the timer only has a precision
up to a second.
The problem was that it also needed to check that game.swnmode was true,
in addition to game.swngame being 1, to actually check that the Super
Gravitron was being played.
Currently, all game-gamestate variables are just ints. This is not
particularly type-safe, in case the number of enums changes. To verify
that all current uses of the game-gamestate variables actually use the
enums, change them to be typed with the enum instead.
(As an aside, we should probably rename this so that it can't be
confused with Terry's state machine that has several different ways to
exploit to warp you to the credits, but that's something to do later.)
You'll note that getting in to the glitchy state of the game (the state
where you could play the game after it had hardreset() called on it)
required the player to quit to menu with ingame_titlemode set to true.
Well, quitting to menu calls hardreset(). So if hardreset() is called
when quitting, then you can no longer preserve ingame_titlemode that
way. This is a bit overkill, but I'm just taking precautions.
The game will now assert if the main menu is created while
ingame_titlemode is true, or if we attempt to load into a mode while
it's true. And if assertions are disabled then it just stops doing it
anyway.
I don't think there's any way to get a glitched ingame_titlemode again,
ever since I removed save data deletion taking you back to the main
menu. But I've had enough bugs with the fact that we more-or-less use
the same state for main menu options and in-game options, and that
glitched ingame_titlemode bug DID just happen, so I'm taking
precautions.
The next commit will add logic that more-or-less quits the whole block
if ingame_titlemode, and instead of adding another layer of indentation
I will just pull this into its own function so we can use a return
statement.
While I was testing deleting data while you were in-game, I noticed that
deleting data gave you all the "Win with less than X deaths" trophies,
even if you never got any of them before deleting data. Well, it turns
out that if you have the best game death count of 0, then you win every
trophy, and if you have the best game death count of -1 then that means
you haven't completed the game yet.
This reset was added in e3bfc79d4a, so at
least it's not in 2.3, but I only have myself to blame for making this
mistake. Whoops.
Going back to the main menu allowed for glitchiness to occur if you
deleted your save data while in in-game options. This meant you could
then load back in to the game, and then quit to the menu, then open the
options and then jump back in-game, exploring the state of the game
after hardreset() had been called on it. Which is: pretty glitchy.
For example, this meant having your room coordinates be 0,0 (which is
different from 100,100, which is the actual 0,0, thanks for the
100-indexing Terry), which caused some of the room transitions to be
disabled because room transitions were disabled if the
game.door_up/down/left/right variables were -2 or less, and they were
computed based on room coordinates, which meant some of them went
negative if you were 0,0 and not 100,100. At least this was the case
until I removed those variables for, at best, doing nothing, and at
worst, being actively harmful.
Anyways, so deleting your save data now just takes you back to the
previous menu, much like deleting custom level data does. I don't know
why deleting save data put you back on the main menu in the first place.
It's not like the options menu needed to be reloaded or anything. I
checked and this was the behavior in 2.0 as well, so it was probably
added for a dumb reason.
I considered prohibiting data deletion if you were ingame_titlemode, but
as of the moment it seems to be okay (if albeit weird, e.g. returning to
menu while in Secret Lab doesn't place your cursor on the "play"
button), and I can always add such a prohibition later if it was really
causing problems. Can't think of anything bad off of the top of my head,
though.
Btw thanks to Elomavi for discovering that you could do this glitch.
Okay, so, this is the elephant sprite, right?
https://i.imgur.com/dtS70zk.png
This is how it looks in the actual game, when you stitch all the rooms
together:
https://i.imgur.com/aztVnFT.png
Looks kind of messed-up, doesn't it?
Okay, so, in the bottom two rooms (11,9) and (12,9), the elephant is
placed at y-position -152. But in (11,8) and (12,8), it's placed at
y-position 96. This is despite the fact that -152 plus 240 is 88, not
96.
Similarly, in the left two rooms (11,8) and (11,9), the elephant is
placed at x-position 64, but in the right two rooms (12,8) and (12,9),
the elephant is placed at -264. This is despite the fact that 64 minus
320 is -256, not -264.
All of this stems from the calculations in Otherlevel.cpp using offsets
of -248 and -328 instead of -240 and -320.
So there's an 8-pixel offset that causes the elephant to be chopped off
when viewed with all the rooms stitched together. Simple enough to fix.
For the y-position fixes, I decremented the initial 8-pixel multiplier
as well, else the elephant would sink into the floor.
And this is what the elephant looks like now after stitching:
https://i.imgur.com/27ePLm1.png
Thanks to Tzann for pointing this out.
These warnings are kinda spammy, and they make sense in principle.
vlog_error takes a format string, so passing it an arbitrary string
(even error messages from libraries) isn't a good idea.
Dvoid from Discord just reported a crash when trying to load a
custom tiles2.png that was encoded weirdly.
The problem is that we don't check the return value from LodePNG, so
LodePNG gives us a null pointer, and then
SDL_CreateRGBSurfaceWithFormatFrom doesn't check this null pointer,
which then propagates until we crash in SDL_ConvertSurfaceFormat (or
rather, one of its sub-functions), and we would probably crash somewhere
else anyway if it continued.
After properly checking LodePNG's return value, along with printing the
error, it turns out that Dvoid's custom tiles2.png had an "invalid CRC".
I don't know what this means but it sounds worrying. `feh` can read the
file correctly but it also reports a "CRC error".
SDL_GetTicks64() is a function that got added in SDL 2.0.18, which is
just an SDL_GetTicks() without a value that wraps every ~49 days,
instead wrapping after the sun explodes and kills us all. Oh sorry,
didn't mean to get existential.
For now, put this behind an SDL_VERSION_ATLEAST guard, which will be
removed when SDL 2.0.18 officially releases and we can update to it.
My latest rebase of #624 (refactoring/splitting editor.cpp) accidentally
overwrote #787 and essentially reverted it entirely. So, add it back in.
This is the same as #787 except it uses the new names, uses SDL_INLINE
to inline the function, and uses named constants.
GCC warns on casting `void*` to function pointers. This is because the C
standard makes a clear distinction between pointers to objects (`void*`)
and pointers to functions (function pointers), and does not specify
anything related to being able to cast object pointers to function
pointers.
The warning message is factually wrong, though - it states that it is
forbidden by ISO C, when in fact it is not, and is actually just
unspecified.
We can't get rid of the cast entirely, because we need the explicit cast
(the C standard _does_ mandate you need an explicit cast when converting
between object pointers and function pointers), and at the end of the
day, this is simply how `SDL_LoadFunction()` works (and more
importantly, how `dlsym()` works), so we can't get rid of it, and we
have no reason to anyways since it means we don't have a hard runtime
dependency on Steam (unlike some other games) and casting `void*` to
function pointers always behaves well on every single platform we ship
on that supports Steam.
Unfortunately, this warning seems to be a part of -Wpedantic, and
there's no way to disable this warning specifically without disabling
-Wpedantic. Luckily, I've found a workaround - just cast to `intptr_t`
before casting to the function pointer. Hopefully the compiler doesn't
get smarter in the future and this ends up breaking or anything...
* Add `setactivityposition(x,y)`, add new textbox color `transparent`
This commit adds a new internal command as a part of the visual activity zone changes I've been making.
This one allows the user to reposition the activity zone to anywhere on the screen.
In addition, this commit adds the textbox color `transparent`, which just sets r, g and b to 0.
rgb(0, 0, 0) normally creates the color black, however in VVVVVV textboxes, it makes the background
of them invisible, and makes the text the off-white color which the game uses elsewhere.
* add new variables to hardreset
* Fix unwanted text centering; offset position by 16, 4
It makes sense for `setactivityposition(0, 0)` to place the activity zone in the default position,
so the x has been offset by 16, and the y has been offset by 4.
Text was being automatically centered, meaning any activity zone which wasn't centered had misplaced text.
This has been fixed by calculating the center manually, and offsetting it by the passed value.
In previous versions, the game mistakenly checked the wrong color
channel of sprites, checking the red channel instead of the alpha
channel. I abuse this in some of my levels. Then I broke it when
refactoring masks so the game now no longer checks the red channel but
seems to check the blue channel instead. So re-fix this to the previous
bug, and preserve the previous bug with a comment explaining why.
This broke when I was refactoring things earlier, because we no longer
have a direct reference to the contents array, instead using a copied
int. But we have a settile() function anyway, so why not use it?
It is impossible to get on the quicksave screen in time trials, because
Enter is always bound to restarting time trials in a time trial, and
there's no way to open the map screen otherwise.
So, I've decided to add a fun little message in case someone somehow
manages to get to this screen in a time trial.
As is typical, the code was copy-pasted to account for Flip Mode, and
then copy-pasted again to account for custom levels, leading to four
instances of the same code.
I clean this up while also improving code style. This is where the new
FLIP macro and the fixed PrintWrap help a lot - otherwise the "Game
saved ok!" screen would look really wrong without the height
corrections.
It now looks more like the FLIP macro in Render.cpp: The y-position is
simply the height of the area the object is being flipped in, minus the
y-position itself, minus the height of the object. So:
flipped_yp = constant - yp - height
This is just a mathematical simplification of the existing statement,
which is:
flipped_yp = yp + 2 * (constant/2 - yp) - height
Using algebra, the 2 distributes into the parentheses, so
flipped_yp = yp + constant - 2 * yp - height
And the two `yp`s add together, so
flipped_yp = constant - yp - height
It's more readable this way.
Also I am using a named constant instead of a hardcoded one.
Otherwise, the text will be in the wrong position compared to normal
mode.
PrintWrap is not used in Flip Mode yet, but it will be used on the map
screen in an upcoming change of mine. The FLIP macro in Render.cpp can't
help us there, since it would need to know the height of the wrapped
text at compile time, when the height is only figured out at runtime
based off of the string (or, well, right _now_ the string _is_ known,
but we are going to merge localization for 2.4, and it's better to
future-proof...), and only PrintWrap itself can figure out the height of
the text. (Or, well, I suppose you could call it from outside the
function, but that's not very separation-of-concernsy style.)
Flipping objects in Flip Mode needs to account for the heights of those
objects (that's why flipme text boxes in Flip Mode in 2.2 were
positioned wrongly).
Also, turn it into a macro instead of an inline function.
This changes the positions of all existing de-duplicated map menu text
in Flip Mode, but it'll be more correct.