If an XML tag doesn't contain anything inside, pText will be NULL. If
that happens without being checked, then NULL will be passed to
SDL_strcmp(). SDL_strcmp() will either call libc strcmp() or use its own
implementation; both implementations will still dereference the NULL
without checking it.
This is undefined behavior, so I'm fixing it. The solution is to do what
is done with all other XML parsing functions, and to make sure pText
gets set to a safe empty string (which is just a pointer to a null
terminator) if it happens to be NULL.
PR #279 added game.gametimer solely for the editor ghosts feature. It
seems that whoever originally wrote it (Leo for the now-dead VVVVVV:
Community Edition, I believe) forgot that the game already had its own
timer, that they could use.
The game timer does increment on unfocus pause (whereas this doesn't),
but that's a separate issue, and it ought to not do that.
The background would change for 1 frame before sending you back to the
pause menu or editor settings. The map.nexttowercolour() call needs to
be deferred until the end of the frame.
The new loop order introduces a glitch where the menu would display
whichever menu was saved to kludge_ingametemp for 1 frame right as the
user returned to the pause menu. This happened because the
game.returntomenu() happens in titleinput(), which comes before
titlerender(). To fix this, we just need to defer it to the end of the
frame.
game.shouldreturntoeditor was added to fix a frame ordering issue that
was causing a bug where if you started playtesting in a room with a
horizontal/vertical warp background, and exited playtesting in a
different room that also had a horizontal/vertical warp background and
which was different, then the background of the room you exited in would
slowly scroll offscreen, when you re-entered the editor, instead of the
background consisting entirely of the actual background of the room.
Namely, the issue was that the game would render one more frame of
GAMEMODE after graphics.backgrounddrawn got set to false, and re-set it
to true, thus negating the background redraw, so the editor background
would be incorrect.
With defer callbacks, we can now just use a couple lines of code,
instead of having to add an extra kludge variable and putting handling
for it all over the code.
This ensures that if the player decides to toggle Flip Mode while one of
these text boxes is up, they won't be oriented improperly. Additionally,
it also de-duplicates a bunch of Flip Mode check code, which is also a
win.
The "Game Saved" text box, along with its associated telesave() call,
exists in both Game.cpp and Script.cpp, so one of them is the copy-paste
of the other. Unfortunately this copy-paste resulted in an inconsistency
where both of them don't check for the same things when deciding whether
or not the telesave should actually happen (this is why you don't
copy-paste, kids... it's scary!).
Either way, de-duplicating this now is less work for me later.
Every Level Complete sequence is the same copy-pasted thing, but with
minor changes. To make my work easier, I'm de-duplicating them so I have
less text boxes to change later, and less grind to grind.
These commented-out code blocks just get in the way of clarity when I'm
refactoring flipped textboxes created in the gamestate system. So I'm
getting rid of them. If we need them back, we always have Git history.
To do this, I've added Graphics::setbars(), to make sure
oldcutscenebarspos always gets assigned when cutscenebarspos is. This
fixes potential deltaframe rendering issues if these two mismatch.
While working on #535, I noticed that editormenuactionpress() still
didn't do the explicit void declaration. Then I ran `rg 'void.*\(\)'`
and found three other functions that I somehow missed in #628. Whoops.
Well, now they no longer are missed.
This is a small quality-of-life tweak that makes it so if you're in the
middle of editing a level, you don't have to save the level, exit to the
menu, change whatever setting you wanted, re-enter the editor, and type
in the level name, just to change one setting. This is the same as
adding Graphic Options and Game Options to the in-game pause menu,
except for the editor, too.
To do this, I'm reusing Game::returntopausemenu() (because all of its
callers are the same callers for returning to editor settings) and
renamed it to returntoingame(), then added a variable named
ingame_editormode to Game. When we're in the options menus but still in
the editor, BOTH ingame_titlemode and ingame_editormode will be true.
Since it only ever gets assigned from FILESYSTEM_getUserSaveDirectory(),
and that function returns a C string, and the variable is only ever read
from again, this doesn't need to be an std::string.
There's no need to create an std::string for every single element just
to see if it's a key name.
At least in libstdc++, there's an optimization where std::strings that
are 16 characters or less don't allocate on the heap, and instead use
the internal 16-char buffer directly in the control structure of the
std::string. However, it's not guaranteed that all the element names
we'll get will always be 16 chars or less, and in case the std::string
does end up allocating on the heap, we have no reason for it to allocate
on the heap; so we should just convert these string comparisons to C
strings instead.
Now that recreating the same menu keeps currentmenuoption, we can remove
all these superfluous assignments. This means repeating ourselves less;
in case the option numbers change in the future, we won't have to
remember to update these reassignments, too.
When recreating the same menu, there's basically no reason to reset the
currently-selected menu option. (Also, no need to worry about indexing
out of bounds or anything - the number gets checked while iterating over
all menu options; it's never used to actually index anything. At worst
there might be a 1-frame flicker as the bounds code in gameinput() kicks
in, but that shouldn't happen anyways.)
Apparently in C, if you have `void test();`, it's completely okay to do
`test(2);`. The function will take in the argument, but just discard it
and throw it away. It's like a trash can, and a rude one at that. If you
declare it like `void test(void);`, this is prevented.
This is not a problem in C++ - doing `void test();` and `test(2);` is
guaranteed to result in a compile error (this also means that right now,
at least in all `.cpp` files, nobody is ever calling a void parameter
function with arguments and having their arguments be thrown away).
However, we may not be using C++ in the future, so I just want to lay
down the precedent that if a function takes in no arguments, you must
explicitly declare it as such.
I would've added `-Wstrict-prototypes`, but it produces an annoying
warning message saying it doesn't work in C++ mode if you're compiling
in C++ mode. So it can be added later.
This patch restores some 2.2 behavior, fixing a regression caused by the
refactor of properly using std::vectors.
In 2.2, the game allocated 200 items in obj.entities, but used a system
where each entity had an `active` attribute to signify if the entity
actually existed or not. When dealing with entities, you would have to
check this `active` flag, or else you'd be dealing with an entity that
didn't actually exist. (By the way, what I'm saying applies to blocks
and obj.blocks as well, except for some small differing details like the
game allocating 500 block slots versus obj.entities's 200.)
As a consequence, the game had to use a separate tracking variable,
obj.nentity, because obj.entities.size() would just report 200, instead
of the actual amount of entities. Needless to say, having to check for
`active` and use `obj.nentity` is a bit error-prone, and it's messier
than simply using the std::vector the way it was intended. Also, this
resulted in a hard limit of 200 entities, which custom level makers ran
into surprisingly quite often.
2.3 comes along, and removes the whole system. Now, std::vectors are
properly being used, and obj.entities.size() reports the actual number
of entities in the vector; you no longer have to check for `active` when
dealing with entities of any sort.
But there was one previous behavior of 2.2 that this system kind of
forgets about - namely, the ability to have holes in between entities.
You see, when an entity got disabled in 2.2 (which just meant turning
its `active` off), the indices of all other entities stayed the same;
the indice of the entity that got disabled stays there as a hole in the
array. But when an entity gets removed in 2.3 (previous to this patch),
the indices of every entity afterwards in the array get shifted down by
one. std::vector isn't really meant to be able to contain holes.
Do the indices of entities and blocks matter? Yes; they determine the
order in which entities and blocks get evaluated (the highest indice
gets evaluated first), and I had to fix some block evaluation order
stuff in previous PRs.
And in the case of entities, they matter hugely when using the
recently-discovered Arbitrary Entity Manipulation glitch (where crewmate
script commands are used on arbitrary entities by setting the `i`
attribute of `scriptclass` and passing invalid crewmate identifiers to
the commands). If you use Arbitrary Entity Manipulation after destroying
some entities, there is a chance that your script won't work between 2.2
and 2.3.
The indices also still determine the rendering order of entities
(highest indice gets drawn first, which means lowest indice gets drawn
in front of other entities). As an example: let's say we have the player
at 0, a gravity line at 1, and a checkpoint at 2; then we destroy the
gravity line and create a crewmate (let's do Violet).
If we're able to have holes, then after removing the gravity line, none
of the other indices shift. Then Violet will be created at indice 1, and
will be drawn in front of the checkpoint.
But if we can't have holes, then removing the gravity line results in
the indice of the checkpoint shifting down to indice 1. Then Violet is
created at indice 2, and gets drawn behind the checkpoint! This is a
clear illustration of changing the behavior that existed in 2.2.
However, I also don't want to go back to the `active` system of having
to check an attribute before operating on an entity. So... what do we
do to restore the holes?
Well, we don't need to have an `active` attribute, or modify any
existing code that operates on entities. Instead, we can just set the
attributes of the entities so that they naturally get ignored by
everything that comes into contact with it. For entities, we set their
invis to true, and their size, type, and rule to -1 (the game never uses
a size, type, or rule of -1 anywhere); for blocks, we set their type to
-1, and their width and height to 0.
obj.entities.size() will no longer necessarily equal the amount of
entities in the room; rather, it will be the amount of entity SLOTS that
have been allocated. But nothing that uses obj.entities.size() needs to
actually know the amount of entities; it's mostly used for iterating
over every entity in the vector.
Excess entity slots get cleaned up upon every call of
mapclass::gotoroom(), which will now deallocate entity slots starting
from the end until it hits a player, at which point it will switch to
disabling entity slots instead of removing them entirely.
The entclass::clear() and blockclass::clear() functions have been
restored because we need to call their initialization functions when
reusing a block/entity slot; it's possible to create an entity with an
invalid type number (it creates a glitchy Viridian), and without calling
the initialization function again, it would simply not create anything.
After this patch is applied, entity and block indices will be restored
to how they behaved in 2.2.
The current way "arrays" from XML files are loaded (before this commit
is applied) goes something like this:
1. Read the buffer of the contents of the tag using TinyXML-2.
2. Allocate a buffer on the heap of the same size, and copy the
existing buffer to it. (This is what the statement `std::string
TextString = pText;` does.)
3. For each delimiter in the heap-allocated buffer...
a. Allocate another buffer on the heap, and copy the characters from
the previous delimiter to the delimiter you just hit.
b. Then allocate the buffer AGAIN, to copy it into an std::vector.
4. Then re-allocate every single buffer YET AGAIN, because you need to
make a copy of the std::vector in split() to return it to the caller.
As you can see, the existing way uses a lot of memory allocations and
data marshalling, just to split some text.
The problem here is mostly making a temporary std::vector of split text,
before doing any actual useful work (most likely, putting it into an
array or ANOTHER std::vector - if the latter, then that's yet another
memory allocation on top of the memory allocation you already did; this
memory allocation is unavoidable, unlike the ones mentioned earlier,
which should be removed).
So I noticed that since we're iterating over the entire string once
(just to shove its contents into a temporary std::vector), and then
basically iterating over it again - why can't the whole thing just be
more immediate, and just be iterated over once?
So that's what I've done here. I've axed the split() function (both of
them, actually), and made next_split() and next_split_s().
next_split() will take an existing string and a starting index, and it
will find the next occurrence of the given delimiter in the string. Once
it does so, it will return the length from the previous starting index,
and modify your starting index as well. The price for immediateness is
that you're supposed to handle keeping the index of the previous
starting index around in order to be able to use the function; updating
it after each iteration is also your responsibility.
(By the way, next_split() doesn't use SDL_strchr(), because we can't get
the length of the substring for the last substring. We could handle this
special case specifically, but it'd be uglier; it also introduces
iterating over the last substring twice, when we only need to do it
once.)
next_split_s() does the same thing as next_split(), except it will copy
the resulting substring into a buffer that you provide (along with its
size). Useful if you don't particularly care about the length of the
substring.
All callers have been updated accordingly. This new system does not make
ANY heap allocations at all; at worst, it allocates a temporary buffer
on the stack, but that's only if you use next_split_s(); plus, it'd be a
fixed-size buffer, and stack allocations are negligible anyway.
This improves performance when loading any sort of XML file, especially
loading custom levels - which, on my system at least, I can noticeably
tell (there's less of a freeze when I load in to a custom level with
lots of scripts). It also decreases memory usage, because the heap isn't
being used just to iterate over some delimiters when XML files are
loaded.
Instead of checking the length() of an std::string, just check if
pText[0] is equal to '\0'.
This will have to be done anyway, because I'm going to get rid of the
std::string allocation here, and I noticed this inefficiency in the
indentation, so I'm going to remove it.
The actual unindent will be done in the next commit.
This now means every XML array loading is done with common,
re-duplicated code. The only exceptions to this are special cases other
than the the majority of cases; the majority being a simple matter of
reading an array of integers and putting it into another array.
Seems like the only reason I hadn't caught the <customlevelscore> tag
until now was because I was focused on de-duplicating all the array
loads in Game::loadstats() and below, forgetting about
Game::loadcustomlevelstats().
In order to be able to use the LOAD_ARRAY() and LOAD_ARRAY_RENAME()
macros in Game::loadcustomlevelstats(), they have to be moved to earlier
in the file.
Valgrind reported this.
The error here is that the buffer here is only guaranteed to be
initialized up until (and including) the null-terminator, by
SDL_snprintf(). Iterating over the entire allocated buffer is bad and I
should feel bad as the girl who wrote this code; doing that reads
uninitialized memory and passes it to SDL_tolower().
As a bonus, the iterator increment is now a preincrement instead of a
postincrement.
This does the same thing as the last commit, but for No Death Mode
instead of Time Trials. Whenever you die in No Death Mode, or complete
it, all the relevant variables get copied to variables prefixed with
'ndmresult' that never get reset by script.hardreset(), and these
variables are what titlerender() use, instead of the "live" ones.
This makes it so when a Time Trial gets completed, all the relevant
variables get copied onto variables prefixed with 'timetrialresult',
which never get reset by script.hardreset(). Then titlerender() will use
those variables accordingly.
There are multiple different exit paths to the main menu. In 2.2, they
all had a bunch of copy-pasted code. In 2.3 currently, most of them use
game.quittomenu(), but there are some stragglers that still use
hand-copied code.
This is a bit of a problem, because all exit paths should consistently
have FILESYSTEM_unmountassets(), as part of the 2.3 feature of per-level
custom assets. Furthermore, most (but not all) of the paths call
script.hardreset() too, and some of the stragglers don't. So there could
be something persisting through to the title screen (like a really long
flash/shake timer) that could only persist if exiting to the title
screen through those paths.
But, actually, it seems like there's a good reason for some of those to
not call script.hardreset() - namely, dying or completing No Death Mode
and completing a Time Trial presents some information onscreen that
would get reset by script.hardreset(), so I'll fix that in a later
commit.
So what I've done for this commit is found every exit path that didn't
already use game.quittomenu(), and made them use game.quittomenu(). As
well, some of them had special handling that existed on top of them
already having a corresponding entry in game.quittomenu() (but the path
would take the special handling because it never did game.quittomenu()),
so I removed that special handling as well (e.g. exiting from a custom
level used returntomenu(Menu::levellist) when quittomenu() already had
that same returntomenu()).
The menu that exiting from the level editor returns to is now handled in
game.quittomenu() as well, where the map.custommode branch now also
checks for map.custommodeforreal. Unfortunately, it seems like entering
the level editor doesn't properly initialize map.custommode, so entering
the level editor now initializes map.custommode, too.
I've also taken the music.play(6) out of game.quittomenu(), because not
all exit paths immediately play Presenting VVVVVV, so all exit paths
that DO immediately play Presenting VVVVVV now have music.play(6)
special-cased for them, which is fine enough for me.
Here is the list of all exit paths to the menu:
- Exiting through the pause menu (without glitchrunner mode)
- Exiting through the pause menu (with glitchrunner mode)
- Completing a custom level
- Completing a Time Trial
- Dying in No Death Mode
- Completing No Death Mode
- Completing an Intermission replay
- Exiting from the level editor
- Completing the main game
Comments in general don't get verified by the compiler, but
commented-out code is even worse. Especially since this looks to be
outdated code.
As always, if we need some of this code, then we can just look back in
the Git history.
During 2.3 development, there's been a gradual shift to using SDL stdlib
functions instead of libc functions, but there are still some libc
functions (or the same libc function but from the STL) in the code.
Well, this patch replaces all the rest of them in one fell swoop.
SDL's stdlib can replace most of these, but its SDL_min() and SDL_max()
are inadequate - they aren't really functions, they're more like macros
with a nasty penchant for double-evaluation. So I just made my own
VVV_min() and VVV_max() functions and placed them in Maths.h instead,
then replaced all the previous usages of min(), max(), std::min(),
std::max(), SDL_min(), and SDL_max() with VVV_min() and VVV_max().
Additionally, there's no SDL_isxdigit(), so I just implemented my own
VVV_isxdigit().
SDL has SDL_malloc() and SDL_free(), but they have some refcounting
built in to them, so in order to use them with LodePNG, I have to
replace the malloc() and free() that LodePNG uses. Which isn't too hard,
I did it in a new file called ThirdPartyDeps.c, and LodePNG is now
compiled with the LODEPNG_NO_COMPILE_ALLOCATORS definition.
Lastly, I also refactored the awful strcpy() and strcat() usages in
PLATFORM_migrateSaveData() to use SDL_snprintf() instead. I know save
migration is getting axed in 2.4, but it still bothers me to have
something like that in the codebase otherwise.
Without further ado, here is the full list of functions that the
codebase now uses:
- SDL_strlcpy() instead of strcpy()
- SDL_strlcat() instead of strcat()
- SDL_snprintf() instead of sprintf(), strcpy(), or strcat() (see above)
- VVV_min() instead of min(), std::min(), or SDL_min()
- VVV_max() instead of max(), std::max(), or SDL_max()
- VVV_isxdigit() instead of isxdigit()
- SDL_strcmp() instead of strcmp()
- SDL_strcasecmp() instead of strcasecmp() or Win32 strcmpi()
- SDL_strstr() instead of strstr()
- SDL_strlen() instead of strlen()
- SDL_sscanf() instead of sscanf()
- SDL_getenv() instead of getenv()
- SDL_malloc() instead of malloc() (replacing in LodePNG as well)
- SDL_free() instead of free() (replacing in LodePNG as well)
This variable seems to have been intended to make sure
game.savestatsandsettings() was called at the end of the frame, or make
sure that it didn't get called more than once per frame. I don't see any
frame ordering-related reason why it needs to be called specifically at
the end of the frame (the function doesn't modify any state), so it's
more plausible that it was added to make sure it didn't get called more
than one per frame.
However, upon further analysis, none of the code paths where
game.savemystats is used ever calls or sets game.savemystats more than
once, and a majority of the code directly calls
game.savestatsandsettings() anyway, so there's no reason for this
variable to exist. If we ever need to make sure it doesn't get called
more than once, and there's no way to change the code paths around to
prevent it otherwise, we can use the defer callbacks system that I added
to #535, when it gets merged.
These variables basically serve no purpose. map.customx and map.customy
are clearly never used. map.finalx and map.finaly, on the other hand,
are basically always game.roomx and game.roomy respectively if
map.finalmode is on, and if it's off, then they don't matter.
Also, there are some weird and redundant variable assignments going on
with these; most notably in map.gotoroom(), where rx/ry (local
variables) get assigned to finalx/finaly, then finalx/finaly get
assigned to game.roomx/game.roomy, then finalx/finaly get assigned to
rx/ry. If finalx/finaly made a difference, then there'd be no need to
assign finalx/finaly back to rx/ry. So it makes the code clearer to
remove these weird bits of code.
This patch cleans up unnecessary exports from header files (there were
only a few), as well as adds the static keyword to all symbols that
aren't exported and are specific to a file. This helps the linker out in
not doing any unnecessary work, speeding it up and avoiding silent
symbol conflicts (otherwise two symbols with the same name (and
type/signature in C++) would quietly resolve as okay by the linker).
When gamemode(teleporter) gets run in a script, it brings up a read-only
version of the teleporter screen, intended only for displaying rooms on
the minimap.
However, ever since 2.3 allowed bringing up the map screen during
cutscenes (in order to prevent softlocks), bringing up the map screen
during this mode would (1) do an unnecessary animation of suddenly
switching back to the game and bringing up the menu screen again (even
though the menu screen has already been brought up), and (2) would let
you close the menu entirely and go back to GAMEMODE, thus
unintentionally closing the teleporter screen and kind of ruining the
cutscene.
To fix this, when you bring up the map screen, it will instead instantly
transition to the map screen. And when you bring it down, it will also
instantly transition back to the teleporter screen.
But that's not all. The previous behavior was actually kind of a nice
failsafe, in that if you somehow got stuck in a state where a script ran
gamemode(teleporter), but stopped running before it could take you out
of that mode by running gamemode(game), then you could return to
GAMEMODE yourself by bringing up the map screen and then bringing it
back down. So I've made sure to keep that failsafe behavior, only as
long as there isn't a script running.
When bringing up the map screen, the game does a small menu animation
where the menu comes in from the bottom. The code to calculate the menu
offset is copy-pasted everywhere, so I thought I'd de-duplicate it to
make my life easier when working with it. I also included the
game.gamestate assignment in the de-duplicated function, so it would be
easier for a future bugfix.
At the same time, I'm also removing all the BlitSurfaceStandard()s that
copied menubuffer to backBuffer. The red flag is that this blit happened
for every single entry point to MAPMODE and TELEPORTERMODE, except for
the script command gamemode(teleporter). Pressing Enter to bring up the
map screen, pressing Enter to quit the Super Gravitron, pressing Esc to
bring up the pause screen, and pressing Enter to bring up the teleporter
screen all do this blit, so if this blit was there to fix a bug, then
there's a bug with using the script command gamemode(teleporter)... but,
as far as I can tell, there isn't.
That's because the blit basically does nothing. All the blit does is
copy menubuffer onto backBuffer. Then the next thing that happens is
that either maprender() or teleporterrender() will be called, and the
first thing that those functions will always do is fill backBuffer with
solid black, completely overriding the previous blit. So that's why
removing this blit won't have any effect, and it can be safely removed
for code clarity.
To fix this bug, all we have to do is just pass the existing
ScreenSettings* that we have in loadstats() to savestats(), and in
loadsettings() to savesettings().
Fixes#556. Depends on #558.
Another step to fix the bug #556 is to allow Game::savestats() to accept
a pointer to an existing ScreenSettings struct. This entails refactoring
Game::savesettings() and Game::serializesettings() to accept the
function as well, along with adding Screen::GetSettings() so the
settings of the current Screen can be easily grabbed.
In order to be able to fix the bug #556, I'm planning on adding
ScreenSettings* to the settings.vvv write function. However, that
entails adding another argument to Game::savesettings(), which is going
to be really messy given the default argument of Game::savestats().
That, combined with the fact that the code comment at the site of the
implementation of Game::savestats() being wrong (!!!), leads me to
believe that using default function arguments here isn't worth it.
Instead, what I've done is made it so callers are explicit about whether
or not they're calling savestats(), savesettings(), or both at the same
time. If they are calling both at the same time, then they will be using
a new function named savestatsandsettings().
In short, these are the interface changes:
* bool Game::savestats(bool) has been removed
* bool Game::savestatsandsettings() has been added
* void Game::savestats_menu() has been renamed to
void Game::savestatsandsettings_menu()
* All previous callers of bool Game::savestats() are now using bool
Game::savestatsandsettings()
* The one caller of bool Game::savestats(bool) is now using bool
Game::savestats()
`success = success && savesettings();` is now changed to
`success &= savesettings();`. It's bitwise, and I think C++ should have
had a &&= for completeness, but it shouldn't matter here.
Changing settings would most of the time attempt to save unlock.vvv and
now also settings.vvv, but there would be no feedback whether the files
have been saved successfully or not. Now, if saving fails when changing
settings in the menu, a warning message will be shown. The setting will
still be applied of course, but the user will be informed it couldn't
be saved. This message can be silenced until the game is restarted,
because I can imagine this could get very annoying when someone already
knows their settings aren't writeable.
Also, some options didn't even save settings in the first place. These
are turning off invincibility, and by coincidence precisely all the
options in the advanced options menu. I made sure these options now do
so.
It wasn't a direct duplicate of key.sensitivity, but it was still
basically the same thing. Although to be fair, at least the case-switch
conversion didn't get duplicated everywhere unlike game.slowdown.
So now key.sensitivity functions the same as game.controllerSensitivity,
and it only gets converted to its actual value whenever a joystick input
happens in key.Poll(), unlike previously where it got converted every
single frame on the title screen (there was even a comment that said
"TODO bit wasteful doing this every poll").
game.gameframerate seems to exist for converting the value of
game.slowdown into an actual timestep value, when really the timestep
value should just use game.slowdown directly with a fast lookup table.
Otherwise, there's a bunch of duplicated game.slowdown case-switches
everywhere, which adds up to a large, annoying pile should the values be
changed in the future. But now the duplicate variable has been removed,
and with it, all the copy-pasted case-switches.
Also, the game speed text rendering in Menu::accessibility and
Menu::setslowdown has been factored out to a function and de-duplicated
as well.
There were some duplicate Screen configuration variables that were on
Game, when there didn't need to be.
- game.fullScreenEffect_badSignal is a duplicate of
graphics.screenbuffer->badSignalEffect
- game.fullscreen is a duplicate of !graphics.screenbuffer->isWindowed
- game.stretchMode is a duplicate of graphics.screenbuffer->stretchMode
- game.useLinearFilter is a duplicate of
graphics.screenbuffer->isFiltered
These duplicate variables have been removed now.
I put indentation when handling the ScreenSettings struct in main() so
the local doesn't live for the entirety of main() (which is the entirety
of the program).
Assigning these variables is now wholly unnecessary ever since #522 got
merged, and in fact setting graphics.backgrounddrawn to false actually
causes the warp background to "skip" when the map screen gets closed. So
this fixes that bug, too.
This kludge variable was used to re-set the warp background after coming
back from the in-game settings menus. But since #522 got merged, this
has no longer been necessary.