Previously, setting the actual volume of the music was all over the
place. Which isn't bad, but when I added being able to press N to mute
the music specifically, I should've made it so that there would be a
volume variable somewhere that the game looks at if the music is
unmuted, and otherwise sets the actual volume to 0 if the game is muted.
This resulted in things like #400 and #505 and having to add a bunch of
special-cased checks like game.musicmuted and game.completestop. But
instead of adding a bunch of special-case code, just make it so there's
a central place where Mix_VolumeMusic() actually gets called, and if
some piece of code wants to set the music volume, they can set
music.musicVolume. But the music handling logic in main.cpp gets the
final say on whether to listen to music.musicVolume, or to mute the game
entirely.
This is how the music handling code should have been from the start
(when pressing N to mute the music was added).
Fixes#505.
After looking at pull request #446, I got a bit annoyed that we have TWO
variables, key.textentrymode and ed.textentry, that we rolled ourselves
to track the state of something SDL already provides us a function to
easily query: SDL_IsTextInputActive(). We don't need to have either of
these two variables, and we shouldn't.
So that's what I do in this patch. Both variables have been axed in
favor of using this function, and I just made a wrapper out of it, named
key.textentry().
For bonus points, this gets rid of the ugly NO_CUSTOM_LEVELS and
NO_EDITOR ifdef in main.cpp, since text entry is enabled when entering
the script list and disabled when exiting it. This makes the code there
easier to read, too.
Furthermore, apparently key.textentrymode was initialized to *true*
instead of false... for whatever reason. But that's gone now, too.
Now, you'd think there wouldn't be any downside to using
SDL_IsTextInputActive(). After all, it's a function that SDL itself
provides, right?
Wrong. For whatever reason, it seems like text input is active *from the
start of the program*, meaning that what would happen is I would go into
the editor, and find that I can't move around nor place tiles nor
anything else. Then I would press Esc, and then suddenly become able to
do those things I wanted to do before.
I have no idea why the above happens, but all I can do is to just insert
an SDL_StopTextInput() immediately after the SDL_Init() in main.cpp. Of
course, I have to surround it with an SDL_IsTextInputActive() check to
make sure I don't do anything extraneous by stopping input when it's
already stopped.
So, earlier in the development of 2.0, Simon Roth (I presume)
encountered a problem: Oh no, all my backgrounds aren't appearing! And
this is because my foregroundBuffer, which contains all the drawn tiles,
is drawing complete black over it!
So he had a solution that seems ingenius, but is actually really really
hacky and super 100% NOT the proper solution. Just, take the
foregroundBuffer, iterate over each pixel, and DON'T draw any pixel
that's 0xDEADBEEF. 0xDEADBEEF is a special signal meaning "don't draw
this pixel". It is called a 'key'.
Unfortunately, this causes a bug where translucent pixels on tiles
(pixels between 0% and 100% opacity) didn't get drawn correctly. They
would be drawn against this weird blue color.
Now, in #103, I came across this weird constant and decided "hey, this
looks awfully like that weird blue color I came across, maybe if I set
it to 0x00000000, i.e. complete and transparent black, the issue will be
fixed". And it DID appear to be fixed. However, I didn't look too
closely, nor did I test it that much, and all that ended up doing was
drawing the pixels against black, which more subtly disguised the
problem with translucent pixels.
So, after some investigation, I noticed that BlitSurfaceColoured() was
drawing translucent pixels just fine. And I thought at the time that
there was something wrong with BlitSurfaceStandard(), or something.
Further along later I realized that all drawn tiles were passing through
this weird OverlaySurfaceKeyed() function. And removing it in favor of a
straight SDL_BlitSurface() produced the bug I mentioned above: Oh no,
all the backgrounds don't show up, because my foregroundBuffer is
drawing pure black over them!
Well... just... set the proper blend mode for foregroundBuffer. It
should be SDL_BLENDMODE_BLEND instead of SDL_BLENDMODE_NONE.
Then you don't have to worry about your transparency at all. If you did
it right, you won't have to resort this hacky color-keying business.
*sigh*
For some reason, the cursor would be either disabled and re-enabled if
you switched to windowed mode, or it would be always enabled if you
switched to fullscreen mode. This only happened when you toggled
fullscreen using the Alt+Enter, Alt+F, or F11 keybinds, and the
fullscreen option in graphic options doesn't have this problem.
This cursor toggling business seems like an arcane incantation back in
the days of SDL1.2, now since no longer necessary with SDL2. However,
after some testing, it seems like removing these indecipherable runes
don't cause any harm, so I'm going to remove them.
Fixes#371.
Achievements could be unlocked in custom levels/make and play, so this adds the wrapper function `game.unlockAchievement` which calls `NETWORK_unlockAchievement` if `map.custommode` is false.
Also, this function and `Game::unlocknum` have both been `ifdef`ed to be empty if MAKEANDPLAY is defined.
Okay, so basically here's the include layout that this game now
consistently uses:
[The "main" header file, if any (e.g. Graphics.h for Graphics.cpp)]
[blank line]
[All system includes, such as tinyxml2/physfs/utfcpp/SDL]
[blank line]
[All project includes, such as Game.h/Entity.h/etc.]
And if applicable, another blank line, and then some special-case
include screwy stuff (take a look at editor.cpp or FileSystemUtils.cpp,
for example, they have ifdefs and defines with their includes).
Including a header file inside another header file means a bunch of
files are going to be unnecessarily recompiled whenever that inner
header file is changed. So I minimized the amount of header files
included in a header file, and only included the ones that were
necessary (system includes don't count, I'm only talking about includes
from within this project). Then the includes are only in the .cpp files
themselves.
This also minimizes problems such as a NO_CUSTOM_LEVELS build failing
because some file depended on an include that got included in editor.h,
which is another benefit of removing unnecessary includes from header
files.
Whoops.
Noticed this earlier when I was merging upstream back into VCE, and
interestingly enough, it doesn't look like cppcheck warns about
undeffing a non-existent define.
This patch optimizes the loop used to limit the framerate in 30-FPS-only
mode so that it uses SDL_Delay() instead of an accumulator. This means
that the game will take up less CPU power in 30-FPS-only mode. This also
means that the game loop code has been simplified, so there's only two
while-loops, and only two places where game.over30mode is checked, thus
leading to easier-to-understand logic.
Using an accumulator here would essentially mean busywaiting until the
34 millisecond timer was up. (The following is just what leo60228 told
me.) Busywaiting is bad because it's inefficient. The operating system
assumes that if you're busywaiting, you're performing a complex
calculation and handles your process accordingly. And this is why
sleeping was invented, so you could busywait without taking up
unnecessary CPU time.
There's a bug where playtesting from Ved doesn't properly play the music
of the level, due to no fault with Ved.
This was because the music was being faded out by
scriptclass::startgamemode() case 23 after main() called music.play().
To fix this, just call music.play() when all the other variables are
being set in Game::customloadquick().
The problem we're running into is entirely contained in the Screen - we need to
either decouple graphics context init from Screen::init or we need to take out
the screenbuffer interaction from loadstats (which I'm more in favor of since we
can just pull the config values and pass them to Screen::init later).
Now that you have a mini menu in MAPMODE, it's a bit annoying to have to
deal with the slowed-down timestep when pressing left/right/ACTION
inside it. Especially since going to an options menu restores the
timestep back to normal (because it's in TITLEMODE). Also removed it
from TELEPORTERMODE for consistency.
The Alt+Enter Glitch is a funny glitch where if you press any toggle
fullscreen keybind during a cutscene, Viridian will stop moving (if
they're being moved by a walk()) and ACTION will start being held down
for them. Meaning in most cases you can interrupt a walk and flip at the
same time.
This can obviously break cutscenes if a casual just wants to toggle
fullscreen, so I'm fixing it. But it will be unfixed in glitchrunner
mode in case you want to glitch out custom levels (I know I do).
More information on the Alt+Enter Glitch is available here:
https://gitgud.io/infoteddy/vvvvvv-knowledge/blob/master/bugs/bugs.md#the-altenter-glitch
(The page is a bit outdated, some bugs have been fixed in 2.3 already.)
This patch is very kludge-y, but at least it fixes a semi-noticeable
visual issue in custom levels that use internal scripts to spawn
entities when loading a room.
Basically, the problem here is that when the game checks for script
boxes and sets newscript, newscript has already been processed for that
frame, and when the game does load a script, script.run() has already
been processed for that frame.
That issue can be fixed, but it turns out that due to my over-30-FPS
game loop changes, there's now ANOTHER visible frame of delay between
room load and entity creation, because the render function gets called
in between the script being loaded at the end of gamelogic() and the
script actually getting run.
So... I have to temporary move script.run() to the end of gamelogic()
(in map.twoframedelayfix()), and make sure it doesn't get run next
frame, because double-evaluations are bad. To do that, I have to
introduce the kludge variable script.dontrunnextframe, which does
exactly as it says.
And with all that work, the two-frame (now three-frame) delay is fixed.
In my previous PR, I wrongly assumed that I could just replace the `i`
handling code of those options with an `i++;` at the beginning, and thus
I could put all blocks' `i++;` into ARG_INNER(). Well I was wrong,
because the code was written the original way for a reason, namely that
we still need `i` to point to the -playx/y/rx/ry/gc/music argv so we can
re-compare which argument led us into this code block.
Just to be helpful if someone has a failing FILESYSTEM_init(), but
doesn't know that's their issue and keeps wondering why VVVVVV just
exits with code 1.
The command-line argument parsing code has a lot of copy-paste. This
copy-paste would get even worse if I added safety checks to make sure
you couldn't index argv out-of-bounds by having an argument like
`-renderer` without having anything after it, i.e. you'd be doing the
command `./VVVVVV -renderer`.
Previously, only the playtest arguments (apart from the recently-added
`playassets`) had this safety check, but the message it printed whenever
the safety check failed was always "-playing option requires one
argument" regardless of whatever argument actually failed to be parsed.
So I fixed it so that all arguments actually output the correct
corresponding failed argument instead.
Also, `strcmp(argv[i], <name>) == 0` is really kind of noisy, even if
you understand what it does perfectly well.
So I refactored it with a bunch of macros. ARG() just does the strcmp()
char* comparison, and ARG_INNER() does the safety check and returns 1,
along with printing a message, if the safety check fails.
This is used if you're loading a level file from STDIN. The game needs
to know the actual level assets directory you're referring to, since
when it gets the level from STDIN, it doesn't know the actual filename
of the level.
Fixes#309.
Ugh, this is terrible and stupid and I hate myself for it.
Anyway, since the SDL2 VSync hint only applies when the renderer is
created, we have to re-create the renderer whenever VSync is toggled.
However, this also means we need to re-create m_screenTexture as well,
AND call ResizeScreen() after that or else the letterbox/integer modes
won't be applied.
Unfortunately, this means that in main(), gameScreen.init() will create
a renderer only to be destroyed later by graphics.processVsync().
There's not much we can do about this. Fixing this would require putting
graphics.processVsync() before gameScreen.init(). However, in order to
know whether the user has VSync set, we would have to call
game.loadstats() first, but wait, we can't, because game.loadstats()
mutates gameScreen! Gahhhhhh!!!!
@leo60228 suggested to fix that problem (
https://github.com/TerryCavanagh/VVVVVV/pull/220#issuecomment-624217939
) by adding NULL checks to game.loadstats() and then calling it twice,
but then you're trading wastefully creating a renderer only to be
destroyed, for wastefully opening and parsing unlock.vvv twice instead
of once. In either case, you're doing something twice and wasting work.
Otherwise a NO_CUSTOM_LEVELS build would fail. Also I got rid of the
'graphics.flipmode = false;' in the fixed loop because I don't think it
does anything.
This is needed for the next step. I want to put all the loop stuff in
their own functions so the code isn't one huge blob, but to do that I'll
need to make 'time' a global variable, but I can't do that because
actually 'time' is already a function, apparently, and you're only
allowed to shadow variables when already inside a function.
This is to make sure no lerping occurs in 30-FPS mode, otherwise things
might look weird. A good case that this fixes is how entities look in a
double-gotoroom (they should be completely frozen in 30-FPS mode).
There is now an option in "graphic options" named "toggle fps", which
toggles whether the game visually runs at 1000/34 FPS or over 1000/34
FPS. It is off by default.
I've had to put the entire game loop in yet another set of braces, I'll
indent it next commit.
This is because otherwise, on my Linux system at least, the game will
take up a lot of CPU that it doesn't really need to (I only have a 60hz
monitor).
On Windows, it looks like Windows already enforces V-sync for
applications anyway, unless they have exclusive fullscreen control.
Linux doesn't enforce V-sync on apps and lets them take up as much CPU
as they want, so I'm putting this here to limit the framerate.
The game is already actually 30 FPS anyway, the nice smooth FPS is just
visual. V-sync won't introduce any more "input lag" than already exists.
As much as it looks cool to have a slowly-scrolling background on the
title screen, it's quite annoying that slowmode applying on the title
screen mean that your keypresses are less responsive.
To fix this, I draw another row/column of incoming textures. But of
course, I have to extend the size of the towerbuffer, otherwise the
incoming textures will just be gone.
This has to be done in order to fix rendering when on a conveyor or
moving platform and actively moving with or against it. Pretty sure this
shouldn't break anything, oldxp/oldyp is mostly visual after all (and by
the time it's used for gravity line collision checking,
updateentitylogic() would've already gotten around to it anyway).
Incidentally, this also fixes a jitter that would occur if you were
moving at the time you died or collected a trinket or custom crewmate,
due to the game temporarily freezing and either doing deathsequence or
completestop.
Otherwise the screen will shake too fast for my liking.
Also I'm planning to add an FPS limiting option later (because right
now, un-capping the FPS is pretty wasteful and eats up lots of
resources, especially since I have only a 60hz monitor), and it'd feel
weird if screen shaking updated every delta timestep.