Dan Olson is the brilliant documentarian behind Folding Ideas. He’s covered a range of interesting topics ranging from Decentraland, gamer culture, NFTs, financial scammers and Minecraft.
James Rolfe plays the Angry Video Game Nerd, that long-running game commentary and comedy series, and to some degree he is the nerd, even though the character doesn’t reflect his own views or personality. Although he plays a character, playing it has become his career. He does other videos too, but it’s what he’s known for, it’s his mark upon the world, and so it’s how he’ll be remembered.
Rolfe is the head of a little empire called Cinemassacre. Rolfe was really popular at one point, but over time his subject became less relevant. Time is unkind. By the time the mechanisms became available to effectively monetize what he does, his fandom had diminished, although he keeps plugging away, and it sustains him. Cinemassacre partnered with an outfit, Screenwave, to help him monetize it, which involves making five videos a week. It’s provided him with an income to support a family. That’s the same tradeoff most of us make, only he gets to do something he likes doing.
It makes the case that James Rolfe was a victim of his own success. The Nerd character was extremely popular for a while. If you have three things you do, and one of them turns out a popular as the Nerd was, you’re naturally going to focus on that, and the others must suffer.
Dan Olson’s video is not against James Rolfe, but it’s also not in favor of him. It presents him as a hack, a jobber, a person who, when he finally has the chance to do something with his own ideas, they end up half-baked, iterations over things he made as a teenager. These things are probably true, but they’re also better than what most of the rest of us get.
I have never really been a fan of the Nerd. I think that the relentless negativity has fed into a culture that tears things down. But there is effort in how they’re made. There is a weird skill in coming up with so many distinct ways to insult things. I don’t agree with all his videos, which don’t leave much room to consider things noble attempts or failed experiments. But they’re just games, after all.
James Rolfe isn’t a bad person, far from it. Even so, Olson’s video tells us that Rolfe has an anti-fandom, a band of people who just hate him and what he does for no reason, for the crime of having a family and doing what he needs to survive. What an awful thing to exist. To think that there’s a category of person so petty. But also, this kind of pettiness is a great invisible sea. It is one of the worst of the early internet’s many legacies, and it’s largely the result of most people having no real, I’m not going to say life, but I will say stakesin life. When people’s lives are devoid of real meaning, they find what little meaning they can, and sad to say, there’s a lot of people who, to put it in Balatro terms, the best card they’ve been given is a five of clubs, and the rest of their deck is mostly twos and threes. (Can you tell what I’ve just came from doing?)
I’m rambling a bit, and part of that is due to the fact that Olson’s video rambles too. Dan Olson became obsessed with James Rolfe and his legacy, due in part to the similarity between their lives, and it feels like the video was released partly to exorcise James Rolfe from Olson’s mind.
I hope that Olson has successfully evicted the nerd from his brain attic. And I hope that Rolfe continues to be successful, even if I won’t watch his videos. It’s a hard life for all of us, far too hard to spend it tearing others down.
Twinbeard is a pretty active gamedev and Youtuber. If that name sounds familiar, it’s because he made Frog Fractions and Frog Fractions II. Yes! Him.
Lately he’s been playing Mario games on the installment plan: one significant unit of the game per video. One level at a time, or one star, or shine, or whatever luminous MacGuffin the plumber is lusting after at the moment.
Sundry Sunday is our weekly feature of fun gaming culture finds and videos, from across the years and even decades.
I find myself looking back upon the Dreamcast’s library, which was outrageously experimental. Sega tried so many things to see what would stick, but sadly few of them did, even though they’re really cool games.
There’s probably no better example of this than Space Channel 5, which I sometimes like to call “How Many Ways Can We Remix Mexican Flyer?”
Mexican Flyer is a real song, that existed long before Space Channel 5 and the Dreamcast. It was first published by Ken Woodman and His Piccadilly Brass in 1966 on their album That’s Nice. Here’s audio from Youtube (2 1/2 minutes):
Space Channel 5 remixes it several ways. Here’s the beginning, which is a fairly straight rendition. (That link was made with Youtube’s Clips feature, which doesn’t embed too well in WordPress.) Here’s the start of the second level (5 minutes):
Space Channel 5 isn’t a very long game, with only four levels, and although there’s alternate sections of a couple of levels that unlock after finishing the game and a subgoal of rescuing all the hostages, it doesn’t have a lot of replayability. It’s an enjoyable trip while it lasts, though.
It ends with a (mostly) a capella version, about ten minutes long:
And here’s the music isolated without the gameplay sounds overtop it (3 minutes):
Ken Woodman passed away in 2000, only a few years before Mexican Flyer began its video game afterlife. He also did music for a couple of British radio productions, and arranged music for Shirley Bassey, Tom Jones and Sandie Shaw.
Our friend Robin at 8-Bit Show And Tell lets us know of this cool and free Commodore 64 BASIC 2.0 extension, of a sort, called Hare Basic. It’s a successor to an earlier version called Bunny Basic. Here’s the video, 48 minutes long. My comments on it follow below, which you can read either after having watched the video, or before, depending on of you have most of an hour to spare right now.
Here are the downloads, which are hosted on the creator’s Dropbox, so availability may fluctuate.
Commodore BASIC is, in many ways, the worst of all worlds. It’s a slow interpreted language, a variant of infamous Microsoft BASIC, and it has almost no machine-specific features, but it comes with the machine, and it’s burned into ROM. You can swap it out for extra RAM if you have a replacement OS or are running something in pure machine code.
I could go on for a long time about the problems with Commodore BASIC 2.0, a language I’m quite familiar with having spent much of my teens programming in it. Sometimes it feels like it was designed especially to run slowly. One example: it supports floating point math, which ordinarily would be a good thing, right? Use integer math for performance, and just use floats when you need decimals, right? But no: internally, Commodore BASIC converts integer variables into floats when doing any math with them, and converts them back to store as integers when it’s done. Wilberforce Trafalgar Franklin?! Why?! It does these unnecessary extra steps to do all arithmetic as floating point even when it doesn’t need do, and doesn’t offer a way to do performant integer math at all! Need I remind you that Microsoft BASIC is based upon software written by Bill Gates himself? I suspect that I don’t!
Hare Basic is a highly optimized subset of Commodore BASIC that can be switched on and off as needed. It has to be coded in a special way which might throw beginners for a loop: Hare Basic can’t abide whitespace, for example, only allows for variables of one letter in length, has no support for modifying strings, and contrary to Commodore BASIC can only do integer math. There’s lots of other differences too, and if you want to play around with it it’s essential that you study the manual.
But once you get used to it, it runs blazingly fast, sometimes as much as 10 times faster! And the best part is you don’t have to use it for everything. You can start out with a standard Commodore BASIC program, then enter into Hare Basic mode with a USR function call. You could write your whole program in Hare if you’re up for it, or just loops, or other places where performance is necessary.
Of course, this is ultimately an enhancement for a programming language that runs on a home computer made in 1984. It’s not what one might consider of universal interest. But it might be of interest to the kinds of people who read this site. It’s interesting to me, at least. Maybe I should dust off VICE and see what I can do with it? I haven’t coded on a ’64 in nearly three decades, maybe I should get back into that….
I did a search of the blog to make sure I haven’t posted this before. I’m really an obsessive tagger, and it didn’t show up under the tag pacman, so I think it hasn’t been seen here before. Let’s fix that now!
It’s a video from Retro Game Mechanics Explained from six years ago, and it’s 11 1/2 minutes:
Here’s a terse summary of the explanation, that leaves out a lot. Like a lot of 8-bit games (the arcade version uses a Z80 processor), Pac-Man stores the score in one byte, making the maximum it can count to 255. Since it doesn’t use signed arithmetic, it doesn’t use the high bit to signify a minus sign and so flip to negative at 128.
As an optimization, Pac-Man’s code uses the depiction of the maze in the video memory, itself, in the movement of both Pac-Man and the ghosts. If a spot has a maze wall tile, then Pac-Man can’t go there, and the ghosts won’t consider that direction when moving.
At the start of every level, the game performs some setup tasks. It draws the maze anew, including dots, Energizers and walls. One of these tasks is to update the fruit display in the bottom-right corner. It was a common design idiom at some arcade manufacturers, especially at Namco, at the time to depict the level number with icons in some way. Galaga shows rank insignia in the corner; Mappy has small and large balloons and mansions.
Pac-Man’s code shows the bonus fruit for each level, up to seven of them. If you finish more than seven levels, only the most recent seven are shown. If you get far enough eventually this will be just a line of Keys, the final “fruit.”
The code draws them from right to left. There’s three cases (the video goes into much more detail), but generally it starts from the fruit of six minus the current round number, draws it, counts up once and moves left two tiles, draws that one, and so on.
An interesting fact about Pac-Man’s graphics hardware is that the screen doesn’t map as you might expect to the screen! A lot of arcade games have weird screen mappings. Most consumer programmable hardware will map characters horizontally first vertically second, like a typewriter*.
In Pac-Man, the bottom area of the screen comes first in memory, starting at memory location hex $4000 (16384 decimal), and it doesn’t go forward like an English typewriter, but is mapped right to left. The first row of 32 tiles comes at $4000, and the second row is $4020. Then the playfield area is mapped completely differently, in vertical rows going down starting from the top-right of its region, then the next vertical row is the one to the left of that, and so forth to the left edge of the playfield. Then comes the score area at the top of the screen, which are two final rows mapped the same way as the bottom area, right to left.
When Pac-Man’s score counter overflows, it breaks the check for the limit for only drawing seven fruit, and causes it to draw 256 fruit. This is why the tops of keys are drawn beneath the upper-halves of the fruit at the bottom of the split screen. It also breaks the tile lookup for the fruit.
As it continues writing its missourced fruit tiles in memory, it goes back in memory each time to draw the next fruit, and after the fruit section of the display it keeps going to the left, into the area where Pac-Man’s lives are displayed, then it keeps going and overwrites half of the maze tiles. Then Pac-Man’s lives (and any empty spaces that indicate the lack of lives) are plotted, overwriting fruit after the first ones drawn and obscuring some of the memory corruption.
Since the game’s actors use that data to decide where to move, and where dots and Energizers are placed, it means they can move outside the bounds of the maze, and that there won’t be enough dots for Pac-Man to eat to complete the level. That’s what makes it a kill screen: if Pac-Man loses a life, a few dots will get placed in the maze as the fruit are redrawn, but it’s not enough to bring the dot-eaten count to 244, which triggers the level clear function.
If the fruit-drawing loop didn’t stop at 256 (another artifact of using 8-bit math for the loop), it’d go on to clobber the rest of the maze, the score area at the top of the screen, then color memory (which has already been clobbered by the palette-drawing portion of the loop). Then, going by a memory map of the arcade hardware, it’d hit the game logic RAM storage, which would probably crash the game, triggering the watchdog and resetting the machine.
The visual effect of the split screen is certainly distinctive, enough that since Bandai-Namco has capitalized on its appearance at least once, in the mobile (and Steam and consoles) game Pac-Man 256. I’ve played Pac-Man 256: it’s okay, but, eh. It’s really too F2P unlocky.
* Yes, I just used a typewriter’s operation as a metaphor for something a computer does. It didn’t feel acceptable to use another computer thing as the comparison, since ultimately the reason they do it that way is because typewriters did it that way too. I guess the fact that it’s English reading order would be better to use, but I’m really overthinking it at this point.
Sundry Sunday is our weekly feature of fun gaming culture finds and videos, from across the years and even decades.
The Splatoon series has a lot of great music, usually composed along the lines of squishy voices shouting gibberish, which makes sense due to the singers being squid, or other forms of aquatic life.
One of the songs in Splatoon 3’s single-player campaign is Seep and Destroy, which has gotten the fan name of Bang Bing due to a specific frequently-heard vocalization within it.
nathors made an animation (2:46) that has no sea life at all, but fits really well. It imagines the song as backing a civilization of Easter Island heads, who get abducted by a planet of robots, and then they fight their way onto a spaceship and back home. It’s fun! It’s here:
The Youtuber: MattKC Bytes What he did: Unexpected things to Sega’s aborted Genesis/Mega Drive add-on. The address: here. The length: about seven minutes.
The explanation: Did you ever play around with a 32X? Evidently not a lot of people did. It was straaaange. Unexpectedly powerful! A bit misjudged! Hosted a port of DOOM! Had a port of Virtua Racing that compares favorably with the Saturn version! Had that crazy hard-to-play Knuckles game that gave us Vector the Crocodile!
Have you ever hooked one up though? Its hardware is odd. It’s like a completely separate console to itself. The Mega Drive wasn’t made to support add-on processors and chips like that, so Sega used a clever solution: the 32X has its own video output, and also a video input. You plug the Genesis’ output into the 32X, and then the 32X into your TV. The 32X mixes the Genesis’ signal into its own, as if it were chromakeyed. Since the 32X cartridge supplies the program running on the Genesis as well as itself and they can talk to each other, the two processors and graphics chips should be able to sync perfectly, if awkwardly.
But: because the Genesis’ video signal emerges from that console through this external wire before reentering the 32X, it’s possible to do things to it while in transit. The Genesis supplies video timing information that the 32X relies on, so you can’t get a signal from the add-on without the Genesis’ AV plugged into it, but the Genesis does produce a viewable video signal that you can see on its own.
All the details are in the video, which has been embedded below for your convenience and amusement.
By that title, I don’t mean the capabilities of the Wii title called Wii Music*. The video below, from Dublincalif, is about the properties of the Wii’s sound system itself. It’s 24 minutes, but pretty interesting for all that, and it’s presented really well. It’s a model explainer video, and a great first effort in that style from its maker!
You might think that all the music on the Wii is just streamed, either from audio tracks or files, but it isn’t. The Wii has fairly little NAND storage, and music is a major consumer of storage space, so a lot of its music is sequenced, essentially MIDI files played with sample banks, with optional effects added. The video is a great overview of its features and capabilities.
* Of random interest: Wii Music’s data is amazingly small! Of that 4.7GB DVD it resides on, it uses less than 10 MB!
The life of a farmer is a difficult one. Most people don’t know how difficult it is to succeed in agriculture. It’s not enough to harvest fields of wheat and bale hay. The first bale of hay collected in the barn, as it turns out, sets a multiplier! And any grain collected in the silo, and any hay harvested in the upper floor of a barn (but only the upper floor), is not only affected by that multiplier, but reduces the multiplier of rivals. I presume all of this is due to farm subsidies.
These are the idiosyncratic rules of Farminng Simulator eSport, a popular (in some circles) gaming competition, it seems, in Germany. Teams are sponsored by agricultural equipment manufacturers, and there’s a pick/ban system in place for tractor selection. Pro gamers compete to get bales into their barns (preferably by that magic window into the upper floor!) before their opponent does, and can raise and lower a bridge on the rival farm, in an effort to mess them up, all while real farmers share pints of lager and look on in confusion.
People Make Games looked into this scene and explains it over half an hour, here:
Inside the Peculiar World of Farming Simulator eSports (Youtube, 32 minutes)
In 24 minutes, Bismuth on Youtube explains how Super Mario 64 was beaten without a single A button press, on actual hardware by someone who’s nom de net is Marbler. The run was performed over five days. Video of the feat isn’t up yet, but should appear on Marbler’s channel when uploaded and encoded. Here is the video embed:
I have some commentary on this. First, if you’ve been following PannenKoek all this time like I have you know they’ve done many videos over the internals of SM64, many with the end goal of getting the A Button Challenge as low as it can go. The answer is, he doesn’t get all the stars, and it’s been a long iterative process of routing, and figuring out how to do formerly TAS-only techniques on with a controller. After a long period of improvement, finally, the dam broke.
What does this mean for PannenKoek? I think their most interesting videos lately have been those that are more about Mario 64’s internals, like that terrific explainer about invisible walls. And completing every star without A button presses is still a ways off. I think they’ll be fine.
It’s only two episodes in, but this series from the Youtube channel What’s Ken Making is already really interesting, with episodes averaging at around 16 minutes each. The first part is titled “The Design of a Legend,” which doesn’t really grab me much, but the second is about the main processor, “The 6502 CPU,” which Ken admits near the start isn’t exactly accurate. The Famicom/NES’s processor isn’t precisely a MOS 6502; it’s a Ricoh 2A03 in NTSC territories, and a 2A07 in others. The 2A03 is licensed from MOS, but lacks the original’s Binary-Coded Decimal mode, and includes the Famicom/NES’s sound hardware on-die.
Episode 1 (15 minutes):
Episode 2 (17 minutes):
That removed BCD feature. Why? The video notes that the circuits are right there within the chip, but have been disabled by having five necessary traces severed. The video notes that the 6502’s BCD functionality was actually patented by MOS, and asks, was the feature disabled because of patent issues? Was Ricoh trying to avoid paying royalties?