[HN Gopher] War stories: how Crash Bandicoot hacked the original...
___________________________________________________________________
 
War stories: how Crash Bandicoot hacked the original Playstation
(2020)
 
Author : vector_spaces
Score  : 141 points
Date   : 2021-12-19 09:09 UTC (1 days ago)
 
web link (arstechnica.com)
w3m dump (arstechnica.com)
 
| jjoonathan wrote:
| If you want to see this sort of brilliant hacking on a modern
| system, I recommend taking a closer look at Unreal Nanite. It
| isn't just auto-LOD. Oh no. That's the core, but they had to do
| this same sort of "work around the tools they are given" to make
| it actually happen. Tiny triangles chug in modern GPU hardware
| raster pipelines -- so they wrote a GPGPU software rasterizer.
| They needed more bandwidth, so they wrote a compression engine.
| Their shaders needed better scheduling, so they abused the Z-test
| to do it. It's nuts!
| 
| https://www.youtube.com/watch?v=eviSykqSUUw
 
| Synaesthesia wrote:
| The original PS1 was so freaking limited. 33mhz? 2mb RAM? I used
| to wonder how they made games work on it. But looking at them
| today, you can see the short draw distances and the limitations
| very clearly.
 
  | skhr0680 wrote:
  | One part of it is you know what to look for and another part of
  | it is SDTVs connected by composite cables (or worse!) hid a lot
  | of jank
 
  | Comevius wrote:
  | It had additional fixed function hardware (GPU, SPU), a vector
  | coprocessor and a coprocessor for decoding images and video. It
  | was made for PS1 games, it's just that Sony's C library did not
  | initially expose the vector coprocessor, and the 2 MB RAM and 1
  | MB VRAM required developers to chunk resources from a 650 MB
  | CD-ROM.
 
  | fps_doug wrote:
  | It's still magic though, isn't it? The hardware was slower and
  | more limited, but at the same time the GPU wasn't event that
  | much of a GPU by today's standards, for example. You had to do
  | a lot of what modern GPUs do for you by yourself, on the CPU,
  | sacrificing even more of what little performance it had.
  | 
  | A modern dev would wonder how on earth he's gonna get Node.js
  | plus a thousand npm packages running on this so he can create a
  | WebGL context to start making a game. ;)
 
    | [deleted]
 
  | anthk wrote:
  | a 33 MHZ MIPS CPU would be close to a Pentium 90 and 4MB of
  | RAM. Not bad at all.
 
    | TapamN wrote:
    | The CPU in the PlayStation is nowhere near a 90 MHz Pentium.
    | 
    | The CPUs in the Saturn, PlayStation, and N64 were single
    | issue, and could execute at most one instruction per clock.
    | The Pentium was superscalar and could execute two
    | instructions per clock. The Pentium would likely average, for
    | integer operations, more than four or five times the
    | performance. Much higher for floating point.
    | 
    | The MIPS CPU in the PlayStation is much closer to a 486 in
    | performance-per-clock for general purpose code. A similarly
    | clocked 486 might win some G.P. code competitions due to
    | better caches, but the PSX's CPU would smoke the 486 in
    | transform and lighting, thanks to it's DSP coprocessor, and
    | the MIPS would still have much better performance-per-dollar
    | than the 486.
 
    | bluedino wrote:
    | Pentium CPU's often matched the performance of SPARC/MIPS
    | chips clock-for-clock, at least in non-floating point
    | operations. And at that time, almost no games use floating
    | point anyway.
 
    | ranma42 wrote:
    | Wouldn't x86 have the higher code density?
    | 
    | https://www.vogons.org/viewtopic.php?t=56207 "Finding an
    | IA-32 CPU most like the MIPS R3000A"
    | 
    | "a 40MHz MIPS R3000 gets 18.1 MFLOPS, while a 66MHz 486DX2
    | only gets 3.1 MFLOPS. So, the MIPS is downright Pentium-class
    | for floating-point code. For integer code, the MIPS R3000/40
    | manages a score of 22.6 in SPECINT89, while the 486DX2/66
    | gets 34.0"
    | 
    | So its also only "Pentium-class" for floating point math, not
    | for integer.
 
      | mysterydip wrote:
      | It should, by nature of CISC. However, the other
      | coprocessors on the playstation meant you weren't wasting
      | all your CPU cycles on rendering, which was the majority of
      | the time on a 3D game of the era.
 
      | hrydgard wrote:
      | The PSX CPU did not even have floating point support. It's
      | nowhere close to a Pentium 90.
 
        | flatiron wrote:
        | It's a video game system. You can make do without
        | floating point. The whole "it doesn't have fp so whobbly
        | textures" is a myth.
 
        | monocasa wrote:
        | It did have a fixed point vector coprocessor instead
        | though.
 
  | magoghm wrote:
  | Limited? The Atari 2600 had 128 bytes of RAM.
 
    | AnIdiotOnTheNet wrote:
    | And its video hardware only worked on one scanline at a time.
 
  | Sohcahtoa82 wrote:
  | I always wonder about how the hell DOOM ran at 30 fps on a 33
  | Mhz system.
  | 
  | 320 * 200 pixels @ 30 fps means they needed to calculate
  | 1,920,000 pixels per second. With a 33 Mhz CPU, that's ~17
  | clock cycles per pixel, and that doesn't account for time to
  | handle game logic.
  | 
  | I know it's not a true 3D engine, and the status bar took up a
  | good amount of space, but even accounting for that, you still
  | don't get a lot of clock cycles.
 
    | tenebrisalietum wrote:
    | > 320 * 200 pixels @ 30 fps means they needed to calculate
    | 1,920,000 pixels per second. With a 33 Mhz CPU, that's ~17
    | clock cycles per pixel, and that doesn't account for time to
    | handle game logic.
    | 
    | GPUs, even the primitive PSX one, will draw many in a single
    | CPU cycle. I can't find information on the PSX GPU fill rate
    | but I'm sure it's faster than 1 pixel per CPU cycle.
    | 
    | DOOM didn't use the GPU to draw the walls/floors as triangles
    | but did use the GPU to draw the scene as vertical strips.
    | DOOM also relied on the 1Kbyte "scratchpad RAM" which was the
    | CPU cache mapped into the address space and not subject to
    | DMA delays.
 
      | Sohcahtoa82 wrote:
      | > GPUs, even the primitive PSX one, will draw many in a
      | single CPU cycle. I can't find information on the PSX GPU
      | fill rate but I'm sure it's faster than 1 pixel per CPU
      | cycle.
      | 
      | GPUs do it via massive parallelization.
      | 
      | > DOOM didn't use the GPU to draw the walls/floors as
      | triangles but did use the GPU to draw the scene as vertical
      | strips. DOOM also relied on the 1Kbyte "scratchpad RAM"
      | which was the CPU cache mapped into the address space and
      | not subject to DMA delays.
      | 
      | In the days of DOOM, did the video card even offer any sort
      | of acceleration? I was under the impression that all the
      | texture scaling was done on the CPU. So while it rendered
      | walls as vertical strips, it would still have to
      | interpolate across the strip on the CPU.
 
    | joombaga wrote:
    | What makes Doom's 3d engine not "true"?
 
      | jsolson wrote:
      | As others noted, it only supported horizontal walls and
      | floors. More specifically, though, it only supported _one_
      | gap (e.g., a single window or door). You couldn't stack
      | floors on top of each other.
      | 
      | This allowed a single ray cast against the 2D map to learn
      | which vertical slices of wall texture to paint (no pitch or
      | roll, remember). Floors and ceilings could be painted in
      | vertical order (above / below the horizon line - "floor"
      | and "ceiling" being explicit for any given open area),
      | iirc, and walls filled in based on the ray cast from the
      | eye.
 
      | monocasa wrote:
      | The engine is limited such that it fundamentally can't do
      | up/down rotation of the camera.
 
      | bonzini wrote:
      | It can only draw horizontal floors and vertical walls (so
      | the map is essentially 2D even though it is drawn in 3D).
      | The player also cannot tilt their head sideways.
 
    | pengaru wrote:
    | What always impressed me about Doom was the dynamic
    | shading/lighting effects they managed in a 256-color mode.
    | 
    | The "3D" part wasn't any more impressive than Wolfenstein3D
    | was, which was running on significantly slower PCs.
    | 
    | I dabbled in PC demo programming at the time, the limited
    | color palette was a major challenge for me in attempting
    | shaded texture mapping. Once we had ubiquitous
    | "TrueColor"/"DirectColor" RGB modes, the shading challenges
    | became trivial.
    | 
    | Doom made clever use of a finite colorspace. When I tried
    | recreating some of that myself it became rather obvious why
    | Doom's palette was so muted and basically dark
    | gray/brown/green gradients they could then illuminate. But it
    | worked very well for Doom.
 
    | dehrmann wrote:
    | - Textures were just skewed images
    | 
    | - Everything that wasn't a room was a 2d sprite
    | 
    | - Very limited lighting effects
    | 
    | - An effectively 2d environment
    | 
    | It's still impressive. It's just that 3d games weren't
    | "ready" until the Quake II/Unreal/PS2 era.
 
    | Narishma wrote:
    | > I always wonder about how the hell DOOM ran at 30 fps on a
    | 33 Mhz system.
    | 
    | It didn't. It ran at ~25 FPS on a 66 Mhz 486 and only ~15 FPS
    | on a 33 Mhz system. It needed a Pentium to reach half the
    | refresh rate, which was 35 FPS at the time.
    | 
    | https://youtu.be/HNlcZetLzY8?t=209
 
      | Sohcahtoa82 wrote:
      | That's still only ~41 clocks per pixel. I wonder how many
      | clocks multiple and add instructions took on a 486.
 
  | pjmlp wrote:
  | The original Amiga 500 was so freaking limited. 7.16 MHz? 1mb
  | RAM?....
  | 
  | The original C64 was so freaking limited. 1 MHz? 64KB RAM?....
  | 
  | The original 48K was so freaking limited. 3.5 MHz? 48KB
  | RAM?....
  | 
  | It is a matter to know the tools and hardware.
 
    | buggeryorkshire wrote:
    | Yeah. I programmed the Amiga and it was awesome - no
    | middleware, no crap, if somebody else's code ran better then
    | their code was better. Same with the Atari ST.
    | 
    | I kind of miss those days.
 
      | pjmlp wrote:
      | Same here, you can get some of the experience back
      | targeting something like ESP32.
 
      | dehrmann wrote:
      | > no middleware, no crap
      | 
      | Not sure if this applies to the Amiga, but one of the huge
      | drawbacks to DOS was that you wrote your own video and
      | sound drivers. It wasn't abstracted away, so you _could_
      | get better performance, but it was a lot of repeated work,
      | and it doesn 't scale.
 
        | vnorilo wrote:
        | I mean, for most DOS games the "video driver" was one
        | interrupt and start writing pixels to framebuffer at
        | 0xa0000. Sound drivers were more complicated, but there
        | were a ton of libraries - it's how FMod got started!
 
| oakesm9 wrote:
| See also this brilliant series written by Andy Gavin which goes
| into a lot more detail https://all-things-andy-gavin.com/video-
| games/making-crash/
 
| bluedino wrote:
| Using the developers' networked SGI workstations to pre-compute
| the visibility of the levels was so ahead of it's time.
 
| RankingMember wrote:
| I'd love to see a similar breakdown on the original Metal Gear
| Solid for the PlayStation.
 
| mulmboy wrote:
| The GameHut channel does a lot of similar videos on how old video
| games squeezed the hardware. Super interesting
| 
| https://www.youtube.com/watch?v=JK1aV_mzH3A
| 
| https://www.youtube.com/watch?v=gRzKAe9UtoU
 
___________________________________________________________________
(page generated 2021-12-20 23:01 UTC)