The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).

Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?

  • magic_lobster_party@fedia.io
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    19 hours ago

    They used top of the line hardware specialized for 3D rendering. Seems like they used Silicon Graphics workstations, which costed more than $10k back in the day. Not something the typical consumer would buy. The calculations are probably a bit off with this taken into account.

    Then they likely relied on rendering techniques optimized for the hardware they had. I suspect modern GPUs aren’t exactly compatible with these old rendering pipelines.

    So multiply with 10ish and I think we have a more accurate number.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      15 hours ago

      There is no comparison between a top of the line SGI workstation from 1993-1995 and a gaming rig built in 2025. The 2025 Gaming Rig is literal orders of magnitude more powerful.

      In 1993 the very best that SGI could sell you was an Onyx RealityEngine2 that cost an eye-watering $250,000 in 1993 money ($553,000 today).

      A full spec breakdown would be boring and difficult but the best you could do in a “deskside” configuration is 4 x single core MIPS processors, either R4400 at 295Mhz or R10000 at 195Mhz with something like 2GB of memory. The RE2 system could maybe pull 500 Megaflops.

      A 2025 Gaming Rig can have a 12 core (or more) processor clocked at 5Ghz and 64GB of RAM. An Nvidia 4060 is rated for 230 Gigaflops.

      A modern Gaming Rig absolutely, completely, and totally curb stomps anything SGI could build in the early-mid 90s. The performance delta is so wide it’s difficult to adequately express it. The way that Pixar got it done was by having a whole bunch of SGI systems working together but 30 years of advancements in hardware, software, and math have nearly, if not completely, erased even that advantage.

      If a single modern gaming rig can’t replace all of the Pixar SGI stations combined it’s got to be very close.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      6
      ·
      18 hours ago

      Remember how extreme hardware progress was back then. the devkit for the n64 was $250k in 1993 but the console was $250 in 1996.

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        17 hours ago

        Most of that cost was unlikely for the hardware itself, but rather Nintendo greed. Most of it was probably for the early access to Nintendo’s next console and possibly support from Nintendo directly.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          6
          ·
          17 hours ago

          the devkit was an SGI supercomputer, since they designed the CPU. no nintendo hardware in it.