Now Reading
Ask HN: What software program was used to make 90s cutscenes?

Ask HN: What software program was used to make 90s cutscenes?

2023-01-08 19:17:03

[Disclaimer: I did not work in the video games industry in the 90s, but I did work in VFX studios hired to do cutscenes in the mid-00s]

Before engine-compatible assets were good enough to render in-engine, cinematics that involved character animation were authored with an entirely parallel path, using the same reference art, in a character animation program. Maya and 3d Studio Max were popular at the time and most reference art and lookdev was done with them anyway, so that was a popular choice, and Softimage (RIP) was an artist favorite in offline animation too.

Sometimes making the cinematics wasn’t a core competence of the studio working on the game, so VFX or animation studios would be contracted to do this. Often this meant the studio was already set up to work for TV or Film, so it was staffed such that a 3D render would reach a compositor relatively quickly, where many manual corrections would happen in 2D space. Compositing software has largely consolidated nowadays to the few survivors (Nuke, Flame mostly) but at the time there were many, like Combustion and Henry (Quantel), and even After Effects was used a bunch in places I worked.

To this day, not all engines love creating cinematics, because even if an external/offline renderer can do the full render (like in Unreal Engine), some engines don’t support the animation systems required to do cinematics, or don’t support them at a velocity artists like. In other words, the same software workflows are used today.

IIRC all the character artists used one tool, 3D Studio Max, and all the background artists used Maya or maybe the other way around. I just remember the cutscenes were delegated to Blizzard South or the main studio in Irvine (they did them in house), and the guy who was like the story evangelist down there was so enthusiastic (he came up to our studio for a couple of meetings) about things that had little connection to the game play but was critical to the story he was telling in the cinematics, I remember looking at the quest list and comparing it to the plot of the cinematics and not seeing a lot of similarities but it’s been 20 years so I may be getting some things wrong here. My memory is in early views we got of the cinematics there were also creatures that had no resemblance to any we had in game but that’s what the guy in charge of cinematics wanted.

> 3D Studio Max, and all the background artists used Maya

What I’ve seen in my past glory days (years ago) was the opposite, Maya excelled at character animation (or general animation) and the rest was done in 3DS Max.

Someone deeper into the industry might want to correct my correction though.

IME it depended a lot on individual studios. The company I worked at (Radon Labs) was a 100% Maya shop, other Berlin game companies were entirely 3DSMax based. I guess today it’s more mixed up because Maya and 3DSMax have ended up in the same company and (hopefully) can easily exchange data.

Max was released in 1996 and Maya in 1998. From my foggy recollection, it took a while for Max to become widespread so my hunch is that 3D Studio for DOS and Lightwave ruled the roost for most of the 90s.

Maya had a popular (on SGI) predecessor called Wavefront (at least the company was called that, but IIRC it was also the name of their 3d modelling/animation product), the company Wavefront was then aquired and merged together with Alias Research (PowerAnimator) by Silicon Graphics into a company called Alias|Wavefront, which then created Maya from scratch, but Maya took a lot of inspiration from Wavefront (and PowerAnimator I guess).

As far as I remember, 3DSMax was still an outlier throughout the 90’s, but it quickly gained popularity in the PC game development scene because it also ran on PC and didn’t require a separate investment in SGI workstations or beefed up Amigas (and I guess the rampant software pirating on PC in the 90’s also played a role to make 3DSMax popular).

Yeah, as far as I know the Amiga was popular for US TV productions, probably because of the Video Toaster heritage, and the low resolution of TV allowed to render out lower resolution video which didn’t require powerful SGI render farms. I don’t think Lightwave was ever popular in the PC game development scene (I remember one small Berlin game studio though which sticked with Lightwave for a fairly long time because they had built a pretty extensive workflow around it).

3ds max was pretty popular before in-engine I remember trying to learn it before C++ and it was harder than any programming concept… can’t tell if it was my laggy head or the obtuse UI.

Interesting factoid: a codec called TrueMotion by the Duck Corporation encoded cut scenes for a number of games in that period, on PC and Sega Saturn+Dreamcast+3DO. This company later changed its name to On2, and developed the VP3/VP8 generation of codecs that were ancestors of AV1 (Google’s open-source codec) (Disclaimer: I was founder/CTO)

Off the top of my head, we did intro & cutscenes for these titles/platforms (this is a small subset):

Spycraft | Activision PC

Gexx | Crystal Dynamics 3DO

The Horde | Crystal Dynamics 3DO

Street Fighter II | Sega

Final Fantasy 7 | Square (PC +/or PS2??)

There were at least 20 others

FF7 used the Duck codec for the PC release, which can sometimes have a fun bug where all the videos play upside down. I assume the PS1 original uses the consoles built in Motion JPEG decoder.

Same 3D-modelling and -animation software that’s used today, most have even survived one way or another: 3DSMax, Maya (or its predecessors Wavefront and PowerAnimator), Houdini, SoftImage, Cinema4D, Lightwave3D, Real3D etc… in the early 90’s the hardware would be either Silicon Graphics workstations or Amigas, by the end of the decade everything had moved to PC.

Cutscene creation was usually outsourced to dedicated studios because it was completely disconnected from the actual game development process.

Something to consider generally about 90’s game production is the interplay between the in-game assets, the cutscenes and the production models that were evolving at the time.

The software itself is a factor in this – and there was hardware, too, a lot of Silicon Graphics workstations were used in the mid-90’s – but the device constraints at play dictated the idea of pre-rendering 3D assets for games like the Diablos and Resident Evils, which in turn made it easier to consider reusing them for FMV. That in turn produced the “parallel pipelines” mentioned in omershapira’s comment whenever the engine was actually capable of 3D: often games were pitched to publishers by producing a cutscene trailer, and then the development team figured out what the engine tech could actually do as they went along. Because the in-game assets were still very basic and produced relatively cheaply given a design spec, this served a combination of development and marketing goals. Lara Croft got on all the magazine covers because of the high-poly CGI, not her in-game version.

(Why would publishers focus on assets? In this period, acquisition was extremely common as the industry got bigger and financing new projects got more risky, and so publishers gravitated towards a practice of shopping around for IP development and staffing at a low price. What they were banking on was not getting just one hit or a powerful engine, but a franchise that would sell well for years and experienced developers that they could put on their own projects. Likewise, studios were hungry to get publisher support and their heads often settled for an acquisition rather than shutting down. Focusing on asset production was a way of meeting in the middle, since the tech was so often an unknown; if you acquired a team that could make good assets, then plugged them in with an in-house tech team, a product could be made.)

Blizzard was a Maya (and it’s predecessor PowerAnimator) shop in the 90s and the RE3 cinematics were done with lightwave. Lightwave pivoted from general cg to more of a architectural focus in the last decade.

Lightwave was a big deal in the broadcast TV industry for many years as a result of being born out of VideoToaster.

VideoToaster didn’t really survive the transition to HD due to lack of investment from NewTek and Lightwave sort of lost a niche. Many of its users transitioned to Modo and other tools.

There is sadly not a lot of information out there for the early years of cinematics. Your best bet is old issues of “3dArtist” or “Cinefex”. They float around on various torrent trackers. Blizzard has hardcovers for most of their franchises with a lot of history and there is also a behind the scenes & cinematics DVD Set for diablo. Square had behind the scenes books for all of the final fantasies prior to 13. There is also a TV documentary “Light&Magic” on Disney about ILM (LucasArts) which is primarily about their feature film vfx work but goes also into some details of “LucasArts” cinematics (mainly early Star Wars Games).

FWIW, quite a few of the collaborators on Love Death + Robots come from the video game cinematic world including Blur, the main producers of the series.

Blur for one heavily uses 3DS Max.

See Also

Both 3DSMax and Maya eventually ended up being bought by the same company (Autodesk), but they have very long separate histories and entirely different roots (Maya is coming out of the highend Silicon Graphics world and was mostly used in the movie industry, while 3DSMax was the “PC underdog” at the end of the 90’s but has always been very popular for PC game development).

Hasn’t been necessary in quite a while. Playing fullscreen video isn’t the challenge it was in the 1990s.

(Besides, a lot of games are moving away from FMV cutscenes anyway. Rendering cutscenes in realtime using the game engine makes them easier to author and integrate into gameplay.)

I’m guessing it’s this one? https://player.vimeo.com/video/14483640?autoplay=1

Produced by a Polish studio that appears to have achieved a ton of online game cinematics, not only for The Witcher.

Cannot discover any specifics from their web site, however looking job listings (https://platige.com/careers/) often give a sign of what sort of instruments they’re utilizing. Appears to be these as of in the present day:

– Nuke

– Maya

– 3DS Max

– Unreal Engine

– Houdini

– ZBrush

Maya and Houdini is talked about in quite a lot of them, however the cinematic would not have quite a lot of common VFX/destruction and such, so my guess is that they principally used simply Maya, rendered with Arnold and composited with Nuke. It is a very fashionable setup within the VFX trade and notably at the moment, earlier than Unreal Engine was as helpful as it’s in the present day.

That’s because cutscene production was usually outsourced to specialized companies that offered the entire cutscene production process as a service to game studios and publishers (not just for game cutscenes and intros, but also marketing trailers) – those companies usually created all cutscene assets from scratch, and at most used artist sketches for ‘inspiration’, but no ‘ingame assets’.

If the game studio made the cut scenes themselves, they used usually both Maya and 3DS.

If they were made externally by a 3rd party production studio that also worked for TV and Cinema, they probably used some SGI workstation running Maya or Softimage 3D or Lightwave.

Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top