Rendering speed in Lemmings-like games

Started by EricLang, February 16, 2020, 10:43:04 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

EricLang

I was reading Object-oriented design with Simon and was wondering: how is it possible that in 2017 we have framerateproblems while in 1991 there were already lemmings walking around.

Simon

#1
The pointer chasing was 1 % or 2 % at best when I let the replay verifier run without graphics. This loaded the level, but created no savestates or only a single one. Then it ran the physics. Emplacing the job within the lix was only a small gain. I'd have to test more cleanly to isolate for pointer chasing, or for savestating.

Avoiding two allocations per lix, and instead implementing a single allocation for all lixes in one go, this was probably a bigger win than 1 %, but I can't tell. This is very hard to measure because the important use case is human framestepping. (There is no need for savestating when you verify replays automatically.)

The lion's share is always with the graphics: Choice of graphical libraries, and the particular usage of the library within the game. Hardware graphics are faster than software graphics in some ways and slower in others. As lone developer, with no special training in graphics, we're optimizing on hardware designed for 3D games.

Then there are the Lemmings 1 drawing engine or DOS video memory, I don't know anything about either. :lix-grin:

-- Simon

EricLang

That is why I chose to use Graphics32 which essentially boils down to a windows call to BitBlt, which is always hardware optimzed, at least for the last decade as far as I know. But the WinLemmings were quite smart in drawing as well. The makers had found some insane trick to make the graphics output very fast.

namida

Graphics32 definitely does not have hardware optimization, nor does it rely on Windows API in any way (maybe it used to a long time ago?).

It does have some of the code written in ASM, rather than Pascal, to improve performance, but that's all.
My projects
2D Lemmings: NeoLemmix (engine) | Lemmings Plus Series (level packs) | Doomsday Lemmings (level pack)
3D Lemmings: Loap (engine) | L3DEdit (level / graphics editor) | L3DUtils (replay / etc utility) | Lemmings Plus 3D (level pack)
Non-Lemmings: Commander Keen: Galaxy Reimagined (a Commander Keen fangame)

EricLang

Graphics32 does not use hardware acceleration itself, but the windows GDI BitBlt() is. I don't know how to hardcore verify this in practice though.

ccexplore

One big difference to consider is that on PC DOS, the game runs at a resolution of 320x200 and furthermore only uses the 16-color mode.  The 16-color mode alone translates to a difference of 4 bits per pixel vs nowaday's 24 bits.

Anyway, framestepping is not even a feature in original DOS Lemmings.  It has its special challenges apart from normal gameplay rendering, because it's not practical to keep an in-memory snapshot of all the game states since start of level, so the implementation inevitably ends up loading the closest snapshot and then fast-forward from it to the desired point (frame) in time after that snapshot.  There is a tricky balance between how many snapshots to keep vs how quickly the game can seek to the given frame.  It should be clarified that there is generally no performance issue in the case of merely stepping forward one single frame (like the game already does during normal play), it's the other cases of framestepping that are tricky.

As for BitBlt, it is a Windows API that has been around for a long time and fairly commonly utilized, so it's very likely that some hardware optimizations are available for it at least for certain cases.  On the other hand, it was not originally designed for gaming purposes (at least not specifically so), and while I'm no expert in the history of graphics support in Windows, it is telling to me that DirectX was developed specifically to support gaming on Windows, which to me strongly implies that the design of APIs like BitBlt were probably not entirely up to the performance standards required by gaming even at that time (though it's also true that DirectX was also to help paved the way for hardware-accelerated 3D graphics, which they correctly foresaw to become critical in gaming).