Porting Lix to SDL, loose ideas

Started by Simon, February 26, 2015, 02:10:37 PM

Previous topic - Next topic

0 Members and 3 Guests are viewing this topic.

Simon

This might be of interest only to our small group of Lix source hackers.

I'm playing around with SDL 2. (EDIT: Using Allegro 5 for a rewrite attempt since March 2015. The remaining post here is not altered.) The grand idea is to port Lix from Allegro 4 (called A4) to this library. A4 is getting old, and porting Lix to the current A5 would be similarly elaborate.

A4's BITMAP* is a RAM bitmap. Optionally, it can be a VRAM bitmap, but I don't use that anywhere in A4 Lix.

SDL (by which I mean SDL 2.0) works with three datatypes instead of one: SDL_Surface*, SDL_Texture*, and SDL_Renderer*. The latter comes as a window renderer and a software renderer. From the documentation I've digested, these types are used in the following manner:

  • Renderer is the pendant to A4 Lix's pre_screen, which gets drawn onto several times until it's done and ready to be copied to the visible screen. Typically, you draw many different Texture objects, full or in part, onto Renderer, then display it. Every SDL Window should have exactly one renderer. You can't draw from one renderer onto another. You can use drawing primitives (point, line, etc.) on a Renderer directly.
  • Software Renderer similarly collects data to be drawn, but ultimately draws back on a surface instead of the screen.
  • Texture is like a BITMAP* in VRAM or a compiled sprite, I'm not 100 % sure. They can be blitted onto a Renderer, which should be extremely fast. You can have blender modes while drawing, but can't do software pixel manipulation without locking the Texture in RAM and then unlocking it again. This is slow if it fetches the Texture out of VRAM.
  • Surface is the same as A4's BITMAP* in RAM. They can be converted to Texture, and probably have to, because you can't blit Surface to Renderer directly.
Now, what are good replacements for the various data structures used in Lix?

  • pre_screen -> Renderer
  • Cutbit -> Surface. Probably not Texture, since they must be copied onto Torbits, and the idea of Texture is to be copied onto Renderer.
  • Torbit (a bitmap that behaves like a torus when other things are drawn onto it: what's drawn out of bounds reappears at the other side) -> Surface, because you should only have one Renderer, which is pre_screen.
  • Map (the landscape) -> ?
Alternatively, maybe this:

  • pre_screen -> Renderer
  • Cutbit -> Texture, and it still draws onto Torbit, which must therefore become...
  • Torbit -> Software Renderer, making a Surface again. That surface will then become a texture to be drawn onto the screen Renderer.
  • Map (the landscape) -> ?
Everything was easier when I wasn't caring about drawing speed, but this is one reason why efficient game programming is hard. :>

Image: SDL hello world with Texture

-- Simon

NaOH

Why not let both Cutbit and Torbit use Texture?

Simon

Drawing Texture onto Texture isn't what you "should" do, in that there is no library function to do it.

If the gut feeling is that both should be Texture, then probably Cutbit should use Texture and Torbit should draw via SurfaceRenderer -> Surface -> Texture.

-- Simon

NaOH

I believe SDL_RenderCopy() would work.

But I think, generally one shouldn't be rendering to a texture directly, only rendering textures to the screen. So maybe the intermediate Torbit should be dropped altogether, and Cutbit should perform the correct modular arithmetic when rendering?

Simon

Can't dismiss this idea, it's certainly possible that the information architecture of A4 Lix is flawed when using SDL.

I'm reluctant to do this because Torbit does some modulo calculation external of drawing: computing distances. Also, I wouldn't want the graphic objects (each lix, each terrain tile, i.e., things that refer to some Cutbit and sit somewhere on the map) carry the information about torus.

Right now, Map inherits Torbit. If Torbit is scrapped, Map will continue to be important. I'd therefore like to have the torus information at least in Map.

-- Simon

ccexplore

The SDL 1.2=>2.0 migration guide on the wiki you linked to might be a good conceptual resource.  I'm certainly no expert in this area, but based on what you said in the first post, A4 Lix seems a bit similar to how the migration guide describes SDL 1.2:

Quote from: http://wiki.libsdl.org/MigrationGuide#VideoFor 2D graphics, SDL 1.2 offered a concept called "surfaces," which were memory buffers of pixels. The screen itself was a "surface," if you were doing 2D software rendering, and we provided functions to copy ("blit") pixels between surfaces, converting formats as necessary. You were almost always working on the CPU in system RAM, not on the GPU in video memory. SDL 2.0 changes this; you almost always get hardware acceleration now, and the API has changed to reflect this.

So conceptually speaking, I expect some similarities in going from A4 = > SDL 2.0 as what the migration guide obstensibly covers (SDL 1.2 => SDL 2.0).  If so then the guide may prove useful for this topic.

ccexplore

It might also be helpful for benefit of "other readers" (perhaps just me at this point :-[) to briefly mention here how rendering works currently in the A4 game.  For example, the game has things like the level's terrain (which is "Map" I think?), "sprites" that includes things like the lixes, interactive objects, trapdoors, etc., and UI elements like buttons and windows.  What is the current rendering path for each such thing (in other words, which objects like Torbit/Cutbit etc. does the thing go through until it eventually winds up rendered on the pre_screen?)

Simon

#7
ccexplore, thanks for the link to the SDL porting advice, this sounds very much applicable.

Here's the sketch of the drawing order:

BITMAP* pre_screen; (this lives in RAM, too)
Torbit (owns a BITMAP*) osd; (It's not necessary that osd must be Torbit, BITMAP* would probably do.)
Map (inherits Torbit) map; (This must be Torbit in the current architecture, it's a canvas for drawing gameplay.)
State (owns a Torbit) current_state; (This holds the land, I have several states for networking backup or savestating.)
Cutbit interactive_object, lix_image, ...; (owns a BITMAP* with the entire spritesheet, can make sub-BITMAP*s that only have the desired frame. Cutbits do not change while the program is running! Maybe some more are loaded when new terrain is required from disk, but existing Cutbits don't change anymore.)

When drawing during the normal gameplay:

Clear the map rectangle visited by the current scrolling to the level's bg color.
Draw the interactive objects onto the map.
Draw the landscape from current_state's Torbit onto the map.
Draw the interactive objects onto the map that go in front of the landscape.
Draw the lixes onto the map.
Undraw osd, this removes the mouse cursor from the last time we were drawing and restores its background.
Draw GUI elements onto the osd that have changed since last time we were drawing.
Draw the mouse cursor onto the osd, saving its background.
Blit the map rectangle visited by the current scrolling onto pre_screen, without transparency.
Blit the osd onto the pre_screen, with transparency.
Blit the pre_screen to the physical screen in VRAM.

The landscape in current_state may be altered by the lix skills during calc()the logic calculation. This happens at a different time than draw(). Important is that the landscape will not necessarily be the same image upon the next call to draw(), I don't think about it as a sprite.

-- Simon

ccexplore

Slightly off topic, but this together with your revival of the "end-user-translatable Lix" thread reminds me of this one thing, which is a good idea to take into account as you start doing your D port:

Currently the dimensions of UI elements (popup windows, buttons, etc.) in C+ Lix are all hardcoded.  While I never pretended to try to support complex languages like Chinese/Japanese/Korean, I can easily see one problem:  the current dimensions look to be a little too small to make those languages legible I think.  I think the font size being used would need to be bumped up a little to get good legibility for those languages, but then they probably won't fit properly within the existing hard-code heights of most buttons and labels.

I would suggest that for the D port, at a minimum, maybe introduce a global height and a global width scaling factor that can be applied to most UI, and incorporate those scaling factors into all calculations of sizes and positions of such UI.  And allow for those scaling values to be adjustable by translators (along with font sizes, but they can do that already even in the C++ port by simply replacing the game's font files).  I believe that will adequately help address the issue without introducing too much complexity.  Just an idea.

Simon

#9
A good thing. I don't yet have any code that depends on these design ideas, but they have come to my mind, too.

Variable resolution will go in. A5 offers something they call "fullscreen window", which is a window without edges or title bar, the size of the screen. It looks like fullscreen and alt-tabbing from that is very fast. I really want to use this, and I'm restricted to the user's desktop resolution for it.

Normal windowed mode will also go in. Only switching between window and fullscreen-window will be impossible, or slow and hard to implement, with the VRAM bitmaps.

A5 depends on FreeType, which loads TTF fonts in any wanted size. Example image.

I want the GUI to be scalable without my ugly pixelwise scaling blit in C++/A4 Lix. Your scaling factor right in the GUI code is a good idea: the GUI elements take coordinates before scaling, and draw after applying scaling. It doesn't account for widescreen displays, but I'm sure the idea can be extended in a nice way.

-- Simon

vanfanel

Hi, Simon!

Any chances to see Lix ported to SDL2? That would take Lix to the mighty Raspberry Pi, for example, where allegro does not perform well at all while SDL2 has accelerated graphics support.
Lix on a 5$ computer could be great!

namida

I attempted to compile and run Lix on Ubuntu on an Odroid-XU4, which is somewhat similar to (though much more powerful than) the Raspberry Pi. Although it worked at full speed, there were problems with sound and the Cuber skill - though this is based on the C++ version; I never tested the D/A5 version. (Note to self and Simon: Let's test that sometime too!) So I'm not sure how well it would work on ARM-based hardware in general.
My projects
2D Lemmings: NeoLemmix (engine) | Lemmings Plus Series (level packs) | Doomsday Lemmings (level pack)
3D Lemmings: Loap (engine) | L3DEdit (level / graphics editor) | L3DUtils (replay / etc utility) | Lemmings Plus 3D (level pack)
Non-Lemmings: Commander Keen: Galaxy Reimagined (a Commander Keen fangame)

Simon

#12
Quote from: vanfanel on February 08, 2016, 02:40:27 AM
Raspberry Pi, for example, where allegro does not perform well

Allegro 4 or Allegro 5? I'd be surprised if A5 were slow on the Raspberry. A5 and SDL2 should have comparable hardware graphics acceleration. Some forum members have criticized using A5 over SDL2 because they were more familar with SDL2.

If you're feeling adventurous, build the D/A5 version that's yet in development.

-- Simon

vanfanel

Quote from: Simon on February 08, 2016, 09:02:58 AM
Quote from: vanfanel on February 08, 2016, 02:40:27 AM
Raspberry Pi, for example, where allegro does not perform well

Allegro 4 or Allegro 5? I'd be surprised if A5 were slow on the Raspberry. A5 and SDL2 should have comparable hardware graphics acceleration. Some forum members have criticized using A5 over SDL2 because they were more familar with SDL2.

If you're feeling adventurous, build the D/A5 version that's yet in development.

-- Simon

Ah, Allegro 5 seems to have GLES support, yes. But it's implemented in a way that rendering context depends on X11 to work :(
So still Allegro 5 is not for me. I could add a dispmanx (native Rpi rendering context) but I have no time to add it to yet another library.
No word on the SDL2 port then? I understand you had something working already

Simon

No SDL2, at least in the upcoming year. I'm too busy, too.

For a few weeks now, Lix regulars have been looking at the port, giving lots of feedback. I'm ironing out heavy problems, to get speed and better consistency with C++/A4 Lix. I wouldn't like to swap libraries at all right now.

Your case is the first substantial technical drawback of A5 compared to SDL2. Thanks for pointing it out. I view low-level graphics drawing as a black box. Did SDL2 have a similar problem, and you have written code for SDL2 to improve the situation?

-- Simon