Some questions re: graphics files

Started by Lemmingologist, August 20, 2005, 02:43:43 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Lemmingologist

I heard somewhere on this forum that the graphics files can have any character following "VGAGR". But how do you indicate a graphics set other than 0-9 in the level file?
 &#A0;
Also, although it appears to be somewhat irrelevant, my compression code should probably include the checksum, just to be complete. How is this value calculated?
 &#A0;
And, ccexplore, I see you have cracked the format of the VGASPEC files! Well done - I was trying to figure it out earlier but gave up, as the files didn't appear to be long enough to store a raw 960x160 bitmap, and yet there is no information in the level files to indicate the placement of separate terrain pieces. I'm obviously not planning to write a VGASPEC editor, but just out of interest, how is it done? (Unless, of course, you prefer not to make this information public).
 &#A0;
 Thanks for your answers, everyone.

ccexplore

Quote from: Lemmingologist  link=1124549023/0#0 date=1124549023I heard somewhere on this forum that the graphics files can have any character following "VGAGR". But how do you indicate a graphics set other than 0-9 in the level file?
The game just uses (char)(number + '0') for the character following VGAGR (and GROUND, and analogously for VGASPEC), so other numbers mapped to other ASCII characters.  Of course, not all ASCII characters are legal in Windows filenames, and the Windows file system is case-insensitive so 'A' and 'a' are not distinguishable.

LemEdit seems to handle it differently though, so without hacking LemEdit, numbers outside 0-9 might not work properly.
 &#A0;
QuoteAlso, although it appears to be somewhat irrelevant, my compression code should probably include the checksum, just to be complete. How is this value calculated?
Just XOR all the bytes of the compressed data (excluding the header).
 &#A0;
QuoteI'm obviously not planning to write a VGASPEC editor, but just out of interest, how is it done? (Unless, of course, you prefer not to make this information public).
There's a second level of compression (basically a run-length encoding).  I've actually already made the information available to other people, in particular Mindless, so maybe you can e-mail him to get a copy of what I e-mailed him a while ago.  I could try digging it up in my e-mail sentbox later tonight.

Mindless

Quote from: ccexplore (not logged in)  link=1124549023/0#1 date=1124572666There's a second level of compression (basically a run-length encoding).  I've actually already made the information available to other people, in particular Mindless, so maybe you can e-mail him to get a copy of what I e-mailed him a while ago.  I could try digging it up in my e-mail sentbox later tonight.
Yup, I've still got it, but no address to send it to.

Lemmingologist

Quote from: ccexplore (not logged in)  link=1124549023/0#1 date=1124572666The game just uses (char)(number + '0') for the character following VGAGR (and GROUND, and analogously for VGASPEC), so other numbers mapped to other ASCII characters. &#A0;Of course, not all ASCII characters are legal in Windows filenames, and the Windows file system is case-insensitive so 'A' and 'a' are not distinguishable.

LemEdit seems to handle it differently though, so without hacking LemEdit, numbers outside 0-9 might not work properly.
Wow, that's easy! My first thought was that it involved changing the executable, but I dismissed that as a stupid idea, as it would be hopelessly impractical to do that every time someone makes a new graphics file. I'm glad it's already built into the game (unintentionally, it sounds like). Obviously the author of LemEdit didn't know about this feature.

QuoteJust XOR all the bytes of the compressed data (excluding the header).
Of course! XOR - why didn't I think of that? I kept trying to use addition and subtraction and remainders. Thanks, I have now updated my compressor (added the checksum and cleaned up the part that adds filler bits).
 &#A0;
Quote from: Mindless  link=1124549023/0#2 date=1124638530Yup, I've still got it, but no address to send it to.
Cool, could you send it to me? My address is lemmingologist@yahoo.com. Thanks!

Mindless

Quote from: Lemmingologist  link=1124549023/0#3 date=1124698419Cool, could you send it to me? My address is lemmingologist@yahoo.com. Thanks!
Email sent.

Lemmingologist

Thanks, I really appreciate it.
Just a few errors, ccexplore: first, the bitplanes are read in reverse order [edit: they are read from right to left - I was wrong, this is actually normal order. The doc doesn't actually say right-left or left-right, but the least significant bitplane is rightly labeled as bitplane 0]. Also, the last eight colors in the palette are not EGA, but CGA. [edit: I was wrong again here - these are actually for 16-color EGA, which is standard at 320x200 resolution]. And one typo - it should say "0x82 a length of 127", rather than 0x80. Other than that, it's great!

ccexplore

Quote from: Lemmingologist  link=1124549023/0#5 date=1124798136Thanks, I really appreciate it.
Just a few errors, ccexplore: first, the bitplanes are read in reverse order (I tested this).
No, this is ridiculous.  I wrote myvgaspec and it works, remember?

I think we are just misreading each other.  When I count bits, bit 0 is the rightmost (least significant bit).   That's generally how everyone else label the bits.  Is that how you do it?

So "normal order" to me is bitplane 0, bitplane 1, bitplane 2, meaning that you get the bitplane for the rightmost bit, followed by the bitplane for the second-to-rightmost bit, and so forth.  Actually, that's why I don't even use the term "normal order" or "reverse order".  It's just bitplane 0, bitplane 1, ..., in the order I list them in the doc.

QuoteAlso, the last eight colors in the palette are not EGA, but CGA.
Interesting, except the CGA hardware being older than the EGA, does not support palettes (at least not one that can be arbitrarily defined).  Moreover, there are separate CGASPECx files, so why would they store and load the supposed palette in the VGASPECx files?

(The Tandy might have some graphics mode that uses palettes though, although I've always thought it gets the data from CGASPECx also.)

QuoteAnd one typo - it should say "0x82 a length of 127", rather than 0x80. Other than that, it's great!
Oops, that's a typo.  And you're welcome.

But I do want to resolve the disagreement with the two other things you mentioned.

Lemmingologist

Quote from: ccexplore (not logged in)  link=1124549023/0#6 date=1124816344No, this is ridiculous. &#A0;I wrote myvgaspec and it works, remember?

I think we are just misreading each other. &#A0;When I count bits, bit 0 is the rightmost (least significant bit). &#A0; That's generally how everyone else label the bits. &#A0;Is that how you do it?

So "normal order" to me is bitplane 0, bitplane 1, bitplane 2, meaning that you get the bitplane for the rightmost bit, followed by the bitplane for the second-to-rightmost bit, and so forth. &#A0;Actually, that's why I don't even use the term "normal order" or "reverse order". &#A0;It's just bitplane 0, bitplane 1, ..., in the order I list them in the doc.
Ah, I understand now. I'm used to reading numbers from left to right, so I considered reading in the other direction to be "reverse order". Your way makes much more sense, as the last bit is always 20, the next is 21 (and so on), so they should be labeled 0, 1... from right to left. I suppose this is the standard way to read planar graphics; I had always just assumed that left to right was normal.

QuoteInteresting, except the CGA hardware being older than the EGA, does not support palettes (at least not one that can be arbitrarily defined). &#A0;Moreover, there are separate CGASPECx files, so why would they store and load the supposed palette in the VGASPECx files?

(The Tandy might have some graphics mode that uses palettes though, although I've always thought it gets the data from CGASPECx also.)
The CGA palettes are for the special 16-color CGA mode [edit: it's 16-color EGA, which uses the CGA colors], which does support palettes, as opposed to normal 4-color mode, which does not.

ccexplore

QuoteThe CGA palettes are for the special 16-color CGA mode, which does support palettes, as opposed to normal 4-color mode, which does not. I know this is true because the colors displayed are the same ones as described in this article: http://en.wikipedia.org/wiki/Color_Graphics_Adapter.
The usage of the terminology "palette" in that article is a little misleading.  When I talk of palette, what I'm really referring to is a programmable palette, that is, one where you can reassign its colors.

The CGA's color scheme is what I guess you can call a "fixed" palette.  Yes, there's a palette in the sense that you get a set of colors, but then again that's just by definition of supporting color.  But the set is fixed unlike EGA/VGA, so you cannot change the color assignment of the palette.

Also keep in mind that although the EGA has a programmable palette, the default palette the system sets it tois deliberately made the same as the CGA's set of fixed colors, for obvious backwards compatibility reasons.  So just because you find colors that happen to match the CGA's doesn't necessary mean that it must be for the CGA and not the EGA.

I'm also doubtful about the use of either of the two special graphics mode mentioned in the article.  Certainly when I run cgalemmi.exe (both in DOSBox and in plain Windows DOS), I just get the standard 4-color, 320x200 CGA mode.  (The fact that the game mechanics is pixel-based also makes me think it less likely to utilize a lower resolution of 160.)  The article also explicitly mentioned that many people confused the modes it talked about with PCjr/Tandy's 160x200 graphics mode, so I'm inclined to think that if the palette is used at all, it's more likely for the Tandy modes which does have a 320x200x16 color mode (although if memory serves, I don't think it has a programmable palette either).

-----------------------------

Bottom line:  unless you have actually verified that the palette you described as "CGA palette" in the VGASPEC file actually has an effect on the colors of the game when run in CGA mode, I think all you can say for sure is that the colors happen to match the ones from the CGA.

I gather though that since you called it a mistake, it must be the case that changing the "CGA" data in the VGASPEC doesn't affect the game colors when displayed in EGA mode.  Is that correct?  (This would I guess be analogous to the VGA case, where the game chooses to leave the lower 8 colors fixed.)

Lemmingologist

After reading the article on EGA, it's obvious that you're right. However, there are two different EGA palettes, and two different EGA modes - I wasn't aware of this before. [edit: they are used for two different resolutions]. Since my version of Lemmings (the CD version) doesn't have a CGA mode, I mistook 16-color EGA for CGA. Upon closer inspection, however, I realized that at 320x200 resolution, standard CGA could only display 4 colors, so it must be a form of EGA. When you run Lemmings in EGA mode, do you get 16 or 64 colors? Is it even possible to use the 64-color EGA mode, or is it always 16-color?

ccexplore

Ah, now that's good info.  Thanks!  B)

Yeah, I forgot about the possibility of the 16-color EGA mode.  That would explain perfectly why there are 2 EGA palettes in the VGASPEC file, but just one VGA palette.

I'm not totally familiar with all the details of the EGA, but this is what I think is the case here.

The EGA is complicated partly because it tries to accomodate many possible configurations of hardware.  Its vidoe memory for example can range from 64k, 96k and 128k (or something like that), and it can either attach to the RGB monitors that the CGA uses, or a more advanced monitor specifically for EGA use.

The CGA monitor (often called an "RGB" monitor) only supports the 16 CGA colors.  So an EGA attach to such a monitor would only be able to use those 16 colors, since the monitor physically cannot display other colors.  So presumably the 16-color EGA palette would be used in that setup.

With an EGA monitor attached however, the colors displayable by the monitor expands to 64, so in that setup presumably the 64-color EGA palette would be used.

If I remember correctly, the EGA does not actually have a graphics mode (at least a standard one) that can display 64 colors simultaneously.  This is where the concept of a programmable palette first appears.  You can only display 16 colors at a time, but the EGA allows you to reassign the colors, so that at least you can pick which 16 colors to use.

----------------------

In Lemmings, for both VGA and either configuration of EGAs, the graphics mode used is actually the same, namely 320x200x16 colors.  This is why there aren't EGASPEC files.  The difference is in how colors are used.  The VGA setting uses the VGA's palette mechanism which allows you to pick your 16 colors from 2^18 possible colors.  The EGA setting uses the EGA's palette mechanism which allows you to pick from either 64 or 16 colors, depending on what monitor is attached.

Unfortunately it would probably be impossible to test my theory on modern PCs, since it would always be treated as an EGA+EGA monitor configuration when you try to talk EGA to it, so you won't be able to get an EGA+CGA monitor configuration.

-----------------------

Oh yeah, on an unrelated note, an update to that doc of mine.  In it I think I said I wasn't sure which method the game uses to decide where to place the 960x160 within the overall level area.  Testing with myvgaspec seems to show that it always places the 960x160 bitmap at location x=304, y = 0 in the level area.

Lemmingologist

Quote from: ccexplore (not logged in)  link=1124549023/0#10 date=1124836202Unfortunately it would probably be impossible to test my theory on modern PCs, since it would always be treated as an EGA+EGA monitor configuration when you try to talk EGA to it, so you won't be able to get an EGA+CGA monitor configuration.
Are you sure about this? Mine can only display the 16 CGA colors.

ccexplore

Hmm...well I'll take a look later tonight.

Maybe it's the other way around then.  Or maybe it is one of the other machine settings (the ones besides PC compatible and Tandy) that has some extension to EGA that supports 64 colors or the like.  As I said, I'm much less familiar with EGA, and it seems the wikipedia article is also pretty short on details too.

ccexplore

Ok, after some thought, I think this is what's happening instead:

If you play Lemmings in windowed DOSBox, it is pretty obvious that the main menu screen uses a different resolution than when you're playing a level.  That's because when running in windowed DOSBox, the size of the window changes based on the resolution.

It looks to me that when you're not playing a level (eg. at the main menu, and most importantly, at the level preview), the game is probably using the 640x350 EGA mode (this seems true by the way even when running the game in VGA mode, although with VGA it can uses the VGA's palette mechanism even in that mode), and when the level's being played, the game uses the 320x200 EGA mode.

Now the wiki on EGA did say that only the 640x350 EGA mode can use 64 colors, the 320x200 apparently can only use 16.  So it seems plausible that the 64-color palette is used specifically when rendering the level preview in EGA mode, and the 16-color palette is used when actually playing the level in EGA mode.

Although two different graphics modes are also used in VGA, the VGA's palette mechanism works on all graphics modes (or something close to that), so the same palette can be used for both level preview and actual playing.  So only 1 VGA palette.

If you care to test this, the easiest way of course is to zero out one or the other palette, and see whether the level preview and/or in-game view are affected when playing in EGA mode.

Lemmingologist

Quote from: ccexplore (not logged in)  link=1124549023/0#13 date=1124853585Now the wiki on EGA did say that only the 640x350 EGA mode can use 64 colors, the 320x200 apparently can only use 16. &#A0;So it seems plausible that the 64-color palette is used specifically when rendering the level preview in EGA mode, and the 16-color palette is used when actually playing the level in EGA mode.
Edit: in theory, this is true; in practice, the 64-color palette is used not only for the preview, but also for the rest of the title screen. The details of this are not quite clear, but it makes a huge difference - the title screen flashes erratically and is hard to read. The best solution I can come up with is to make colors 0-7 in this palette match the built-in palette described in one of my later posts in this thread.

QuoteAlthough two different graphics modes are also used in VGA, the VGA's palette mechanism works on all graphics modes (or something close to that), so the same palette can be used for both level preview and actual playing. &#A0;So only 1 VGA palette.
Actually, there are two VGA palettes as well: one for the level and one for the preview, exactly the same as for EGA. In both preview palettes, colors 0-7 are editable as well as 8-F, so these palettes take up twice as much space as the level palettes. The EGA preview palette is necessary because, as you say, the 640x350 resolution uses a different color set, so the codes used for the 320x200 colors wouldn't work. However, I don't understand why they wasted space on the VGA preview palette, when they could have just used the level colors.