Of course it is possible to solve this in software.
In fact, in principle, this is already solved, so for reasons of weirdness in the OS / graphic card drivers / monitors, this does not always work for everybody (though it worked fine for me for many years on various systems including some with a monitor connected using an old-style VGA cable.)
ScummVM is built on the SDL library. When ScummVM starts a game in fullscreen mode, it asks the SDL library for a screen of a certain resolution, say 640x480 (that's what it asks if you start a 320x200 game and chose a 2x scaler and aspect ratio correction).
Now, SDL, will then in turn ask the OS (which will ask the graphics card driver, which may ask the graphics card and the screen...) to switch to a resolution; or at least a resolution close to that.
If the final resolution does *not* match exactly what ScummVM is using, then SDL will try to automatically do what you call "pillarbox mode": Namely, it centers the actual graphics drawn by ScummVM on the screen, and adds black borders around that as necessary.
Thus, if all went right, then ScummVM should still display with correct aspect ratio and maybe black borders! Perfect.
Sadly, this breaks down on various systems for various reasons. Namely, sometimes the OS (I'll from now on just refer to the "OS" when I mean "OS, driver, card or screen") will tell us that video modes are available that we really do not want to use. Say, a mode which is not aspect ratio correct. You screen might be 16:10, but the OS might end up telling us that, sure, we can get 640x480 out of it. And it then proceeds to somehow map that resolution to the screen surface. Somtimes by stretching/squeezing it (bad :/) sometimes by adding black borders / pillarboxes of its own to it (not as bad but still usually not perfect), etc.
On some systems, you can avoid that by either editing the list of resolutions the OS will accept as valid; or by changing a setting that forces it to never stretch screen data (e.g. I think the NVidia drivers on Windows allow both; and some better LCDs have settings for that, too). But sometimes none of this is possible, and we end up in a crap situation.
![Sad :-(](./images/smilies/icon_sad.gif)
.
None of these problems would exist if the OS would not try to cheat by telling us about resolutions that are not really supported, or about resolutions that lead to stretching. So, in general, this is something that deserves bug reports to be sent to the responsible parties.
However, it is clear that this does not help you, unless you are willing to buy a new screen/graphics card/switch th OS ...
So, we could try to work around this mess in ScummVM, but none of it is pretty. We currently have some code which tries to partially workaround this all by inspecting the list of resolutions the OS tell us are really, *really* supported, and only using one of those. But as I explained, even that heuristic can fail badly because too many involved parties may hand out "incorrect" (or at least: misleading) information.
Another approach, used by DOSBox (AFAIK) is to let the user specify which resolution to use. I.e. offer key/value pairs for this in the config, maybe also show a popup in our GUI option dialog with a list of all available resolutions, and use that.
This is certainly doable, but poses some issues of its own to us (none are unresolvable, mind you). For example, we only should show that popup on systems were it makes sense. We'd also have to create an API internally to our "backends" (the code which interfaces with your PC/game console/smartphone) which makes such information available to the options dialog. Then, we'd have to make sure that people who switch between using multiple screens (they the one built into your laptop, and a big external one you use at home) are not suddenly made unhappy.
All this is solvable, but adds quite some complexity, not just for us the developers, but also to users. While it would help some people, like you, it would also bear the risk of creating a new inconvenience for other people, were things used to work but due to the change may work less, if we are not very careful.
All in all, definitely something we should consider looking into, as it has become clear that certain OS & graphic cards & screen vendors will not do anything to improve the messy situation. However, not everybody is so interested in working on this. Me, for example: I have no interest in working on this since it always worked perfectly on *my* Mac OS X computers, and also on the external (non-Apple
![Wink ;)](./images/smilies/icon_wink.gif)
screens I used over the years.