I _think_ I _may_ have gotten it to work, without coding anything. Blender plays hell with the normals though. Will keep you posted if I have any notable success.
UPDATE: I managed to get facial animations and everything else to work with the existing tools (obj3do / 3doobj), but I managed to do that only if I use same size materials. How did you manage to use bigger sized(dimension) materials? What did you do with the model in order to allow that?
UPDATE: OK, I managed to do everything, but it's a
CRAP process:
Use the tools to create an OBJ file, then edit the *.mtl file to contain the missing images (not MATs but TGAs or something similar (BMP didnt't work with Blender on my end) ).
Then import to blender (to coordinate with the textures, in 3DSMax you'd have to do it manually), then export to 3DSMax, then scale the UV's four times (yes, scale the UV's, which makes it impossible to see how it will look like ingame), I do this with a script in MAX, since manually it's virtually imposssible to be precise. Then export back to Blender (I couldn't get the exported file from MAX to compile with the tools, I always got a NORMAL NOT UPDATED error), and resave as an OBJ, and finally use the tools to make a 3DO... Oh and yeah, Blender does a terrible job with normals and smoothing groups.
Please tell me you had a better way
Here's a test with a 512x512 chest (nothing improved just a resized image):
P.S.: a quote from Wiki:
Relative and absolute indices
OBJ files, due to their list structure, are able to reference vertices, normals, etc. either by their absolute (1-indexed) list position, or relatively by using negative indices and counting backwards. However, not all software supports the latter approach, and conversely some software inherently writes only the latter form (due to the convenience of appending elements without needing to recalculate vertex offsets, etc.), leading to occasional incompatibilities.