Multisample Issues - Strange Behavior

Return to General Discussion

Multisample Issues - Strange Behavior

Postby markjacksonguy » Tue Nov 15, 2011 8:54 pm

Something strange is going on when I run P3D vs any other game app I have. When I run games on my PC and set the sampling from 2 to 8x, I can clearly see the difference in the amount of sampling. 8x looks a lot better than 2x, but 8x is slower.

This is the thing with Panda3D... It doesn't matter what I set for sampling. 2x or 8x, it WILL ALWAYS USE THE GRAPHIC CARD MAX!! For me that would be 16xQ.

I'm losing a lot of speed because of this. I didn't even realize it until now. I just kind of noticed the sampling looked too nice for 2x and that's when I started changing the value around and found out I'm getting max no matter what.

P3D will report a sampling of 2x, but what's reported is not correct. I clearly know the difference between a 2x sampling vs a 16x.

Is this a problem on the NVIDIA Geforce GT 220? Well...can't be, since my other games do adjust sampling correctly and I can clearly see the difference in look and app speed.

All I want is a 2x sampling, not a 16x.

I'm running windows 7 64 bit.

Code: Select all
framebuffer-multisample 1
multisamples 2

render.setAntialias(AntialiasAttrib.MFaster+AntialiasAttrib.MAuto);
User avatar
markjacksonguy
 
Posts: 550
Joined: Wed Sep 28, 2011 4:06 pm

Postby drwr » Tue Nov 15, 2011 9:01 pm

Are you running OpenGL or DirectX9? The buildbot version, or 1.7.2?

David
drwr
 
Posts: 11425
Joined: Fri Feb 13, 2004 12:42 pm
Location: Glendale, CA

Postby markjacksonguy » Wed Nov 16, 2011 12:40 am

drwr wrote:Are you running OpenGL or DirectX9? The buildbot version, or 1.7.2?

David


Open GL and 1.7.0 (at least the install folder says 1.7.0)
User avatar
markjacksonguy
 
Posts: 550
Joined: Wed Sep 28, 2011 4:06 pm

Postby markjacksonguy » Wed Nov 16, 2011 4:05 am

I knew it!!! I knew I wasn't going crazy! It's OpenGL! Has to be!

I can't run DirectX because I get all kinds of errors which I will type later, but what I want to mention here is the fact, I was able to set sampling to 2x and then 16x and I saw the change (under Dx9).

I can't say if fps was effected or not because I was having an error, so my fps was lowered anyway.

I would rather used Open GL because it can cross-platform.

I disabled display lists when using DirectX, because an error said lists weren't supported by DX, so I guess that's an OpenGL thing.

Anyway, my next Dx9 error was as follows:
Code: Select all
display:gsg:dxgsg9(error): vertex_element_type_array check not implemented yet

display:gsg:dxgsg9(error): could not find matching vertex element data for vertex shader


My whole program goes to hell with Dx9 (I'm guessing no better with Dx8). Auto Shaders crash, the fps drops and all the visuals are messed up (textures and alpa).

Going back to the OpenGL issue, even if I set framebuffer-multisample to value 0, I will still get max sampling. As long as framebuffer-multisample is set to 1 or multisamples is set to anything, I get max sampling; both of them don't have to be set.....(?)

Something is messed up somewhere in P3D 1.7.0 with OpenGL and multisampling.

Drwr, man if you can fix this one I think I would love you to death. :)

If this issue is present in 1.7.0, then the same issue was most likely carried to later versions.
User avatar
markjacksonguy
 
Posts: 550
Joined: Wed Sep 28, 2011 4:06 pm

Postby zhao » Wed Nov 16, 2011 7:28 am

To use shaders with dx9, you need to download the latest build, not the pre-packaged SDK. Most shaders will work correctly then.

As for the hardware multisampling issue, there are some bugs.

1) Under Dx9 if you enable multi-sampling, your off-screen buffers will have screwed up depth-testing. So if you need to use off-screen buffers, you can't use multi-sampling under Dx9.

2) Under Opengl, the anti-aliasing (at least for Panda under windows), is not MSAA but something closer to FSAA. FSAA is equivalent to rendering to a 2x buffer and then downsizing. ie., if your windows resolution is 1024x768, the internal GPU draw buffer is actually 2048x1536, which is auto-sampled down to 1024x768 at the end. This is a big deal if you want to do a lot of post-processing, or particles as EVERYTHING requires 4x pixel fill rate.

Bottom line, if you need to do anything fancy, don't use hardware multisampling.

If you want to do something fancy and still want AA, I suggest you looking into shader solutions like MLAA and FXAA. Pretty much everyone seems to be going down this route anyway, and it's a very viable alternative to hardware AA.
zhao
 
Posts: 225
Joined: Tue Nov 10, 2009 5:32 pm

Postby markjacksonguy » Wed Nov 16, 2011 9:28 am

zhao wrote:FSAA is equivalent to rendering to a 2x buffer and then downsizing. ie., if your windows resolution is 1024x768, the internal GPU draw buffer is actually 2048x1536, which is auto-sampled down to 1024x768 at the end.


Therefore the P3D kills a lot of it's own speed, which I kind of figured out. Oh well, I guess one project with P3D won't hurt since I've already spent so much time with the engine.

I will move on afterwards. :)
User avatar
markjacksonguy
 
Posts: 550
Joined: Wed Sep 28, 2011 4:06 pm

Postby zhao » Wed Nov 16, 2011 3:36 pm

Of all of the problems that Panda has, this should be the very last reason for not using P3D. Like I said, if you have a simple setup, use the default MSAA. It is fast enough if you don't do any post processing in opengl or offscreen buffers in Dx9.

If you have a complex setup, you will want to use shader based FXAA or MLAA anyway ... unless you think you can do better than Battlefield 3, Cry Engine ... which ALL are moving to shader based AA solutions. I can't think of a reason why you wouldn't want to use shader AA as they run 5-6x faster than regular MSAA.
zhao
 
Posts: 225
Joined: Tue Nov 10, 2009 5:32 pm

Postby markjacksonguy » Fri Nov 18, 2011 5:22 am

I have one friend that is making a more commercial like game with P3D. He's still working on the technical stuff and has not done any polishing of the game yet, but his fps is great.

I asked him how he did it and he said, his fps stays at 60 because he models all his characters any where between 1000 to 2000 vetices.

OMG... One of my characters makes almost three of his.

Is that even possible? I mean, his characters look great. Hard to imagine they are only 1 to 2 K vertices each.

I guess I have some super low polygon modeling skills to pick up. When thinking about it, I guess it is possible; given the fact bump mapped texturing will make the model look high level anyway, so it's just a matter of giving your models a realistic, basic shaping and really creating larger faces to cover more 3D space.

I think I can do it. :D

Even with that, I would say P3D is still only good for light weight games. Nothing majorly hardcore.

Texturing will be harder though because I can't set texture stages for the individual objects that make up an entire actor (which are all exported to egg, together), without effecting the whole Actor.

In fact, individual object data is lost. Actor.getGeomData.ls() (or how ever the method is spelled) does not list any geometry data. Only other nodes that were attached like collision nodes and any bones that were exposed.

Anyway,

Two of my enemy actors on screen do not bother frame rate at all, so I'll be able to convert the two Actors into five or six Actors by remodeling them with no more than 2k vertices max.

Here goes nothing...






-----------------
Anyone tried LOTRO online? Really nice looking MMO.
User avatar
markjacksonguy
 
Posts: 550
Joined: Wed Sep 28, 2011 4:06 pm


Return to General Discussion

Who is online

Users browsing this forum: flavio and 0 guests