AI Libraries for Panda3D

Currently , My team is working on making AI libraries for panda3d at Carnegie Mellon University. If we could get some input from you guys it would be highly helpful. Here is a list of stuff that we are looking at to do

– Seek/Flee
– Arrival/Leader Follow
– Path Following
– Wander
– Queuing/Flocking
– Obstacle Avoidance and some others

And Also we are looking at a navigation mesh/Pathfinding system as such. We would like the input of the community on this and also we want to know

– If you have tried something in AI with panda3d, what are they?
– Is there something you would like which is missing?

Sounds very nice. Will it be implemented in C++ or Python?

You might also be interested to look at PandaSteer:
discourse.panda3d.org/viewtopic.php?t=2431
Though I’m not sure if its still being maintained.

Indeed, PandaSteer has implemented several essential steering behaviors. But when I tried to implement my AI, that was not sufficient, and I found that for real-life use it makes more sense to implement some ‘autonomous pedestrians’ algorithms. As I remember, many of the geometry and trigonometry algorithms that are important for autonomous pedestrians are not yet implemented in Panda core; some of them I tried to recreate in Python (but, as you probably know, mathematics in Python is very inefficient unless you use Psyco): discourse.panda3d.org/viewtopic.php?t=5817
This includes: point-in-polygon tests, intersection tests, perp product (aka 2d cross product), and so on. These functions are very helpful for navmesh creation and navigation, checking whether an item is right/left/ahead/behind of another item and so on.
Pity I had no time to develop it further… :frowning:

Ah, We are indeed implementing it in C++ and wrapping it up in python for the most part. And ya thanks for the link on pandasteer…

hello, that’s a very interesting and deep topic, i’ve done some stuff in this direction and want to share my thoughts.

what i’d love to see implemented:

in my opinion a aStar with a grid or a node system would be the most important thing panda is missing. as it has already been mentioned python “sucks” at low levels, aStar makes a lot of such simple actions. if possible it should be run as thread. (something like: http://www.nouser.org/PMW/pmwiki.php/Portfolio/SparetimeAStar , if you want the code i can dig it out)

a very interesting part is pathfinding mesh generation. (http://www.ai-blog.net/archives/000152.html) , together with a good a-star this solves a lot of higher priority problems in AI.

queuing in aStar is a interesting topic as well. multiple actors navigating the same mesh without interferencing is quite hard to do afaik. (never really tried it myself)

things to consider when working on seek/flee/arrival/…:

while i’ve never tried it myself, i’ve read that “Potential Function-Based Movement” can do pretty much all of the movement styles you mentioned. The book “AI for Game Developers” from O’Reilly mentions this one after the flock algrithm.

another thing that has a big influence on the topics you want to cover is if you are going to do this in 0 grip (space game) or in earthlike motions. leader follow is quite a bit more complex (and interesting) in space then if you try following a human (you want to archive the same velocity vector, not the same position). while the “0-grip” method can easily be used for position archiving methods it’s impossible the other way. i’ve implemented this in: http://www.nouser.org/PMW/pmwiki.php/Portfolio/BachelorEliteWars but the code may be not that good to read.

other topics:

fuzzy logic and neural networks are another big topic in ai. but neural networks dont seem to be the topic you want to cover. fuzzy logic however seems to be a possible way in behavior control of ai characters.

genetic algorithms are another interesting part, but it’s quite easy to implement, it’s more important to have good selection mechanisms. im using genetic algrithms in my current master project.

inverse kinematics could be viewed as some kind of pathfinding. a very interesting topic and cool if panda had this. i’ve seen some works that make ragdolls to behave human like (protecting the head while falling) that goes into a similar direction.

i already said this, but ai for game developers covers most of the topics you and i mentioned.

Navmesh-based pathfinding would be very, very nice, especially if the navmesh could come through the EGG pipeline.

The Boids-esque behaviors would be fun too, I suppose, but I personally would be best served in that respect by a simple 2D culling structure and neighborhood query. If you do do the Boids stuff, it’d be really nice if those were a layer over a well-exposed and documented culling structure.

Recast and Detour are extremely impressive. They might be enough for generating navmeshes and pathfinding.

"They are both open source projects aiming to solve some path finding related problems. Recast is automatic navigation mesh generation toolkit and Detour is a runtime component which can be used to do some spatial queries and pathfinding on navmeshes. Basically you can throw any triangles mesh Recast and it will generate a mesh from that data which allows the AI to navigate in that environment using Detour.

By releasing Recast and Detour open source, I hope that people who usually do not have access to such technology can fullfill their crazy ideas as well as I hope the code can live up to high standards of other game developers and they could adopt it to be used in their games."

Blog
http://digestingduck.blogspot.com/

Google Group
http://groups.google.com/group/recastnavigation

This is the roadmap post
http://digestingduck.blogspot.com/2009/07/recast-and-detour-roadmap.html
Sounds like jump links and area annotations are next.

My request is that no matter what you finally release, is that it’s well documented… :slight_smile:

I would appreciate some Panda3D AI material to study. I am on a very simple level though. I’ve got 3D models in a space game, so zero gravity, and I don’t really have an idea right now how to let AI ships move around convincingly, like let them land/attack/flee/group etc.

Some tool for neural network, and genetic algorithms would be nice.

Hmm, Not sure what you mean by some tool. But neural networks and genetic algorithms are at the moment not on our plate.

I haven’t really gotten into AI much, but I do have some thoughts about it.

Nav meshes seem like the best way to go for that kind of thing. I would love support for them. Be sure to have some way that various regions in the nav mesh can hav values assigned which can be checked against to determine which regions can be used (ex: let me set a heigh value so I can prevent tall things from going somewhere, or flag something as shallow water so some things cant go there). A set of named floats that would simply be compared to a corresponding set for the things moving around would do it. If the value on the AI element is higher than the corresponding values on the nav mesh region, the region is inaccessible.

Potentially, nav volumes could be useful, or maybe inverse volumes really (volumes you can’t go to, like spheres around objects in space or what ever). Another approach to volumetric path information is a cube tree (oct tree). I’m not really sure if that would be useful though.

I have designed (but not implemented to the point of usability) an AI decision making system. I’ll describe it here. You are welcome to use anything you want from my design.

The ideas is that there are abilities that the AI controls. Different urges compete for the control of the abilities through a prioritization system (a priority value is computed for each urge and scaled by a tuning weight value). Then lists of sets of compromising urges are evaluated (Ex: the flee urge can compromise with obstacle avoidance). These sets of compromising urges are basically modes or states (so I’ll call them states). The sum of the priorities of all of the urges, each scaled by a corresponding weight specified in the state is evaluated, then scaled by the state weight. The mode with the highest final priority wins and becomes the dominate state, and thus the urges it contains are given control of the abilities.

I also designed my AI system to be hierarchical meaning that a squad AI would just be a normal AI that’s abilities would be specifying some input values to its members which could be involved in the urge priorities and actions taken with the abilities.

I really don’t see the actual priority computation and ability use code to be part of an AI framework, its more part of the game that uses the framework, and this is where neural networks and such might be useful.

I would also like to see any path finding/nav stuff pretty separate from what ever AI system get made to control NPCs and such. I would like to be able to use one without the other. For ease of use, the common RPG style case could be implemented as an example that uses both, rather than making pathfinding part of the AI, or the AI part of the pathfinding.

well, nobody can make intelligent decisions without being aware of his environment, so a low level ‘lines of sight’ solution should be a nice thing to do i think.

[size=150]Pandai v0.5 Now Available for Use![/size]

Hello Everyone,

For the first half of our semester-long ETC project, our team has worked hard to create a collection of game A.I. steering behaviors for the Panda3D engine. We currently have developed seven steering behaviors (seek, flee, pursue, evade, arrival, wander, and flock) for this collection, all of which only require simple function calls in Python to work. Most of these behaviors can also work together simultaneously, and we have provided Python programmers with the ability to set behavior priority levels in these cases. We have packaged the seven steering behaviors along with the ability to combine them in the latest release of our framework, Pandai v0.5.

To access this latest version, please visit our TEAM WEBSITE and navigate to the “Download” page. There, you will find directions for installing Pandai v0.5 and instructions for how to use the various A.I. behaviors in your code. If you would like to read descriptions about each of the steering behaviors and view some simple demonstrations, please navigate to the“AI Types” page. Finally, if you want to see examples of some of the A.I. behaviors from this collection working in Panda3D projects, please visit our site’s “Gallery” page.

Please email our team at groups.google.com/group/pandai-support in order to learn more about our project and gain access to instant updates on our progress.

We hope that you all enjoy the A.I. systems being offered in Pandai v0.5 and that they benefit your Panda3D projects! In the second half of our semester, we are working on adding a pathfinding system to this current collection of steering behaviors, so please keep an eye out for our next release. Thank you!

We would like to personally thank David and Pro-rsoft for helping out thus far!!

Sounds great. Now, can I have a link to this site?

The link is in the earlier message as “TEAM WEBSITE” – spelled out, it is http://www.etc.cmu.edu/projects/pandai/. Please provide feedback to the CMU ETC Pandai team so that once Pandai 1.0 is released in December, it can be folded into Panda3D directly for the benefit of all. Thanks!

Looks great, amazing job!

Here’s a makefile for non-windows users. Simply adjust the paths at the top of the makefile, place it (with the name “Makefile”) inside the “Code” directory of the source archive and hit “make install”. If the compilation went well, you should be able to use it now.

PANDA3D_INCDIR := /usr/include/panda3d/
PANDA3D_LIBDIR := /usr/lib/panda3d/
PANDA3D_LIBS   := -lpanda
PYTHON_VERSION := python2.6
PYTHON_INCDIR  := /usr/include/$(PYTHON_VERSION)

HEADERS := $(wildcard *.h)
OBJS := $(patsubst %.cxx, %.o, $(wildcard *.cxx)) pandaai_igate.o

all: libpandaai.so
pandaai_igate.cxx: $(HEADERS)
	interrogate -S$(PANDA3D_INCDIR)/parser-inc -I$(PANDA3D_INCDIR) -oc pandaai_igate.cxx -Dvolatile -Dmutable -DCPPPARSER -D__STDC__=1 -D__cplusplus -D__inline -D__const=const -fnames -string -refcount -assert -python-native -do-module -module libpandaai -library libpandaai $(HEADERS)
%.o: %.cxx
	g++ -c -o $@ $< -fPIC -I. -I$(PANDA3D_INCDIR) -I$(PYTHON_INCDIR)
libpandaai.so: $(OBJS)
	g++ -shared -o $@ $(OBJS) -L$(PANDA3D_LIBDIR) $(PANDA3D_LIBS)
clean:
	rm -f $(OBJS) pandaai_igate.cxx libpandaai.so
test: libpandaai.so
	$(PYTHON_VERSION) -c "import libpandaai"
install: libpandaai.so
	cp libpandaai.so /usr/local/lib/libpandaai.so
uninstall:
	rm /usr/local/lib/libpandaai.so
.PHONY: all test clean install uninstall

– pro-rsoft

great, thanks for that lib, also thanks to pro-rsoft, i’ve managed to compile it under snow leopard using the makefile with some adations, maybe they are useful for someone else:

PANDA3D_INCDIR := /Applications/Panda3d/1.7.0/include
PANDA3D_LIBDIR := /Applications/Panda3d/1.7.0/lib
PANDA3D_LIBS   := -lpanda
PYTHON_VERSION := python2.5
PYTHON_INCDIR  := /usr/include/$(PYTHON_VERSION)

HEADERS := $(wildcard *.h)
OBJS := $(patsubst %.cxx, %.o, $(wildcard *.cxx)) pandaai_igate.o

all: libpandaai.so
pandaai_igate.cxx: $(HEADERS)
	interrogate -S$(PANDA3D_INCDIR)/parser-inc -I$(PANDA3D_INCDIR) -oc pandaai_igate.cxx -Dvolatile -Dmutable -DCPPPARSER -D__STDC__=1 -D__cplusplus -D__inline -D__const=const -fnames -string -refcount -assert -python-native -do-module -module libpandaai -library libpandaai $(HEADERS)
%.o: %.cxx
	g++ -c -o $@ $< -fPIC -I. -arch i386 -I$(PANDA3D_INCDIR) -I$(PYTHON_INCDIR)
libpandaai.so: $(OBJS)
	g++ -shared -o $@ $(OBJS) -arch i386 -L$(PANDA3D_LIBDIR) $(PANDA3D_LIBS) -undefined dynamic_lookup
clean:
	rm -f $(OBJS) pandaai_igate.cxx libpandaai.so
test: libpandaai.so
	$(PYTHON_VERSION) -c "import libpandaai"
install: libpandaai.so
	cp libpandaai.so /usr/local/lib/libpandaai.so
uninstall:
	rm /usr/local/lib/libpandaai.so
.PHONY: all test clean install uninstall

i havent yet had the time to really test it.

edit: the “Seek and Flee Egg Demo” works fine

great stuff people. very nice work.

For now i’ve just been browsing through your project pages but I will soon get to test everything.
Apart from the downloads you’ve already provided, it would be great if you would make available the sources used when creating all those demo movies.
Will follow on this closely and I am looking forward to the 1.0 version.

Hi radu,

We have some demo code in the Downloads page for the 0.5 version. This is the code for the EGG’s seek and flee video on the Gallery page. The AI aspect of the demos have been explained in detail in the Download page.

If you are still having trouble with any of our AI, do let us know and we will provide you with assistance (via a code sample or by replying on the forum).

Cheers,
Team Pandai