SIMAI-X1 or ‘Simmy’ is a simulation that was written in Borland C++ and OpenGL to simulate a single robot in a 3D environment consisting of a simple virtual confine with a food source and a few chosen objects to help demonstrate the different subjective brain states that can be generated by the Xzistor Concept brain model. These include sensing, association-forming, recognition, cognition, thinking, learning, initiative, problem solving, pain, fear, anger, dreaming, forgetting – to name but a few! The aim with this simulation was to test the fully integrated brain model under dynamic conditions to see if there were any exceptions to its generality. As I am a mechanical engineer with preciously little experience of electronics, building a physical robot seemed quite daunting at the time. Remember these were the days before Raspberry Pi, Arduino and Lego Mindstorms EV3.
Let’s take a look at Simmy in its 3D environment:
The video section for Simmy (here) shows the little simulated robot move about in its confine and demonstrate different brain states.
What is not obvious in the image above, is that Simmy’s Learning Confine actually looked exactly like Troopy’s, except for the fact that it is square and not round. It in fact has all the colour panels that Troopy’s circular confine has but I built an option into the simulation program to hide the colour panels from the viewer to give a cleaner look which makes it easier to follow Simmy’s movements. There is another button that displays exactly what Simmy is ‘seeing’ just above his head. Of course he has no video, so what Simmy sees is calculated based on where he is positioned in the Learning Confine and in what direction he is looking. For Simmy we purely used the colour panels to teach it navigation, where Troopy has the option to use either object recognition or colour panels, or both.
I love this intrepid little virtual agent! Spent so many late nights working on this C++/Open GL program and was delighted when suddenly it started to come alive – that was the first time I experienced what I have come to call the Pinocchio moment.
Whilst going about his little life inside his virtual learning Confine, Simmy has no idea he lives in a simulation. Touching walls and running into the cactus feel exactly as real for Simmy as it feels for Troopy. So Xzistor robots can be given a ‘basic’ training in a virtual environment before the ‘experience files’ are downloaded into real physical robots. Simmy’s complete experience files (memory) can be downloaded by the push of a single button. Equally a new Simmy can be started and the experience file read in with the push of a single button. I will demonstrate this in a demo video on memory and learning. I deliberately designed Simmy and Troopy with this capability so as to not have to teach Simmy or Troopy from scratch every time I switch of the computer or terminate the simulation.
This C++/OpenGL simulation proved to be quite versatile and when I demonstrated it at a university in Germany, I was able to reproduce the simulation by replacing Simmy with the basic geometry of their in-house laboratory robot – called Volksbot, built by the Fraunhofer Institute and used for experiments and rapid prototyping. I was making some suggestions around laser sensing and within an hour or so able to demonstrate how, in a 3D simulation, their robot can be changed into an Xzistor Concept robot. See here:
Go here is a picture gallery of Simmy.