Stumbled upon this useful article: http://www.mfoot.com/blog/2012/01/23/choosing-a-maths-library-for-cross-platform-c-game-development/
I've been using OpenCV for generic linear algebra, and then some others for fixed size 3D matrices and vectors. However, I've grown tired of OpenCV's quirks, especially the difficulty of debugging due to its internals, where void pointers are passed around and a custom RTTI system tries to make sense of the data (legacy of the first OpenCV versions that were C only). It seems to be fairly easy to cause an exception for which one doesn't even get a stack trace. Maybe it's time to switch to Eigen, which is apparently equally useful for large and small matrices, and also performs faster than GLM.
Edit May 27: Eigen definitely seems to work well and is easy to use. Creating math-heavy applications in C++ just got so much easier.
Edit June 12: I'm a full Eigen convert now, preaching to everybody who cares to listen =) Many matrix size and alignment errors are catched at compile time, and the API is flexible - for example, one can declare a matrix m and then say, e.g., Vector3f v=m.rowwise().sum(). More in this API showcase.
Thoughts and code
Thursday, May 22, 2014
Saturday, May 17, 2014
Beyond Ragdolls (we're hiring)
I'm proud to share the first public results from our Future Game Animation research project, supported by 7 game and animation companies and Tekes (Finnish Funding Agency for Innovation). Our paper "Online Motion Synthesis Using Sequential Monte Carlo" was accepted to this year's ACM SIGGRAPH conference.
Note: we're hiring! If you'd like to do a doctoral or Master's thesis about character motion synthesis using physics simulation, email me at perttu.hamalainen@aalto.fi.
Here's the video accompanying the paper:
Here's also some image sequences:
Executive summary: We've developed an "intelligent ragdoll" that can dodge projectiles, land on its feet, and get up after falling. All movement is emergent and does not require a library of animation or motion capture data. The goal is to eliminate animation-related delays in game prototyping and development, and enable new forms of gameplay.
I've previously posted a Unity script for turning a Mecanim character into a ragdoll that can get up after falling. While the script is quite simple, the resulting motion does not respect physical constraints like non-penetration, and the script also needs the get up animations. Using considerably more code and CPU, our research system - also integrated with Unity - can generate getting up and other behaviors without any animation data, by optimizing the control parameters of physics simulation. This will probably be how intelligent characters are implemented in future games, although we still need more work on computational efficiency and movement quality. We will be working on this at least until the end of 2015. In addition to improving the technology, we also plan to create some proof-of-concept prototypes of novel gameplay that could be created with it.
FAQ:
Q: Isn't this like NaturalMotion's Morpheme/Endorphin?
A: Yes and no. What I've seen from NaturalMotion (e.g., Clumsy Ninja) is more like my Mecanim ragdoll script in that it uses a lot of premade animation. While press and NaturalMotion's marketing like to portray it as "one big simulation", Clumsy Ninja's kicks, trampoline jumps, getting up etc. behaviors are premade animations. There's some degree of procedural footstep generation when dragging the character, but also artificial, non-physical glitches and sliding of the character, e.g., when the character adjusts its distance to target before kicking or punching.
I'm not saying that Clumsy Ninja isn't a great product. However, creating such believable and intelligent characters is presently highly expensive due to the amount of custom motion capture, animation, and scripting. The main point of our work is that it requires no motion capture, animation, or precomputation and is fully based on physics simulation.
Q: What about the recent viral video where characters were walking?
A: The video (by Geijtenbeek et al., who we do cite in our paper and have enormous respect for) shows the results of offline optimization of a controller that can generate walking in real time. The drawbacks there are that 1) the offline optimization takes a lot of time and 2) the controller is motion-specific, and the characters can't, e.g., get up after falling (which also applies to NaturalMotion's procedural balancers used in Endorphin). Note that the boxes thrown at the characters in the video appear very light and the steering and ground angle variations are quite modest. Our system does not require offline optimization or other precomputation and produces a wider range of movements, although the quality is not as good (we're working on that...).
Q: Can I download the code?
A: The system is presently quite complex and very much a work in progress. We're planning on an open source release sometime in 2015.
Note: we're hiring! If you'd like to do a doctoral or Master's thesis about character motion synthesis using physics simulation, email me at perttu.hamalainen@aalto.fi.
Here's the video accompanying the paper:
Here's also some image sequences:
Executive summary: We've developed an "intelligent ragdoll" that can dodge projectiles, land on its feet, and get up after falling. All movement is emergent and does not require a library of animation or motion capture data. The goal is to eliminate animation-related delays in game prototyping and development, and enable new forms of gameplay.
FAQ:
Q: Isn't this like NaturalMotion's Morpheme/Endorphin?
A: Yes and no. What I've seen from NaturalMotion (e.g., Clumsy Ninja) is more like my Mecanim ragdoll script in that it uses a lot of premade animation. While press and NaturalMotion's marketing like to portray it as "one big simulation", Clumsy Ninja's kicks, trampoline jumps, getting up etc. behaviors are premade animations. There's some degree of procedural footstep generation when dragging the character, but also artificial, non-physical glitches and sliding of the character, e.g., when the character adjusts its distance to target before kicking or punching.
I'm not saying that Clumsy Ninja isn't a great product. However, creating such believable and intelligent characters is presently highly expensive due to the amount of custom motion capture, animation, and scripting. The main point of our work is that it requires no motion capture, animation, or precomputation and is fully based on physics simulation.
Q: What about the recent viral video where characters were walking?
A: The video (by Geijtenbeek et al., who we do cite in our paper and have enormous respect for) shows the results of offline optimization of a controller that can generate walking in real time. The drawbacks there are that 1) the offline optimization takes a lot of time and 2) the controller is motion-specific, and the characters can't, e.g., get up after falling (which also applies to NaturalMotion's procedural balancers used in Endorphin). Note that the boxes thrown at the characters in the video appear very light and the steering and ground angle variations are quite modest. Our system does not require offline optimization or other precomputation and produces a wider range of movements, although the quality is not as good (we're working on that...).
Q: Can I download the code?
A: The system is presently quite complex and very much a work in progress. We're planning on an open source release sometime in 2015.
Augmented climbing and HCI in sports
In this year's ACM CHI conference, we had three submissions: a video showcase and a poster/extended abstract about our augmented climbing wall (a project of Raine Kajastila, a post-doc in our group), and a HCI in sports workshop paper about designing for empowerment in full-body human-computer interaction.
Thursday, October 17, 2013
Unity and version control
Here's some info of version control in Unity for this fall's game project course. Summary: it's ok to use git, svn etc. if you know a few tips, and Aalto doesn't have licenses for Unity's own version control system (the Asset Server). Only add the Assets and ProjectSettings folders to version control. Don't add the automatically generated Library folder or the monodevelop and visual studio solution files.
Also: DON'T SWITCH BACK TO UNITY WHILE YOU'RE GETTING/UPDATING FILES FROM VERSION CONTROL. This will cause major errors, as Unity starts importing and reimporting the files right away, and in the process it may create new .meta files if the correct .meta file was not yet updated.
Also: DON'T SWITCH BACK TO UNITY WHILE YOU'RE GETTING/UPDATING FILES FROM VERSION CONTROL. This will cause major errors, as Unity starts importing and reimporting the files right away, and in the process it may create new .meta files if the correct .meta file was not yet updated.
Basic info about setting up external version control:
Some additional points:
- If you are using Unity Pro or Academic (e.g., school computers) go to project settings/editor and set asset serialization to "force text", which enables svn, git etc. to merge the files.
- However, the merging doesn't always work, and to minimize conflicts, you should avoid modifying your scene files. The way to do this is to have everything as prefabs and edit the prefabs instead of modifying the scene. This way, it's less likely that several people modify the same file.
- Enabling external version control as explained in the document above creates a .meta file for each file. Always remember to add the metafiles to your repository. The metafile contains the unique identifier (GUID) that Unity uses, e.g., when you link a prefab to a script's public GameObject property. If you don't commit a metafile to the repository, it will be regenerated on each person's machine with different GUIDs, which breaks the object linking and you get null reference errors.
- Be careful when moving and deleting assets. If you delete or move assets inside the Unity editor, svn etc. will of course bring them back when you update before committing your changes. However, once the metafiles are in use, you can safely move and delete files outside Unity as long as you move and delete the metafiles too. Unity will recognize the moved assets as the same ones based on the metafiles.
- When googling to check if there's a good tutorial or if I missed something above, I noticed there's quite a lot of "free version of Unity doesn't support version control". However, things have changed and as of version 3.5., metafiles are no longer a pro-only feature.
- The "force text" option is apparently a pro-only feature, but there are workarounds: http://answers.unity3d.com/questions/8740/version-control-workflow.html & https://github.com/terravision/UnityTextScene
Unity and FMOD Studio integration
Edit 12 June 2014: This post was based on the very first FMOD Unity integration from fall 2013. There's been changes and the project file should probably be updated. Please (also) check some more recent tutorials.
There's finally an official FMOD Studio integration for Unity, available from http://www.fmod.org/download/. It has just been released and contains no Unity examples, so I prepared a bare bones Unity project and an example FMOD Studio project that you can download from here https://www.dropbox.com/s/vyk29kyhmvm22dc/part6_sound.zip
There's finally an official FMOD Studio integration for Unity, available from http://www.fmod.org/download/. It has just been released and contains no Unity examples, so I prepared a bare bones Unity project and an example FMOD Studio project that you can download from here https://www.dropbox.com/s/vyk29kyhmvm22dc/part6_sound.zip
The main reason to use this is that it enables your sound designers to edit, mix and master your game audio using the FMOD Studio interface while the game is running. Using FMOD studio, you can also create, e.g., a footstep sound event that will randomize the pitch and selected sound sample without the game having to know anything else than the name of the event. Using FMOD is free for non-commercial purposes and the license prices are also pretty reasonable for commercial work.
Most commercial games with decent sound design use such live mixing and centralized sound management approach, either using FMOD, WWise or some custom tools. Unity does use fmod for its own sound, but it does not provide any mixing tools and a sound designer will go insane if he/she has to manage hundreds of sound objects in the Unity editor. The main limitation of the FMOD Studio integration is that it only supports iOS, Android, Windows, Mac but no web player. If you want your sounds to work in the web player, I suggest you check out the Clockstone Audio Toolkit that provides mixing and management on top of Unity's internal sound system.
Here are the steps you need to add FMOD Studio support to your Unity project. These instructions can also be found in the readme.txt of the .zip above.
- Import the fmodstudio10203.unitypackage from this folder. You can also check for a later version from fmod.org. To import in Unity editor, select Assets/Import Package/Custom package.
- build your FMOD banks (File/Build)
- In Unity, select FMOD/Refresh Event List and select the GUIDs.txt from your FMOD project's Build folder. Note: you need to do this whenever you rebuild the FMOD project. This will create FMODAssets and StreamingAssets folders in the Unity project and copy the FMOD banks to the StreamingAssets folder.
- When your scene starts, load a bank (see the Start() method in main.cs)
- Use the FMOD_StudioSystem helpers such as PlayOneShot() to play sound. See the OnGUI() method of main.cs. You might also have to use the lower level API (fmod.cs, fmodstudio.cs), as the unity integration package has been released very recently. The integration package doesn't contain any examples, and for example code, you should consult the c++ examples in the FMOD Studio API install, which you can download from fmod.org.
- To enable live mixing using FMOD Studio, uncomment the first line of FMOD_StudioSystem.cs, found in the plugins/FMOD folder. Also check the "Run in background" checkbox in Unity Editor. You'll find that by selecting Edit/project settings/player and then selecting the Resolution and Presentation panel in the Inspector. Now press play in unity, select File/Connect to game in FMOD Studio and click ok. You can now adjust audio event properties on the fly while the game is running.
Thursday, October 10, 2013
Unity Mecanim and ragdolls
Update 17 May 2014: If you're interested in physics simulation and character animation, check also our latest research. Also, please note that Mecanim has had some changes since the original post, and now the script could be implemented more elegantly. I also haven't tested the script with different characters - the comments suggest that some characters with different bone structure might not work out of the box.
Sharing a Unity project with a RagdollHelper.cs script that can be used to turn a Mecanim character into a ragdoll and then back. Using the script is simple: there's just one public property called "ragdolled", which can be set to true or false. When the property is changed from true to false, the script blends from the ragdolled pose to a get up animation. This is illustrated in the video below.
Transitioning from Mecanim animation to a ragdoll is as easy as disabling the Animator component of the character and setting all ragdoll rigid bodies to non-kinematic. The reverse is more complex, and my solution is not without kludges, as Mecanim doesn't allow as low level control of the animation as Unity's older animation system. The main problems are that one can't use scripting to enforce an immediate state transition without a blend, and apparently one can't also add states programmatically into an animation controller.
In the old animation system, one could create a new animation clip from the ragdoll's current pose, turn of ragdolling and blend the animation from the created clip to a recovery animation. In Mecanim this is not possible, and after turning Mecanim back on, we have to manually lerp the hip position and slerp all body part rotations back towards the last ragdolled pose in LateUpdate() to achieve a smooth blending. Additionally, we have to wait some frames for Mecanim to transition to the get up animation while using LateUpdate() to move the pose back to the ragdolled one, and then read back the get up animation start body orientation and position, and update Mecanim character's root so that the animated orientation and position match the ragdolled one as well as possible. In the old system we could just update the character transforms with a single script command according to a specific frame of a given animation, and then read whatever info we need. Actually we could possibly use the legacy animation system for that, but then the script would need additional handles to the animation etc., so it seems there's no really simple and beautiful way. One could also perhaps use an editor script to precompute the info.
Here's the Unity project as a zip file in case someone finds it useful: https://drive.google.com/file/d/0B8lAtheEAvgmYUlqX1FjNm84cVU/view?usp=sharing&resourcekey=0-kVdY0_KU-BRQP9xZibN0Fw
Open the test scene, click with mouse on a body part to add an impact to the character, press space to make it get back up.
Update 5 Dec 2013: Today I noticed that Unity 4.3 has added the Animator.Update() which could be used to implement the ragdoll to animation blend more elegantly without the LateUpdate() trickery. I'll have to update this example when I have time.
Sharing a Unity project with a RagdollHelper.cs script that can be used to turn a Mecanim character into a ragdoll and then back. Using the script is simple: there's just one public property called "ragdolled", which can be set to true or false. When the property is changed from true to false, the script blends from the ragdolled pose to a get up animation. This is illustrated in the video below.
Transitioning from Mecanim animation to a ragdoll is as easy as disabling the Animator component of the character and setting all ragdoll rigid bodies to non-kinematic. The reverse is more complex, and my solution is not without kludges, as Mecanim doesn't allow as low level control of the animation as Unity's older animation system. The main problems are that one can't use scripting to enforce an immediate state transition without a blend, and apparently one can't also add states programmatically into an animation controller.
In the old animation system, one could create a new animation clip from the ragdoll's current pose, turn of ragdolling and blend the animation from the created clip to a recovery animation. In Mecanim this is not possible, and after turning Mecanim back on, we have to manually lerp the hip position and slerp all body part rotations back towards the last ragdolled pose in LateUpdate() to achieve a smooth blending. Additionally, we have to wait some frames for Mecanim to transition to the get up animation while using LateUpdate() to move the pose back to the ragdolled one, and then read back the get up animation start body orientation and position, and update Mecanim character's root so that the animated orientation and position match the ragdolled one as well as possible. In the old system we could just update the character transforms with a single script command according to a specific frame of a given animation, and then read whatever info we need. Actually we could possibly use the legacy animation system for that, but then the script would need additional handles to the animation etc., so it seems there's no really simple and beautiful way. One could also perhaps use an editor script to precompute the info.
Here's the Unity project as a zip file in case someone finds it useful: https://drive.google.com/file/d/0B8lAtheEAvgmYUlqX1FjNm84cVU/view?usp=sharing&resourcekey=0-kVdY0_KU-BRQP9xZibN0Fw
Open the test scene, click with mouse on a body part to add an impact to the character, press space to make it get back up.
Update 5 Dec 2013: Today I noticed that Unity 4.3 has added the Animator.Update() which could be used to implement the ragdoll to animation blend more elegantly without the LateUpdate() trickery. I'll have to update this example when I have time.
Monday, September 23, 2013
Homepage
Restored and updated my old research homepage. http://perttu.info is now redirecting there. Nothing special, mainly lists of publications and projects.
Subscribe to:
Posts (Atom)