It was 2005 when Harmonix Music Systems burst onto the video game scene with Guitar Hero, the now-legendary (and addictive) video game in which players simulate playing lead guitar on their favorite rock songs. The Cambridge, Massachusetts–based company upped the ante in 2007 when it released Rock Band, which allowed individuals or teams to play guitar, drums, or sing songs from bands including The Beatles, Green Day, and a host of others. Now, Harmonix is encouraging people to get up and dance with Dance Central™, an extraordinarily ambitious new title that coincided with the release of the Kinect™ for Xbox 360®, which enables players to control and interact with the Xbox 360 video game and entertainment system from Microsoft through natural gestures and spoken commands.
Dance Central lets the players choose one of eight characters whose movements they mirror and, by learning various dance moves and routines, perform, work out, or compete against other players, all without a handheld controller. Using a combination of Autodesk® 3ds Max® and Autodesk® MotionBuilder® software, Harmonix has created a strikingly realistic new game.
While rapid advancements in motion capture technology have already enabled video game enthusiasts to closely mirror the action of animated characters, the creators of Dance Central clearly wanted to take things to a whole new level.
“We wanted Dance Central to be an absolutely full-motion dance experience,” says Dare Matheson, lead artist on the game. “Our vision for the game was not just tracking individual positions of the player’s body at different points in time, but clearly and precisely tracking full-body motions through time, while scoring them on the complete dance moves they are executing.”
Working closely with a team of choreographers and dancers, Harmonix would eventually develop 90 dance routines and over 650 dance moves for Dance Central. Matheson continues: “We wanted people, even if they’d never danced before, to be able to learn and perform the routines while receiving feedback and correction. To do that, we built a system that would enable Kinect to not only track their motions, but also apply and compare their routines to our choreographed routines. Obviously, that is an extremely organic and open-ended task for a computer to handle well, but we were determined to see it come to life.”
While the avatar characters in the Rock Band series are compelling in their own right, they are also typically customized by the players who will inhabit them. For Dance Central, however, the Harmonix team wanted to exhibit the kind of personalities that only dance can bring out.
“For Dance Central, the characters are not just tools; they are the interface,” says Matheson. “We were super-psyched about creating a limited set of distinctive personalities from start to finish. We wanted our characters and animations to look mind-blowingly smooth and gorgeous. Autodesk 3ds Max and MotionBuilder were absolutely crucial to that achieving our vision.”
“The quality of animation was vital to the success of Dance Central,” agrees Riseon Kim, lead animator at Harmonix. “We used motion capture for every song and routine, as well as for unique introductions for each character. We spent a good amount of time building and testing our character rig in 3ds Max and coming up with some tricks to make things work. After creating and testing our character rig in 3ds Max, we used MotionBuilder to continue working on motion captured data. It was crucial for the animators to make everything look very clear and understandable for the players. The animation had to be exaggerated and every dance move had to be absolutely accurate. We found the Story mode in MotionBuilder very useful when we needed to blend two different clips and merge animations on different characters easily. The simple import and export capabilities between 3ds Max and MotionBuilder were also helpful when we were working on our character rigs which involved a lot of going back and forth between the two software.”
In addition to the larger character rig created in 3ds Max, the Harmonix team was able to create a multitude of custom rigs for various body parts.
“Achieving a full and believable range of motion in dancing characters can be a big challenge,” says Matheson. “Using 3ds Max, we were able to develop techniques to help keep body parts from collapsing from extreme limb positions. We discovered, for example, that when a character’s leg kicked high, the butt would completely flatten out. With 3ds Max, we were able to create a ‘booty rig’ to prevent that problem. We then used MotionBuilder to smooth and polish the animations until they were seamless.”
Artist QA: Peter MacDonald
Peter MacDonald is a senior artist at Harmonix Music Systems, the premier developer of music video games. Peter lives in Somerville Massachusetts, and was educated as an animator and printmaker at the University of Massachusetts. Most of Peter’s career has focused on environments and art directing. His first encounter with Autodesk software was in 1993, with an internship at an architecture firm using AutoCAD, and later an early version of 3D Studio Max to create architectural visualizations. In 1995 Peter joined the game industry, eventually using Maya for many years, and then upon joining Harmonix in 2005 he started using 3ds Max…it had come a long way in 10 years. Peter’s game credits include Asheron’s Call, Dungeons and Dragons Online, Guitar Hero 2, and Rock Band, Rock Band 2 and Rock Band 3.
Hi Peter, welcome to the AREA!
As Senior Artist, what is your role and daily responsibilities at Harmonix Music Systems?
For the past several years, I’ve been the Art Lead for the Rock Band games, meaning I supervised all the art for the product – everything from the intro cinematic to the interface, from the characters and environments to the credits. I work with the other discipline leads to negotiate what the game is; I help the artists prioritize their work, and I steer the look and feel of art in development through critique.
You've been in the games industry since 1995, how did you get started with 3D and games?
I started using 3D software at UMass in 1990. It was a BFA program with a concentration in Computer Art and Animation. 3D was only a very small part of the program, however. The hardware and software at the time were not very flexible, and render times were prohibitively long, so we spent more time on 2D animation and traditional methods. In 1995, a close friend of mine was hired by a startup in Providence, RI. Their mission was to make the first 3D MMORPG (although that phrase had not yet been coined). They needed artists who could use a computer, so my friend called me up. It sounded like more fun than the architecture visualization stuff I was doing, so I was excited to make the jump into video games.
In what capacity have you work as, in other game titles like Asheron's Call and Dungeons and Dragons Online?
I started as an environment artist. For Asheron’s Call, I modeled the entire world and most of the exterior architecture. I was promoted to Art Director after Asheron’s call and started on the long, painful road of learning how to manage artists. I worked at Turbine for 9 years, but it felt like several different jobs because my role and the management of the company kept changing. I did a little animation (nothing major) and helped establish the technical art pipeline, as well as a ton of concept art and hiring and mentoring dozens of artists. At some point, we hired a more experienced Art Director, and I worked on DDO as the lead artist during pre-production and as the interface designer until I left Turbine to join Harmonix.
Now at Harmonix, you've worked on the 1st, 2nd and 3rd Rock Band titles. What can you tell us about the improvements and streamlined workflows made over each of the sequels?
Rock Band was the largest game Harmonix had ever made. For the first time, we had to manage 40 different venues, a character creator with hundreds of parts, hundreds of UI screens, and thousands of animations. One of our biggest challenges was just staying organized!
During Rock Band 2, we refined our workflows with our outsource partners quite a bit. We had learned a lot of lessons about cleaning up motion capture done off-site and providing clear orthographic concepts to off-site character modelers.
Before Rock Band 3, we finally started using MotionBuilder and once again refined our workflow with our off-site mocap partner. Our internal tools grew too, in ways to facilitate the production of Rock Band song “authoring” for downloadable content. Our character creator took a big leap forward, as did our in-engine lighting.
With the option of customizable characters, did you have some special setups to accommodate/automate and work with all of the individual eyes, noses, mouths, chins, etc. when preparing them for facial rigs?
We did a very robust face-creator prototype completely within 3ds Max. This is where we figured out how to combine morph targets, what our base meshes needed to be like, and what sorts of controls made sense. It was a very successful prototype. It answered tons of questions for us and enabled us to engineer the in-game face creator, and get it 90% right on the first try.
We also created detailed deformation skeletons in 3ds Max to get our different body types. This allowed us to preview how body-type deformation would affect any piece of clothing we were working on and make adjustments before exporting to the game engine.
Are you using any custom or 3rd party scripts/plugins for your daily work?
We have made tons of custom scripts over the years. Many of them deal with skinning issues unique to our characters and instruments, or to automate certain skinning tasks we have to do over and over again. Others are rarely used such as one we have for creating a marquee wall of lights, or one for automating the creation of a 3D font. I’m not aware of any 3rd party scripts that we use.
Asides from the games made at Harmonix, what PC/console games do you play?
I used to play Counterstrike like a maniac, but I haven’t played many shooters since then. I’ve really had fun with Assassin’s Creed Brotherhood recently. I play every Zelda game that comes out. I played Fable 2 this year and I play a lot of small, smart, arty games like Flower, and Limbo. Funnily enough I’ve never been able to get into any MMPs.
What about musical instruments ;-)?
I’ve been a mediocre guitarist for about 20 years and have been messing around with the Bass Guitar for the past four years. Don’t go looking for my debut album: Pete Mac Attack or my solo album: The Houseboat Sessions or my live reunion tour album: Mac Attack Re-Pete or my greatest hits album: Pete Forever, because none of them are in print anymore. In fact, I may have just imagined them.
Thanks for spending the time with us, Peter!
My pleasure! Thank you for all the wonderful software, and hats off to the awesome community of smart people you have here at the Area.
Artist QA: Riseon Kim
Riseon Kim earned her B.F.A from The School of Art Institute Of Chicago and her M.F.A from the Academy of Art University. Riseon started her career as an animator at Harmonix and worked on these published games: Rock Band 1, Rock Band 2, The Beatles: Rock Band and Dance Central as a lead animator.
Based on your education background, it looks like you came from a traditional 2D background. How did you get started with 3D animation?
I was always interested in storytelling and animation but my background actually comes from painting and visual communication. I was working on a project that required some 3D animation and that’s when I realized animation was something that can bring every piece of art together. As an animator, 2D or 3D is only a matter of tool choice but being able to create the entire scene in 3D software was attractive to me; modeling, shading, lighting and animating.
What apps do you primarily use for your daily work?
I mainly use Motion Builder at work and sometimes 3D Max as well.
Coming from Maya originally and now using Max and MotionBuilder, what tool(s) do you use for your personal animation workflow?
I prefer to use Maya for my personal work. I think it’s well designed for the animators and I’ve been using for a long time now.
You had mentioned earlier that motion capture was used for every song and routine. How did the animation team manage to keep track of, and work with the high volume of 650+ animated cycles/clips?
Dance Central required huge amount of animation work at very high quality. I believe it was the pipeline which enabled the team to manage such a high volume. The animation team worked very closely with the design team, including the choreographers. Having dance classes and choreography reviews before the mocap shoot helped the animators get to know the routines in advance. Also the animation team got great support from the tech department, including improving our blending system and adding new tools for better workflow.
How different/challenging was it to work with dance-moves mocap in Dance Central compared to, say, the mocap data used in Rock Band?
In Dance Central, the animation itself was the gameplay, and the character was always shown fully on the screen. Also the characters handled a broader range of motion compared to Rock Band. The animation team had to make sure every dance move was clear and accurate to the players. The quality of animation was very important. To ensure we were capturing the best raw mocap data, extra markers were added to the dancer and the animators were stricter about the clean up process.
Can you describe how the dance routines were created?
All the Hard routines were created first, then the Easy and Medium routines were generated as a subset of the dance moves from Hard. We came up with a solution that allowed us to quickly build the Easy & Medium difficulties using the existing mocap data from the Hard routine. We captured Hard routines and the transition clips for every song. The blending system within the game engine was improved and worked amazingly well but in some cases we still needed transition clips from one move to another for better accuracy. Additionally, sometimes manual tweaks were needed like hand animation or stitching clips together for polishing and bug fixing. We needed very precise consistency throughout the routines.
The cloth deformations on the characters are very smooth and collision-free :-) Can you tell us how you addressed this for each character and all of the available customizable outfits?
This is something Chris Hartelius, our lead tech artist can answer better! When asked, he said:
“Each character’s outfit was designed to have as much secondary movement from their clothing/accessories as we could include. We used a bone-chain physics system for all the cloth, and these chains could be added to any character’s skeleton wherever we needed. The bones would be aligned down the length of the object, (i.e Emilia’s hand wraps or Mo’s necklace), and the set up to collide with collision shapes placed within the character’s body. We also blended in a normal map that emphasized the wrinkles in the clothes as the character moves. This effect, combined with the secondary motion, helped us achieve some pretty nice looking cloth deformation.”
Same for hair deformations, smooth and subtle movements -- what setup did you use to animate the different hair types?
This also falls within the realm of Chris Hartelius. He says:
“We built the hairstyles to have a bottom base layer that would remain relatively static, and then have additional pieces growing out from that base which could move around, using the same physics system we had for the clothes. In the case of Aubrey’s hair, almost every curl was skinned to its own bone chain. We did this so that the top curls could have the most bounce, and the curls that made up the general mass of hair could be heavier, creating a nice layered effect. Angel’s spikey hair style was treated in a similar way, but the hair’s physics properties were stiffer and much more subtle in movement. We found the slightest bit of secondary motion on the hair or clothing really creates a great aesthetic for the characters and game.”
Lastly, were there any in-house scripts/plugins that were developed for the production of Dance Central?
We did have some in-house scripts and tools developed just for Dance Central. Due to such high volume of clips, a lot of tasks were scripted from something as little as renaming clips to something as big as automating playback clips. We had a script creating export nodes based on the take names in MotionBuilder to save animators’ time creating each export node for every clip. We use script to set the bpm of each clip when we export so the animators don’t have to set the bpm manually. Our transition clips get played automatically between two different dance moves so animators don’t have to author them individually. I can’t list them all out but we try to use tool’s support whenever there’s a way to make the process more efficient.
Riseon, thanks for taking the time out to speak with us!