The Black Eyed Peas Experience
"The Black Eyed Peas Experience" is a new dance game featuring the 6-time Grammy Award winning hip-hop group, The Black Eyed Peas (BEP). Even though the game was released only in North America and Europe, it quickly grabbed the attention of hip-hop fans and dance game enthusiasts in Japan. It's no surprise that the game attracted so much attention with a sound track that comprises of 28 BEP smash hits as well as their signature fashion, sound, body language, and of course, the BEP's signature dance moves. The game and the development team had worked hard to create a fun and extremely diverse dance experience in order to gain the support of dance fans. The most striking aspect of The BEP Experience is its use of the Xbox 360 Kinect motion controlled innovative avatar system to create a highly immersive experience coupled with exceptional graphics. We interviewed the development staff of this game, including Akinori Ozawa.
Leveraging the new Kinect for a brand new gaming experience
――Please describe what is The Black Eyed Peas Experience as a game.
Ozawa ● This is a dance game featuring the globally popular hip-hop group The Black Eyed Peas front and center. In the game, the player can control their custom avatars, and learn dance moves for 28 chart-topping Black Eyed Peas song tracks directly from the BEP group members. The dance battles with the BEP members and other characters take place on 4 unique upgradable stages. In addition to the career mode where players can learn the various dance routines and improve their skills, the game also features a party mode where all songs are available from the start.
――How important was the inclusion of BEP to creating the unique experience for this game? Is BEP the most critical aspect of the game?
Ozawa ● Definitely. There wouldn't be a game without the BEP. So we put our main effort towards accurately creating and conveying the full allure of the BEP, on top of building an exciting dance game. Although we had a tight schedule and would have liked to have more time to develop the game, the BEP members were extremely cooperative. They took time out of their busy schedules to work with us on motion capturing their movements as well as recording their voices. We were also fortunate to have Fatima Robinson, the official BEP choreographer, participate in creating the game's choreography. Once we completed the game and showed it to the BEPs at their homes, they were thoroughly excited with the end result and were very vocal in their support of the game, especially on Twitter. They were especially excited with the high quality of the graphics and the reality of the game world.
――Would you say Kinect interactivity is an important aspect of this game?
Ayel ● Yes, the "avatar system" using Kinect is one of the major selling points of this dance game. The avatar system, simply put, is a system that can detect the player's own dance motions and reflect the same dance moves onto the user-customized avatar within the game allowing the player to participate in the game along with their in-game avatar. Since the user can dance with their icon BEP through their own avatars, this brings the level of immersion to a much higher level. Moreover, for dance games, learning the steps of a dance routine can be a barrier for some players. In this game, each song consists of 9 choreographed dance moves, divided into 3 blocks, with 3 dance moves each. Players can follow the comprehensive guidance of the game and learn these routines at their own pace while still having fun playing the game. This allows anyone to easily learn the dance routines in the game.
――As for the development period, you said it was very short timeline. Tell us more about it.
Ozawa ● In fact, this game was originally started in 2009 as a project to utilize the then newly announced Kinect device. As we are a company specializing in creating music games, we came up with the idea of a dance game using Kinect immediately. At the beginning, we didn't have the idea of featuring the BEP exclusively. The initial plan was to make a dance game that the user could play with a variety of songs from multiple artists. As we continued discussions and started developing the game, the opportunity to work with Ubisoft opened up to us. It was around March or April 2011 when the decision was made to feature the Black Eyed Peas as a part of Ubisoft's "Experience" series. Considering the development was completed in September, the actual development period was only half a year. That was because both we and the publisher wanted the game to be completed and available for sale by the holiday season. Although we already had developed the core part of a dance game for Kinect, for a game of this caliber and quality, we would have liked to have at least a year and a half, hopefully two years. But we had no choice since finishing up in September was a MUST and an absolute requisite. So the question became how to do this in such a short time. We thought hard and put in everything we had to get the game done. Looking back to it now, it was pretty scary.
――What was the biggest challenge in the development?
Ozawa ● We were developing a game featuring the BEP who are all major, real artists. Our task was to fully capture the essence of the BEP members, convey it properly to the player, and deliver a great dance game. We aimed to bring out to the max what the BEP members who they are in all aspects of the game: character modeling, background images, animation, user interface, songs, and dance styles. Another big challenge we had in parallel was to improve our productivity. As I mentioned, we had to finish the game within an extremely short period, while maintaining the highest quality. How could we manage that? We thoroughly pursued the solution to this dilemma to the very end.
How leveraging middleware enabled the development in an ultra-short time period.
――Did the usage of middleware play a crucial role in attaining both high quality while maintaining the productivity?
Ozawa ● Yes, indeed. For this project, we used 3ds Max and MotionBuilder as our main tools. In addition to those standard tools, we brought in middleware such as Autodesk HumanIK and Autodesk Scaleform, and utilized them to achieve huge results. HumanIK was especially helpful resolving critical issues in the avatar system, which was one of the biggest selling points of this game. To tell you the truth, if we hadn't used HumanIK, we might not have been able to implement the avatar system as we did. I honestly think it helped us tremendously.
――How did you take advantage of HumanIK?
Ayel ● As I have described previously, the avatar system detects the player's dance motions in front of the screen through Kinect and in real-time, transfers the motion data onto the player's avatar within the game. When we actually applied the process, we found that due to the limits of Kinect, it was difficult to recognize the dance motion of players with a high level of accuracy.
Ozawa ● Especially when detecting full body motion it would cause so-called "foot-sliding" and the whole motion would appear messy on the screen. But as long as we intended to claim the avatar system as one of the selling points of the game, we wanted to apply the player's motion to the avatar properly. Even after a long process of trial and error, we could not achieve an acceptable result. Eventually, we compromised by giving up gathering motion data from the entire body. Since gathering a player's lower body motion data properly was extremely difficult, we decided to focus on gathering data from the upper body, especially the arm motions. We concentrated on creating synchronization between the player and the avatar by focusing on the motion of the player's upper body and hands. After that, the focus was on making dance motion as clear as possible.
――Did it work?
Ayel ● It did, but we still had some problems we needed to deal with. There was a limitation to how well we could gather arm movement data via Kinect. When we applied the gathered data from the Kinect onto an avatar directly, for example, the arms would be bent in impossible angles or directions. We had to limit the avatar's motion within reasonable and acceptable ranges as a human. We initially tried to find a way to solve the problem through the Unreal Engine and, after taking it as far as we could go, decided to consider alternative systems and found HumanIK. We tried using HumanIK as a limit control system and had great results so we instantly decided to adapt it.
――What was the key to choosing HumanIK?
Ozawa ● One of the biggest reasons for choosing HumanIK was that it was highly compatible and integrated with the Unreal Engine, so we could start using them together in no time. Many of the other solutions, even if they can be integrated with the Unreal Engine, were not yet optimized and it would have been difficult for us to use them under a tight schedule. On the other hand, HumanIK was highly optimized for Unreal, so we could adapt it quite smoothly. Furthermore, the fact that we could use almost the same configuration as with MotionBuilder was really attractive. That means someone who knows MotionBuilder could use it at once. Nonetheless, when we finally decided to implement HumanIK, it was already in the beginning of July. We rushed to adjust and were able to build it into our avatar system by August. That was just a month before the September deadline. Actually, Ubisoft had asked us to "cut the whole system if you cannot improve the visual." Again, HumanIK really saved us.
――What can you tell us about Scaleform, the other middleware system you used?
Fujita ● We used Scaleform mainly around the menus and just about every part of the game that contained text. It was tremendously easy to use, so we all loved it. With Scaleform, you can assemble assets with Flash, and preview the results directly. Once you teach the authoring method to the authoring staff, they will be able to do it themselves. This allows us to separate the process completely and have separate pipelines which is amazing. We could divide the process completely with writing scripts and building the assets working in parallel, contributing significantly to our productivity.
――The menus for this title were extremely detailed.
Fujita ● We have always been a company that is very particular about the menus. For this title, we put a lot of energy into the menus as usual. It’s very easy to make menus in 2D, but when it comes to placing the 2D menus in a 3D space and on top of that displaying texts on the texture or many much more complicated patterns, it gets more difficult. As we looked into the capability of Scaleform, we found it could handle those processes easily and that allowed us to produce them smoothly in the later part of the development. The most difficult task was to create the small white silhouette icon that appears at the top of the game screen instructing the next dance step to the user. It looks like a 2D icon, but it was very challenging to get the icon to dance in 3D, requiring extremely complicated processes. We also put a lot of effort into designing the changing colors of lyrics in the karaoke function. The color of the lyrics changes when the player sings correctly. It was not possible for us to depict the effect properly with only Flash. Here, we used the Material function of Unreal Engine in conjunction with 2D Drawing function in Scaleform to create the effect we wanted.
――Please tell us how you came about introducing Scaleform into your project
Fujita ● We were looking for alternative UI solutions that would allow us to create a very particular artistic style. In researching middleware options for some time, we discovered Scaleform. Especially for this project, there were a wide variety of needs for the menus and it would have been pretty hard for us to implement everything without Scaleform. Although it required a little time to master, once learned, we could speed up our work significantly. We were able to rapidly update the parts that were already completed before introducing Scaleform and managed to finish everything on time.
Creating BEP members' ideal selves
――Tell us about the main attraction of this game; expressing the characteristics of The Black Eyed Peas.
Nakane ● Well, let me walk you through all the steps we took. First, character modeling. We created a model for each BEP member unique to each of the 4 stages, for a total of 16 character models. Also for each stage, there are 3 dancers. So we also made 12 models for the non-BEP dancers. That gives a total of 28 models all together, with a count of 10,000 polygons each. In principle, we used the DirectX Shader in 3dsMax, and since the normal map was the most important here, we checked the normal maps before applying them to the actual game. As for textures, of course, we used much higher resolution textures for BEP members than for other dancers. We prepared one texture for the body, one for the face, and an additional texture for Fergie's hair. We used the same faces for all 4 stages, but used different bodies for each stage because they were wearing different costumes. We kept time loss to the minimum by replacing only what we really needed to replace.
――Did BEP members check their own models to provide feedback?
Nakane ● Yes, of course. The BEPs have always been artists that have a keen interest in their visual presentation. Actually, they pointed out problems frequently, and asked for retakes many times. From the very beginning, we gathered as much resources on the BEP members as possible and instead of recreating everything exactly, we intentionally made some adjustments. Even then, the BEP members had extremely high expectations that we had to meet.
Ayel ● In fact, Fergie said since we are making 3D digital models, instead of just recreating herself exactly, she would like to take the opportunity to provide the players with a Fergie that is even prettier than her real self.
Nakane ● Speaking of which, for one of Fergie's costumes, we designed a skirt just like what she wore in a video that we used for reference. After their designer gave us approval, Fergie asked us to change it to a pair of pants. We had to redo the model in a big rush. All told, even though we had to remake many models up until just one month before the deadline, in many ways the BEP models might turn out to be the ideal Black Eyed Peas.
――I understand the backgrounds also played a part for producing the characteristics of BEP.
Nakamura ● Indeed. Naturally, almost all of the backgrounds in this game are stage scenes. Continually moving lights and changing colors make up the main parts of the scenes and helps to portray the characteristics of the BEP. For the backgrounds, we put most of our efforts into the presentation of lights and colors. As for specs, we created each map with 300,000 to 400,000 polygons, and texture size of around 20 MB (compressed). In fact, we had a problem dealing with some processes, like dynamically changing colors. How to realize such production without using the real-time shadow was the biggest challenge in this project.
――How did you work around it?
Nakamura ● We relied on the baking feature of 3ds Max. To be specific, we baked the lighted state into selected meshes with the Render to Texture feature of 3ds Max. By applying the textures, we created the pseudo-shadow as if 3ds Max had calculated and projected it. Likewise, for the areas lit by the lights, we rendered and applied the lighted state with 3ds Max. For the bright area, we could control the colors and lights on Unreal. We set up the colors to change in accordance with the music and we synced the lights to light up in accordance to the beats. For the effect of lighting the audiences, we used the capability of 3ds Max to handle multiple UVs channels per model. For example, for the first UV channel, we place a normal, visible diffuse texture. In the second UV channel, we place non-visible UVs that have no overlaps. This second UV channel will have shadows baked on in Unreal later. Finally, we add a different set of UVs in the third channel that scrolls in Unreal and create the appearance that the lights were dynamically lighting the audience. There are many Xbox 360 games that are awesome, graphically, but I think there are no other games that utilize lights to this extent.
――How about animation?
Ayel ● There were three big points in animation. First point, of course, was dancing. The second one was the clothes and accessories worn by the avatars. Because avatars could wear the clothes and accessories in various combinations, we came up with a system that allowed the clothes or various parts of the costumes to sway reasonably as avatars danced, no matter what they wore. The third factor was the avatar system. The system had to be able to properly provide feedback from the player's dance motion to the avatar. Since this is a dance game, the most important thing above all else is to showcase the dances in the most enjoyable and cleanest way..
――Could you please tell us about the development process for animating the dances?
Ayel ● We followed the ordinary flow of development. We choreographed each song, had the dancers practice the dances, motion capture the dancers and clean up the captured motion data. Although it’s pretty difficult to complete the dance animations for all 28 dance routines in such a short time, we still put a lot of focus on the details. For instance, we can hardly expect Japanese dancers to have the same groove as Western dancers. Therefore we did the motion capture in the U.S., working with a choreographer and dancers over there. And of course, since male dancers perform differently from female dancers, we captured male and female dancers' data separately. In total, we prepared 80 motions for each of the 28 dance routines as well as the downloadable content. Thanks to the fact that we happened to use the motion capture studio that had experience working on the movie "Avatar", we could gathered highly accurate motion capture data. Because the data was so accurate, we didn't need to do as much noise reduction, which made the post production process much easier.
――In this game, each song consists of 9 dance steps, which are divided into groups of 3, so the player can learn the steps in manageable, small parts.
Ayel ● One of the reasons we divided the dance routine into smaller groups was to simplify how the player learn the dances. Another reason was that, because the dance evaluation system of the game evaluates the player by detecting each dance individually via Kinect in each phase, smaller groups were needed to allow for this system. Nonetheless, we didn't just blend two dances together because blending was likely to create messy animation such as sliding. We actually motion captured the transition between dances and carefully linked the motions together using MotionBuilder instead which resulted in very fluid transitions.
――In all aspects, this is a highly polished game, despite the extremely short development time.
Ozawa ● Thank you. We honestly believe, if only we had more time, we could have done far more. However, as far as contents of the game, we have done a damn good job if we do say so ourselves. We hope not only BEP fans, but many other game enthusiasts will enjoy playing this game.
The Black Eyed Peas Experience
Available on: Xbox 360
Genre: Dance game
Release: November/ North America, Europe
Published by: Ubisoft Entertainment.
Developed by: iNiS Corporation
(C) 2011 Ubisoft Entertainment. All rights reserved.