Nvidia launches an artificial intelligence model that can automatically dance moves for the music

The field of deep learning for artificial intelligence has been constantly improving, and NVIDIA has recently been conducting various related research in this field. Earlier this month, Nvidia and Hackster jointly launched the AI ​​at the Edge Challenge. The contestants noticed that they could use the NVIDIA Jetson Nano Developer Kit to develop new models based on neural networks.

Nvidia also released a multi-modal artificial intelligence development tool suite called “Jarvis” in November, which can integrate multiple sensors into one system. Nvidia also recently prototyped a new algorithm that allows robots to pick up any object.

Nvidia has not stopped on the road to exploring artificial intelligence, and they also launched a new deep learning-based AI model at NeurIPS 2019, which can automatically arrange dance movements for input music. This model, known as “artificial choreography,” was developed in collaboration with the University of California, Merced.

Although the tasks performed by this model are not complicated on the surface, the R & D team stated that it is really difficult to more accurately associate music with corresponding dance moves, because there are many factors to consider when arranging dance moves, such as the number of beats and Genre and more. Researchers collected 361,000 fragments of the three most influential dance types (ballet, Zumba, hip-hop) to train the generative adversarial network (GAN) used by the system.

Nvidia plans to expand choreographable dances to other dances including pop and ballroom dances in the future. The source code and prototype of the research will be published on Github after the end of NeurIPS. Now you can read the thesis to get the research content of “artificial choreography”.