Researchers Synthesize Real-Time Fracture Sounds 54
ChippedTeapot writes "Researchers at Cornell University have devised an algorithm for synthesizing sounds associated with brittle fracture simulations. Computers can now automatically generate synchronized sound, motion, and graphics for physically based fracture events, such as in future interactive virtual environments. The results will be presented at ACM SIGGRAPH 2010 in Los Angeles July 25-29. Check out the smashing results on YouTube."
Re:Uses? (Score:4, Informative)
Harmonic Shells [youtube.com]
Re:Uses? (Score:4, Informative)
From first thoughts and glancing at the article, it seems the first use that comes to mind is for sound effects in movies and the like.
Generating realistic sounds of rigid objects is fairly straight forward. You know the material properties. You know the size and shape from your physics engine. You run it through some relatively simple set of equations to get the resonance frequency of the shard. Add in amplitude from the impact, natural damping over time, and you're done. Oh, but you have to do it for hundreds or thousands of objects, and each of those is impacting or fragmenting several times per second.
Producing realistic sound like that isn't very impressive. What is noteworthy is the optimization, generalization, and statistical modeling that they used to simplify that massive amount of computation into something that could be run in real-time on commodity hardware, rather than having to be backed by a supercomputer. Movie and special effects studios have those supercomputers and render farms at their disposal already. This work is intended to go hand and hand with the realistic and complex physics engines that games have started getting in the past few years.