Contributor Q&A: Cara Santa Maria becomes completely digital
This week on "TechKnow," contributor Cara Santa Maria visited the University of Southern California's Institute for Creative Technologies. She learned about the way the ICT uses its light stage to first capture people's likenesses, and then uses that data to digitally render and manipulate 3-D versions of someone's face. This technique is becoming very popular in filmmaking, helping to realize movies like "Gravity" and "Benjamin Button."
After having her own image captured, Santa Maria talked about her experience.
TechKnow: What kind of directions were you given around each of the facial expressions you made? How difficult was posing?
Cara Santa Maria: I'm really good at sitting still. It's a weird gift I have. I used to pose for photos for a photographer friend of mine in college. He would rehab vintage large-format cameras (you know, the kind with the bellows) and use me as a test subject. Some of them required that I hold perfectly still for 10 seconds!
We noticed you didn't have your lip ring in. Was that your choice or by request?
Paul Debevec [chief visual officer for the Institute for Creative Technologies] was concerned that the metal in my lip ring, being so reflective, would distort some of the scans. It's generally best to keep one's hair pulled back tight and have a clean face, so that the geometry of the face and texture maps of the skin are as accurate as possible.
What happens to this footage of you now? How can people control their likeness in this new way?
I actually gave permissions to USC to use my scans for educational purposes. I like to know that my likeness is being used to teach people about this new technology! But in general, actors will retain rights to their own likeness based on agreed-upon contracts with the studios.
What ethical questions or concerns do you have about this technology? What uses would you be excited to see it applied to that we don't explore in the piece?
I think the process is fascinating, and I don't see any ethical risks of creating digital characters — at least not where the technology lies currently. I'm excited to see it move forward into a real-time rendering application. This technology is being developed in a partnership between Paul's lab and Activision, through a project called Digital Ira. Eventually we may see photo-real characters in video games. How cool is that?
Watch "TechKnow," Saturdays at 7PM ET/4PM PT on Al Jazeera America.
Error
Sorry, your comment was not saved due to a technical problem. Please try again later or using a different browser.