Christianity in America

The Pew Forum recently released a study about Christianity in America which you can read in detail here. But what I found interesting was this. CBS gleened from this report that up to 70% of Americans don't feel their religion is the only way to eternal life - even if their faith tradition teaches otherwise. What are your thoughts?