Popular media has spent the last few decades ingraining our culture for “alien contact”.
- Do you believe “they” are coming?
- Will “they” be friendly, or hostile?
- How will “they” manifest themselves to us?
- Is our culture ready?
I've said before that I believe in alien life. I believe that aliens have been on Earth and that various organizations of humanity have already had covert contact.
I think if an alien civilization wanted to reveal themselves to the population of Earth, they would do so in a manner that is incontrovertible. Based on what it takes to convince our culture of the reality of an event, I think that means a direct presence... or a show of force… like you’d expect in a movie (like
Independence Day).