Yes, being able to live on the planet successfully and growing as a country is more important than any money a company can acquire from hurting it. Humans need the environment around them to survive and if we keep destroying it there isn't going to be enough environment left to keep the earth spinning and the diverse wildlife growing.
Be the first to reply to this answer.