the belief that human beings have only been weakened by the fall into sin and can make an effort to help God in their conversion and salvation. The Bible, however, teaches that by nature we are dead in sin (Ephesians 2:1) and are saved entirely by God's grace (Ephesians 2:8,9).

» Defining Religion