The C.R.A.A.P. Test was developed at California State University - Chico, as a way to evaluate information you encounter. The various criteria may be more or less important, depending on your reliability needs.
Currency: Is the information up-to-date? When was the last time the website was updated? Can you use historical information? Scientific research should be very current; if you're looking for historical context, those sources can and should be older.
Relevance: Does the information relate to your topic? Who is the audience? Is it written at an appropriate level for your needs? You don't need a study from a peer-reviewed economics journal as a source for the paper you are writing for your history class about the stock market crash of 1929 (but you probably do if you're writing for an economics class, and that is your major).
Authority: Who wrote or published the information? What are their credentials? Is there contact/submission information? If a website is calling for writers without any other criteria, it's probably not very authoritative. Just because someone is a doctor, doesn't mean they are the right person for a certain topic - would you expect your dentist to operate on your brain tumor?
Accuracy: Are there citations or evidence given? Has the information been edited or reviewed? Can it be verified? This really coincides with the reliability spectrum - those on the unreliable end have no review process, those in the middle have at least been edited, and those on the reliable end have been reviewed by peers.
Purpose: Why was this information created or shared? Is it objective or partial? What biases are at play? Some information is meant to inform (think news), some to persuade (think op-ed), and some to sell or proselytize (think advertising).
Here are a few articles about how to examine different resources.