What do you think of when you hear the word "science"?
Does your level of trust vary between types of sciences? Does it matter if it's a physical or social science?
I pose this question because I worry that "science" is becoming a meaningless buzzword in our society that means that something is trustworthy, and we as a society are forgetting what makes actual science (that whole hypothesis, testing, etc. process you learn about in school) trustworthy.
I think many people apply lazy thinking to "scientific" claims, associating white lab coats, university settings, and academic titles with "science" rather than actually thinking about whether they're actually using the scientific method. And I think, for this reason, that we're accepting a broader and broader range of claims, beliefs and people to label themselves as "scientists" practicing "science" because we're not very careful with what we allow that word to actually mean.
This forum, Gender Studies, is categorized under "social science;" you can be taught this particular collection of political beliefs as a "science" at a college. Notice that this category also contains sociology, and dream interpretation.
Do you think that the term "science" is beginning to become meaningless, overapplied, and dissociated with the scientific method? If so, is there anything we can do about it?
Ten years ago, I would have said that if you want honest, objective economics, you should look to the banks and look to the hedge fund managers, because those people aren't trying to sell any political points, they're just trying to make the most money possible. However, recent events show that I'd be wrong about the assumption that even they are objective.