I receive a lot of review books, but I have never once told lies about the book just because I got a free copy of it. However, some authors seem to feel that if they send you a copy of their book for free, you should give it a positive review.
Do you think reviewers are obligated to put up a good review of a book, even if they don’t like it? Have we come to a point where reviewers *need* to put up disclaimers to (hopefully) save themselves from being harassed by unhappy authors who get negative reviews?
I don’t receive a ton of review books, and I’m OK with that. This year I determined that I would review every book I read. I subscribe to the honesty policy (as in, it’s a good policy). I try to make my reviews kind, gentle, but I must be honest.
If we’re stepping into the shoes of the journalist/professional reviewer, we must consider the fair and balanced creed they’re held to.