Author and fellow Hoosier Jessamyn West once said, "Fiction reveals truth that reality obscures." I believe this is true both in the reading and the writing of fiction, and I have long held the belief that the best literature, by revealing these truths, helps us to be better human beings.
I grew up in a small, Midwestern town on the edge of farmland. We had no black families in our town and no black students in our decently sized school. Looking back, I realize that we probably had a number of religions represented in our school population, but it wasn't something we talked about a lot. I just assumed that all my friends were some shade of Christian.
The point is that my upbringing lacked diversity. Racism wasn't something I had any personal experience with. I learned about racism from reading books like The Color Purple, Huckleberry Finn, The Invisible Man, and To Kill a Mockingbird.
I wouldn't go so far as to say that, were it not for fiction like this, I could have become some intolerable right-wing white supremacist — I had better parents than that. But without fiction, my understanding of racism — or classism, feminism, reproductive rights, gay rights, and a plethora of other issues — would have been purely academic.
Good fiction doesn't just talk about issues and point out right and wrong. Good fiction forces you to experience those issues. When you read fiction, you're more than an onlooker; you are in it.
I admit this isn't the same as actually living through discrimination or oppression. I will never know the extra fear or worry that a black man might feel when he's pulled over by the police in a predominantly white neighborhood, for example. But I do understand it better than I otherwise would have.
Thanks to fiction.
What life lessons have you learned from literature?
Original image by Tom Thai (CC 2.0).