If you haven't noticed it, then you probably live under a rock somewhere in the middle of a desert searching for rats to skin and eat. Yes, I know vampire novels have been around for ever (or at least for three centuries), but what is the deal with vampires these days? From The Twilight Saga movies to The Vampire Diaries series to all sorts of books being released about vampires. What is up with that? Why does the idea of the existence of creatures that live by sucking other creatures' (be them human or not) blood suddenly appeal to so many people?
This just goes to show how degenerate the world has become. Either that, or people are just getting sick of reality TV.
No comments:
Post a Comment