My just-graduated son is moving to the South. Atlanta no less.
I wanted to get him a couple books to help him understand southern culture. Maybe one fiction and one non-fiction. Something that included a little humor would be good.
Let's have your recommendations.
He's never been to the South (other than the Florida keys, which don't count), much less lived there.