I love this show, but it's at its best when it's dark humor. When they focus on the drama, it's just freakin' dark. Darker than Dexter, which is always tongue-in-cheek and never actually challenges ones ethos. Breaking Bad ... they keep making it harder and harder to find someone to root for. For a time they even made RJ hard to root for, and that takes a lot of work to make a kid with Cerebal Palsy to be hard to root for. When Walter White killed those two gang bangers -- who killed a boy whom they recruited to kill a rival drug dealer -- I thought the show was *this close* to going off the rails. They brought it back by focusing on the faux-Daddy relationship WW has with Jesse, but that was a real close call, IMO. I guess they would argue it was something that had to be done to move WW's character forward, but it really muddied the water. Not only was it hard to accept a (former) science teacher could end up there, even after watching his transformation, but it made sympathy for any of the "good guys" almost impossible. I thought they did a better job with Jesse's struggle with taking a life than they did WW, who seemed (and this is probably on purpose) to have lost his conscience.
That said, I don't think the show's ever off. There's never a episode that looks like the writers or actors took a day off, and the casting has been spot on from the start. I've started watching this again after taking a long break (because it was really depressing me afterawhile with so much emotional investment required in WW), but I've found the new direction appealing again. Or maybe it's just my perspective. When you look at this as a show that shows how good people become bad people -- and how you simply can't be a good person who lives in that world -- it's not only acceptable, it's compelling. I can't think of another show that made it's whole premise about characters we probably all know and take them to such a dark place.