I think tv shows should always teach life lessons.
I don't mean this in a boring, classroom educational kind of way, I just think that there should be a message or life-lesson on every tv show or movie.
Ask Your Question today
I don't mean this in a boring, classroom educational kind of way, I just think that there should be a message or life-lesson on every tv show or movie.
Even though I love watching a movie or TV show when I can derive something philosophical from it, I disagree. Sometimes you want to watch something that is just... droll or distracting from the norm.
Also, if you tried hard you could find a life-lesson, no matter how subtle, in any TV show, movie, game or book.
All tv shows already have life lessons. Take any tv show. Just look long enough.
Supernatural teaches us that if you get possessed by a demon or if you are born different it's okay if your head gets chopped off.
Breaking Bad teaches us to stand up for yourself by cooking and selling meth.
Captain Planet taught us that, without "heart", you will never be able to be a true hero. Alternatively, real heroes have mullets.
The Little Mermaid teaches us that crabs are Jamaican.
Why do you shitty people like you always feel the need to impose your pathetic philosophies on the rest of planet?
steven universe has a great life lesson, and that's that lesbian space rocks are suprisingly hot
Short of making your own movie or television show you're powerless to change this.
Sounds boring. Entertainment is suppose to be entertaining. If you feel otherwise get started on your screenplay.
it can be both. the plots on most shows make no sense at all because so many of these writers are writing deliberately unfunny stuff just so they can complain how stupid people are when the show becomes a success. If you think that this doesn't happen your insane.
I guess I'm insane then becos I don't think thats what writers are doing. Hows the screenplay coming along?