Something's been bothering me; has our society made us feel depressed. I mean like, depression in general. How long has it been that we've had antidepressants? According to a quiz I took online, I have depression. I've never seen a doctor about and the internet's not that accurate with this stuff, (trust me, I'm not that stupid as to believe it) but I'm just fine without anti-depressants. I know they're supposed to help people, but a friend told me she had to take them and felt like shit afterwords. Please forgive me for not knowing much about the subject, but I felt the need to post this.
~Wishing you well.
P.S. This was written a year ago, and I still don't know much. I'm not saying that meds are bad. If you have them, take them for Christ's sake. I'm just wondering. Also, I don't research things and blurt things out, and I'm sorry.
No comments:
Post a Comment