I swear, my current employers really need to catch the hell up. These assholes pulled me out of the limbo of sleep, just a little bit ago, only to tell me what I already knew.
Evidently, one of our clients has tested positive for this "covid" shit, and these boneheads are dumb enough to think this is news to me. Then, they proceed to ask me if I've experienced any of the signs/symptoms of it, which is kind of moot, considering I found out when it all started, that I have a natural resistance and immunity to it (go figure).
But what really is even more annoying is that I had to make most of those I work with aware that even if I was willing to have a vaccination for it, it would only cause problems, because my immunity would turn it into something even worse.
But evidently now, I get to wear what I would say almost borders on a hazmat suit, because of their paranoia... and people wonder why those like myself stay in the shadows of society. Don't get me wrong. I understand the "health and wellness" thing. But one would think that when people are told or shown something from the voice or hands of experience, they would catch on, at some point. I can already tell this is going to be a pretty long winter for me. I'm starting to think I should just say "fuck it" and resort to finding an online work type. More and more, the idea is beginning to sound pretty damn nice.
|World Visitor Map|