Hey ladies! I'm wondering if / how / when you plan on telling your employers that you are pregnant.
I just started a new job in March. I am an RN. I work at a pediatric medical daycare. As far as health concerns, it is fairly unlikely I will be exposed to anything cautious for pregnant women. I do a decent amount of lifting kids, but most of them are under 30 pounds. And that's not much different than lifting my almost 40 pound kids at home. So I'm not super concerned about that, but I don't know what's really expected and typical. I love this job, I am so happy here. I don't really have any mean bosses or coworkers, everyone is great. Im just nervous.
I hope I don't sound odd for asking, it seems like easy peasy stuff, but I'm a bit of an awkward person when it comes to formal stuff or confrontations like this. Any advice or experiences would be appreciated!