Been looking for a change in career from the past year doing environmental education and conservation corps work. My first job out of college in 2020 was at a pharma company that produced so much waste in shipping it made me sick. A lot of the options I'm seeing now are consultant work that would have me sizing up natural areas to be developed, depressing EHS jobs for companies doing terrible things to the planet, or sales jobs for "green" companies. Just curious if any of you work for companies that actually make a positive impact on the world. Currently losing hope in my future prospects.