jueves, 19 de septiembre de 2019

Are there any Universities that teach specifically on company/business culture on an undergrad or masters level?

I majored in English but had a few classes when I was a Business Administration Major. I don’t remember seeing and courses about company/business culture. I’m very interested in learning about how it’s created and maintained. I’m considering further education but have no idea where to start. HR? Anthropology? Buried somewhere back in business administration? Any suggestions?



Submitted September 19, 2019 at 10:47AM by wartatoe https://ift.tt/34VAKSb

No hay comentarios:

Publicar un comentario