martes, 14 de noviembre de 2023

Do any American schools teach grammar anymore?

I've seen a lot of discussion about how many US schools don't teach phonics anymore, but I don't really remember being formally taught about grammar either. At some point in elementary I must've learned basic concepts like the difference between nouns, verbs, adjectives, etc, but not much beyond that.

At the middle school I went to we even had 2 separate English classes with 2 separate teachers every day, but both were literature based with one focusing more on books, and the other focusing more on short stories and poems with an occasional nonfiction unit.

My high school English classes were also mostly literature based and the only time the topic of grammar came up was when we discussed an author's grammatical choices in the context of a story.

As a result I feel like I can usually tell whether or not a sentence is grammatically correct, but I can't really explain why. It's also made it hard for me to learn other languages, because the textbooks will talk about things like interrogatives, and I don't have much of a reference point to understand that.

Is there any particular philosophy or policy that discourages the teaching of grammar? Is this a widespread thing, or did I just go to really weird schools?

Edit to add: Despite not teaching grammar, I'm pretty sure my elementary school actually did do phonics because I remember having a phonics workbook with a picture of plaid pigs on the cover.



Submitted November 14, 2023 at 02:54PM by manicpixidreamgirl04 https://ift.tt/qBCYQrl

No hay comentarios:

Publicar un comentario