2nding this. The "non-phonetic alphabet" is the biggest non-issue I see people raise a stink about. It really doesn't matter, context is the heavy-weight backbone of language.
On top of that, I think people really underestimate how inappropriate diacritics would be for English. It has a massive phonemic inventory, with 44 unique items. Compare with Spanish's 24. English's "phonetic" writing system would have to be as complex as a romanized tonal language like Mandarin (which has to account for 46 unique glyphs once you account for 4 tones over 6 vowels + the 22 consonants). Or you know, the absolute mess that is romanization of Afro-Asiatic languages. El 3arabizi daiman byi5ali el siza yid7ako, el Latin bas nizaam kteebe mish la2e2 3a lugha hal2ad m3a2ade.
> The "non-phonetic alphabet" is the biggest non-issue I see people raise a stink about
Myself and many friends who aren’t native have struggled with speaking fluently because of it. Most of us still mispronounce some words (my friend pronounced “draught beer” like the lack of rain, instead of like draft).
Doesn’t mean things should change, but it’s certainly not a “non-issue”
The bureaucratization of language is more problematic in my view, where things are seen as wrong and right and we try to cram the beauty of of natural language into a restricted box that can be cleanly and easily defined and worked with universally. I quite literally have nothing but detest for this conception of language, that it must bend to the whims of rigidity when it's very clearly a natural, highly chaotic dynamic system constantly undergoing evolution in unexpected ways.
How would you account for the fact that for many words, there isn't a consistent pronunciation rule for it at all? For example, I would guess that 50% of English speakers are non-rhotic.
Same way other dialect continuums account for it: you standardize spelling on some variant, or several variants if that is non-viable (which, yes, does mean that e.g. American and British English spellings would diverge somewhat).
To be clear, I'm not particularly advocating for making english a phonetic language. I'm just saying it being non-phonetic does cause issues (and makes it frustrating, but also shows a very interesting history).
Assuming we wanted to make English a phonetic language, then your question is kind of moot: phonetic means we need to pick the pronunciation rules for phonemes, which would make other ways to pronounce these phonemes incorrect. Some of currently-correct english would become incorrect english.
> For example, I would guess that 50% of English speakers are non-rhotic
Note that accent isn't really what people talk about when they complain about pronunciation. The problem is that there's no mapping from letters to phoneme in any english accent: laughter/slaughter, draught/draught, G(a)vin/D(a)vid...
All those examples follow the linguistic patterns of the languages they come from. They aren't arbitrary, they just don't teach us the context when we're learning as children.
Of course there’s always reasons. Teaching it to children isn’t really a solution: you’d need to know where words come from before reading them correctly, and also many people don’t learn English as children.
Phonetic languages do borrow words from other languages too, they adapt them to their own language keeping the pronunciation (the only example coming to mind right now is the Czech for sandwich, sendvič). English could do that just fine being phonetic was a goal
You would know where words came from based on the way they're spelt. That would let you know how to pronounce them. It's the exact same thing people do now we just do it without thinking.
The systems at work in English are not nonsensical like people like to parrot. To say it's not phonetic is just wrong on every level as well.
Frankly I'm fine with the historical oddities that have led to modern English. If non native speakers have issues, that's tough luck for them!
Does relate to the point that English still doesn't have a central linguistics authority (and likely won't ever). Just various reformers that have been more or less successful and in how distributed their reforms have been. Draught versus draft was indeed one of Noah Webster's proposed reforms that influenced a lot of American spellings and in turn is still influencing UK spellings. It's not as obvious as color versus colour, but there is a bit of US versus UK in draft versus draught.
(Webster also went on to suggest dawter over daughter, to remove more of these vestigial augh spellings, but that one still hasn't caught on even in the US. Just as the cot/caught split is its own weird remaining reform discussion.)
Pronunciation is not mandated to be correct or wrong, as long as you're within a radius, it's good. Pronunciation has changed in languages before enmasse. Look at big vowel.shift
> It has a massive phonemic inventory, with 44 unique items. Compare with Spanish's 24, or German's 25.
I'm not sure where you're getting these numbers from, but German has around 45 phonemes according to all sources I could find, depending on how you count: 17 vowels (including two different schwa sounds), 3 diphthongs, 25 consonants.
If Arabic had to cater to afro-asiatic dialects phonemes then the script would have been even more messier. I'm a speaker of one, and my dialect is heavily influenced by the indigenous Tamazight language. and I think this is why many of the Amazigh community were and some still disappointed with the neo-Tifinagh script. While it carries symbolic weight, it doesn’t offer practical readability, phonemic clarity and tech accessibility of a modern script that Tamazight deserves. Latin script, ironically, fits Tamazight much more naturally.
You don't have to make a perfect pronunciation system. It's OK if a vowel is pronounced slightly differently, as long as its pronunciation can be predicted from context. Even if it can only be predicted 99% of the time.
Insisting that the writing system captures every little distinction is a common mistake enterprising linguists do (often when designing an alphabet for a bible translation, or "modernizing" the spelling of a language which is not their own). They don't have to. Even if you do it, it won't last long. Letters only have to be a reasonably consistent shorthand for how things are pronounced. People don't like a ton of markers or, god forbid, digits sprinkled into their writing to specify a detailed pronunciation.
English has accumulated inconsistencies for so long, though, that it can't really be said to be consistent anymore. Usually, there are radicals who just cut through and start writing more sensibly here and there (without digits or quirky phonetical markers), cutting down on the worst excesses of inconsistency. But in English, these radicals have been soundly defeated in prestige by conservative writers.
Agreed. We don’t need an IPA level alignment between writing and pronunciation and you never have a workable single system that rejected all speakers.
I do think we could have a “lite touch” reform that cleaned up some of the more egregious cases like “…ough” and some others that trip people up all the time.
Diacritics don't need to be used the way they are in French, i.e. to preserve the original spelling. On the contrary, most languages use them to make their spelling more phonetic.
Nor is there a need for some insane kind of diacritics to handle English. Its phonemic inventory is considerable, yes, but it can be easily organized, especially when you keep in mind that many distinct sounds are allophones (and thus don't need a separate representation) - a good example is the glottal stop for "t" in words like "cat", it really doesn't need its own character since it's predictable.
Let's take General American as an example. First you have the consonant phonemes:
Nasals: m,n,ŋ
Plosives: p,b,t,d,k,g
Affricates: t͡ʃ, d͡ʒ
Fricatives: f,v,θ,ð,s,z,ʃ,ʒ,h
Approximants: l,r,j,w
Right away we can see that most are actually covered by the basic Latin alphabet. Affricates can be reasonably represented as plosive-fricative pairs since English doesn't have a contrast between tʃ/t͡ʃ or between dʒ/d͡ʒ; then we can repurpose Jj for ʒ. For ŋ one can adopt a phonemic analysis which treats it as an allophone of the sequence ng that only occurs at the end of the word (with g deleted in this context) and as allophone of n before velars.
Thus, distinct characters are only strictly needed for θ,ð,ʃ, and perhaps ʒ. All of these except for θ actually exist as extended Latin characters in their own right, with proper upper/lowercase pairs, so we could just use them as such: Ðð Ʃʃ Ʒʒ. And for θ there's the historical English thorn: Þþ. The same goes for Ŋŋ if we decide that we do want a distinct letter for it.
If one wants to hew closer to basic Latin look, we could use diacritics. Caron is the obvious candidate for Šš =ʃ and Žž=ʒ, and we could use e.g. crossbar for the other two: Đđ and Ŧŧ. If we're doing that, we might also take Čč for c. And if we really want a distinct letter for ŋ, we could use Ňň.
You can also consider which basic Latin letters are redundant in English when using phonemic spelling. These would be c (can always be replaced with k or s), q (can always be replaced with k), and x (can always be replaced with ks or gz). These can then be repurposed - e.g. if we go with two-letter affricates and then take c=ʃ x=ð q=θ we don't need any diacritics at all!
Moving on to vowels, in GA we have:
Monopthongs: ʌ,æ,ɑ,ɛ,ə,i,ɪ,o,u,ʊ
Diphthongs: aɪ,eɪ,ɔɪ,aʊ,oʊ
R-colored: ɑ˞,ɚ,ɔ˞.
Diphthongs can be reasonably represented using the combination of vowel + y/w for the glide, thus: ay,ey,oy,aw,ow.
For monophthongs, firstly, ʌ can be treated as stressed allophone of ə. If we do so, then all vowels (save for o which stands by itself) form natural pairs which can be expressed as diacritics: Aa=ɑ, Ää=æ, Ee=ɛ, Ëë=ə, Ii=i, Ïï=ɪ, Oo=o, Uu=u, Üü=ʊ.
For R-colored vowels, we can just adopt the phonemic analysis that treats them as vowel+r pairs: ar, er, or.
To sum it all up, we could have a decent phonemic American English spelling using just 4 extra vowel letters with diacritics: ä,ë,ï,ü - if we're okay with repurposing existing redundant letters and spelling affricates as two-letter sequences.
And worst case - if we don't repurpose letters, and with each affricate as well as ŋ getting its own letter - we need 10: ä,č,đ,ë,ï,ň,š,ŧ,ž,ü.
I don't think that's particularly excessive, not even the latter variant.
On top of that, I think people really underestimate how inappropriate diacritics would be for English. It has a massive phonemic inventory, with 44 unique items. Compare with Spanish's 24. English's "phonetic" writing system would have to be as complex as a romanized tonal language like Mandarin (which has to account for 46 unique glyphs once you account for 4 tones over 6 vowels + the 22 consonants). Or you know, the absolute mess that is romanization of Afro-Asiatic languages. El 3arabizi daiman byi5ali el siza yid7ako, el Latin bas nizaam kteebe mish la2e2 3a lugha hal2ad m3a2ade.