Discovering the Essence of My Language Arts Education Through the Lens of AI
What is in it for me?
I studied Education and Language Arts at the university for 4 years. I shuffled between the Faculty of Education and Arts throughout that period. This means 70% of my course subjects are from the Department of English.
I remember this topic vividly because it was one of the many roadblocks I encountered while diligently studying. Debates all over the place between lecturers and students about which approach is the best. Not to mention, a series of assignments and tests were done to drive this point home.
Colour me surprised when I realized that there is an aspect of artificial intelligence with the same concept as the figurative fishbone stuck in my throat while in school.
Learn a little English here, will you?
There are two approaches to studying language in English: Traditional Grammar and Transformational Generative Grammar(TGG).
Traditional grammar (also known as classical grammar) is a framework for the description of the structure of a language. According to Veronica Curlette on Quora, Traditional (often called prescriptive) grammar is somebody’s opinion of how a language should be. It doesn’t rely on science (in the case of language, the science would be linguistics) to formulate rules. In the case of English, traditional grammarians often ‘prescribe’ the regulations of Greek and Latin grammar to English.
Transformational Generative Grammar (TGG) considers grammar a system of rules that precisely generates those combinations of words that form grammatical sentences in a given language. It also involves using defined operations (called transformations) to produce new sentences from existing ones. It states that we analyze the relationship between sentence elements and then build internal rules based on observation. The man responsible for this theory is Noam Chomsky.
This is to say that sentences are looked at critically using structure and context more than following rules blindly.
The Mind-boggling Discovery
According to ALX, Computers have long been seen as advanced calculators, rigidly following the instructions programmed by humans. However, a transformative change is underway with Generative AI, where computers are not just tools but partners in thinking, learning, and creating. Generative AI allows machines to engage in creative and intellectual tasks, which, in the past, only humans were capable of.
While the traditional way humans use computers is to follow programmed instructions and are seen as a means to an end, Generative AI is that TGG in computer learning that thrives on context, learning and evolving.
It gets trickier
Two recent advances discussed in more detail below have played a critical part in generative AI going mainstream: transformers and the breakthrough language models they enabled. Transformers are machine learning that allows researchers to train ever-larger models without labelling all the data in advance. New models could thus be trained on billions of pages of text, resulting in answers with more depth. In addition, transformers unlocked a new notion called attention that enabled models to track the connections between words across pages, chapters and books rather than just in individual sentences. And not just words: Transformers could also use their ability to track connections to analyze code, proteins, chemicals and DNA.
Tying It All Together
This journey from traditional grammar to transformational generative grammar (TGG) in linguistics mirrors the evolution in artificial intelligence, particularly with generative AI. The same debates and struggles I encountered in my education about prescriptive versus descriptive grammar are paralleled in the technological world by discussions about traditional programming versus machine learning and generative AI.
In both fields, there is a shift from rigid, rule-based systems to more fluid, adaptive, and context-aware models. Just as TGG allows for creating complex and varied sentences based on internalized rules and context, generative AI will enable computers to generate content, make decisions, and solve problems to mimic human creativity and thought processes.
The revelation that the principles of TGG could be applied to artificial intelligence was mind-boggling. It showed me that the skills and knowledge I gained from studying language arts have unexpected and profound technological applications. This interconnectedness between linguistics and AI opens up new avenues for exploration and innovation.
Top comments (0)