Demystification of Artificial Intelligence Systems in Linguistic Intelligence and English Language Domains
DOI:
https://doi.org/10.32996/ijllt.2025.8.5.16Keywords:
Artificial intelligence, linguistic intelligence, computer programmes, computational linguistics, complex biological system, mindfulness ambiguities in AIAbstract
The term "artificial intelligence" was first coined by John McCarthy as "the science and engineering of making intelligent machines" in a document created for a conference on the campus of Dartmouth College in 1956. This conference kickstarted the beginning of serious AI research in the upcoming decades. The concept of Artificial Intelligence is not as modern as we think it is. This traces back to as early as 1950 when Alan Turing invented the Turing Test. Then, the first chat-box computer programmer, Eliza, was created in the 1960s. Indeed, in 2017, 61% of Europeans were positive about robotics and AI, while 30% were negative, to be managed carefully. The dynamics of public opposition and acceptance could be important factors shaping AI's long-term development path. The theoretical framework is that artificial intelligence could be viewed as an "overarching rubric which encompasses machine learning, which further encompasses deep learning." Rich and Knight (1991, p. 3) stated that "artificial intelligence (AI) is the study of how to make computers do things which, at the moment, people do better."