Editorial: Acquire AI literacy

Editorial Cartoon by John Gilbert Manantan
Editorial Cartoon by John Gilbert Manantan

There are two camps that are distinctly eyeing each other warily across the gulf created by artificial intelligence (AI): those who are using it and those who are not and thus, against its use.

Yet, there is much to argue for being literate about AI since intelligence, a quality involved in all human acts except for the simplest, “must include the ability to adapt to new circumstances,” points out the Britannica.com.

Many teachers are uneasy about students using AI, such as the free version of the popular ChatGPT, not just to review and improve their papers but even to create their assignments.

AI-assisted cheating is not limited to students or teachers.

What if a person or group were to use AI capabilities for deception? Xy, a virtual assistant who also teaches, demonstrated how ChatGPT 4.0 can be prompted to create a virtual man or woman who can chat, gesticulate, and react with nonverbal cues to questions or comments made by a human “conversing” with it.

A close comparison of the AI-simulated human with a real person shows a certain rigidity of facial expressions, which appear only unnatural because one is alerted about the AI-generated image.

The next version of ChatGPT will smooth this wrinkle, predicted Xy who pays monthly for the premium version of the chatbot to create digital content and academic papers.

The “jaw-dropping” performance of AI dominates current discussions. Yet more attention should be on the processes involved in the transformation of inputs to outputs in AI. The focus on outputs rather than process is steered by the black-box nature of AI.

While praising the “powerful language capabilities” of GPT-4 that can be harnessed for all endeavors from the creation of gaming content to news articles, Gartner Research analyst Arun Chandrasekaran cautions in a Business CNN article published on March 16, 2023 that even the latest versions of AI can still be prone to glitches, such as producing inaccurate information called “hallucinations.”

In an article, Britannica.com explains that hallucinations occur when an AI is prompted to function as a search engine, countering its original function as a generator of text. Sometimes, rather than indicate that it does not know the answer, a probability-based language model “responds with probable but factually inaccurate text based on the user’s prompts.”

A “black box model” refers to a system producing useful information through a process that is not transparent or visible for scrutiny. Investopedia.com explains that many technological advances may be too complex to simplify for humans or withheld for proprietary reasons.

Aside from being opposite in principle to a “white box” system where the inner workings are subject to inspection by people, AI is also dependent on the inputs that comprise its memory and determines its capabilities.

OpenAI, which produced ChatGPT and more sophisticated iterations, cautions users of GPT-4 to apply with “great care… especially in high-stakes contexts,” according to Business CNN on March 16.

The area of ethics in social applications of AI is mined with hazards. A crucial consequence of AI is not just the prerequisite of literacy. As more jobs will become automated, many will lose work, even after retooling on AI.

Britannica.com points out that US police departments’ use of predictive policing algorithms are influenced by disproportionately high arrest rates in communities of color, which results in over-policing that, in turn, affects the algorithms. Technology is incapable of filtering these human biases; AI magnifies these biases.

Privacy can be another vulnerability since other parties may illegally access the large data of AI, manipulating these to create fake images and profiles to power deceptions and scams. Intrusions can extend to surveillance of individuals and vulnerable groups such as activists and dissidents.

By being literate on AI, citizens can check and affect how life is enabled or disabled by technology.

Trending

No stories found.

Just in

No stories found.

Branded Content

No stories found.
SunStar Publishing Inc.
www.sunstar.com.ph