An Overview: Generative AI Programs and ChatGPT Infographic by Dr. Jasmin (Bey) Cowin

One of the earliest examples of generative AI was the “Markov Chain”, a statistical method developed by Russian mathematician Andrey Markov in the early 1900s. Markov chains are a “fairly common, and relatively simple, way to statistically model random processes. They have been used in many different domains, ranging from text generation to financial modeling. A popular example is r/SubredditSimulator, which uses Markov chains to automate the creation of content for an entire subreddit.” Devin Soni

The first successful generative AI algorithm was developed in the 1950s by computer scientist Arthur Samuel, who created the Samuel Checkers-Playing Program an early example of a method now commonly used in artificial intelligence (AI) research, that is, to work in a complex yet understandable domain.

One of the early breakthroughs in generative AI was the development of Restricted Boltzmann Machines (RBMs). “It was invented in 1985 by Geoffrey Hinton, then a Professor at Carnegie Mellon University, and Terry Sejnowski, then a Professor at Johns Hopkins University.” RBMs are a type of neural network that can learn to represent complex data distributions and generate new data based on that distribution. In 2014, a team of researchers from the University of Toronto introduced the Generative Adversarial Network (GAN) framework. Jason Brownlee in A Gentle Introduction to Generative Adversarial Networks (GANs). “Generative modeling is an unsupervised learning task in machine learning that involves automatically discovering and learning the regularities or patterns in input data in such a way that the model can be used to generate or output new examples that plausibly could have been drawn from the original dataset.”

Recently, generative AI and ChatGPT have been in the news, discussed at conferences, used by students, and feared by Professors due to the generation of content that can be indistinguishable from that created by humans. Both Google’s BERT and GPT-3, are big language models and have been referred to as “stochastic parrots” because they produce convincing synthetic text devoid of any human-like comprehension. A “stochastic parrot” is, in the words of Bender, Gebru, and colleagues, “a system for randomly stitching together sequences of language forms” that have been seen in the training data “according to probabilistic knowledge about how they join, but without any reference to meaning.”

This infographic is an attempt to visualize the timeline of Generative AI Programs and ChatGPT.

Author: drcowinj

Education is our passport to the future, for tomorrow belongs only to the people who prepare for it today,” determined Malcolm X at the O.A.A.U.’s [Organization of Afro-American Unity] founding forum at the Audubon Ballroom. (June 28, 1964). (X, n.d.) Dr. Jasmin Bey Cowin a Fulbright Scholar, SIT Graduate, completed the Education Policy Fellowship Program (EPFP™) at Columbia University, Teachers College. Dr. Cowin served as the President of the Rotary Club of New York and Assistant Governor for New York State; long-term Chair of the Rotary United Nations International Breakfast meetings; and works as an Assistant Professor at Touro College, Graduate School of Education. Dr. Cowin has over twenty-five years of experience as an educator, tech innovator, entrepreneur, and institutional leader with a focus on equity and access to digital literacy and education in the Sub-Saharan Africa region. Her extensive background in education, administration, not-for-profit leadership, entrepreneurial spirit, and technology innovation provide her with unique skills and vertical networks locally and globally. Dr. Cowin participates fully in the larger world of TESOL academic discipline as elected Vice President and Chair-Elect for the New York State, NYS TESOL organization, for the 2021 conference. Ongoing research, expressed in scholarly contributions to the advancement of knowledge is demonstrated through publications, presentations, and participation in academic conferences, blogging, and other scholarly activities, including public performances and exhibitions at conferences and workshops. Of particular interest to her are The Blockchain of Things and its implications for Higher Education; Current Global Trends in TESOL; Developing Materials and Resources in Teaching English; E-learning; Micro and Macro-Methodologies in TESOL; E-Resources Discovery and Analysis; and Language Acquisition and the Oculus Rift in VR.

%d bloggers like this: