GPT-2 write an important Theater Play

GPT-2
Photo by Alex Knight on Unsplash

GPT-2: Artificial Intelligence

 

Using artificial intelligence to compose a complete theatrical performance is a complex task

Researchers from Charles University began collaborating with the Esfanda Theater and the Performing Arts Academy in Prague on a strange project: integrating artificial intelligence and robotics into the theater.

The goal, according to Rudolf Rosa, one of the researchers, is to create a theatrical show to be presented next January to commemorate the centenary of the ROR play written by Karl Chapick and his brother Joseph, who developed the idea of robots, but rather created the word “robot” at the time.

It is true that the incorporation of artificial intelligence into art is not new, but the use of artificial intelligence in composing a complete theatrical performance is a complex and rarely discussed task; So Rudolph and the rest of his team researched and decided the methodology for their investigation, hoping to complete it by September, to have time to train before the deadline.

GPT-2: intelligence generates

The team began experimenting with its idea of an open-source GPT-2 model, which was developed by OpenAi and trained in several English scripts, to complete missing texts in similar language and consistent style; The team published the results of its preliminary experiments in a research paper at the Archive site.

But even though GPT-2 has trained in many English scripts, and the researchers’ initial experiences are promising, but they have not trained it on theater scripts yet, so they plan to improve its performance by training it with existing theater scripts; This is, of course, easier for them than developing a new model from scratch.

Rosa stated that his team adjusts what artificial intelligence generates, and this is what all field researchers do, but what distinguishes his team’s work is transparency, as they very differentiate between what intelligence does and what humans do, so that the final work is a truly collaborative work, not just a human modification of an automated thing.

It is true that the project is still in its infancy, but Rosa is amazed at the performance of the model, as it feeds on simple sentences, generating similar texts on the subject, structure and style, even though the team did not train and has not yet modified it.

Although the model currently generates the text line-by-line, the team hopes to use hierarchical generation methods soon, generating the form a short summary, and then gradually expanding, generating smooth, consistent, clear dialogues that are not contradictory or flawed.

“Thanks to the approaching anniversary, our main target is clear and fixed: by January 2021, we need to have a play ready for premiere,” Rosa explained. “As it will be performed by a professional theater group, we need to have the script ready in September, so that there is enough time for dramatization, rehearsals, etc. Thus, even though we are developing a general tool for theater script generation, at the moment we only really need to generate one script.”

“Our first play will be shown in January, and we will consider its fate and the opinions of the public about it, hoping to use our observations to advance our research and development,” Rosa added.

What if you ask ChatGPT about GPT-2:

Of course, I remember GPT-2! GPT-2 is a large-scale language model developed by OpenAI that uses deep learning techniques to generate human-like text. It has 1.5 billion parameters and was trained on a massive dataset of web pages, books, and other texts.

GPT-2 has generated a lot of excitement in the natural language processing (NLP) community because of its ability to generate coherent and sometimes even creative text. However, it has also raised concerns about the potential misuse of such technology, particularly in the generation of fake news or malicious content.

GPT-2 is a type of deep learning model called a transformer, which was first introduced in a paper by Google researchers in 2017. The transformer architecture is designed to process sequences of data, such as sentences or paragraphs, and it has become popular in NLP because of its ability to model long-range dependencies between words.

GPT-2 is one of the largest and most powerful transformer models developed so far, and it has demonstrated impressive performance on a wide range of NLP tasks, including language modeling, text completion, and question answering. One of the most impressive aspects of GPT-2 is its ability to generate coherent and sometimes even creative text, such as news articles, poetry, and even computer code.

However, as I mentioned earlier, the potential for misuse of GPT-2 and other similar language models has raised concerns in the NLP community. Because these models are so good at generating realistic text, they could be used to generate fake news, impersonate people online, or even create malicious content. As a result, some researchers have advocated for greater scrutiny and regulation of the development and use of these types of models.