THE BIRTH OF GENERATIVE AI
In 2017, Google researchers pushed the boundaries of the possible. They published an academic paper called Attention is All You Need that proposed a new training method for AI. Known as the transformer model, it gave AI a far more detailed understanding of our existing data by analyzing it in new ways. The transformer model was the basis for generative AI, which went beyond traditional AI. A trained generative AI data model, known as a large language model (LLM), doesn't just interpret existing data. It also uses its enhanced understanding of that data to generate new things. For the first time ever, we have access to a creative form of AI.