Elevating CX through personalisation
Show me that you know me—in a digital world full of endless choice and high expectation, personalisation is the key to retention and improved conversion.
Artificial intelligence (AI) has become a significant part of our daily lives, but understanding how it has evolved can seem complex. Two key developments have been crucial in advancing the field: the design of sophisticated neural networks and the rapid growth of computational power and data.
In 2017, a groundbreaking step forward in AI was made with the introduction of a new type of neural network called the transformer. This innovation changed how AI systems could process and understand information, particularly in tasks that involve analysing sequences of data, like text. The transformer model brought a new approach, allowing for quicker and more efficient processing of information, which was a departure from earlier methods.
AI is now mainstream and part of our everyday experience at work and home. However, its roots go much further back, starting in the 1950s.
Transformer neural networks signify a crucial advancement in Generative AI, enabling a more sophisticated and efficient way to process sequential data. This leap forward is mainly attributable to their unique structure, which uses parallel processing, and most importantly, the introduction of attention mechanisms.
The attention mechanisms, a core concept introduced in "All You Need Is Attention," revolutionised how these networks process information. In simpler terms, attention mechanisms allow the network to focus selectively on different parts of the input data, determining which factors are most crucial for the task at hand.
To understand this better, imagine reading a book. Traditional neural networks, like RNNs and CNNs, would read the book sequentially, one word at a time, often losing the context of previous sections. In contrast, a transformer network with attention mechanisms behaves like a reader who can glance at multiple pages simultaneously and understand how words and sentences on different pages relate to each other. This allows for a more comprehensive and contextually aware understanding of the text.
At the initial introduction of the original ChatGPT, it could process up to 8,000 tokens, akin to reading and comprehending approximately 8,000 words or parts of words at a time. The recent development of GPT-4 Turbo, with a 128K token limit, has vastly expanded this capacity. It's like grasping the content of over 300 pages of text at once. While these advancements are significant, transformer models still have limitations in data processing capacity, underscoring the continuous progress and potential for growth in AI.
Today, Transformer neural networks shape the Generative AI landscape. All leading large language models (LLMs) are built upon this architecture, driving unprecedented advancements in natural language processing and understanding.
Show me that you know me—in a digital world full of endless choice and high expectation, personalisation is the key to retention and improved conversion.
Every new project is a journey into an unknown land. The right partner will help you navigate that journey and reach your destination in great shape.
... and how to fix them. Our UX team reflect on the top 10 UX mis-steps and the impact they can have on conversion.