Generative AI Using Python: Introduction
Artificial Intelligence (AI) has evolved rapidly, and one of the most intriguing domains within it is Generative AI. This cutting-edge technology empowers machines to autonomously create content, opening up vast possibilities for creativity and innovation.
In this blog post, we will delve into the fundamentals of Generative AI and provide hands-on examples using the GPT-2 model with Python.
Understanding Generative AI
Generative AI is a subset of artificial intelligence that enables machines to independently generate content. Unlike rule-based systems, generative models can create new and unique outputs by learning patterns from extensive datasets. This field has found remarkable applications in various domains, including text generation, image creation, and more.
Python: Your Gateway to Generative AI
Python, known for its simplicity and extensive libraries, serves as an ideal language for exploring Generative AI. Let's jump on a step-by-step journey, including practical examples using the GPT-2 model.
Note: There are various models both Commercial and Open-Source are available in the marketplace to explore. Each model has it's own advantages and disadvantages. So, select model according to your requirements.
Some top models in the list are as follows:
1. GPT-2, GPT-3, GPT3.5, and GPT-4 by OpenAI
2. BARD and PaLM by Google
3. LlaMa by Meta
4 Falcon by TII
1. Install Python and Libraries
Begin by installing Python on your system. Visit the official Python website for the latest version. Next, install essential libraries such as TensorFlow, PyTorch, or Hugging Face's transformers library, which we'll use for the GPT-2 model.
# install necessary library
pip install transformers
2. The Basics of GPT-2
Generative Pre-trained Transformer 2 (GPT-2) is a state-of-the-art language model developed by OpenAI. It's a transformer-based model that excels at various natural language processing tasks, including text generation. To use GPT-2, you can choose from multiple pre-trained versions based on your needs.
Note: We have selected this model as it's completely free for learning and research purpose.
3. Text Generation with GPT-2
Let's dive into a practical example of text generation using GPT-2 and the transformers library.
For this example, we'll use the smaller version of GPT-2.
# import library and create a UDF to generate text using User-Prompt as input
import sys
from transformers import GPT2LMHeadModel, GPT2Tokenizer
# Redirect stderr to suppress warning messages
sys.stderr = open('/dev/null', 'w')
def generate_text(prompt, model, tokenizer, max_length=100):
# Tokenize the prompt
input_ids = tokenizer.encode(prompt, return_tensors="pt", max_length=max_length, truncation=True)
# Generate text based on the prompt
output = model.generate(input_ids, max_length=max_length, num_return_sequences=1, no_repeat_ngram_size=2)
# Decode the generated text
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
return generated_text
# Load pre-trained GPT-2 model and tokenizer
model = GPT2LMHeadModel.from_pretrained("gpt2")
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
# create a UDF to format the generated text
def format_and_display_text(generated_text, words_per_line=5):
words = generated_text.split()
formatted_text = []
for i in range(0, len(words), words_per_line):
line = ' '.join(words[i:i+words_per_line])
formatted_text.append(line)
# Display the formatted text in a box
print("+" + "-"*(max(map(len, formatted_text)) + 2) + "+")
for line in formatted_text:
print(f"| {line.ljust(max(map(len, formatted_text)))} |")
print("+" + "-"*(max(map(len, formatted_text)) + 2) + "+")
# create a UDF to interact with model and generate formatted responses
def get_response(user_prompt):
# Generate text based on the user prompt
generated_text = generate_text(user_prompt, model, tokenizer)
return generated_text
# Futuristic Scenario Prompt Example
prompt = "Describe a future society where humans coexist with advanced AI, highlighting the benefits and potential challenges."
format_and_display_text(generated_text=get_response(user_prompt=prompt),
words_per_line=10)
Output:
Complete Code and Implementation is available on our Baacumen GitHub Repository.
4. Experiment and Explore
Generative AI, especially with models like GPT-2, opens up exciting possibilities. Experiment with different prompts, explore larger versions of GPT-2, and adapt the code to your specific needs. Join online communities, contribute to open-source projects, and stay curious!
Conclusion
Generative AI in Python, powered by models like GPT-2, provides an exhilarating path for exploration. From understanding the basics to practical implementation, the journey is both engaging and rewarding. As you venture into the dynamic world of Generative AI, embrace experimentation, contribute to the community, and enjoy the creative process. Happy coding!
Stay tuned for our next blog post, where we will discuss more about Large Language Models and guide you through creating a GUI Chatbot using Large Language Models.