Use GPT-2 for Text-generation
-
Updated
Apr 22, 2023 - Jupyter Notebook
Use GPT-2 for Text-generation
Tensorflow 2.0 removed some GPT-2 dependencies which did not allow many people to access GPT-2. And as Tensorflow 1.xx is not available unless you build it from their sources, This repository adds compatibility to allow running the GPT-2 model using Tensorflow 2.0.
A simple web UI for text completion using gpt-2-simple finetuned GPT-2 models.
👨🎓 This repo is a supplement to my video on Transformers and Text Summarization as part of my series AI does AI (https://youtu.be/p_6xgrykPMQ)
A desktop interface for ODIN, my centralized GPT-2 web API.
Pre-train a Spanish GPT-2 model from scratch using the Spanish OSCAR dataset.
bookkeeping repo for SC22 Batch A Team Wireless-Union at AI Camp
A Twitter bot account that posts AI-generated fake tweets as Elon Musk. Fine tuned using GPT2
A serie of notebooks exploring GPT-Neo
Python-based Internet Relay Chat bot client for local GPT-2 (TensorFlow) models
This chat bot is fine tuned on GPT-2 model to generate responses like Shakespear.
Programming Language for Deep Learning in Python
Website to visualise a NLP project on text generation. A GPT-2 model re-trained to mimic as closely as possible the writing style of Karl Marx comments daily on the latest news from The Guardian!
Flask doctor chatbot that produces a response for patient's query related to COVID-19. The app is designed based on GPT-2 pretrained on dialogpt weights.
Transformer based abstractive summarization models: mT5, T5 Pegasus, GPT-2 are implemented for Chinese text summarization.
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."