Transformer(Attention Is All You Need) Implementation in Pytorch
-
Updated
Dec 2, 2022 - Python
Transformer(Attention Is All You Need) Implementation in Pytorch
JAX implementation of the bart-base model
JAX implementation of the T5 model: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Censored tweets annotated for specificity; AAAI 2019 paper: Predicting and Analyzing Language Specificity in Social Media Posts
A NLP algorithm I developed to determine the similarity or relation between two documents/Wikipedia articles. Inspired by the cosine similarity algorithm and built from WordNet.
Train a T5 model to generate simple Fake News and use a RoBERTa model to classify what's fake and what's real.
An interactive application leveraging a pre-trained language model (GPT-2) to generate human-like text from user prompts. Create stories, reports, dialogues, and more! 🤖
Advanced NLP model implementation in PyTorch featuring transformer architecture, multi-head attention, and comprehensive training pipeline with mixed precision, gradient accumulation, and dynamic learning rate scheduling.
A Python-based sentiment analysis tool that classifies text as positive, negative, or neutral. Using NLP techniques and machine learning, it processes data, extracts features, trains models, and evaluates performance. Ideal for analyzing feedback and social media sentiment.
Transform emoticon to text, e.g., :) => Smile.
A Streamlit-based spam classifier that predicts whether a message is spam or not spam using machine learning.
Add a description, image, and links to the nlp-model topic page so that developers can more easily learn about it.
To associate your repository with the nlp-model topic, visit your repo's landing page and select "manage topics."