Skip to content

jackgopack4/nlp-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP-F22

Natural Language Processing UT Austin Fall 2022

Final project details utilizing Transformer models:

  • Adapt CheckList detailed testing approach to a industry-standard Stanford Question Answering Database (SQuAD)-trained ELECTRA model
  • Train an ELECTRA SQuAD model on adversarial datasets to evaluate improvments based on question/answer categories
  • Generate and train with hand-tuned training sets to improve performance on specific categories (such as a specialized model for a given task)

Models were trained and code was ran on Google CoLab Pro.

PyTorch models and HuggingFace datasets/transformers were utilized.

11-page PDF report is included under final project folder.

About

NLP UT Austin Fall 2022

Resources

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published