A sentiment analysing web application for customer reviews. Positive, Negative and Neutral opinions are highlighted.
-
Updated
Jan 9, 2022 - Python
A sentiment analysing web application for customer reviews. Positive, Negative and Neutral opinions are highlighted.
I will perform a text classification tast using various Transformer like BERT, DistilBERT, ElLECTRA models, mostly from Huggingface community
Thesis Project
1. Fine-tune DistilBERT on NLI and dentify the some salient or toxic features that the model learnt. 2. Sample annotations techniques and production of silver label using (EDA and Back Translation).
Fake news classification transformer model
Transformers
Sentiment Analysis On Stanford Dataset using State-of-the-Art models (Contextualized Embedding)
Multi-modal retrieval with a smooth weighting of negatives
Sentiment analysis on song lyrics using DistilBERT for NLP
Classification of German noun compounds based on the semantic relation between their constituents.
This is a production ready DistilBERT Sentiment Analysis model for service reviews designed to work as a low cost market research tool with the nuiance of an actual market researcher.
Low-Rank Adaptation of Large Language Models for Robust Question Answering
DistilBERT, a distilled version of BERT (Bidirectional Encoder Representations from Transformers). DistilBERT is known for its efficiency and reduced computational requirements while retaining significant language understanding capabilities.
This repository contains the code and resources for building a language understanding and question answering system using the Jeopardy! dataset. The project aims to train a language understanding model to extract meaning from open-ended questions and leverage that understanding to answer the questions using a dedicated QA system.
ALBERT and DistilBERT classification models for the DBpedia ontology dataset.
A notebook for a medium article about text classification with Hugging Face DistilBert and Tensorflow 2.0
Distilbert trained on goemotions dataset using huggingface
Sentence Classification with BERT
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."