.H5
all-MiniLM-L12-v2_ag_news_test.h5 (10.34 MB)
.H5
all-MiniLM-L12-v2_ag_news_train.h5 (162.89 MB)
.H5
all-distilroberta-v1_ag_news_test.h5 (20.72 MB)
.H5
all-distilroberta-v1_ag_news_train.h5 (326.83 MB)
.H5
all-mpnet-base-v2_ag_news_train.h5 (326.96 MB)
.H5
all-mpnet-base-v2_ag_news_test.h5 (20.74 MB)
.H5
multi-qa-distilbert-cos-v1_ag_news_train.h5 (326.43 MB)
.H5
multi-qa-distilbert-cos-v1_ag_news_test.h5 (20.71 MB)
1/0
Pretrained sentence BERT models AG News embeddings
dataset
posted on 2023-09-29, 08:12 authored by Beatrix Miranda Ginn NielsenBeatrix Miranda Ginn NielsenEmbeddings on the AG News dataset using pretrained Sentence BERT models.
AG News dataset: https://www.kaggle.com/datasets/amananandrai/ag-news-classification-dataset
Pretrained models used:
all-distilroberta-v1: https://huggingface.co/sentence-transformers/all-distilroberta-v1
all-MiniLM-L12-v2: https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2
all-mpnet-base-v2: https://huggingface.co/sentence-transformers/all-mpnet-base-v2
multi-qa-distilbert-cos-v1: https://huggingface.co/sentence-transformers/multi-qa-distilbert-cos-v1
Funding
Danish Pioneer Centre for AI, DNRF grant number P1
History
ORCID for corresponding depositor
Usage metrics
Categories
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC