.H5
all-MiniLM-L12-v2_newsgroups_test.h5 (10.01 MB)
.H5
all-MiniLM-L12-v2_newsgroups_train.h5 (15.07 MB)
.H5
all-distilroberta-v1_newsgroups_test.h5 (20.14 MB)
.H5
all-distilroberta-v1_newsgroups_train.h5 (30.23 MB)
.H5
all-mpnet-base-v2_newsgroups_train.h5 (30.2 MB)
.H5
all-mpnet-base-v2_newsgroups_test.h5 (20.07 MB)
.H5
multi-qa-distilbert-cos-v1_newsgroups_test.h5 (20.14 MB)
.H5
multi-qa-distilbert-cos-v1_newsgroups_train.h5 (30.21 MB)
1/0
Pretrained sentence BERT models 20 Newsgroups embeddings
dataset
posted on 2023-09-29, 08:13 authored by Beatrix Miranda Ginn NielsenBeatrix Miranda Ginn NielsenEmbeddings on the 20 Newsgroups dataset using pretrained Sentence BERT models.
20 Newsgroups dataset: http://qwone.com/~jason/20Newsgroups/
Pretrained models used:
all-distilroberta-v1: https://huggingface.co/sentence-transformers/all-distilroberta-v1
all-MiniLM-L12-v2: https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2
all-mpnet-base-v2: https://huggingface.co/sentence-transformers/all-mpnet-base-v2
multi-qa-distilbert-cos-v1: https://huggingface.co/sentence-transformers/multi-qa-distilbert-cos-v1
Funding
Danish Pioneer Centre for AI, DNRF grant number P1
History
Usage metrics
Categories
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC