.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed1_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed1_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed6_newsgroups_test.h5 (24.91 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed6_newsgroups_train.h5 (37.26 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed7_newsgroups_test.h5 (24.91 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed7_newsgroups_train.h5 (37.26 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed10_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed10_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed11_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed11_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed12_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed12_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed13_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed13_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed22_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed22_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed23_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed23_newsgroups_train.h5 (37.27 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed42_newsgroups_test.h5 (24.92 MB)
.H5
sts_bert_distilroberta-base_cos_dist_ORTHOGONAL_z_False_n_False_c_False_seed42_newsgroups_train.h5 (37.27 MB)
1/0
sts_bert_distilroberta-base 20 newsgroups embeddings
dataset
posted on 2023-05-03, 13:22 authored by Beatrix Miranda Ginn NielsenBeatrix Miranda Ginn NielsenText embeddings for the 20 Newsgroups dataset
The 20 newsgroups dataset can be fetched through scikit-learn which fetches data from the 20 newsgroups website (http://qwone.com/~jason/20Newsgroups).
Embeddings are made with sentence BERT models where distilroberta-base (https://huggingface.co/distilroberta-base) is used as the base model.
For more details on models, see the model item: 10.11583/DTU.20708785
Funding
Danish Pioneer Centre for AI, DNRF grant number P1
History
ORCID for corresponding depositor
Usage metrics
Categories
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC