Technical University of Denmark
Browse
.H5
all-MiniLM-L12-v2_ag_news_test.h5 (10.34 MB)
.H5
all-MiniLM-L12-v2_ag_news_train.h5 (162.89 MB)
.H5
all-distilroberta-v1_ag_news_test.h5 (20.72 MB)
.H5
all-distilroberta-v1_ag_news_train.h5 (326.83 MB)
.H5
all-mpnet-base-v2_ag_news_train.h5 (326.96 MB)
.H5
all-mpnet-base-v2_ag_news_test.h5 (20.74 MB)
.H5
multi-qa-distilbert-cos-v1_ag_news_train.h5 (326.43 MB)
.H5
multi-qa-distilbert-cos-v1_ag_news_test.h5 (20.71 MB)
1/0
8 files

Pretrained sentence BERT models AG News embeddings

dataset
posted on 2023-09-29, 08:12 authored by Beatrix Miranda Ginn NielsenBeatrix Miranda Ginn Nielsen

Embeddings on the AG News dataset using pretrained Sentence BERT models.  


AG News dataset: https://www.kaggle.com/datasets/amananandrai/ag-news-classification-dataset


Pretrained models used:

all-distilroberta-v1: https://huggingface.co/sentence-transformers/all-distilroberta-v1

all-MiniLM-L12-v2: https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2

all-mpnet-base-v2: https://huggingface.co/sentence-transformers/all-mpnet-base-v2

multi-qa-distilbert-cos-v1: https://huggingface.co/sentence-transformers/multi-qa-distilbert-cos-v1

Funding

Danish Pioneer Centre for AI, DNRF grant number P1

History

ORCID for corresponding depositor

Usage metrics

    DTU Compute

    Categories

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC