metadata
language:
- en
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:100231
- loss:DebiasedMultipleNegativesRankingLoss
base_model: answerdotai/ModernBERT-base
widget:
- source_sentence: who led the army that defeated the aztecs
sentences:
- >-
Spanish conquest of the Aztec Empire The Spanish conquest of the Aztec
Empire, or the Spanish-Aztec War (1519-21)[3] was one of the most
significant and complex events in world history. There are multiple
sixteenth-century narratives of the events by Spanish conquerors, their
indigenous allies, and the defeated Aztecs. It was not solely a contest
between a small contingent of Spaniards defeating the Aztec Empire, but
rather the creation of a coalition of Spanish invaders with tributaries
to the Aztecs, and most especially the Aztecs' indigenous enemies and
rivals. They combined forces to defeat the Mexica of Tenochtitlan over a
two-year period. For the Spanish, the expedition to Mexico was part of a
project of Spanish colonization of the New World after twenty-five years
of permanent Spanish settlement and further exploration in the
Caribbean. The Spanish made landfall in Mexico in 1517. A Spanish
settler in Cuba, Hernán Cortés, led an expedition (entrada) to Mexico,
landing in February 1519, following an earlier expedition led by Juan de
Grijalva to Yucatán in 1517. Two years later Cortés and his retinue set
sail, thus beginning the expedition of exploration and conquest.[4] The
Spanish campaign against the Aztec Empire had its final victory on
August 13, 1521, when a coalition army of Spanish forces and native
Tlaxcalan warriors led by Cortés and Xicotencatl the Younger captured
the emperor Cuauhtemoc and Tenochtitlan, the capital of the Aztec
Empire. The fall of Tenochtitlan marks the beginning of Spanish rule in
central Mexico, and they established their capital of Mexico City on the
ruins of Tenochtitlan.
- >-
The Girl with All the Gifts Justineau awakens in the Rosalind Franklin.
Melanie leads her to a group of intelligent hungries, to whom Justineau,
wearing an environmental protection suit, starts teaching the alphabet.
- >-
Wendy Makkena In 1992 she had a supporting role in the movie Sister Act
as the shy but talented singing nun Sister Mary Robert, a role she
reprised in Sister Act 2: Back in the Habit the following year. She
appeared in various other television roles until 1997, when she starred
in Air Bud, followed by the independent film Finding North. She
continued appearing on television shows such as The Job, Oliver Beene,
and Listen Up![citation needed]
- source_sentence: who went to the most nba finals in a row
sentences:
- >-
List of NBA franchise post-season streaks The San Antonio Spurs hold the
longest active consecutive playoff appearances with 21 appearances,
starting in the 1998 NBA Playoffs (also the longest active playoff
streak in any major North American sports league as of 2017). The Spurs
have won five NBA championships during the streak. The Philadelphia
76ers (formerly known as Syracuse Nationals) hold the all-time record
for consecutive playoff appearances with 22 straight appearances between
1950 and 1971. The 76ers won two NBA championships during their streak.
The Boston Celtics hold the longest consecutive NBA Finals appearance
streak with ten appearances between 1957 and 1966. During the streak,
the Celtics won eight consecutive NBA championships—also an NBA
record.
- >-
Dear Dumb Diary Dear Dumb Diary is a series of children's novels by Jim
Benton. Each book is written in the first person view of a middle school
girl named Jamie Kelly. The series is published by Scholastic in English
and Random House in Korean. Film rights to the series have been optioned
by the Gotham Group.[2]
- >-
Voting rights in the United States Eligibility to vote in the United
States is established both through the federal constitution and by state
law. Several constitutional amendments (the 15th, 19th, and 26th
specifically) require that voting rights cannot be abridged on account
of race, color, previous condition of servitude, sex, or age for those
above 18; the constitution as originally written did not establish any
such rights during 1787–1870. In the absence of a specific federal law
or constitutional provision, each state is given considerable discretion
to establish qualifications for suffrage and candidacy within its own
respective jurisdiction; in addition, states and lower level
jurisdictions establish election systems, such as at-large or single
member district elections for county councils or school boards.
- source_sentence: who did the vocals on mcdonald's jingle i'm loving it
sentences:
- >-
I'm Lovin' It (song) "I'm Lovin' It" is a song recorded by American
singer-songwriter Justin Timberlake. It was written by Pusha T and
produced by The Neptunes.
- >-
Vallabhbhai Patel As the first Home Minister and Deputy Prime Minister
of India, Patel organised relief efforts for refugees fleeing from
Punjab and Delhi and worked to restore peace across the nation. He led
the task of forging a united India, successfully integrating into the
newly independent nation those British colonial provinces that had been
"allocated" to India. Besides those provinces that had been under direct
British rule, approximately 565 self-governing princely states had been
released from British suzerainty by the Indian Independence Act of 1947.
Employing frank diplomacy with the expressed option to deploy military
force, Patel persuaded almost every princely state to accede to India.
His commitment to national integration in the newly independent country
was total and uncompromising, earning him the sobriquet "Iron Man of
India".[3] He is also affectionately remembered as the "Patron saint of
India's civil servants" for having established the modern all-India
services system. He is also called the Unifier of India.[4]
- >-
National debt of the United States As of July 31, 2018, debt held by the
public was $15.6 trillion and intragovernmental holdings were $5.7
trillion, for a total or "National Debt" of $21.3 trillion.[5] Debt held
by the public was approximately 77% of GDP in 2017, ranked 43rd highest
out of 207 countries.[6] The Congressional Budget Office forecast in
April 2018 that the ratio will rise to nearly 100% by 2028, perhaps
higher if current policies are extended beyond their scheduled
expiration date.[7] As of December 2017, $6.3 trillion or approximately
45% of the debt held by the public was owned by foreign investors, the
largest being China (about $1.18 trillion) then Japan (about $1.06
trillion).[8]
- source_sentence: who is the actress of harley quinn in suicide squad
sentences:
- >-
Tariffs in United States history Tariffs were the main source of revenue
for the federal government from 1789 to 1914. During this period, there
was vigorous debate between the various political parties over the
setting of tariff rates. In general Democrats favored a tariff that
would pay the cost of government, but no higher. Whigs and Republicans
favored higher tariffs to protect and encourage American industry and
industrial workers. Since the early 20th century, however, U.S. tariffs
have been very low and have been much less a matter of partisan debate.
- >-
The Rolling Stones The Rolling Stones are an English rock band formed in
London, England in 1962. The first stable line-up consisted of Brian
Jones (guitar, harmonica), Mick Jagger (lead vocals), Keith Richards
(guitar, backing vocals), Bill Wyman (bass), Charlie Watts (drums), and
Ian Stewart (piano). Stewart was removed from the official line-up in
1963 but continued as a touring member until his death in 1985. Jones
left the band less than a month prior to his death in 1969, having
already been replaced by Mick Taylor, who remained until 1974. After
Taylor left the band, Ronnie Wood took his place in 1975 and has been on
guitar in tandem with Richards ever since. Following Wyman's departure
in 1993, Darryl Jones joined as their touring bassist. Touring
keyboardists for the band have been Nicky Hopkins (1967–1982), Ian
McLagan (1978–1981), Billy Preston (through the mid-1970s) and Chuck
Leavell (1982–present). The band was first led by Brian Jones, but after
developing into the band's songwriters, Jagger and Richards assumed
leadership while Jones dealt with legal and personal troubles.
- >-
Margot Robbie After moving to the United States, Robbie starred in the
short-lived ABC drama series Pan Am (2011–2012). In 2013, she made her
big screen debut in Richard Curtis's romantic comedy-drama film About
Time and co-starred in Martin Scorsese's biographical black comedy The
Wolf of Wall Street. In 2015, Robbie co-starred in the romantic
comedy-drama film Focus, appeared in the romantic World War II drama
film Suite Française and starred in the science fiction film Z for
Zachariah. That same year, she played herself in The Big Short. In 2016,
she portrayed Jane Porter in the action-adventure film The Legend of
Tarzan and Harley Quinn in the superhero film Suicide Squad. She
appeared on Time magazine's "The Most Influential People of 2017"
list.[4]
- source_sentence: what is meaning of am and pm in time
sentences:
- >-
America's Got Talent America's Got Talent (often abbreviated as AGT) is
a televised American talent show competition, broadcast on the NBC
television network. It is part of the global Got Talent franchise
created by Simon Cowell, and is produced by Fremantle North America and
SYCOtv, with distribution done by Fremantle. Since its premiere in June
2006, each season is run during the network's summer schedule, with the
show having featured various hosts - it is currently hosted by Tyra
Banks, since 2017.[2] It is the first global edition of the franchise,
after plans for a British edition in 2005 were suspended, following a
dispute between Paul O'Grady, the planned host, and the British
broadcaster ITV; production of this edition later resumed in 2007.[3]
- >-
Times Square Times Square is a major commercial intersection, tourist
destination, entertainment center and neighborhood in the Midtown
Manhattan section of New York City at the junction of Broadway and
Seventh Avenue. It stretches from West 42nd to West 47th Streets.[1]
Brightly adorned with billboards and advertisements, Times Square is
sometimes referred to as "The Crossroads of the World",[2] "The Center
of the Universe",[3] "the heart of The Great White Way",[4][5][6] and
the "heart of the world".[7] One of the world's busiest pedestrian
areas,[8] it is also the hub of the Broadway Theater District[9] and a
major center of the world's entertainment industry.[10] Times Square is
one of the world's most visited tourist attractions, drawing an
estimated 50 million visitors annually.[11] Approximately 330,000 people
pass through Times Square daily,[12] many of them tourists,[13] while
over 460,000 pedestrians walk through Times Square on its busiest
days.[7]
- >-
12-hour clock The 12-hour clock is a time convention in which the 24
hours of the day are divided into two periods:[1] a.m. (from the Latin,
ante meridiem, meaning before midday) and p.m. (post meridiem, meaning
past midday).[2] Each period consists of 12 hours numbered: 12 (acting
as zero),[3] 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, and 11. The 24 hour/day
cycle starts at 12 midnight (often indicated as 12 a.m.), runs through
12 noon (often indicated as 12 p.m.), and continues to the midnight at
the end of the day. The 12-hour clock was developed over time from the
mid-second millennium BC to the 16th century AD.
datasets:
- sentence-transformers/natural-questions
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on answerdotai/ModernBERT-base
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoMSMARCO
type: NanoMSMARCO
metrics:
- type: cosine_accuracy@1
value: 0.14
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.24
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.3
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.4
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.14
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.07999999999999999
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.06
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.04000000000000001
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.14
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.24
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.3
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.4
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.25076046577886124
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.20557936507936506
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.21939187046366332
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoHotpotQA
type: NanoHotpotQA
metrics:
- type: cosine_accuracy@1
value: 0.14
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.28
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.3
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.36
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.14
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.09333333333333332
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.064
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.038
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.07
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.14
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.16
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.19
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.15720914647954295
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.2121904761904762
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.12322210117624575
name: Cosine Map@100
- task:
type: nano-beir
name: Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: cosine_accuracy@1
value: 0.14
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.26
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.3
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.38
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.14
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.08666666666666666
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.062
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.03900000000000001
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.10500000000000001
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.19
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.22999999999999998
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.29500000000000004
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.2039848061292021
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.20888492063492065
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.17130698581995454
name: Cosine Map@100
SentenceTransformer based on answerdotai/ModernBERT-base
This is a sentence-transformers model finetuned from answerdotai/ModernBERT-base on the natural-questions dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
This is a test model to experiment with the proposed DebiasedMultipleNegativesRankingLoss
from Pull Request #3148 in the Sentence Transformers repository, using commit 370bf473e60b57f7d01a6e084b5acaabdac38a2c
.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: answerdotai/ModernBERT-base
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/ModernBERT-base-nq-debiased-mnrl")
# Run inference
sentences = [
'what is meaning of am and pm in time',
'12-hour clock The 12-hour clock is a time convention in which the 24 hours of the day are divided into two periods:[1] a.m. (from the Latin, ante meridiem, meaning before midday) and p.m. (post meridiem, meaning past midday).[2] Each period consists of 12 hours numbered: 12 (acting as zero),[3] 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, and 11. The 24 hour/day cycle starts at 12 midnight (often indicated as 12 a.m.), runs through 12 noon (often indicated as 12 p.m.), and continues to the midnight at the end of the day. The 12-hour clock was developed over time from the mid-second millennium BC to the 16th century AD.',
"America's Got Talent America's Got Talent (often abbreviated as AGT) is a televised American talent show competition, broadcast on the NBC television network. It is part of the global Got Talent franchise created by Simon Cowell, and is produced by Fremantle North America and SYCOtv, with distribution done by Fremantle. Since its premiere in June 2006, each season is run during the network's summer schedule, with the show having featured various hosts - it is currently hosted by Tyra Banks, since 2017.[2] It is the first global edition of the franchise, after plans for a British edition in 2005 were suspended, following a dispute between Paul O'Grady, the planned host, and the British broadcaster ITV; production of this edition later resumed in 2007.[3]",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Datasets:
NanoMSMARCO
andNanoHotpotQA
- Evaluated with
InformationRetrievalEvaluator
Metric | NanoMSMARCO | NanoHotpotQA |
---|---|---|
cosine_accuracy@1 | 0.14 | 0.14 |
cosine_accuracy@3 | 0.24 | 0.28 |
cosine_accuracy@5 | 0.3 | 0.3 |
cosine_accuracy@10 | 0.4 | 0.36 |
cosine_precision@1 | 0.14 | 0.14 |
cosine_precision@3 | 0.08 | 0.0933 |
cosine_precision@5 | 0.06 | 0.064 |
cosine_precision@10 | 0.04 | 0.038 |
cosine_recall@1 | 0.14 | 0.07 |
cosine_recall@3 | 0.24 | 0.14 |
cosine_recall@5 | 0.3 | 0.16 |
cosine_recall@10 | 0.4 | 0.19 |
cosine_ndcg@10 | 0.2508 | 0.1572 |
cosine_mrr@10 | 0.2056 | 0.2122 |
cosine_map@100 | 0.2194 | 0.1232 |
Nano BEIR
- Dataset:
NanoBEIR_mean
- Evaluated with
NanoBEIREvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.14 |
cosine_accuracy@3 | 0.26 |
cosine_accuracy@5 | 0.3 |
cosine_accuracy@10 | 0.38 |
cosine_precision@1 | 0.14 |
cosine_precision@3 | 0.0867 |
cosine_precision@5 | 0.062 |
cosine_precision@10 | 0.039 |
cosine_recall@1 | 0.105 |
cosine_recall@3 | 0.19 |
cosine_recall@5 | 0.23 |
cosine_recall@10 | 0.295 |
cosine_ndcg@10 | 0.204 |
cosine_mrr@10 | 0.2089 |
cosine_map@100 | 0.1713 |
Training Details
Training Dataset
natural-questions
- Dataset: natural-questions at f9e894e
- Size: 100,231 training samples
- Columns:
query
andanswer
- Approximate statistics based on the first 1000 samples:
query answer type string string details - min: 10 tokens
- mean: 12.46 tokens
- max: 25 tokens
- min: 16 tokens
- mean: 139.02 tokens
- max: 537 tokens
- Samples:
query answer who is required to report according to the hmda
Home Mortgage Disclosure Act US financial institutions must report HMDA data to their regulator if they meet certain criteria, such as having assets above a specific threshold. The criteria is different for depository and non-depository institutions and are available on the FFIEC website.[4] In 2012, there were 7,400 institutions that reported a total of 18.7 million HMDA records.[5]
what is the definition of endoplasmic reticulum in biology
Endoplasmic reticulum The endoplasmic reticulum (ER) is a type of organelle in eukaryotic cells that forms an interconnected network of flattened, membrane-enclosed sacs or tube-like structures known as cisternae. The membranes of the ER are continuous with the outer nuclear membrane. The endoplasmic reticulum occurs in most types of eukaryotic cells, but is absent from red blood cells and spermatozoa. There are two types of endoplasmic reticulum: rough and smooth. The outer (cytosolic) face of the rough endoplasmic reticulum is studded with ribosomes that are the sites of protein synthesis. The rough endoplasmic reticulum is especially prominent in cells such as hepatocytes. The smooth endoplasmic reticulum lacks ribosomes and functions in lipid manufacture and metabolism, the production of steroid hormones, and detoxification.[1] The smooth ER is especially abundant in mammalian liver and gonad cells. The lacy membranes of the endoplasmic reticulum were first seen in 1945 using elect...
what does the ski mean in polish names
Polish name Since the High Middle Ages, Polish-sounding surnames ending with the masculine -ski suffix, including -cki and -dzki, and the corresponding feminine suffix -ska/-cka/-dzka were associated with the nobility (Polish szlachta), which alone, in the early years, had such suffix distinctions.[1] They are widely popular today.
- Loss:
DebiasedMultipleNegativesRankingLoss
with these parameters:{ "scale": 1.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
natural-questions
- Dataset: natural-questions at f9e894e
- Size: 100,231 evaluation samples
- Columns:
query
andanswer
- Approximate statistics based on the first 1000 samples:
query answer type string string details - min: 10 tokens
- mean: 12.46 tokens
- max: 22 tokens
- min: 12 tokens
- mean: 138.0 tokens
- max: 649 tokens
- Samples:
query answer difference between russian blue and british blue cat
Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.
who played the little girl on mrs doubtfire
Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.
what year did the movie the sound of music come out
The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.
- Loss:
DebiasedMultipleNegativesRankingLoss
with these parameters:{ "scale": 1.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128per_device_eval_batch_size
: 128learning_rate
: 8e-05num_train_epochs
: 1warmup_ratio
: 0.05seed
: 12bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 128per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 8e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.05warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 12data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoHotpotQA_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
---|---|---|---|---|---|---|
0 | 0 | - | - | 0.0785 | 0.1489 | 0.1137 |
0.0129 | 10 | 4.8033 | - | - | - | - |
0.0258 | 20 | 4.5295 | - | - | - | - |
0.0387 | 30 | 4.2124 | - | - | - | - |
0.0515 | 40 | 4.0863 | - | - | - | - |
0.0644 | 50 | 4.0048 | 3.9563 | 0.1444 | 0.1660 | 0.1552 |
0.0773 | 60 | 3.9686 | - | - | - | - |
0.0902 | 70 | 3.9192 | - | - | - | - |
0.1031 | 80 | 3.9276 | - | - | - | - |
0.1160 | 90 | 3.9104 | - | - | - | - |
0.1289 | 100 | 3.8971 | 3.8877 | 0.2041 | 0.1293 | 0.1667 |
0.1418 | 110 | 3.8987 | - | - | - | - |
0.1546 | 120 | 3.8861 | - | - | - | - |
0.1675 | 130 | 3.8987 | - | - | - | - |
0.1804 | 140 | 3.8811 | - | - | - | - |
0.1933 | 150 | 3.8697 | 3.8478 | 0.1918 | 0.1084 | 0.1501 |
0.2062 | 160 | 3.8621 | - | - | - | - |
0.2191 | 170 | 3.8628 | - | - | - | - |
0.2320 | 180 | 3.8733 | - | - | - | - |
0.2448 | 190 | 3.8551 | - | - | - | - |
0.2577 | 200 | 3.862 | 3.8324 | 0.1940 | 0.0977 | 0.1458 |
0.2706 | 210 | 3.8545 | - | - | - | - |
0.2835 | 220 | 3.8495 | - | - | - | - |
0.2964 | 230 | 3.8459 | - | - | - | - |
0.3093 | 240 | 3.8438 | - | - | - | - |
0.3222 | 250 | 3.8425 | 3.8238 | 0.1933 | 0.1498 | 0.1716 |
0.3351 | 260 | 3.843 | - | - | - | - |
0.3479 | 270 | 3.8486 | - | - | - | - |
0.3608 | 280 | 3.8409 | - | - | - | - |
0.3737 | 290 | 3.8345 | - | - | - | - |
0.3866 | 300 | 3.8446 | 3.8154 | 0.1937 | 0.1532 | 0.1735 |
0.3995 | 310 | 3.8281 | - | - | - | - |
0.4124 | 320 | 3.8316 | - | - | - | - |
0.4253 | 330 | 3.8325 | - | - | - | - |
0.4381 | 340 | 3.8298 | - | - | - | - |
0.4510 | 350 | 3.8379 | 3.8104 | 0.1690 | 0.1559 | 0.1624 |
0.4639 | 360 | 3.821 | - | - | - | - |
0.4768 | 370 | 3.8297 | - | - | - | - |
0.4897 | 380 | 3.8206 | - | - | - | - |
0.5026 | 390 | 3.8222 | - | - | - | - |
0.5155 | 400 | 3.8243 | 3.8031 | 0.2141 | 0.1544 | 0.1843 |
0.5284 | 410 | 3.8328 | - | - | - | - |
0.5412 | 420 | 3.8211 | - | - | - | - |
0.5541 | 430 | 3.82 | - | - | - | - |
0.5670 | 440 | 3.8167 | - | - | - | - |
0.5799 | 450 | 3.8062 | 3.7988 | 0.2281 | 0.1392 | 0.1837 |
0.5928 | 460 | 3.8166 | - | - | - | - |
0.6057 | 470 | 3.8164 | - | - | - | - |
0.6186 | 480 | 3.8207 | - | - | - | - |
0.6314 | 490 | 3.815 | - | - | - | - |
0.6443 | 500 | 3.813 | 3.7943 | 0.2381 | 0.1260 | 0.1821 |
0.6572 | 510 | 3.8144 | - | - | - | - |
0.6701 | 520 | 3.8172 | - | - | - | - |
0.6830 | 530 | 3.8175 | - | - | - | - |
0.6959 | 540 | 3.8126 | - | - | - | - |
0.7088 | 550 | 3.8077 | 3.7913 | 0.2501 | 0.1395 | 0.1948 |
0.7216 | 560 | 3.8022 | - | - | - | - |
0.7345 | 570 | 3.8131 | - | - | - | - |
0.7474 | 580 | 3.8067 | - | - | - | - |
0.7603 | 590 | 3.8175 | - | - | - | - |
0.7732 | 600 | 3.8084 | 3.7870 | 0.2751 | 0.1480 | 0.2116 |
0.7861 | 610 | 3.8029 | - | - | - | - |
0.7990 | 620 | 3.8125 | - | - | - | - |
0.8119 | 630 | 3.817 | - | - | - | - |
0.8247 | 640 | 3.8038 | - | - | - | - |
0.8376 | 650 | 3.8054 | 3.7877 | 0.2274 | 0.1449 | 0.1861 |
0.8505 | 660 | 3.8041 | - | - | - | - |
0.8634 | 670 | 3.8012 | - | - | - | - |
0.8763 | 680 | 3.8117 | - | - | - | - |
0.8892 | 690 | 3.8098 | - | - | - | - |
0.9021 | 700 | 3.8008 | 3.7848 | 0.2466 | 0.1551 | 0.2008 |
0.9149 | 710 | 3.8038 | - | - | - | - |
0.9278 | 720 | 3.7949 | - | - | - | - |
0.9407 | 730 | 3.8044 | - | - | - | - |
0.9536 | 740 | 3.7982 | - | - | - | - |
0.9665 | 750 | 3.804 | 3.7832 | 0.2585 | 0.1587 | 0.2086 |
0.9794 | 760 | 3.8038 | - | - | - | - |
0.9923 | 770 | 3.8046 | - | - | - | - |
1.0 | 776 | - | - | 0.2508 | 0.1572 | 0.2040 |
Framework Versions
- Python: 3.11.10
- Sentence Transformers: 3.4.0.dev0
- Transformers: 4.48.0.dev0
- PyTorch: 2.6.0.dev20241112+cu121
- Accelerate: 1.2.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
DebiasedMultipleNegativesRankingLoss
@inproceedings{chuang2020debiased,
title={Debiased Contrastive Learning},
author={Ching-Yao Chuang and Joshua Robinson and Lin Yen-Chen and Antonio Torralba and Stefanie Jegelka},
booktitle={Advances in Neural Information Processing Systems},
year={2020},
url={https://arxiv.org/pdf/2007.00224}
}