Bert base fine-tuned with Cantonese and English mixed STS dataset

This is a sentence-transformers model finetuned from hon9kon9ize/bert-large-cantonese-sts. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: hon9kon9ize/bert-large-cantonese-sts
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity
  • Language: yue
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("hon9kon9ize/yue-embed")
# Run inference
sentences = [
    'query: when did england change from fahrenheit to celsius',
    'document: Metrication in the United Kingdom Adopting the metric system was discussed in Parliament as early as 1818 and some industries and even some government agencies had metricated, or were in the process of metricating by the mid 1960s. A formal government policy to support metrication was agreed by 1965. This policy, initiated in response to requests from industry, was to support voluntary metrication, with costs picked up where they fell. In 1969 the government created the Metrication Board as a quango to promote and coordinate metrication. In 1978, after some carpet retailers reverted to pricing by the square yard rather than the square metre, government policy shifted, and they started issuing orders making metrication mandatory in certain sectors. In 1980 government policy shifted again to prefer voluntary metrication, and the Metrication Board was abolished. By the time the Metrication Board was wound up, all the economic sectors that fell within its remit except road signage and parts of the retail trade sector had metricated.',
    "document: Periodic table Importantly, the organization of the periodic table can be utilized to derive relationships between various element properties, but also predicted chemical properties and behaviours of undiscovered or newly synthesized elements. Russian chemist Dmitri Mendeleev was first to publish a recognizable periodic table in 1869, developed mainly to illustrate periodic trends of the then-known elements. He also predicted some properties of unidentified elements that were expected to fill gaps within this table. Most of his forecasts proved to be correct. Mendeleev's idea has been slowly expanded and refined with the discovery or synthesis of further new elements and by developing new theoretical models to explain chemical behaviour. The modern periodic table now provides a useful framework for analyzing chemical reactions, and continues to be widely adopted in chemistry, nuclear physics and other sciences.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

  • Datasets: NanoClimateFEVER, NanoDBPedia, NanoFEVER, NanoFiQA2018, NanoHotpotQA, NanoMSMARCO, NanoNFCorpus, NanoNQ, NanoQuoraRetrieval, NanoSCIDOCS, NanoArguAna, NanoSciFact and NanoTouche2020
  • Evaluated with InformationRetrievalEvaluator
Metric NanoClimateFEVER NanoDBPedia NanoFEVER NanoFiQA2018 NanoHotpotQA NanoMSMARCO NanoNFCorpus NanoNQ NanoQuoraRetrieval NanoSCIDOCS NanoArguAna NanoSciFact NanoTouche2020
cosine_accuracy@1 0.06 0.1 0.06 0.12 0.18 0.08 0.1 0.12 0.56 0.06 0.12 0.18 0.3469
cosine_accuracy@3 0.2 0.26 0.1 0.22 0.38 0.16 0.1 0.26 0.66 0.12 0.34 0.22 0.7143
cosine_accuracy@5 0.22 0.44 0.1 0.26 0.4 0.2 0.12 0.38 0.68 0.14 0.52 0.32 0.7959
cosine_accuracy@10 0.26 0.52 0.12 0.36 0.44 0.24 0.18 0.44 0.8 0.22 0.64 0.36 0.9388
cosine_precision@1 0.06 0.1 0.06 0.12 0.18 0.08 0.1 0.12 0.56 0.06 0.12 0.18 0.3469
cosine_precision@3 0.0667 0.1267 0.0333 0.08 0.1333 0.0533 0.06 0.0867 0.2533 0.0533 0.1133 0.08 0.3265
cosine_precision@5 0.052 0.152 0.02 0.064 0.084 0.04 0.056 0.08 0.16 0.036 0.104 0.068 0.3061
cosine_precision@10 0.032 0.154 0.012 0.046 0.052 0.024 0.042 0.048 0.092 0.026 0.064 0.04 0.2714
cosine_recall@1 0.035 0.0058 0.05 0.0709 0.09 0.08 0.0024 0.11 0.49 0.0157 0.12 0.165 0.0173
cosine_recall@3 0.105 0.0257 0.09 0.1362 0.2 0.16 0.0045 0.24 0.6073 0.0367 0.34 0.21 0.06
cosine_recall@5 0.1267 0.0488 0.09 0.1499 0.21 0.2 0.0053 0.37 0.634 0.0407 0.52 0.3 0.1013
cosine_recall@10 0.144 0.0818 0.11 0.2119 0.26 0.24 0.0069 0.43 0.7407 0.0567 0.64 0.345 0.1705
cosine_ndcg@10 0.1074 0.1565 0.078 0.1599 0.2152 0.155 0.0514 0.2669 0.6316 0.0544 0.3668 0.2485 0.2934
cosine_mrr@10 0.1231 0.223 0.0753 0.1879 0.2793 0.1282 0.1127 0.2195 0.6266 0.1009 0.2815 0.2242 0.5437
cosine_map@100 0.0839 0.0848 0.0766 0.1279 0.1695 0.1439 0.0112 0.2213 0.6008 0.0383 0.2897 0.2308 0.1828

Nano BEIR

Metric Value
cosine_accuracy@1 0.1605
cosine_accuracy@3 0.2873
cosine_accuracy@5 0.352
cosine_accuracy@10 0.4245
cosine_precision@1 0.1605
cosine_precision@3 0.1128
cosine_precision@5 0.094
cosine_precision@10 0.0695
cosine_recall@1 0.0963
cosine_recall@3 0.1704
cosine_recall@5 0.2151
cosine_recall@10 0.2644
cosine_ndcg@10 0.2142
cosine_mrr@10 0.2405
cosine_map@100 0.1739

Training Details

Training Dataset

Unnamed Dataset

  • Size: 129,371 training samples
  • Columns: query and answer
  • Approximate statistics based on the first 1000 samples:
    query answer
    type string string
    details
    • min: 13 tokens
    • mean: 22.58 tokens
    • max: 134 tokens
    • min: 7 tokens
    • mean: 169.6 tokens
    • max: 512 tokens
  • Samples:
    query answer
    query: hotel and restaurant employees and bartenders international union document: Hotel Employees and Restaurant Employees Union The Hotel Employees and Restaurant Employees Union (HERE) was a United States labor union representing workers of the hospitality industry, formed in 1891. In 2004, HERE merged with the Union of Needletrades, Industrial, and Textile Employees (UNITE) to form UNITE HERE. HERE notably organized the staff of Yale University in 1984. Other major employers that contracted with this union included several large casinos (Harrah's, Caesars Palace, and Wynn Resorts); hotels (Hilton, Hyatt and Starwood), and Walt Disney World. HERE was affiliated with the AFL-CIO.
    query: 多肢离断伤的并发症是什么? document: 失血性休克;血循环危象;急性肾功能衰竭
    query: who is the father of kelly taylor's son on 90210 document: Kelly Taylor (90210) In 2008, Kelly Taylor returned in the spin-off 90210, now working as a guidance counselor at her alma mater West Beverly Hills High School. It was revealed that in the intervening years, she attained a master's degree and had a son named Sammy with Dylan. She and Dylan ended their relationship soon after. It was also revealed that West Beverly principal Harry Wilson was Kelly's neighbor growing up.[39]
  • Loss: CachedGISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.01}
    

Evaluation Dataset

Unnamed Dataset

  • Size: 1,000 evaluation samples
  • Columns: query and answer
  • Approximate statistics based on the first 1000 samples:
    query answer
    type string string
    details
    • min: 11 tokens
    • mean: 22.61 tokens
    • max: 146 tokens
    • min: 7 tokens
    • mean: 164.27 tokens
    • max: 512 tokens
  • Samples:
    query answer
    query: 微创经皮肾镜手术的推荐药有些什么? document: 阿司匹林
    query: why are the fires in ca called the thomas fires document: Thomas Fire On December 4, 2017, the Thomas Fire was reported at 6:26 p.m. PST,[36] to the north of Santa Paula, near Steckel Park and Thomas Aquinas College,[3][24] after which the fire is named.[37] That night, the small brush fire exploded in size and raced through the rugged mountain terrain that lies west of Santa Paula, between Ventura and Ojai.[19][38] Officials blamed strong Santa Ana winds that gusted up to 60 miles per hour (97 km/h) for the sudden expansion.[28][39] Soon after the fire had started, a second blaze was ignited nearly 30 minutes later, about 4 miles (6.4 km) to the north in Upper Ojai at the top of Koenigstein Road.[40] According to eyewitnesses, this second fire was sparked by an explosion in the power line over the area. The second fire was rapidly expanded by the strong Santa Ana winds, and soon merged into the Thomas Fire later that night.[40]
    query: which mountain man rediscovered south pass and brought back important information about this trail document: Jedediah Smith Jedediah Strong Smith (January 6, 1799 – May 27, 1831), was a clerk, frontiersman, hunter, trapper, author, cartographer, and explorer of the Rocky Mountains, the North American West, and the Southwest during the early 19th century. After 75 years of obscurity following his death, Smith was rediscovered as the American whose explorations led to the use of the 20-mile (32 km)-wide South Pass as the dominant point of crossing the Continental Divide for pioneers on the Oregon Trail.
  • Loss: CachedGISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.01}
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 2
  • warmup_ratio: 0.05
  • seed: 12
  • bf16: True
  • prompts: {'query': 'query: ', 'answer': 'document: '}
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 2
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.05
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 12
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: {'query': 'query: ', 'answer': 'document: '}
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss NanoClimateFEVER_cosine_ndcg@10 NanoDBPedia_cosine_ndcg@10 NanoFEVER_cosine_ndcg@10 NanoFiQA2018_cosine_ndcg@10 NanoHotpotQA_cosine_ndcg@10 NanoMSMARCO_cosine_ndcg@10 NanoNFCorpus_cosine_ndcg@10 NanoNQ_cosine_ndcg@10 NanoQuoraRetrieval_cosine_ndcg@10 NanoSCIDOCS_cosine_ndcg@10 NanoArguAna_cosine_ndcg@10 NanoSciFact_cosine_ndcg@10 NanoTouche2020_cosine_ndcg@10 NanoBEIR_mean_cosine_ndcg@10
0.0010 1 31.7042 - - - - - - - - - - - - - - -
0.0049 5 32.9433 - - - - - - - - - - - - - - -
0.0099 10 27.0338 - - - - - - - - - - - - - - -
0.0148 15 18.1598 - - - - - - - - - - - - - - -
0.0198 20 12.5771 - - - - - - - - - - - - - - -
0.0247 25 8.6872 - - - - - - - - - - - - - - -
0.0297 30 6.0455 - - - - - - - - - - - - - - -
0.0346 35 5.1917 - - - - - - - - - - - - - - -
0.0396 40 4.8424 - - - - - - - - - - - - - - -
0.0445 45 4.4785 - - - - - - - - - - - - - - -
0.0495 50 4.1896 - - - - - - - - - - - - - - -
0.0544 55 4.2621 - - - - - - - - - - - - - - -
0.0593 60 3.8401 - - - - - - - - - - - - - - -
0.0643 65 3.9482 - - - - - - - - - - - - - - -
0.0692 70 3.7762 - - - - - - - - - - - - - - -
0.0742 75 3.4895 - - - - - - - - - - - - - - -
0.0791 80 3.5892 - - - - - - - - - - - - - - -
0.0841 85 3.5312 - - - - - - - - - - - - - - -
0.0890 90 3.3244 - - - - - - - - - - - - - - -
0.0940 95 3.4369 - - - - - - - - - - - - - - -
0.0989 100 3.1867 - - - - - - - - - - - - - - -
0.1039 105 3.1734 - - - - - - - - - - - - - - -
0.1088 110 3.2156 - - - - - - - - - - - - - - -
0.1137 115 2.8888 - - - - - - - - - - - - - - -
0.1187 120 2.8613 - - - - - - - - - - - - - - -
0.1236 125 2.8905 - - - - - - - - - - - - - - -
0.1286 130 2.5984 - - - - - - - - - - - - - - -
0.1335 135 2.6853 - - - - - - - - - - - - - - -
0.1385 140 2.7013 - - - - - - - - - - - - - - -
0.1434 145 2.5577 - - - - - - - - - - - - - - -
0.1484 150 2.6287 - - - - - - - - - - - - - - -
0.1533 155 2.6481 - - - - - - - - - - - - - - -
0.1583 160 2.7741 - - - - - - - - - - - - - - -
0.1632 165 2.5738 - - - - - - - - - - - - - - -
0.1682 170 2.5335 - - - - - - - - - - - - - - -
0.1731 175 2.531 - - - - - - - - - - - - - - -
0.1780 180 2.437 - - - - - - - - - - - - - - -
0.1830 185 2.4836 - - - - - - - - - - - - - - -
0.1879 190 2.4642 - - - - - - - - - - - - - - -
0.1929 195 2.399 - - - - - - - - - - - - - - -
0.1978 200 2.3896 - - - - - - - - - - - - - - -
0.2028 205 2.3738 - - - - - - - - - - - - - - -
0.2077 210 2.5518 - - - - - - - - - - - - - - -
0.2127 215 2.4836 - - - - - - - - - - - - - - -
0.2176 220 2.2157 - - - - - - - - - - - - - - -
0.2226 225 2.2986 - - - - - - - - - - - - - - -
0.2275 230 2.4967 - - - - - - - - - - - - - - -
0.2324 235 2.121 - - - - - - - - - - - - - - -
0.2374 240 2.4301 - - - - - - - - - - - - - - -
0.2423 245 2.5054 - - - - - - - - - - - - - - -
0.2473 250 2.3213 - - - - - - - - - - - - - - -
0.2522 255 2.1182 - - - - - - - - - - - - - - -
0.2572 260 2.2966 - - - - - - - - - - - - - - -
0.2621 265 2.2662 - - - - - - - - - - - - - - -
0.2671 270 2.3188 - - - - - - - - - - - - - - -
0.2720 275 2.1836 - - - - - - - - - - - - - - -
0.2770 280 2.2206 - - - - - - - - - - - - - - -
0.2819 285 2.3144 - - - - - - - - - - - - - - -
0.2868 290 2.2496 - - - - - - - - - - - - - - -
0.2918 295 1.9909 - - - - - - - - - - - - - - -
0.2967 300 2.1294 - - - - - - - - - - - - - - -
0.3017 305 2.119 - - - - - - - - - - - - - - -
0.3066 310 2.0076 - - - - - - - - - - - - - - -
0.3116 315 2.127 - - - - - - - - - - - - - - -
0.3165 320 2.1309 - - - - - - - - - - - - - - -
0.3215 325 2.0868 - - - - - - - - - - - - - - -
0.3264 330 1.9429 - - - - - - - - - - - - - - -
0.3314 335 1.9 - - - - - - - - - - - - - - -
0.3363 340 1.82 - - - - - - - - - - - - - - -
0.3412 345 1.9731 - - - - - - - - - - - - - - -
0.3462 350 2.0156 - - - - - - - - - - - - - - -
0.3511 355 2.0106 - - - - - - - - - - - - - - -
0.3561 360 1.9383 - - - - - - - - - - - - - - -
0.3610 365 2.0491 - - - - - - - - - - - - - - -
0.3660 370 1.8893 - - - - - - - - - - - - - - -
0.3709 375 1.958 - - - - - - - - - - - - - - -
0.3759 380 1.9821 - - - - - - - - - - - - - - -
0.3808 385 2.024 - - - - - - - - - - - - - - -
0.3858 390 2.0182 - - - - - - - - - - - - - - -
0.3907 395 1.9659 - - - - - - - - - - - - - - -
0.3956 400 1.8339 - - - - - - - - - - - - - - -
0.4006 405 1.9081 - - - - - - - - - - - - - - -
0.4055 410 1.7876 - - - - - - - - - - - - - - -
0.4105 415 1.8371 - - - - - - - - - - - - - - -
0.4154 420 1.8274 - - - - - - - - - - - - - - -
0.4204 425 1.7863 - - - - - - - - - - - - - - -
0.4253 430 1.9064 - - - - - - - - - - - - - - -
0.4303 435 1.7721 - - - - - - - - - - - - - - -
0.4352 440 1.7162 - - - - - - - - - - - - - - -
0.4402 445 1.9112 - - - - - - - - - - - - - - -
0.4451 450 1.9384 - - - - - - - - - - - - - - -
0.4500 455 1.8096 - - - - - - - - - - - - - - -
0.4550 460 1.7145 - - - - - - - - - - - - - - -
0.4599 465 1.784 - - - - - - - - - - - - - - -
0.4649 470 1.9506 - - - - - - - - - - - - - - -
0.4698 475 1.7243 - - - - - - - - - - - - - - -
0.4748 480 1.8003 - - - - - - - - - - - - - - -
0.4797 485 1.7568 - - - - - - - - - - - - - - -
0.4847 490 1.5696 - - - - - - - - - - - - - - -
0.4896 495 1.8973 - - - - - - - - - - - - - - -
0.4946 500 1.6981 - - - - - - - - - - - - - - -
0.4995 505 1.7616 - - - - - - - - - - - - - - -
0.5045 510 1.6573 - - - - - - - - - - - - - - -
0.5094 515 1.8685 - - - - - - - - - - - - - - -
0.5143 520 1.8532 - - - - - - - - - - - - - - -
0.5193 525 1.7603 - - - - - - - - - - - - - - -
0.5242 530 1.7636 - - - - - - - - - - - - - - -
0.5292 535 1.4829 - - - - - - - - - - - - - - -
0.5341 540 1.6959 - - - - - - - - - - - - - - -
0.5391 545 1.6389 - - - - - - - - - - - - - - -
0.5440 550 1.6624 - - - - - - - - - - - - - - -
0.5490 555 1.8193 - - - - - - - - - - - - - - -
0.5539 560 1.7144 - - - - - - - - - - - - - - -
0.5589 565 1.4954 - - - - - - - - - - - - - - -
0.5638 570 1.6659 - - - - - - - - - - - - - - -
0.5687 575 1.669 - - - - - - - - - - - - - - -
0.5737 580 1.6931 - - - - - - - - - - - - - - -
0.5786 585 1.6894 - - - - - - - - - - - - - - -
0.5836 590 1.6437 - - - - - - - - - - - - - - -
0.5885 595 1.7259 - - - - - - - - - - - - - - -
0.5935 600 1.7937 - - - - - - - - - - - - - - -
0.5984 605 1.7279 - - - - - - - - - - - - - - -
0.6034 610 1.6769 - - - - - - - - - - - - - - -
0.6083 615 1.4731 - - - - - - - - - - - - - - -
0.6133 620 1.6466 - - - - - - - - - - - - - - -
0.6182 625 1.6954 - - - - - - - - - - - - - - -
0.6231 630 1.6224 - - - - - - - - - - - - - - -
0.6281 635 1.62 - - - - - - - - - - - - - - -
0.6330 640 1.5795 - - - - - - - - - - - - - - -
0.6380 645 1.5245 - - - - - - - - - - - - - - -
0.6429 650 1.7629 - - - - - - - - - - - - - - -
0.6479 655 1.5767 - - - - - - - - - - - - - - -
0.6528 660 1.6749 - - - - - - - - - - - - - - -
0.6578 665 1.5602 - - - - - - - - - - - - - - -
0.6627 670 1.6768 - - - - - - - - - - - - - - -
0.6677 675 1.8311 - - - - - - - - - - - - - - -
0.6726 680 1.5973 - - - - - - - - - - - - - - -
0.6775 685 1.5066 - - - - - - - - - - - - - - -
0.6825 690 1.6036 - - - - - - - - - - - - - - -
0.6874 695 1.7857 - - - - - - - - - - - - - - -
0.6924 700 1.4387 - - - - - - - - - - - - - - -
0.6973 705 1.5886 - - - - - - - - - - - - - - -
0.7023 710 1.551 - - - - - - - - - - - - - - -
0.7072 715 1.5561 - - - - - - - - - - - - - - -
0.7122 720 1.4458 - - - - - - - - - - - - - - -
0.7171 725 1.5703 - - - - - - - - - - - - - - -
0.7221 730 1.6162 - - - - - - - - - - - - - - -
0.7270 735 1.5643 - - - - - - - - - - - - - - -
0.7319 740 1.4894 - - - - - - - - - - - - - - -
0.7369 745 1.6413 - - - - - - - - - - - - - - -
0.7418 750 1.5406 - - - - - - - - - - - - - - -
0.7468 755 1.5185 - - - - - - - - - - - - - - -
0.7517 760 1.488 - - - - - - - - - - - - - - -
0.7567 765 1.5041 - - - - - - - - - - - - - - -
0.7616 770 1.4665 - - - - - - - - - - - - - - -
0.7666 775 1.5252 - - - - - - - - - - - - - - -
0.7715 780 1.4925 - - - - - - - - - - - - - - -
0.7765 785 1.3833 - - - - - - - - - - - - - - -
0.7814 790 1.3808 - - - - - - - - - - - - - - -
0.7864 795 1.5468 - - - - - - - - - - - - - - -
0.7913 800 1.5317 - - - - - - - - - - - - - - -
0.7962 805 1.5385 - - - - - - - - - - - - - - -
0.8012 810 1.4012 - - - - - - - - - - - - - - -
0.8061 815 1.5531 - - - - - - - - - - - - - - -
0.8111 820 1.6032 - - - - - - - - - - - - - - -
0.8160 825 1.4053 - - - - - - - - - - - - - - -
0.8210 830 1.5082 - - - - - - - - - - - - - - -
0.8259 835 1.5559 - - - - - - - - - - - - - - -
0.8309 840 1.4286 - - - - - - - - - - - - - - -
0.8358 845 1.4336 - - - - - - - - - - - - - - -
0.8408 850 1.3731 - - - - - - - - - - - - - - -
0.8457 855 1.5706 - - - - - - - - - - - - - - -
0.8506 860 1.4184 - - - - - - - - - - - - - - -
0.8556 865 1.4312 - - - - - - - - - - - - - - -
0.8605 870 1.4364 - - - - - - - - - - - - - - -
0.8655 875 1.5605 - - - - - - - - - - - - - - -
0.8704 880 1.4219 - - - - - - - - - - - - - - -
0.8754 885 1.4082 - - - - - - - - - - - - - - -
0.8803 890 1.3846 - - - - - - - - - - - - - - -
0.8853 895 1.4292 - - - - - - - - - - - - - - -
0.8902 900 1.4195 - - - - - - - - - - - - - - -
0.8952 905 1.5103 - - - - - - - - - - - - - - -
0.9001 910 1.5041 - - - - - - - - - - - - - - -
0.9050 915 1.427 - - - - - - - - - - - - - - -
0.9100 920 1.4385 - - - - - - - - - - - - - - -
0.9149 925 1.298 - - - - - - - - - - - - - - -
0.9199 930 1.4499 - - - - - - - - - - - - - - -
0.9248 935 1.4752 - - - - - - - - - - - - - - -
0.9298 940 1.4752 - - - - - - - - - - - - - - -
0.9347 945 1.3705 - - - - - - - - - - - - - - -
0.9397 950 1.4567 - - - - - - - - - - - - - - -
0.9446 955 1.3364 - - - - - - - - - - - - - - -
0.9496 960 1.376 - - - - - - - - - - - - - - -
0.9545 965 1.35 - - - - - - - - - - - - - - -
0.9594 970 1.5841 - - - - - - - - - - - - - - -
0.9644 975 1.3449 - - - - - - - - - - - - - - -
0.9693 980 1.2132 - - - - - - - - - - - - - - -
0.9743 985 1.3414 - - - - - - - - - - - - - - -
0.9792 990 1.5148 - - - - - - - - - - - - - - -
0.9842 995 1.3866 - - - - - - - - - - - - - - -
0.9891 1000 1.2051 1.3370 0.0906 0.1578 0.0712 0.1504 0.1887 0.1554 0.0466 0.2528 0.6197 0.0672 0.2857 0.2291 0.2718 0.1990
0.9941 1005 1.3021 - - - - - - - - - - - - - - -
0.9990 1010 1.391 - - - - - - - - - - - - - - -
1.0040 1015 1.1452 - - - - - - - - - - - - - - -
1.0089 1020 1.3989 - - - - - - - - - - - - - - -
1.0138 1025 1.2142 - - - - - - - - - - - - - - -
1.0188 1030 1.2472 - - - - - - - - - - - - - - -
1.0237 1035 1.3058 - - - - - - - - - - - - - - -
1.0287 1040 1.2643 - - - - - - - - - - - - - - -
1.0336 1045 1.2581 - - - - - - - - - - - - - - -
1.0386 1050 1.2434 - - - - - - - - - - - - - - -
1.0435 1055 1.1874 - - - - - - - - - - - - - - -
1.0485 1060 1.0421 - - - - - - - - - - - - - - -
1.0534 1065 1.3834 - - - - - - - - - - - - - - -
1.0584 1070 1.3279 - - - - - - - - - - - - - - -
1.0633 1075 1.3779 - - - - - - - - - - - - - - -
1.0682 1080 1.3071 - - - - - - - - - - - - - - -
1.0732 1085 1.1569 - - - - - - - - - - - - - - -
1.0781 1090 1.2427 - - - - - - - - - - - - - - -
1.0831 1095 1.1607 - - - - - - - - - - - - - - -
1.0880 1100 1.2691 - - - - - - - - - - - - - - -
1.0930 1105 1.2936 - - - - - - - - - - - - - - -
1.0979 1110 1.2527 - - - - - - - - - - - - - - -
1.1029 1115 1.1143 - - - - - - - - - - - - - - -
1.1078 1120 1.1508 - - - - - - - - - - - - - - -
1.1128 1125 1.1627 - - - - - - - - - - - - - - -
1.1177 1130 0.9774 - - - - - - - - - - - - - - -
1.1227 1135 1.1827 - - - - - - - - - - - - - - -
1.1276 1140 0.9429 - - - - - - - - - - - - - - -
1.1325 1145 1.0029 - - - - - - - - - - - - - - -
1.1375 1150 1.0764 - - - - - - - - - - - - - - -
1.1424 1155 1.0555 - - - - - - - - - - - - - - -
1.1474 1160 1.0559 - - - - - - - - - - - - - - -
1.1523 1165 1.0081 - - - - - - - - - - - - - - -
1.1573 1170 1.1928 - - - - - - - - - - - - - - -
1.1622 1175 1.0774 - - - - - - - - - - - - - - -
1.1672 1180 0.9185 - - - - - - - - - - - - - - -
1.1721 1185 1.0838 - - - - - - - - - - - - - - -
1.1771 1190 0.9981 - - - - - - - - - - - - - - -
1.1820 1195 1.0395 - - - - - - - - - - - - - - -
1.1869 1200 0.9522 - - - - - - - - - - - - - - -
1.1919 1205 0.9652 - - - - - - - - - - - - - - -
1.1968 1210 1.0276 - - - - - - - - - - - - - - -
1.2018 1215 0.9663 - - - - - - - - - - - - - - -
1.2067 1220 1.1356 - - - - - - - - - - - - - - -
1.2117 1225 1.159 - - - - - - - - - - - - - - -
1.2166 1230 0.8575 - - - - - - - - - - - - - - -
1.2216 1235 0.9134 - - - - - - - - - - - - - - -
1.2265 1240 1.1889 - - - - - - - - - - - - - - -
1.2315 1245 0.935 - - - - - - - - - - - - - - -
1.2364 1250 0.975 - - - - - - - - - - - - - - -
1.2413 1255 1.073 - - - - - - - - - - - - - - -
1.2463 1260 1.0709 - - - - - - - - - - - - - - -
1.2512 1265 0.9241 - - - - - - - - - - - - - - -
1.2562 1270 1.0101 - - - - - - - - - - - - - - -
1.2611 1275 1.1451 - - - - - - - - - - - - - - -
1.2661 1280 1.0501 - - - - - - - - - - - - - - -
1.2710 1285 0.9724 - - - - - - - - - - - - - - -
1.2760 1290 0.9222 - - - - - - - - - - - - - - -
1.2809 1295 1.086 - - - - - - - - - - - - - - -
1.2859 1300 0.973 - - - - - - - - - - - - - - -
1.2908 1305 0.9287 - - - - - - - - - - - - - - -
1.2957 1310 0.9051 - - - - - - - - - - - - - - -
1.3007 1315 0.9531 - - - - - - - - - - - - - - -
1.3056 1320 0.9605 - - - - - - - - - - - - - - -
1.3106 1325 0.8778 - - - - - - - - - - - - - - -
1.3155 1330 0.9399 - - - - - - - - - - - - - - -
1.3205 1335 0.9185 - - - - - - - - - - - - - - -
1.3254 1340 0.9078 - - - - - - - - - - - - - - -
1.3304 1345 0.8266 - - - - - - - - - - - - - - -
1.3353 1350 0.8186 - - - - - - - - - - - - - - -
1.3403 1355 0.9394 - - - - - - - - - - - - - - -
1.3452 1360 1.0972 - - - - - - - - - - - - - - -
1.3501 1365 0.8895 - - - - - - - - - - - - - - -
1.3551 1370 0.8678 - - - - - - - - - - - - - - -
1.3600 1375 0.9493 - - - - - - - - - - - - - - -
1.3650 1380 0.8449 - - - - - - - - - - - - - - -
1.3699 1385 0.917 - - - - - - - - - - - - - - -
1.3749 1390 0.8899 - - - - - - - - - - - - - - -
1.3798 1395 0.9516 - - - - - - - - - - - - - - -
1.3848 1400 0.9538 - - - - - - - - - - - - - - -
1.3897 1405 0.9964 - - - - - - - - - - - - - - -
1.3947 1410 0.9123 - - - - - - - - - - - - - - -
1.3996 1415 0.86 - - - - - - - - - - - - - - -
1.4045 1420 0.9382 - - - - - - - - - - - - - - -
1.4095 1425 0.764 - - - - - - - - - - - - - - -
1.4144 1430 0.9161 - - - - - - - - - - - - - - -
1.4194 1435 0.937 - - - - - - - - - - - - - - -
1.4243 1440 0.8487 - - - - - - - - - - - - - - -
1.4293 1445 0.7928 - - - - - - - - - - - - - - -
1.4342 1450 0.8586 - - - - - - - - - - - - - - -
1.4392 1455 0.9355 - - - - - - - - - - - - - - -
1.4441 1460 0.965 - - - - - - - - - - - - - - -
1.4491 1465 0.9019 - - - - - - - - - - - - - - -
1.4540 1470 0.8624 - - - - - - - - - - - - - - -
1.4590 1475 0.8204 - - - - - - - - - - - - - - -
1.4639 1480 1.0131 - - - - - - - - - - - - - - -
1.4688 1485 0.9222 - - - - - - - - - - - - - - -
1.4738 1490 0.9182 - - - - - - - - - - - - - - -
1.4787 1495 0.8247 - - - - - - - - - - - - - - -
1.4837 1500 0.7746 - - - - - - - - - - - - - - -
1.4886 1505 0.882 - - - - - - - - - - - - - - -
1.4936 1510 0.8482 - - - - - - - - - - - - - - -
1.4985 1515 0.9623 - - - - - - - - - - - - - - -
1.5035 1520 0.8804 - - - - - - - - - - - - - - -
1.5084 1525 0.8874 - - - - - - - - - - - - - - -
1.5134 1530 0.9747 - - - - - - - - - - - - - - -
1.5183 1535 0.8805 - - - - - - - - - - - - - - -
1.5232 1540 0.8776 - - - - - - - - - - - - - - -
1.5282 1545 0.7627 - - - - - - - - - - - - - - -
1.5331 1550 0.8975 - - - - - - - - - - - - - - -
1.5381 1555 0.8213 - - - - - - - - - - - - - - -
1.5430 1560 0.9472 - - - - - - - - - - - - - - -
1.5480 1565 0.9379 - - - - - - - - - - - - - - -
1.5529 1570 0.9312 - - - - - - - - - - - - - - -
1.5579 1575 0.7866 - - - - - - - - - - - - - - -
1.5628 1580 0.8629 - - - - - - - - - - - - - - -
1.5678 1585 0.8156 - - - - - - - - - - - - - - -
1.5727 1590 0.8737 - - - - - - - - - - - - - - -
1.5776 1595 0.942 - - - - - - - - - - - - - - -
1.5826 1600 0.8167 - - - - - - - - - - - - - - -
1.5875 1605 0.9468 - - - - - - - - - - - - - - -
1.5925 1610 0.9117 - - - - - - - - - - - - - - -
1.5974 1615 1.0137 - - - - - - - - - - - - - - -
1.6024 1620 0.8357 - - - - - - - - - - - - - - -
1.6073 1625 0.8372 - - - - - - - - - - - - - - -
1.6123 1630 0.905 - - - - - - - - - - - - - - -
1.6172 1635 0.9265 - - - - - - - - - - - - - - -
1.6222 1640 0.846 - - - - - - - - - - - - - - -
1.6271 1645 0.7729 - - - - - - - - - - - - - - -
1.6320 1650 0.7885 - - - - - - - - - - - - - - -
1.6370 1655 0.8717 - - - - - - - - - - - - - - -
1.6419 1660 0.9845 - - - - - - - - - - - - - - -
1.6469 1665 0.8286 - - - - - - - - - - - - - - -
1.6518 1670 0.8979 - - - - - - - - - - - - - - -
1.6568 1675 0.8502 - - - - - - - - - - - - - - -
1.6617 1680 0.9423 - - - - - - - - - - - - - - -
1.6667 1685 1.0128 - - - - - - - - - - - - - - -
1.6716 1690 0.8535 - - - - - - - - - - - - - - -
1.6766 1695 0.737 - - - - - - - - - - - - - - -
1.6815 1700 0.9871 - - - - - - - - - - - - - - -
1.6864 1705 0.8828 - - - - - - - - - - - - - - -
1.6914 1710 0.8178 - - - - - - - - - - - - - - -
1.6963 1715 0.7703 - - - - - - - - - - - - - - -
1.7013 1720 0.8739 - - - - - - - - - - - - - - -
1.7062 1725 0.8582 - - - - - - - - - - - - - - -
1.7112 1730 0.9181 - - - - - - - - - - - - - - -
1.7161 1735 0.8801 - - - - - - - - - - - - - - -
1.7211 1740 0.8009 - - - - - - - - - - - - - - -
1.7260 1745 0.9779 - - - - - - - - - - - - - - -
1.7310 1750 0.7777 - - - - - - - - - - - - - - -
1.7359 1755 0.7864 - - - - - - - - - - - - - - -
1.7409 1760 1.0066 - - - - - - - - - - - - - - -
1.7458 1765 0.7776 - - - - - - - - - - - - - - -
1.7507 1770 0.8122 - - - - - - - - - - - - - - -
1.7557 1775 0.8025 - - - - - - - - - - - - - - -
1.7606 1780 0.7559 - - - - - - - - - - - - - - -
1.7656 1785 0.8819 - - - - - - - - - - - - - - -
1.7705 1790 0.8901 - - - - - - - - - - - - - - -
1.7755 1795 0.7598 - - - - - - - - - - - - - - -
1.7804 1800 0.7542 - - - - - - - - - - - - - - -
1.7854 1805 0.8178 - - - - - - - - - - - - - - -
1.7903 1810 0.8374 - - - - - - - - - - - - - - -
1.7953 1815 0.8363 - - - - - - - - - - - - - - -
1.8002 1820 0.8177 - - - - - - - - - - - - - - -
1.8051 1825 0.9488 - - - - - - - - - - - - - - -
1.8101 1830 0.9959 - - - - - - - - - - - - - - -
1.8150 1835 0.7942 - - - - - - - - - - - - - - -
1.8200 1840 0.8747 - - - - - - - - - - - - - - -
1.8249 1845 0.9053 - - - - - - - - - - - - - - -
1.8299 1850 0.7853 - - - - - - - - - - - - - - -
1.8348 1855 0.838 - - - - - - - - - - - - - - -
1.8398 1860 0.7732 - - - - - - - - - - - - - - -
1.8447 1865 0.8613 - - - - - - - - - - - - - - -
1.8497 1870 0.791 - - - - - - - - - - - - - - -
1.8546 1875 0.8203 - - - - - - - - - - - - - - -
1.8595 1880 0.7558 - - - - - - - - - - - - - - -
1.8645 1885 0.9918 - - - - - - - - - - - - - - -
1.8694 1890 0.8272 - - - - - - - - - - - - - - -
1.8744 1895 0.8552 - - - - - - - - - - - - - - -
1.8793 1900 0.8135 - - - - - - - - - - - - - - -
1.8843 1905 0.8297 - - - - - - - - - - - - - - -
1.8892 1910 0.7844 - - - - - - - - - - - - - - -
1.8942 1915 0.8466 - - - - - - - - - - - - - - -
1.8991 1920 0.9099 - - - - - - - - - - - - - - -
1.9041 1925 0.8139 - - - - - - - - - - - - - - -
1.9090 1930 0.8628 - - - - - - - - - - - - - - -
1.9139 1935 0.6778 - - - - - - - - - - - - - - -
1.9189 1940 0.8251 - - - - - - - - - - - - - - -
1.9238 1945 0.8915 - - - - - - - - - - - - - - -
1.9288 1950 0.8136 - - - - - - - - - - - - - - -
1.9337 1955 0.8879 - - - - - - - - - - - - - - -
1.9387 1960 0.8758 - - - - - - - - - - - - - - -
1.9436 1965 0.8153 - - - - - - - - - - - - - - -
1.9486 1970 0.7253 - - - - - - - - - - - - - - -
1.9535 1975 0.8493 - - - - - - - - - - - - - - -
1.9585 1980 1.0186 - - - - - - - - - - - - - - -
1.9634 1985 0.8412 - - - - - - - - - - - - - - -
1.9683 1990 0.7027 - - - - - - - - - - - - - - -
1.9733 1995 0.744 - - - - - - - - - - - - - - -
1.9782 2000 0.9555 1.1452 0.1064 0.1577 0.0780 0.1597 0.2144 0.1550 0.0513 0.2643 0.6316 0.0525 0.3670 0.2485 0.2937 0.2139
1.9832 2005 0.9095 - - - - - - - - - - - - - - -
1.9881 2010 0.7378 - - - - - - - - - - - - - - -
1.9931 2015 0.8024 - - - - - - - - - - - - - - -
1.9980 2020 0.9107 - - - - - - - - - - - - - - -
2.0 2022 - - 0.1074 0.1565 0.0780 0.1599 0.2152 0.1550 0.0514 0.2669 0.6316 0.0544 0.3668 0.2485 0.2934 0.2142

Framework Versions

  • Python: 3.11.2
  • Sentence Transformers: 3.3.1
  • Transformers: 4.47.1
  • PyTorch: 2.4.0+cu121
  • Accelerate: 1.0.1
  • Datasets: 3.1.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
8
Safetensors
Model size
326M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hon9kon9ize/yue-embed

Evaluation results