SentenceTransformer based on hon9kon9ize/bert-large-cantonese

This is a sentence-transformers model finetuned from hon9kon9ize/bert-large-cantonese on the yue-all-nli and sentence-transformers/all-nli dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: hon9kon9ize/bert-large-cantonese
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("hon9kon9ize/bert-large-cantonese-nli")
# Run inference
sentences = [
    '有個男人拋過咗個杆,後面有啲人同埋遮蓬。',
    '有個男人騰空躍起。',
    '個男人追緊個賊。',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric sts-dev sts-test
pearson_cosine 0.8077 0.7648
spearman_cosine 0.8072 0.7611

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,115,217 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 25.84 tokens
    • max: 216 tokens
    • min: 4 tokens
    • mean: 16.04 tokens
    • max: 53 tokens
    • min: 4 tokens
    • mean: 16.68 tokens
    • max: 52 tokens
  • Samples:
    anchor positive negative
    From Lincoln to Wilson it was 34, and since Wilson it has been 25. Since Wilson it has been 25, and from Lincoln to Wilson 34. From Lincoln to Wilson it was 99.
    「我會上返我房。」跟住我就跟咗佢。 「我上房啦。」跟住我就跟咗佢上去。 我落去廚房沖杯茶先。
    (To restore Internet Explorer, you'll need to have it on your original Windows 95 CD-ROM or another disk. You need to have the original disk to restore Internet Explorer. Internet explorer can be restored without a disk.
  • Loss: GISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.01}
    

Evaluation Dataset

yue-all-nli

  • Dataset: yue-all-nli at f6757e9
  • Size: 6,572 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 24.98 tokens
    • max: 103 tokens
    • min: 6 tokens
    • mean: 13.09 tokens
    • max: 50 tokens
    • min: 7 tokens
    • mean: 13.7 tokens
    • max: 40 tokens
  • Samples:
    anchor positive negative
    「兩個女人攬住,仲手持外賣包裝。」 兩個女人手持包裹。 班男人喺間熟食店外面打緊交。
    「兩個著住藍色球衣嘅細路,一個背號係9,另一個係2,企喺浴室嘅木梯度,喺個鋅盤度洗手。 兩個著住有號碼球衣嘅細路仔喺度洗手。 兩個著住外套嘅小朋友返緊學。
    一個男人喺安吉利斯市舉行嘅世界博覽會活動期間,向顧客賣甜甜圈。 有個男人賣緊甜甜圈俾個客人。 一個女人喺間細細嘅咖啡店飲緊咖啡。
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 80
  • per_device_eval_batch_size: 80
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • bf16: True
  • use_liger_kernel: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 80
  • per_device_eval_batch_size: 80
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: True
  • eval_use_gather_object: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss sts-dev_spearman_cosine sts-test_spearman_cosine
0 0 - - 0.6088 -
0.0007 10 8.9182 - - -
0.0014 20 8.3335 - - -
0.0022 30 7.8536 - - -
0.0029 40 7.2262 - - -
0.0036 50 5.892 - - -
0.0043 60 5.1125 - - -
0.0050 70 4.5982 - - -
0.0057 80 3.9755 - - -
0.0065 90 3.7588 - - -
0.0072 100 3.6977 - - -
0.0079 110 3.5546 - - -
0.0086 120 3.3271 - - -
0.0093 130 3.1647 - - -
0.0100 140 3.0859 - - -
0.0108 150 2.9647 - - -
0.0115 160 2.9156 - - -
0.0122 170 2.9001 - - -
0.0129 180 2.721 - - -
0.0136 190 2.7586 - - -
0.0143 200 2.601 - - -
0.0151 210 2.5653 - - -
0.0158 220 2.5726 - - -
0.0165 230 2.4649 - - -
0.0172 240 2.4707 - - -
0.0179 250 2.5434 - - -
0.0187 260 2.3945 - - -
0.0194 270 2.2557 - - -
0.0201 280 2.3749 - - -
0.0208 290 2.1641 - - -
0.0215 300 2.0744 - - -
0.0222 310 2.2832 - - -
0.0230 320 2.2298 - - -
0.0237 330 2.2006 - - -
0.0244 340 2.0964 - - -
0.0251 350 2.208 - - -
0.0258 360 2.1591 - - -
0.0265 370 1.9641 - - -
0.0273 380 2.1122 - - -
0.0280 390 1.9392 - - -
0.0287 400 1.9073 - - -
0.0294 410 2.115 - - -
0.0301 420 1.7491 - - -
0.0308 430 1.8553 - - -
0.0316 440 1.997 - - -
0.0323 450 1.8158 - - -
0.0330 460 1.8158 - - -
0.0337 470 1.78 - - -
0.0344 480 1.8675 - - -
0.0351 490 1.7096 - - -
0.0359 500 1.7854 - - -
0.0366 510 1.8731 - - -
0.0373 520 1.8561 - - -
0.0380 530 1.6927 - - -
0.0387 540 1.7424 - - -
0.0395 550 1.7394 - - -
0.0402 560 1.7621 - - -
0.0409 570 1.8423 - - -
0.0416 580 1.8296 - - -
0.0423 590 1.7202 - - -
0.0430 600 1.7966 - - -
0.0438 610 1.5712 - - -
0.0445 620 1.6416 - - -
0.0452 630 1.5977 - - -
0.0459 640 1.7382 - - -
0.0466 650 1.6744 - - -
0.0473 660 1.7537 - - -
0.0481 670 1.7131 - - -
0.0488 680 1.6714 - - -
0.0495 690 1.651 - - -
0.0502 700 1.6177 - - -
0.0509 710 1.5179 - - -
0.0516 720 1.612 - - -
0.0524 730 1.561 - - -
0.0531 740 1.5283 - - -
0.0538 750 1.5407 - - -
0.0545 760 1.4809 - - -
0.0552 770 1.5749 - - -
0.0560 780 1.5744 - - -
0.0567 790 1.5428 - - -
0.0574 800 1.4429 - - -
0.0581 810 1.4721 - - -
0.0588 820 1.3807 - - -
0.0595 830 1.4229 - - -
0.0603 840 1.5178 - - -
0.0610 850 1.4231 - - -
0.0617 860 1.4853 - - -
0.0624 870 1.4032 - - -
0.0631 880 1.473 - - -
0.0638 890 1.3368 - - -
0.0646 900 1.3146 - - -
0.0653 910 1.4207 - - -
0.0660 920 1.3747 - - -
0.0667 930 1.406 - - -
0.0674 940 1.2829 - - -
0.0681 950 1.3115 - - -
0.0689 960 1.4112 - - -
0.0696 970 1.4644 - - -
0.0703 980 1.3492 - - -
0.0710 990 1.4076 - - -
0.0717 1000 1.4196 - - -
0.0724 1010 1.4376 - - -
0.0732 1020 1.2883 - - -
0.0739 1030 1.3183 - - -
0.0746 1040 1.4535 - - -
0.0753 1050 1.28 - - -
0.0760 1060 1.2585 - - -
0.0768 1070 1.3905 - - -
0.0775 1080 1.3414 - - -
0.0782 1090 1.2586 - - -
0.0789 1100 1.2527 - - -
0.0796 1110 1.2847 - - -
0.0803 1120 1.2187 - - -
0.0811 1130 1.2561 - - -
0.0818 1140 1.3247 - - -
0.0825 1150 1.2935 - - -
0.0832 1160 1.4118 - - -
0.0839 1170 1.2976 - - -
0.0846 1180 1.2084 - - -
0.0854 1190 1.3157 - - -
0.0861 1200 1.2973 - - -
0.0868 1210 1.24 - - -
0.0875 1220 1.3447 - - -
0.0882 1230 1.2246 - - -
0.0889 1240 1.2742 - - -
0.0897 1250 1.1727 - - -
0.0904 1260 1.3523 - - -
0.0911 1270 1.228 - - -
0.0918 1280 1.1218 - - -
0.0925 1290 1.1027 - - -
0.0933 1300 1.3143 - - -
0.0940 1310 1.1356 - - -
0.0947 1320 1.2 - - -
0.0954 1330 1.118 - - -
0.0961 1340 1.1889 - - -
0.0968 1350 1.1722 - - -
0.0976 1360 1.1186 - - -
0.0983 1370 1.2979 - - -
0.0990 1380 1.1622 - - -
0.0997 1390 1.2492 - - -
0.1004 1400 1.1449 - - -
0.1011 1410 1.1141 - - -
0.1019 1420 1.3464 - - -
0.1026 1430 1.1921 - - -
0.1033 1440 1.0709 - - -
0.1040 1450 1.1985 - - -
0.1047 1460 1.1392 - - -
0.1054 1470 1.1942 - - -
0.1062 1480 1.1897 - - -
0.1069 1490 1.1081 - - -
0.1076 1500 1.1641 - - -
0.1083 1510 1.1414 - - -
0.1090 1520 1.1705 - - -
0.1097 1530 1.1629 - - -
0.1105 1540 1.1342 - - -
0.1112 1550 1.1283 - - -
0.1119 1560 1.1242 - - -
0.1126 1570 1.1932 - - -
0.1133 1580 1.075 - - -
0.1141 1590 1.1319 - - -
0.1148 1600 1.2497 - - -
0.1155 1610 0.9953 - - -
0.1162 1620 1.1005 - - -
0.1169 1630 1.0989 - - -
0.1176 1640 1.0662 - - -
0.1184 1650 1.1136 - - -
0.1191 1660 1.129 - - -
0.1198 1670 1.0414 - - -
0.1205 1680 1.0026 - - -
0.1212 1690 1.035 - - -
0.1219 1700 1.0701 - - -
0.1227 1710 1.0386 - - -
0.1234 1720 1.1529 - - -
0.1241 1730 1.0426 - - -
0.1248 1740 1.077 - - -
0.1255 1750 1.1176 - - -
0.1262 1760 1.0752 - - -
0.1270 1770 1.0145 - - -
0.1277 1780 1.0539 - - -
0.1284 1790 1.1963 - - -
0.1291 1800 1.1032 - - -
0.1298 1810 1.0652 - - -
0.1306 1820 1.144 - - -
0.1313 1830 1.1093 - - -
0.1320 1840 1.0215 - - -
0.1327 1850 1.152 - - -
0.1334 1860 0.9941 - - -
0.1341 1870 1.0329 - - -
0.1349 1880 0.9976 - - -
0.1356 1890 0.9901 - - -
0.1363 1900 1.1014 - - -
0.1370 1910 1.0418 - - -
0.1377 1920 1.1701 - - -
0.1384 1930 0.973 - - -
0.1392 1940 1.0644 - - -
0.1399 1950 1.0022 - - -
0.1406 1960 0.9863 - - -
0.1413 1970 0.929 - - -
0.1420 1980 1.0516 - - -
0.1427 1990 1.0123 - - -
0.1435 2000 1.0731 - - -
0.1442 2010 1.0337 - - -
0.1449 2020 0.9423 - - -
0.1456 2030 0.966 - - -
0.1463 2040 1.046 - - -
0.1470 2050 1.045 - - -
0.1478 2060 0.9203 - - -
0.1485 2070 0.9721 - - -
0.1492 2080 1.0341 - - -
0.1499 2090 1.0463 - - -
0.1506 2100 1.0051 - - -
0.1514 2110 1.0055 - - -
0.1521 2120 0.9688 - - -
0.1528 2130 1.0974 - - -
0.1535 2140 0.8852 - - -
0.1542 2150 1.0356 - - -
0.1549 2160 0.9915 - - -
0.1557 2170 0.9609 - - -
0.1562 2178 - 1.2093 0.7866 -
0.1564 2180 1.0349 - - -
0.1571 2190 0.899 - - -
0.1578 2200 1.0112 - - -
0.1585 2210 1.0009 - - -
0.1592 2220 0.9221 - - -
0.1600 2230 0.9595 - - -
0.1607 2240 0.9516 - - -
0.1614 2250 1.0355 - - -
0.1621 2260 0.9125 - - -
0.1628 2270 0.9046 - - -
0.1635 2280 0.9773 - - -
0.1643 2290 1.0821 - - -
0.1650 2300 0.9773 - - -
0.1657 2310 0.9909 - - -
0.1664 2320 0.9288 - - -
0.1671 2330 0.9354 - - -
0.1679 2340 0.9196 - - -
0.1686 2350 0.8961 - - -
0.1693 2360 0.9685 - - -
0.1700 2370 0.9669 - - -
0.1707 2380 0.9542 - - -
0.1714 2390 0.9911 - - -
0.1722 2400 0.8908 - - -
0.1729 2410 0.9395 - - -
0.1736 2420 0.9501 - - -
0.1743 2430 0.9534 - - -
0.1750 2440 0.9839 - - -
0.1757 2450 0.9619 - - -
0.1765 2460 0.937 - - -
0.1772 2470 0.8549 - - -
0.1779 2480 0.9377 - - -
0.1786 2490 0.8928 - - -
0.1793 2500 0.8883 - - -
0.1800 2510 0.9459 - - -
0.1808 2520 0.8577 - - -
0.1815 2530 0.8096 - - -
0.1822 2540 0.8988 - - -
0.1829 2550 0.9359 - - -
0.1836 2560 0.944 - - -
0.1843 2570 0.9284 - - -
0.1851 2580 0.8583 - - -
0.1858 2590 0.8604 - - -
0.1865 2600 0.9531 - - -
0.1872 2610 0.9074 - - -
0.1879 2620 0.8655 - - -
0.1887 2630 0.9546 - - -
0.1894 2640 0.9259 - - -
0.1901 2650 0.945 - - -
0.1908 2660 0.861 - - -
0.1915 2670 0.8892 - - -
0.1922 2680 0.9798 - - -
0.1930 2690 0.8644 - - -
0.1937 2700 0.95 - - -
0.1944 2710 0.931 - - -
0.1951 2720 0.7496 - - -
0.1958 2730 0.8855 - - -
0.1965 2740 0.8926 - - -
0.1973 2750 0.8792 - - -
0.1980 2760 0.8397 - - -
0.1987 2770 0.9355 - - -
0.1994 2780 0.9362 - - -
0.2001 2790 0.8009 - - -
0.2008 2800 0.7962 - - -
0.2016 2810 0.8475 - - -
0.2023 2820 0.8982 - - -
0.2030 2830 0.8046 - - -
0.2037 2840 0.9624 - - -
0.2044 2850 0.8324 - - -
0.2052 2860 0.7491 - - -
0.2059 2870 0.9558 - - -
0.2066 2880 0.8514 - - -
0.2073 2890 0.7838 - - -
0.2080 2900 0.7808 - - -
0.2087 2910 1.0037 - - -
0.2095 2920 0.9283 - - -
0.2102 2930 0.8422 - - -
0.2109 2940 0.8594 - - -
0.2116 2950 0.8546 - - -
0.2123 2960 0.8496 - - -
0.2130 2970 0.845 - - -
0.2138 2980 0.7914 - - -
0.2145 2990 0.8316 - - -
0.2152 3000 0.8721 - - -
0.2159 3010 0.6566 - - -
0.2166 3020 0.8026 - - -
0.2173 3030 0.8237 - - -
0.2181 3040 0.8142 - - -
0.2188 3050 0.7932 - - -
0.2195 3060 0.8751 - - -
0.2202 3070 0.8583 - - -
0.2209 3080 0.8627 - - -
0.2216 3090 0.8593 - - -
0.2224 3100 0.8595 - - -
0.2231 3110 0.8371 - - -
0.2238 3120 0.8579 - - -
0.2245 3130 0.8658 - - -
0.2252 3140 0.7946 - - -
0.2260 3150 0.8652 - - -
0.2267 3160 0.7412 - - -
0.2274 3170 0.7563 - - -
0.2281 3180 0.9034 - - -
0.2288 3190 0.8193 - - -
0.2295 3200 0.8001 - - -
0.2303 3210 0.794 - - -
0.2310 3220 0.8222 - - -
0.2317 3230 0.8456 - - -
0.2324 3240 0.8093 - - -
0.2331 3250 0.7943 - - -
0.2338 3260 0.8158 - - -
0.2346 3270 0.8379 - - -
0.2353 3280 0.8711 - - -
0.2360 3290 0.8452 - - -
0.2367 3300 0.7533 - - -
0.2374 3310 0.8267 - - -
0.2381 3320 0.818 - - -
0.2389 3330 0.8427 - - -
0.2396 3340 0.7445 - - -
0.2403 3350 0.8246 - - -
0.2410 3360 0.8033 - - -
0.2417 3370 0.8017 - - -
0.2425 3380 0.7425 - - -
0.2432 3390 0.8606 - - -
0.2439 3400 0.7978 - - -
0.2446 3410 0.8827 - - -
0.2453 3420 0.7308 - - -
0.2460 3430 0.819 - - -
0.2468 3440 0.7908 - - -
0.2475 3450 0.7302 - - -
0.2482 3460 0.7982 - - -
0.2489 3470 0.8015 - - -
0.2496 3480 0.7676 - - -
0.2503 3490 0.8107 - - -
0.2511 3500 0.8125 - - -
0.2518 3510 0.8307 - - -
0.2525 3520 0.7488 - - -
0.2532 3530 0.92 - - -
0.2539 3540 0.6889 - - -
0.2546 3550 0.7914 - - -
0.2554 3560 0.7938 - - -
0.2561 3570 0.8257 - - -
0.2568 3580 0.7064 - - -
0.2575 3590 0.7717 - - -
0.2582 3600 0.631 - - -
0.2589 3610 0.8091 - - -
0.2597 3620 0.7559 - - -
0.2604 3630 0.7483 - - -
0.2611 3640 0.7886 - - -
0.2618 3650 0.6773 - - -
0.2625 3660 0.7352 - - -
0.2633 3670 0.8096 - - -
0.2640 3680 0.79 - - -
0.2647 3690 0.6886 - - -
0.2654 3700 0.7923 - - -
0.2661 3710 0.7476 - - -
0.2668 3720 0.8992 - - -
0.2676 3730 0.733 - - -
0.2683 3740 0.6898 - - -
0.2690 3750 0.7969 - - -
0.2697 3760 0.7556 - - -
0.2704 3770 0.786 - - -
0.2711 3780 0.827 - - -
0.2719 3790 0.75 - - -
0.2726 3800 0.7101 - - -
0.2733 3810 0.8142 - - -
0.2740 3820 0.7928 - - -
0.2747 3830 0.788 - - -
0.2754 3840 0.7075 - - -
0.2762 3850 0.695 - - -
0.2769 3860 0.7571 - - -
0.2776 3870 0.6798 - - -
0.2783 3880 0.8626 - - -
0.2790 3890 0.7438 - - -
0.2798 3900 0.7301 - - -
0.2805 3910 0.7628 - - -
0.2812 3920 0.726 - - -
0.2819 3930 0.7678 - - -
0.2826 3940 0.7113 - - -
0.2833 3950 0.723 - - -
0.2841 3960 0.771 - - -
0.2848 3970 0.787 - - -
0.2855 3980 0.6815 - - -
0.2862 3990 0.7384 - - -
0.2869 4000 0.7465 - - -
0.2876 4010 0.6722 - - -
0.2884 4020 0.7361 - - -
0.2891 4030 0.7149 - - -
0.2898 4040 0.773 - - -
0.2905 4050 0.7779 - - -
0.2912 4060 0.7116 - - -
0.2919 4070 0.6573 - - -
0.2927 4080 0.8571 - - -
0.2934 4090 0.7424 - - -
0.2941 4100 0.7404 - - -
0.2948 4110 0.7462 - - -
0.2955 4120 0.625 - - -
0.2962 4130 0.7387 - - -
0.2970 4140 0.7127 - - -
0.2977 4150 0.7645 - - -
0.2984 4160 0.7598 - - -
0.2991 4170 0.649 - - -
0.2998 4180 0.7683 - - -
0.3006 4190 0.6465 - - -
0.3013 4200 0.7598 - - -
0.3020 4210 0.7458 - - -
0.3027 4220 0.7207 - - -
0.3034 4230 0.6873 - - -
0.3041 4240 0.7018 - - -
0.3049 4250 0.7524 - - -
0.3056 4260 0.6823 - - -
0.3063 4270 0.5722 - - -
0.3070 4280 0.7361 - - -
0.3077 4290 0.7342 - - -
0.3084 4300 0.6414 - - -
0.3092 4310 0.6505 - - -
0.3099 4320 0.7799 - - -
0.3106 4330 0.7294 - - -
0.3113 4340 0.6403 - - -
0.3120 4350 0.7521 - - -
0.3125 4356 - 0.9868 0.7972 -
0.3127 4360 0.7467 - - -
0.3135 4370 0.7838 - - -
0.3142 4380 0.683 - - -
0.3149 4390 0.7511 - - -
0.3156 4400 0.6735 - - -
0.3163 4410 0.7084 - - -
0.3171 4420 0.7538 - - -
0.3178 4430 0.8077 - - -
0.3185 4440 0.7891 - - -
0.3192 4450 0.6701 - - -
0.3199 4460 0.7232 - - -
0.3206 4470 0.787 - - -
0.3214 4480 0.6689 - - -
0.3221 4490 0.7594 - - -
0.3228 4500 0.75 - - -
0.3235 4510 0.6784 - - -
0.3242 4520 0.7004 - - -
0.3249 4530 0.7308 - - -
0.3257 4540 0.752 - - -
0.3264 4550 0.7164 - - -
0.3271 4560 0.6964 - - -
0.3278 4570 0.7029 - - -
0.3285 4580 0.6584 - - -
0.3292 4590 0.6218 - - -
0.3300 4600 0.779 - - -
0.3307 4610 0.6442 - - -
0.3314 4620 0.6317 - - -
0.3321 4630 0.6734 - - -
0.3328 4640 0.7953 - - -
0.3335 4650 0.6617 - - -
0.3343 4660 0.6404 - - -
0.3350 4670 0.6407 - - -
0.3357 4680 0.6547 - - -
0.3364 4690 0.7249 - - -
0.3371 4700 0.7525 - - -
0.3379 4710 0.7395 - - -
0.3386 4720 0.6307 - - -
0.3393 4730 0.6092 - - -
0.3400 4740 0.6135 - - -
0.3407 4750 0.7408 - - -
0.3414 4760 0.6585 - - -
0.3422 4770 0.6396 - - -
0.3429 4780 0.6022 - - -
0.3436 4790 0.6295 - - -
0.3443 4800 0.8024 - - -
0.3450 4810 0.6866 - - -
0.3457 4820 0.5793 - - -
0.3465 4830 0.6637 - - -
0.3472 4840 0.6971 - - -
0.3479 4850 0.6391 - - -
0.3486 4860 0.7 - - -
0.3493 4870 0.6213 - - -
0.3500 4880 0.6752 - - -
0.3508 4890 0.6263 - - -
0.3515 4900 0.6575 - - -
0.3522 4910 0.6694 - - -
0.3529 4920 0.6498 - - -
0.3536 4930 0.6279 - - -
0.3544 4940 0.7128 - - -
0.3551 4950 0.6603 - - -
0.3558 4960 0.6351 - - -
0.3565 4970 0.6578 - - -
0.3572 4980 0.6546 - - -
0.3579 4990 0.7007 - - -
0.3587 5000 0.6485 - - -
0.3594 5010 0.6139 - - -
0.3601 5020 0.6885 - - -
0.3608 5030 0.6312 - - -
0.3615 5040 0.5633 - - -
0.3622 5050 0.6518 - - -
0.3630 5060 0.7098 - - -
0.3637 5070 0.583 - - -
0.3644 5080 0.5949 - - -
0.3651 5090 0.7185 - - -
0.3658 5100 0.6777 - - -
0.3665 5110 0.6809 - - -
0.3673 5120 0.6909 - - -
0.3680 5130 0.6251 - - -
0.3687 5140 0.6099 - - -
0.3694 5150 0.6659 - - -
0.3701 5160 0.6516 - - -
0.3708 5170 0.664 - - -
0.3716 5180 0.6589 - - -
0.3723 5190 0.6371 - - -
0.3730 5200 0.6692 - - -
0.3737 5210 0.6228 - - -
0.3744 5220 0.6432 - - -
0.3752 5230 0.6663 - - -
0.3759 5240 0.6402 - - -
0.3766 5250 0.5334 - - -
0.3773 5260 0.6676 - - -
0.3780 5270 0.6822 - - -
0.3787 5280 0.5719 - - -
0.3795 5290 0.6703 - - -
0.3802 5300 0.6444 - - -
0.3809 5310 0.6299 - - -
0.3816 5320 0.6906 - - -
0.3823 5330 0.5978 - - -
0.3830 5340 0.6278 - - -
0.3838 5350 0.6683 - - -
0.3845 5360 0.5997 - - -
0.3852 5370 0.6662 - - -
0.3859 5380 0.6641 - - -
0.3866 5390 0.6498 - - -
0.3873 5400 0.6363 - - -
0.3881 5410 0.6324 - - -
0.3888 5420 0.6702 - - -
0.3895 5430 0.6308 - - -
0.3902 5440 0.6571 - - -
0.3909 5450 0.6408 - - -
0.3917 5460 0.6606 - - -
0.3924 5470 0.6572 - - -
0.3931 5480 0.624 - - -
0.3938 5490 0.5959 - - -
0.3945 5500 0.5782 - - -
0.3952 5510 0.6109 - - -
0.3960 5520 0.6388 - - -
0.3967 5530 0.7233 - - -
0.3974 5540 0.6029 - - -
0.3981 5550 0.6285 - - -
0.3988 5560 0.6243 - - -
0.3995 5570 0.6196 - - -
0.4003 5580 0.579 - - -
0.4010 5590 0.5563 - - -
0.4017 5600 0.5994 - - -
0.4024 5610 0.6865 - - -
0.4031 5620 0.6909 - - -
0.4038 5630 0.6201 - - -
0.4046 5640 0.5478 - - -
0.4053 5650 0.6121 - - -
0.4060 5660 0.5546 - - -
0.4067 5670 0.6614 - - -
0.4074 5680 0.657 - - -
0.4081 5690 0.6096 - - -
0.4089 5700 0.6035 - - -
0.4096 5710 0.605 - - -
0.4103 5720 0.5959 - - -
0.4110 5730 0.6696 - - -
0.4117 5740 0.5671 - - -
0.4125 5750 0.6159 - - -
0.4132 5760 0.64 - - -
0.4139 5770 0.5673 - - -
0.4146 5780 0.6086 - - -
0.4153 5790 0.5885 - - -
0.4160 5800 0.5774 - - -
0.4168 5810 0.6652 - - -
0.4175 5820 0.7244 - - -
0.4182 5830 0.6169 - - -
0.4189 5840 0.612 - - -
0.4196 5850 0.6474 - - -
0.4203 5860 0.6296 - - -
0.4211 5870 0.6413 - - -
0.4218 5880 0.5958 - - -
0.4225 5890 0.7245 - - -
0.4232 5900 0.6638 - - -
0.4239 5910 0.6368 - - -
0.4246 5920 0.4921 - - -
0.4254 5930 0.5735 - - -
0.4261 5940 0.6299 - - -
0.4268 5950 0.6276 - - -
0.4275 5960 0.6121 - - -
0.4282 5970 0.6351 - - -
0.4290 5980 0.6644 - - -
0.4297 5990 0.5185 - - -
0.4304 6000 0.6165 - - -
0.4311 6010 0.6107 - - -
0.4318 6020 0.5443 - - -
0.4325 6030 0.6167 - - -
0.4333 6040 0.58 - - -
0.4340 6050 0.5833 - - -
0.4347 6060 0.6181 - - -
0.4354 6070 0.574 - - -
0.4361 6080 0.5628 - - -
0.4368 6090 0.6423 - - -
0.4376 6100 0.6041 - - -
0.4383 6110 0.596 - - -
0.4390 6120 0.5613 - - -
0.4397 6130 0.5559 - - -
0.4404 6140 0.64 - - -
0.4411 6150 0.5376 - - -
0.4419 6160 0.5798 - - -
0.4426 6170 0.5994 - - -
0.4433 6180 0.6344 - - -
0.4440 6190 0.6665 - - -
0.4447 6200 0.4965 - - -
0.4454 6210 0.5973 - - -
0.4462 6220 0.5608 - - -
0.4469 6230 0.6146 - - -
0.4476 6240 0.5945 - - -
0.4483 6250 0.5044 - - -
0.4490 6260 0.6034 - - -
0.4498 6270 0.6141 - - -
0.4505 6280 0.5858 - - -
0.4512 6290 0.5974 - - -
0.4519 6300 0.6554 - - -
0.4526 6310 0.5202 - - -
0.4533 6320 0.5724 - - -
0.4541 6330 0.6492 - - -
0.4548 6340 0.5159 - - -
0.4555 6350 0.5591 - - -
0.4562 6360 0.6472 - - -
0.4569 6370 0.6088 - - -
0.4576 6380 0.6208 - - -
0.4584 6390 0.6376 - - -
0.4591 6400 0.572 - - -
0.4598 6410 0.5349 - - -
0.4605 6420 0.5964 - - -
0.4612 6430 0.5505 - - -
0.4619 6440 0.6732 - - -
0.4627 6450 0.6193 - - -
0.4634 6460 0.5998 - - -
0.4641 6470 0.6722 - - -
0.4648 6480 0.562 - - -
0.4655 6490 0.6163 - - -
0.4663 6500 0.6194 - - -
0.4670 6510 0.6307 - - -
0.4677 6520 0.6048 - - -
0.4684 6530 0.6088 - - -
0.4687 6534 - 0.8857 0.8003 -
0.4691 6540 0.5555 - - -
0.4698 6550 0.5695 - - -
0.4706 6560 0.5297 - - -
0.4713 6570 0.5887 - - -
0.4720 6580 0.5488 - - -
0.4727 6590 0.5199 - - -
0.4734 6600 0.6172 - - -
0.4741 6610 0.6275 - - -
0.4749 6620 0.552 - - -
0.4756 6630 0.5625 - - -
0.4763 6640 0.6341 - - -
0.4770 6650 0.624 - - -
0.4777 6660 0.5949 - - -
0.4784 6670 0.592 - - -
0.4792 6680 0.63 - - -
0.4799 6690 0.539 - - -
0.4806 6700 0.4893 - - -
0.4813 6710 0.5384 - - -
0.4820 6720 0.6262 - - -
0.4827 6730 0.5817 - - -
0.4835 6740 0.5142 - - -
0.4842 6750 0.6155 - - -
0.4849 6760 0.6046 - - -
0.4856 6770 0.5208 - - -
0.4863 6780 0.5693 - - -
0.4871 6790 0.5925 - - -
0.4878 6800 0.5335 - - -
0.4885 6810 0.5036 - - -
0.4892 6820 0.6136 - - -
0.4899 6830 0.5461 - - -
0.4906 6840 0.5427 - - -
0.4914 6850 0.6388 - - -
0.4921 6860 0.5486 - - -
0.4928 6870 0.5807 - - -
0.4935 6880 0.5927 - - -
0.4942 6890 0.5138 - - -
0.4949 6900 0.5692 - - -
0.4957 6910 0.512 - - -
0.4964 6920 0.6018 - - -
0.4971 6930 0.5529 - - -
0.4978 6940 0.5709 - - -
0.4985 6950 0.5649 - - -
0.4992 6960 0.6347 - - -
0.5000 6970 0.6457 - - -
0.5007 6980 0.493 - - -
0.5014 6990 0.5897 - - -
0.5021 7000 0.5963 - - -
0.5028 7010 0.5828 - - -
0.5036 7020 0.4462 - - -
0.5043 7030 0.5167 - - -
0.5050 7040 0.6401 - - -
0.5057 7050 0.5416 - - -
0.5064 7060 0.5534 - - -
0.5071 7070 0.5706 - - -
0.5079 7080 0.6137 - - -
0.5086 7090 0.5938 - - -
0.5093 7100 0.5648 - - -
0.5100 7110 0.4971 - - -
0.5107 7120 0.604 - - -
0.5114 7130 0.5265 - - -
0.5122 7140 0.6485 - - -
0.5129 7150 0.4953 - - -
0.5136 7160 0.556 - - -
0.5143 7170 0.5575 - - -
0.5150 7180 0.5176 - - -
0.5157 7190 0.5259 - - -
0.5165 7200 0.6056 - - -
0.5172 7210 0.6033 - - -
0.5179 7220 0.5388 - - -
0.5186 7230 0.4771 - - -
0.5193 7240 0.5286 - - -
0.5200 7250 0.522 - - -
0.5208 7260 0.5192 - - -
0.5215 7270 0.4642 - - -
0.5222 7280 0.5815 - - -
0.5229 7290 0.5689 - - -
0.5236 7300 0.5474 - - -
0.5244 7310 0.5578 - - -
0.5251 7320 0.5625 - - -
0.5258 7330 0.5808 - - -
0.5265 7340 0.4624 - - -
0.5272 7350 0.5725 - - -
0.5279 7360 0.5179 - - -
0.5287 7370 0.5718 - - -
0.5294 7380 0.5296 - - -
0.5301 7390 0.6168 - - -
0.5308 7400 0.5107 - - -
0.5315 7410 0.4983 - - -
0.5322 7420 0.5214 - - -
0.5330 7430 0.6323 - - -
0.5337 7440 0.5855 - - -
0.5344 7450 0.4729 - - -
0.5351 7460 0.4938 - - -
0.5358 7470 0.553 - - -
0.5365 7480 0.5566 - - -
0.5373 7490 0.5617 - - -
0.5380 7500 0.5156 - - -
0.5387 7510 0.4875 - - -
0.5394 7520 0.5156 - - -
0.5401 7530 0.4714 - - -
0.5409 7540 0.4578 - - -
0.5416 7550 0.5131 - - -
0.5423 7560 0.5552 - - -
0.5430 7570 0.5136 - - -
0.5437 7580 0.4777 - - -
0.5444 7590 0.5379 - - -
0.5452 7600 0.6732 - - -
0.5459 7610 0.5154 - - -
0.5466 7620 0.5178 - - -
0.5473 7630 0.5495 - - -
0.5480 7640 0.5128 - - -
0.5487 7650 0.5204 - - -
0.5495 7660 0.5271 - - -
0.5502 7670 0.5384 - - -
0.5509 7680 0.6433 - - -
0.5516 7690 0.4918 - - -
0.5523 7700 0.4363 - - -
0.5530 7710 0.5323 - - -
0.5538 7720 0.5474 - - -
0.5545 7730 0.5893 - - -
0.5552 7740 0.5152 - - -
0.5559 7750 0.508 - - -
0.5566 7760 0.5241 - - -
0.5573 7770 0.533 - - -
0.5581 7780 0.5551 - - -
0.5588 7790 0.5938 - - -
0.5595 7800 0.5727 - - -
0.5602 7810 0.5684 - - -
0.5609 7820 0.4587 - - -
0.5617 7830 0.5074 - - -
0.5624 7840 0.5415 - - -
0.5631 7850 0.5848 - - -
0.5638 7860 0.5041 - - -
0.5645 7870 0.5171 - - -
0.5652 7880 0.4854 - - -
0.5660 7890 0.4966 - - -
0.5667 7900 0.5987 - - -
0.5674 7910 0.4858 - - -
0.5681 7920 0.5706 - - -
0.5688 7930 0.5498 - - -
0.5695 7940 0.5368 - - -
0.5703 7950 0.4902 - - -
0.5710 7960 0.5596 - - -
0.5717 7970 0.5445 - - -
0.5724 7980 0.5594 - - -
0.5731 7990 0.4938 - - -
0.5738 8000 0.5387 - - -
0.5746 8010 0.5252 - - -
0.5753 8020 0.4954 - - -
0.5760 8030 0.5151 - - -
0.5767 8040 0.4807 - - -
0.5774 8050 0.529 - - -
0.5782 8060 0.5128 - - -
0.5789 8070 0.5226 - - -
0.5796 8080 0.5176 - - -
0.5803 8090 0.5473 - - -
0.5810 8100 0.4694 - - -
0.5817 8110 0.6168 - - -
0.5825 8120 0.5293 - - -
0.5832 8130 0.4965 - - -
0.5839 8140 0.5242 - - -
0.5846 8150 0.544 - - -
0.5853 8160 0.5377 - - -
0.5860 8170 0.4485 - - -
0.5868 8180 0.5069 - - -
0.5875 8190 0.5158 - - -
0.5882 8200 0.4719 - - -
0.5889 8210 0.51 - - -
0.5896 8220 0.5616 - - -
0.5903 8230 0.5371 - - -
0.5911 8240 0.5196 - - -
0.5918 8250 0.5522 - - -
0.5925 8260 0.5521 - - -
0.5932 8270 0.5324 - - -
0.5939 8280 0.5127 - - -
0.5946 8290 0.5323 - - -
0.5954 8300 0.537 - - -
0.5961 8310 0.4763 - - -
0.5968 8320 0.5378 - - -
0.5975 8330 0.5467 - - -
0.5982 8340 0.4638 - - -
0.5990 8350 0.4969 - - -
0.5997 8360 0.4733 - - -
0.6004 8370 0.5341 - - -
0.6011 8380 0.5129 - - -
0.6018 8390 0.5206 - - -
0.6025 8400 0.5235 - - -
0.6033 8410 0.5429 - - -
0.6040 8420 0.5091 - - -
0.6047 8430 0.4689 - - -
0.6054 8440 0.4554 - - -
0.6061 8450 0.5019 - - -
0.6068 8460 0.5153 - - -
0.6076 8470 0.6021 - - -
0.6083 8480 0.4634 - - -
0.6090 8490 0.4937 - - -
0.6097 8500 0.5063 - - -
0.6104 8510 0.516 - - -
0.6111 8520 0.5554 - - -
0.6119 8530 0.5347 - - -
0.6126 8540 0.5134 - - -
0.6133 8550 0.5225 - - -
0.6140 8560 0.4929 - - -
0.6147 8570 0.5203 - - -
0.6155 8580 0.497 - - -
0.6162 8590 0.5288 - - -
0.6169 8600 0.5209 - - -
0.6176 8610 0.507 - - -
0.6183 8620 0.5293 - - -
0.6190 8630 0.5167 - - -
0.6198 8640 0.5352 - - -
0.6205 8650 0.5578 - - -
0.6212 8660 0.5579 - - -
0.6219 8670 0.5713 - - -
0.6226 8680 0.4888 - - -
0.6233 8690 0.4801 - - -
0.6241 8700 0.4636 - - -
0.6248 8710 0.5466 - - -
0.6249 8712 - 0.8224 0.8066 -
0.6255 8720 0.5104 - - -
0.6262 8730 0.5216 - - -
0.6269 8740 0.5057 - - -
0.6276 8750 0.5063 - - -
0.6284 8760 0.4317 - - -
0.6291 8770 0.5023 - - -
0.6298 8780 0.496 - - -
0.6305 8790 0.5099 - - -
0.6312 8800 0.5356 - - -
0.6319 8810 0.5398 - - -
0.6327 8820 0.476 - - -
0.6334 8830 0.4993 - - -
0.6341 8840 0.4822 - - -
0.6348 8850 0.4718 - - -
0.6355 8860 0.4758 - - -
0.6363 8870 0.5065 - - -
0.6370 8880 0.5102 - - -
0.6377 8890 0.4447 - - -
0.6384 8900 0.546 - - -
0.6391 8910 0.5625 - - -
0.6398 8920 0.5806 - - -
0.6406 8930 0.514 - - -
0.6413 8940 0.4845 - - -
0.6420 8950 0.4617 - - -
0.6427 8960 0.5025 - - -
0.6434 8970 0.5104 - - -
0.6441 8980 0.482 - - -
0.6449 8990 0.4998 - - -
0.6456 9000 0.4729 - - -
0.6463 9010 0.5519 - - -
0.6470 9020 0.5465 - - -
0.6477 9030 0.4455 - - -
0.6484 9040 0.4705 - - -
0.6492 9050 0.4455 - - -
0.6499 9060 0.5809 - - -
0.6506 9070 0.4515 - - -
0.6513 9080 0.5331 - - -
0.6520 9090 0.4467 - - -
0.6528 9100 0.4606 - - -
0.6535 9110 0.5086 - - -
0.6542 9120 0.4944 - - -
0.6549 9130 0.5332 - - -
0.6556 9140 0.5077 - - -
0.6563 9150 0.4812 - - -
0.6571 9160 0.5507 - - -
0.6578 9170 0.4836 - - -
0.6585 9180 0.4651 - - -
0.6592 9190 0.5284 - - -
0.6599 9200 0.4636 - - -
0.6606 9210 0.5041 - - -
0.6614 9220 0.418 - - -
0.6621 9230 0.4231 - - -
0.6628 9240 0.4677 - - -
0.6635 9250 0.523 - - -
0.6642 9260 0.495 - - -
0.6649 9270 0.4729 - - -
0.6657 9280 0.4469 - - -
0.6664 9290 0.4068 - - -
0.6671 9300 0.4954 - - -
0.6678 9310 0.4387 - - -
0.6685 9320 0.4104 - - -
0.6692 9330 0.4212 - - -
0.6700 9340 0.5106 - - -
0.6707 9350 0.4879 - - -
0.6714 9360 0.4934 - - -
0.6721 9370 0.5037 - - -
0.6728 9380 0.4081 - - -
0.6736 9390 0.4473 - - -
0.6743 9400 0.4908 - - -
0.6750 9410 0.5625 - - -
0.6757 9420 0.5785 - - -
0.6764 9430 0.5401 - - -
0.6771 9440 0.4242 - - -
0.6779 9450 0.4898 - - -
0.6786 9460 0.435 - - -
0.6793 9470 0.4314 - - -
0.6800 9480 0.4553 - - -
0.6807 9490 0.5175 - - -
0.6814 9500 0.4092 - - -
0.6822 9510 0.4513 - - -
0.6829 9520 0.4797 - - -
0.6836 9530 0.5147 - - -
0.6843 9540 0.4793 - - -
0.6850 9550 0.4406 - - -
0.6857 9560 0.4882 - - -
0.6865 9570 0.5474 - - -
0.6872 9580 0.5141 - - -
0.6879 9590 0.4602 - - -
0.6886 9600 0.4935 - - -
0.6893 9610 0.4411 - - -
0.6901 9620 0.4962 - - -
0.6908 9630 0.4408 - - -
0.6915 9640 0.5038 - - -
0.6922 9650 0.4907 - - -
0.6929 9660 0.4232 - - -
0.6936 9670 0.4522 - - -
0.6944 9680 0.41 - - -
0.6951 9690 0.4946 - - -
0.6958 9700 0.4476 - - -
0.6965 9710 0.516 - - -
0.6972 9720 0.4733 - - -
0.6979 9730 0.4614 - - -
0.6987 9740 0.3978 - - -
0.6994 9750 0.5053 - - -
0.7001 9760 0.5595 - - -
0.7008 9770 0.4486 - - -
0.7015 9780 0.5005 - - -
0.7022 9790 0.4225 - - -
0.7030 9800 0.5013 - - -
0.7037 9810 0.4477 - - -
0.7044 9820 0.4701 - - -
0.7051 9830 0.49 - - -
0.7058 9840 0.4581 - - -
0.7065 9850 0.4411 - - -
0.7073 9860 0.523 - - -
0.7080 9870 0.5018 - - -
0.7087 9880 0.4369 - - -
0.7094 9890 0.4481 - - -
0.7101 9900 0.4865 - - -
0.7109 9910 0.4814 - - -
0.7116 9920 0.5155 - - -
0.7123 9930 0.411 - - -
0.7130 9940 0.4613 - - -
0.7137 9950 0.444 - - -
0.7144 9960 0.4229 - - -
0.7152 9970 0.3818 - - -
0.7159 9980 0.4605 - - -
0.7166 9990 0.4462 - - -
0.7173 10000 0.4238 - - -
0.7180 10010 0.434 - - -
0.7187 10020 0.4647 - - -
0.7195 10030 0.4578 - - -
0.7202 10040 0.424 - - -
0.7209 10050 0.4705 - - -
0.7216 10060 0.4445 - - -
0.7223 10070 0.5135 - - -
0.7230 10080 0.41 - - -
0.7238 10090 0.44 - - -
0.7245 10100 0.4204 - - -
0.7252 10110 0.4657 - - -
0.7259 10120 0.468 - - -
0.7266 10130 0.4964 - - -
0.7274 10140 0.4993 - - -
0.7281 10150 0.4351 - - -
0.7288 10160 0.4347 - - -
0.7295 10170 0.3381 - - -
0.7302 10180 0.5415 - - -
0.7309 10190 0.4001 - - -
0.7317 10200 0.5359 - - -
0.7324 10210 0.4191 - - -
0.7331 10220 0.5139 - - -
0.7338 10230 0.4038 - - -
0.7345 10240 0.4754 - - -
0.7352 10250 0.4629 - - -
0.7360 10260 0.4581 - - -
0.7367 10270 0.3917 - - -
0.7374 10280 0.4575 - - -
0.7381 10290 0.4549 - - -
0.7388 10300 0.5175 - - -
0.7395 10310 0.4308 - - -
0.7403 10320 0.4396 - - -
0.7410 10330 0.506 - - -
0.7417 10340 0.5001 - - -
0.7424 10350 0.4761 - - -
0.7431 10360 0.4762 - - -
0.7438 10370 0.4796 - - -
0.7446 10380 0.4115 - - -
0.7453 10390 0.42 - - -
0.7460 10400 0.4872 - - -
0.7467 10410 0.4558 - - -
0.7474 10420 0.4399 - - -
0.7482 10430 0.4483 - - -
0.7489 10440 0.4172 - - -
0.7496 10450 0.4452 - - -
0.7503 10460 0.4153 - - -
0.7510 10470 0.4721 - - -
0.7517 10480 0.4941 - - -
0.7525 10490 0.4715 - - -
0.7532 10500 0.498 - - -
0.7539 10510 0.3506 - - -
0.7546 10520 0.4197 - - -
0.7553 10530 0.4125 - - -
0.7560 10540 0.4328 - - -
0.7568 10550 0.5102 - - -
0.7575 10560 0.3991 - - -
0.7582 10570 0.4378 - - -
0.7589 10580 0.466 - - -
0.7596 10590 0.4312 - - -
0.7603 10600 0.4249 - - -
0.7611 10610 0.3792 - - -
0.7618 10620 0.5206 - - -
0.7625 10630 0.5168 - - -
0.7632 10640 0.481 - - -
0.7639 10650 0.4632 - - -
0.7647 10660 0.3409 - - -
0.7654 10670 0.4891 - - -
0.7661 10680 0.4246 - - -
0.7668 10690 0.4763 - - -
0.7675 10700 0.4815 - - -
0.7682 10710 0.399 - - -
0.7690 10720 0.4272 - - -
0.7697 10730 0.4586 - - -
0.7704 10740 0.4231 - - -
0.7711 10750 0.3829 - - -
0.7718 10760 0.5181 - - -
0.7725 10770 0.4894 - - -
0.7733 10780 0.5098 - - -
0.7740 10790 0.5215 - - -
0.7747 10800 0.4637 - - -
0.7754 10810 0.3917 - - -
0.7761 10820 0.3441 - - -
0.7768 10830 0.4249 - - -
0.7776 10840 0.4442 - - -
0.7783 10850 0.4463 - - -
0.7790 10860 0.3829 - - -
0.7797 10870 0.4264 - - -
0.7804 10880 0.4512 - - -
0.7811 10890 0.3809 0.7688 0.8069 -
0.7819 10900 0.4447 - - -
0.7826 10910 0.4561 - - -
0.7833 10920 0.4434 - - -
0.7840 10930 0.3988 - - -
0.7847 10940 0.4622 - - -
0.7855 10950 0.4783 - - -
0.7862 10960 0.4139 - - -
0.7869 10970 0.4589 - - -
0.7876 10980 0.4697 - - -
0.7883 10990 0.443 - - -
0.7890 11000 0.4469 - - -
0.7898 11010 0.4428 - - -
0.7905 11020 0.4773 - - -
0.7912 11030 0.464 - - -
0.7919 11040 0.4289 - - -
0.7926 11050 0.4404 - - -
0.7933 11060 0.4808 - - -
0.7941 11070 0.4378 - - -
0.7948 11080 0.5612 - - -
0.7955 11090 0.4256 - - -
0.7962 11100 0.4124 - - -
0.7969 11110 0.3992 - - -
0.7976 11120 0.4104 - - -
0.7984 11130 0.5023 - - -
0.7991 11140 0.4849 - - -
0.7998 11150 0.4924 - - -
0.8005 11160 0.4554 - - -
0.8012 11170 0.4139 - - -
0.8020 11180 0.4228 - - -
0.8027 11190 0.4376 - - -
0.8034 11200 0.4341 - - -
0.8041 11210 0.4174 - - -
0.8048 11220 0.4935 - - -
0.8055 11230 0.4719 - - -
0.8063 11240 0.4347 - - -
0.8070 11250 0.4232 - - -
0.8077 11260 0.4341 - - -
0.8084 11270 0.4756 - - -
0.8091 11280 0.4161 - - -
0.8098 11290 0.4281 - - -
0.8106 11300 0.4067 - - -
0.8113 11310 0.4053 - - -
0.8120 11320 0.5421 - - -
0.8127 11330 0.4303 - - -
0.8134 11340 0.458 - - -
0.8141 11350 0.4362 - - -
0.8149 11360 0.3936 - - -
0.8156 11370 0.437 - - -
0.8163 11380 0.3638 - - -
0.8170 11390 0.4194 - - -
0.8177 11400 0.4276 - - -
0.8184 11410 0.4101 - - -
0.8192 11420 0.458 - - -
0.8199 11430 0.442 - - -
0.8206 11440 0.4359 - - -
0.8213 11450 0.4534 - - -
0.8220 11460 0.4688 - - -
0.8228 11470 0.4188 - - -
0.8235 11480 0.4292 - - -
0.8242 11490 0.4718 - - -
0.8249 11500 0.3786 - - -
0.8256 11510 0.4351 - - -
0.8263 11520 0.3869 - - -
0.8271 11530 0.4331 - - -
0.8278 11540 0.4696 - - -
0.8285 11550 0.4595 - - -
0.8292 11560 0.4335 - - -
0.8299 11570 0.4069 - - -
0.8306 11580 0.45 - - -
0.8314 11590 0.49 - - -
0.8321 11600 0.4664 - - -
0.8328 11610 0.4436 - - -
0.8335 11620 0.559 - - -
0.8342 11630 0.4535 - - -
0.8349 11640 0.4646 - - -
0.8357 11650 0.3551 - - -
0.8364 11660 0.4389 - - -
0.8371 11670 0.4897 - - -
0.8378 11680 0.4559 - - -
0.8385 11690 0.4397 - - -
0.8393 11700 0.3907 - - -
0.8400 11710 0.5208 - - -
0.8407 11720 0.4463 - - -
0.8414 11730 0.4028 - - -
0.8421 11740 0.3576 - - -
0.8428 11750 0.4087 - - -
0.8436 11760 0.3917 - - -
0.8443 11770 0.5077 - - -
0.8450 11780 0.4082 - - -
0.8457 11790 0.474 - - -
0.8464 11800 0.4174 - - -
0.8471 11810 0.3767 - - -
0.8479 11820 0.4821 - - -
0.8486 11830 0.4108 - - -
0.8493 11840 0.3639 - - -
0.8500 11850 0.4432 - - -
0.8507 11860 0.3886 - - -
0.8514 11870 0.4779 - - -
0.8522 11880 0.404 - - -
0.8529 11890 0.4651 - - -
0.8536 11900 0.4695 - - -
0.8543 11910 0.3713 - - -
0.8550 11920 0.4525 - - -
0.8557 11930 0.4531 - - -
0.8565 11940 0.4862 - - -
0.8572 11950 0.4772 - - -
0.8579 11960 0.424 - - -
0.8586 11970 0.3684 - - -
0.8593 11980 0.3982 - - -
0.8601 11990 0.4637 - - -
0.8608 12000 0.4752 - - -
0.8615 12010 0.4145 - - -
0.8622 12020 0.4236 - - -
0.8629 12030 0.4051 - - -
0.8636 12040 0.4203 - - -
0.8644 12050 0.3692 - - -
0.8651 12060 0.394 - - -
0.8658 12070 0.4427 - - -
0.8665 12080 0.4394 - - -
0.8672 12090 0.3222 - - -
0.8679 12100 0.3932 - - -
0.8687 12110 0.4082 - - -
0.8694 12120 0.4096 - - -
0.8701 12130 0.3993 - - -
0.8708 12140 0.4015 - - -
0.8715 12150 0.3619 - - -
0.8722 12160 0.4456 - - -
0.8730 12170 0.4571 - - -
0.8737 12180 0.4556 - - -
0.8744 12190 0.4558 - - -
0.8751 12200 0.4459 - - -
0.8758 12210 0.4515 - - -
0.8766 12220 0.4004 - - -
0.8773 12230 0.4654 - - -
0.8780 12240 0.4023 - - -
0.8787 12250 0.4333 - - -
0.8794 12260 0.4455 - - -
0.8801 12270 0.4145 - - -
0.8809 12280 0.3526 - - -
0.8816 12290 0.411 - - -
0.8823 12300 0.374 - - -
0.8830 12310 0.4718 - - -
0.8837 12320 0.401 - - -
0.8844 12330 0.4038 - - -
0.8852 12340 0.4706 - - -
0.8859 12350 0.4091 - - -
0.8866 12360 0.4187 - - -
0.8873 12370 0.4024 - - -
0.8880 12380 0.3695 - - -
0.8887 12390 0.3591 - - -
0.8895 12400 0.3335 - - -
0.8902 12410 0.3606 - - -
0.8909 12420 0.3709 - - -
0.8916 12430 0.3524 - - -
0.8923 12440 0.294 - - -
0.8930 12450 0.4481 - - -
0.8938 12460 0.402 - - -
0.8945 12470 0.4377 - - -
0.8952 12480 0.3799 - - -
0.8959 12490 0.431 - - -
0.8966 12500 0.4675 - - -
0.8974 12510 0.4373 - - -
0.8981 12520 0.314 - - -
0.8988 12530 0.4401 - - -
0.8995 12540 0.4617 - - -
0.9002 12550 0.4755 - - -
0.9009 12560 0.4238 - - -
0.9017 12570 0.3653 - - -
0.9024 12580 0.3693 - - -
0.9031 12590 0.4351 - - -
0.9038 12600 0.3866 - - -
0.9045 12610 0.4536 - - -
0.9052 12620 0.5173 - - -
0.9060 12630 0.5105 - - -
0.9067 12640 0.3794 - - -
0.9074 12650 0.404 - - -
0.9081 12660 0.4053 - - -
0.9088 12670 0.4064 - - -
0.9095 12680 0.4395 - - -
0.9103 12690 0.3785 - - -
0.9110 12700 0.4262 - - -
0.9117 12710 0.3822 - - -
0.9124 12720 0.3707 - - -
0.9131 12730 0.3777 - - -
0.9139 12740 0.4349 - - -
0.9146 12750 0.3646 - - -
0.9153 12760 0.4035 - - -
0.9160 12770 0.3438 - - -
0.9167 12780 0.337 - - -
0.9174 12790 0.3754 - - -
0.9182 12800 0.4289 - - -
0.9189 12810 0.467 - - -
0.9196 12820 0.4256 - - -
0.9203 12830 0.3679 - - -
0.9210 12840 0.3465 - - -
0.9217 12850 0.3643 - - -
0.9225 12860 0.354 - - -
0.9232 12870 0.4019 - - -
0.9239 12880 0.3729 - - -
0.9246 12890 0.3746 - - -
0.9253 12900 0.4627 - - -
0.9260 12910 0.3951 - - -
0.9268 12920 0.3693 - - -
0.9275 12930 0.4226 - - -
0.9282 12940 0.3942 - - -
0.9289 12950 0.4397 - - -
0.9296 12960 0.4165 - - -
0.9303 12970 0.3927 - - -
0.9311 12980 0.3793 - - -
0.9318 12990 0.4103 - - -
0.9325 13000 0.3728 - - -
0.9332 13010 0.3846 - - -
0.9339 13020 0.4065 - - -
0.9347 13030 0.3783 - - -
0.9354 13040 0.3783 - - -
0.9361 13050 0.4102 - - -
0.9368 13060 0.3836 - - -
0.9374 13068 - 0.7355 0.8072 -
0.9375 13070 0.3368 - - -
0.9382 13080 0.4058 - - -
0.9390 13090 0.5002 - - -
0.9397 13100 0.3993 - - -
0.9404 13110 0.4017 - - -
0.9411 13120 0.4148 - - -
0.9418 13130 0.3592 - - -
0.9425 13140 0.3963 - - -
0.9433 13150 0.3987 - - -
0.9440 13160 0.419 - - -
0.9447 13170 0.4185 - - -
0.9454 13180 0.4298 - - -
0.9461 13190 0.3671 - - -
0.9468 13200 0.3923 - - -
0.9476 13210 0.3999 - - -
0.9483 13220 0.3333 - - -
0.9490 13230 0.4123 - - -
0.9497 13240 0.4005 - - -
0.9504 13250 0.4017 - - -
0.9512 13260 0.3607 - - -
0.9519 13270 0.4122 - - -
0.9526 13280 0.3681 - - -
0.9533 13290 0.3403 - - -
0.9540 13300 0.4423 - - -
0.9547 13310 0.4315 - - -
0.9555 13320 0.4148 - - -
0.9562 13330 0.4654 - - -
0.9569 13340 0.4161 - - -
0.9576 13350 0.3704 - - -
0.9583 13360 0.3941 - - -
0.9590 13370 0.3545 - - -
0.9598 13380 0.3683 - - -
0.9605 13390 0.3908 - - -
0.9612 13400 0.3988 - - -
0.9619 13410 0.4066 - - -
0.9626 13420 0.413 - - -
0.9633 13430 0.427 - - -
0.9641 13440 0.4173 - - -
0.9648 13450 0.3884 - - -
0.9655 13460 0.3745 - - -
0.9662 13470 0.3937 - - -
0.9669 13480 0.3334 - - -
0.9676 13490 0.3882 - - -
0.9684 13500 0.3987 - - -
0.9691 13510 0.4433 - - -
0.9698 13520 0.4267 - - -
0.9705 13530 0.4004 - - -
0.9712 13540 0.4384 - - -
0.9720 13550 0.4509 - - -
0.9727 13560 0.4531 - - -
0.9734 13570 0.3745 - - -
0.9741 13580 0.4051 - - -
0.9748 13590 0.3827 - - -
0.9755 13600 0.3747 - - -
0.9763 13610 0.3756 - - -
0.9770 13620 0.3712 - - -
0.9777 13630 0.4406 - - -
0.9784 13640 0.3874 - - -
0.9791 13650 0.3802 - - -
0.9798 13660 0.3969 - - -
0.9806 13670 0.418 - - -
0.9813 13680 0.4128 - - -
0.9820 13690 0.4676 - - -
0.9827 13700 0.3834 - - -
0.9834 13710 0.3687 - - -
0.9841 13720 0.4016 - - -
0.9849 13730 0.4197 - - -
0.9856 13740 0.3992 - - -
0.9863 13750 0.4337 - - -
0.9870 13760 0.3324 - - -
0.9877 13770 0.3119 - - -
0.9885 13780 0.3704 - - -
0.9892 13790 0.3604 - - -
0.9899 13800 0.4234 - - -
0.9906 13810 0.4262 - - -
0.9913 13820 0.4042 - - -
0.9920 13830 0.4085 - - -
0.9928 13840 0.4015 - - -
0.9935 13850 0.3746 - - -
0.9942 13860 0.451 - - -
0.9949 13870 0.4334 - - -
0.9956 13880 0.4162 - - -
0.9963 13890 0.4029 - - -
0.9971 13900 0.427 - - -
0.9978 13910 0.3278 - - -
0.9985 13920 0.4409 - - -
0.9992 13930 0.3603 - - -
0.9999 13940 0.4769 - - -
1.0 13941 - - - 0.7611

Framework Versions

  • Python: 3.11.2
  • Sentence Transformers: 3.3.1
  • Transformers: 4.46.1
  • PyTorch: 2.4.0+cu121
  • Accelerate: 1.0.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

GISTEmbedLoss

@misc{solatorio2024gistembed,
    title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
    author={Aivin V. Solatorio},
    year={2024},
    eprint={2402.16829},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
Downloads last month
18
Safetensors
Model size
326M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hon9kon9ize/bert-large-cantonese-nli

Finetuned
(2)
this model
Finetunes
1 model

Datasets used to train hon9kon9ize/bert-large-cantonese-nli

Evaluation results