tonyducks's picture
Model save
f4e952d verified
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: ViT_bean_leaves_model
    results: []

ViT_bean_leaves_model

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0320
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0335 0.0769 10 0.9775 0.6165
0.8744 0.1538 20 0.8419 0.8647
0.7576 0.2308 30 0.6803 0.9248
0.5918 0.3077 40 0.5376 0.9624
0.4833 0.3846 50 0.4257 0.9699
0.3873 0.4615 60 0.3395 0.9624
0.3508 0.5385 70 0.2656 0.9774
0.3008 0.6154 80 0.2369 0.9549
0.2282 0.6923 90 0.1895 0.9925
0.204 0.7692 100 0.1596 0.9699
0.213 0.8462 110 0.1350 0.9925
0.1618 0.9231 120 0.1679 0.9624
0.1657 1.0 130 0.1061 0.9925
0.0997 1.0769 140 0.1020 0.9925
0.1185 1.1538 150 0.0892 0.9925
0.1329 1.2308 160 0.0903 1.0
0.0671 1.3077 170 0.0767 1.0
0.0634 1.3846 180 0.0696 0.9925
0.0618 1.4615 190 0.0631 1.0
0.0896 1.5385 200 0.0687 0.9925
0.0519 1.6154 210 0.0641 0.9925
0.052 1.6923 220 0.0580 0.9925
0.049 1.7692 230 0.0707 0.9925
0.1221 1.8462 240 0.0723 0.9925
0.0798 1.9231 250 0.0645 0.9850
0.043 2.0 260 0.0590 0.9925
0.0461 2.0769 270 0.0549 0.9850
0.0391 2.1538 280 0.0626 0.9925
0.0368 2.2308 290 0.0612 0.9925
0.0357 2.3077 300 0.0510 0.9925
0.035 2.3846 310 0.0448 1.0
0.0339 2.4615 320 0.0437 1.0
0.0352 2.5385 330 0.0454 0.9925
0.0315 2.6154 340 0.0457 0.9925
0.1151 2.6923 350 0.0390 1.0
0.0324 2.7692 360 0.0386 1.0
0.0298 2.8462 370 0.0370 1.0
0.0316 2.9231 380 0.0363 1.0
0.0295 3.0 390 0.0363 1.0
0.0284 3.0769 400 0.0363 1.0
0.0278 3.1538 410 0.0357 1.0
0.0296 3.2308 420 0.0352 1.0
0.0299 3.3077 430 0.0348 1.0
0.0277 3.3846 440 0.0351 1.0
0.0301 3.4615 450 0.0331 1.0
0.0281 3.5385 460 0.0328 1.0
0.0279 3.6154 470 0.0325 1.0
0.0283 3.6923 480 0.0324 1.0
0.0278 3.7692 490 0.0323 1.0
0.0286 3.8462 500 0.0322 1.0
0.0267 3.9231 510 0.0321 1.0
0.0276 4.0 520 0.0320 1.0

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.1