Radiant-Shadow-12B / README.md
Vortex5's picture
Update README.md
014a70b verified
metadata
base_model:
  - Retreatcost/KansenSakura-Radiance-RP-12b
  - Vortex5/Lunar-Nexus-12B
  - Vortex5/Shadow-Crystal-12B
library_name: transformers
tags:
  - mergekit
  - merge
  - roleplay

Radiant-Shadow-12B

This is a merge of pre-trained language models created using mergekit.

📒Notes: I had some issues with chatml instruction template, try Mistral V7 works well.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
- sources:
  - model: Vortex5/Lunar-Nexus-12B
    layer_range: [0, 17]

- sources:
  - model: Retreatcost/KansenSakura-Radiance-RP-12b
    layer_range: [17, 31]

- sources:
  - model: Vortex5/Shadow-Crystal-12B
    layer_range: [31, 40]
merge_method: passthrough
dtype: bfloat16
tokenizer:
  source: union