Instructions to use hmarkc/FW-merged with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use hmarkc/FW-merged with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="hmarkc/FW-merged")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("hmarkc/FW-merged", dtype="auto") - Notebooks
- Google Colab
- Kaggle
FW-Merging: Scaling Model Merging with Frank-Wolfe Optimization
This repository contains the Roberta model checkpoints resulting from applying Frank-Wolfe merging, as described in FW-Merging: Scaling Model Merging with Frank-Wolfe Optimization.
FW-Merging frames large-scale model merging as a constrained optimization problem. Fine-tuned checkpoints define the constraint set, while the objective dictates the desired properties of the merged model. It is designed to be robust to irrelevant models and effectively utilize relevant models for improved performance.
The merged model checkpoints can be found at: https://huggingface.co/hmarkc/FW-merged/tree/main/roberta
The code for merging the model and further details can be found at: https://github.com/hmarkc/FW-merged