Instructions to use purewhite42/DExplorer-8B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use purewhite42/DExplorer-8B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="purewhite42/DExplorer-8B") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("purewhite42/DExplorer-8B") model = AutoModelForCausalLM.from_pretrained("purewhite42/DExplorer-8B") messages = [ {"role": "user", "content": "Who are you?"}, ] inputs = tokenizer.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use purewhite42/DExplorer-8B with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "purewhite42/DExplorer-8B" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "purewhite42/DExplorer-8B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/purewhite42/DExplorer-8B
- SGLang
How to use purewhite42/DExplorer-8B with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "purewhite42/DExplorer-8B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "purewhite42/DExplorer-8B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "purewhite42/DExplorer-8B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "purewhite42/DExplorer-8B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use purewhite42/DExplorer-8B with Docker Model Runner:
docker model run hf.co/purewhite42/DExplorer-8B
[ICLR'26] Let's Explore Step by Step: Generating Provable Formal Statements with Deductive Exploration
Qi Liu, Kangjie Bao, Yue Yang, Xinhao Zheng, Renqiu Xia, Qinxiang Cao, Junchi Yan* (* indicates corresponding author)
School of Computer Science & School of Artificial Intelligence, Shanghai Jiao Tong University
Shanghai Innovation Institute
Please refer to the πΊGitHub repo and πPaper for more details.
π About
DExplorer-8B is a Lean 4-based agent that generates provable formal mathematical statements through step-by-step Deductive Exploration (DExploration). Instead of directly synthesizing problems in one shot, DExplorer explores the mathematical world step by step β introducing variables/hypotheses, deriving intermediate facts, and submitting conclusions β with each step verified by the Lean 4 kernel. This ensures the provability of generated statements while enabling the synthesis of complex problems that push the limits of state-of-the-art provers.
This model is fine-tuned from Goedel-Prover-V2-8B on DExploration-40K using supervised fine-tuning.
βοΈ Usage
This model is SFTed for the DExploration task: given the current exploration state (introduced variables/hypotheses and Lean 4 context), the model proposes the next exploration step β either introducing a new variable/hypothesis, deriving a new fact, or submitting a conclusion.
See the πΊGitHub repo for prompt templates and detailed usage.
π Citation
If you find our work useful in your research, please cite:
@inproceedings{
liu2026lets,
title={Let's Explore Step by Step: Generating Provable Formal Statements with Deductive Exploration},
author={Qi Liu and Kangjie Bao and Yue Yang and Xinhao Zheng and Renqiu Xia and Qinxiang Cao and Junchi Yan},
booktitle={The Fourteenth International Conference on Learning Representations},
year={2026},
url={https://openreview.net/forum?id=Njrkeo3DiJ}
}
Β©οΈ License
This project is released under the Apache 2.0 license. See LICENSE for details.
π€ Contributing
We welcome contributions! Please feel free to submit issues or pull requests.
π§ Contact
For questions about the paper, data, or code:
- Qi Liu: purewhite@sjtu.edu.cn
- Issues: GitHub Issues
π Acknowledgments
- NuminaMath-LEAN for the high-quality statement-proof data.
- Goedel-Prover for the base model.
- Pantograph for Lean 4 interaction.
- And many other open-source projects!
- Downloads last month
- 5