Skip to content

Code to generate conversations with Code Soliloquies for Reliable Calculations

Notifications You must be signed in to change notification settings

luffycodes/Tutorbot-Spock-Phys

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code Soliloquies for Accurate Calculations in Large Language Models (Accepted at LAK'24)

This repository contains code for the paper: Code Soliloquies for Accurate Calculations in Large Language Models

Model: https://huggingface.co/luffycodes/higgs-llama-vicuna-ep25-70b

Demo can be found at the last of the readme.

Inference

To use the model, first install the fastchat library, and then follow the steps here:

  1. Update the conversation.py from our conversation_inference repository in the FastChat folder.
  2. Update the inference.py from our repository in the FastChat folder.
  3. Then run the model using the following command:
    • python3 -m fastchat.serve.cli --model-path luffycodes/higgs-llama-vicuna-ep25-70b --num-gpus 8 --temperature 0.7 --max-gpu-memory 20GiB
      
    • Note that code soliloquy is not implemented in UI

Creating synthetic conversational dataset to train Higgs

Example of generating conversational dataset using GPT

  1. Run the generate_conversations.py.
  2. Remember to put openai.api_key.
  3. Use the training instructions from fastchat library and run train_higgs_lora.py.

Descriptions of GPT-Tutorbot state prompts used to implement code soliloquy in conversation_gen

  • deciding_state.txt = The 'deciding_state' is the first state of code soliloquy in which the GPT-tutorbot determines whether a calculation is needed for its response to the student. If it determines a calculation is needs is needed, it generates the description for Python code for performing that calculation and gpt-tutorbot transitions to 'use_python_state'; if it decides that no calculation is needed, the gpt-tutorbot transitions to 'no_python_state'.
  • use_python_state.txt = In this state, gpt-tutorbot generates Python code based on Python code description generated during 'deciding_state'.
  • received_python_state.txt = This is the final state of gpt-tutorbot soliloquy (when it is using python) in which it generate tutorbot's response based on Python code output.
  • no_python_state.txt = This is the final state of gpt-tutorbot soliloquy (when it is not using python) in which it generate gpt-tutorbot's response.

If you use this work, please cite: Code Soliloquies for Accurate Calculations in Large Language Models https://arxiv.org/abs/2309.12161

@misc{sonkar2023code,
      title={Code Soliloquies for Accurate Calculations in Large Language Models}, 
      author={Shashank Sonkar and MyCo Le and Xinghe Chen and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk},
      year={2023},
      eprint={2309.12161},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

Demo:

About

Code to generate conversations with Code Soliloquies for Reliable Calculations

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •