{ "info": { "author": "", "author_email": "", "bugtrack_url": null, "classifiers": [], "description": "# Let's Sample Step by Step: Adaptive-Consistency for Efficient Reasoning with LLMs\n\n

\n Website \u2022\n Paper \n

\n\n\n

\n \n \"GitHub\n \n \n \"Twitter\"\n \n

\n\n\n\n\n[Pranjal Aggarwal](https://github.com/Pranjal2041), [Aman Madaan](https://madaan.github.io/), [Yiming Yang](https://www.cs.cmu.edu/~./yiming/), [Mausam](https://www.cse.iitd.ac.in/~mausam/)\n\n\n## Abstract\n>A popular approach for improving the correctness of output from large language models (LLMs) is Self-Consistency - poll the LLM multiple times and output the most frequent solution. Existing Self-Consistency techniques always draw a constant number of samples per question, where a better approach will be to non-uniformly distribute the available budget based on the amount of agreement in the samples drawn so far. In response, we introduce Adaptive-Consistency, a cost-efficient, model-agnostic technique that dynamically adjusts the number of samples per question using a lightweight stopping criterion. Our experiments over 13 datasets and two LLMs demonstrate that Adaptive-Consistency reduces sample budget by up to 6.0 times with an average accuracy drop of less than 0.1%.\n>\n\n![AdaptiveConsistency](docs/static/images/ac_teaser_new.png)\n\n\n\n\n# Adaptive Consistency: \nThis repository contains code for:\n1. Adaptive-Consistency Library for Running efficient LLM generation using [Adaptive-Consistency](http://sample-step-by-step.info) in your code.\n2. Code to reproduce results of [Adaptive-Consistency](https://arxiv.org/abs/2305.11860).\n\n## Installation\n\n### From PyPi\n\n```bash\npip install AdaptiveConsistency\n```\n\n### From Source\n\nFirst, clone the repo:\n```bash\ngit clone https://github.com/Pranjal2041/AdaptiveConsistency.git\n```\n\nNext install the package using: \n```bash \npython setup.py install\n```\n\n## Usage\n\nUsing Adaptive Consistency in your code requires only 2-3 lines of changes in your existing framework.\n\n### 1. Importing the library\n\n```python\nfrom adaptive_consistency import AC, BetaStoppingCriteria\n```\n\n### 2. Initializing the library\n\n```python\nac = AC(model, stopping_criteria=BetaStoppingCriteria(0.95), max_gens = 40)\n```\n\n### 3. Using the library\n\nYou can directly run a whole loop of evaluation using:\n\n```python\nac.eval_loop(sampling_function, *args, **kwargs)\n```\n\nFor example, if using Openai api for sampling, you can use:\n\n```python\nimport openai\n\nac.eval_loop(openai.Completion.create, engine=\"text-davinci-003\", prompt=\"Solve the questions ahead\", max_tokens=5)\n```\n\nOr you can check for consistency of answers at each step:\n\n```python\nanswers = []\nfor i in range(40):\n answers.append(generate_answer_from_model()) # Example openai.Completion.create\n if ac.should_stop(answers):\n break\n```\n\n\n### 4. Stoppping Criterias\n\nYou can use one of the following Stopping Criterias:\n\n1. `BetaStoppingCriteria (beta)`: Uses the Beta Distribution to guide the stopping criteria. This is the default stopping criteria.\n2. `DirichletStoppingCriteria (dirichlet)`: Uses the Dirichlet Distribution to guide the stopping criteria.\n3. `EntropyStoppingCriteria (entropy)`: Uses the Entropy of the distribution to guide the stopping criteria.\n4. `MajorityStoppingCriteria (majority)`: Uses the Majority ratio of the top element in the distribution to guide the stopping criteria.\n5. `RandomStoppingCriteria (random)`: Randomly stops the sampling process with a pre-defined probability.\n6. `CRPStoppingCriteria (crp)`: Uses the Chinese Restaurant Process to guide the stopping criteria.\n\nCheck out the paper for more details.\n\n\n## Reproducing Numbers\n\n\n### 1. Downloading the data\n\nRun, \n\n```bash\nbash download_data.sh\n```\n\n### 2. Downloading Model Outputs\n\nWe provide the model outputs for all the models used in the paper. You can download them using:\n\n```bash\nbash download_outputs.sh\n```\n\nThese model outputs will work for all experiments in the paper.\n\n### 3. Running Generations\n\nIf you decide to skip the previous step, you can run your generations on your own. You can use the following command:\n\n```bash\nbash scripts/run_self_consistency.sh\nbash scripts/run_adaptive_consistency.sh\n```\n\nBy default, `beta` function will be used for stopping criteria. You can change it by passing the `stopping_criteria` and corresponding Confidence Threshold as arguments. For example, to use `entropy` stopping criteria, with a Confidence Threshold of 0.75, you can use:\n\n```bash\nbash scripts/run_adaptive_consistency.sh entropy 0.75\n``` \n\nThis step will print the final accuracy on the terminal.\n\n### 4. Running Eval on Model Outputs\n\nYou can skip Step 3, and directly run eval on the model outputs. You can use the following command:\n\n```bash\npython eval_outputs.py --output_file --stop_criteria --stop_criteria_thresh \n```\n\nThis will print the average generations and accuracy on the terminal.\n\n\n\n\n\n## Citation\n\n```bibtex\n@misc{aggarwal2023lets,\n title={Let's Sample Step by Step: Adaptive-Consistency for Efficient Reasoning with LLMs}, \n author={Pranjal Aggarwal and Aman Madaan and Yiming Yang and Mausam},\n year={2023},\n eprint={2305.11860},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n```\n\n## LICENSE\n\nAdaptive-Consistency is MIT licensed, as found in the [LICENSE](LICENSE) file.\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "AdaptiveConsistency", "package_url": "https://pypi.org/project/AdaptiveConsistency/", "platform": null, "project_url": "https://pypi.org/project/AdaptiveConsistency/", "project_urls": null, "release_url": "https://pypi.org/project/AdaptiveConsistency/1.0.0/", "requires_dist": null, "requires_python": "", "summary": "Library for running AdapativeConsistency based Inference on large language models.", "version": "1.0.0", "yanked": false, "yanked_reason": null }, "last_serial": 18202235, "releases": { "0.0.1": [ { "comment_text": "", "digests": { "blake2b_256": "9efd764272d512f9b0ecc8c2e2a0993ad7bfc9de7751b6c74369557ececd52c8", "md5": "46608a822ec2717522d77a3447a806f2", "sha256": "382f962b225737d5a33ea132a680d9ed4064d2d5b41a90b868488c5f00bd5636" }, "downloads": -1, "filename": "AdaptiveConsistency-0.0.1.tar.gz", "has_sig": false, "md5_digest": "46608a822ec2717522d77a3447a806f2", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 7200, "upload_time": "2023-05-22T18:13:39", "upload_time_iso_8601": "2023-05-22T18:13:39.842778Z", "url": "https://files.pythonhosted.org/packages/9e/fd/764272d512f9b0ecc8c2e2a0993ad7bfc9de7751b6c74369557ececd52c8/AdaptiveConsistency-0.0.1.tar.gz", "yanked": false, "yanked_reason": null } ], "1.0.0": [ { "comment_text": "", "digests": { "blake2b_256": "ad496132bb951dd3b670ab4bc41b70500dd2664b4a2af986c8bfb360f4c12715", "md5": "6cbc0ecfe5a9d146c25acd0f9504b748", "sha256": "044e1af9218742beba82e61116287dd51d9f8cccfc187950b6543b64fcba3a97" }, "downloads": -1, "filename": "AdaptiveConsistency-1.0.0.tar.gz", "has_sig": false, "md5_digest": "6cbc0ecfe5a9d146c25acd0f9504b748", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 7205, "upload_time": "2023-05-22T18:20:30", "upload_time_iso_8601": "2023-05-22T18:20:30.725537Z", "url": "https://files.pythonhosted.org/packages/ad/49/6132bb951dd3b670ab4bc41b70500dd2664b4a2af986c8bfb360f4c12715/AdaptiveConsistency-1.0.0.tar.gz", "yanked": false, "yanked_reason": null } ] }, "urls": [ { "comment_text": "", "digests": { "blake2b_256": "ad496132bb951dd3b670ab4bc41b70500dd2664b4a2af986c8bfb360f4c12715", "md5": "6cbc0ecfe5a9d146c25acd0f9504b748", "sha256": "044e1af9218742beba82e61116287dd51d9f8cccfc187950b6543b64fcba3a97" }, "downloads": -1, "filename": "AdaptiveConsistency-1.0.0.tar.gz", "has_sig": false, "md5_digest": "6cbc0ecfe5a9d146c25acd0f9504b748", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 7205, "upload_time": "2023-05-22T18:20:30", "upload_time_iso_8601": "2023-05-22T18:20:30.725537Z", "url": "https://files.pythonhosted.org/packages/ad/49/6132bb951dd3b670ab4bc41b70500dd2664b4a2af986c8bfb360f4c12715/AdaptiveConsistency-1.0.0.tar.gz", "yanked": false, "yanked_reason": null } ], "vulnerabilities": [] }