From 216a92894e6f30c707a214fad5a5fba417e3bc39 Mon Sep 17 00:00:00 2001 From: Eli Date: Wed, 25 Dec 2024 08:20:06 -0600 Subject: [PATCH] Hide example html file for testing. --- .gitignore | 1 + tests/data/readme-ai.html | 1203 ------------------------------------- 2 files changed, 1 insertion(+), 1203 deletions(-) delete mode 100644 tests/data/readme-ai.html diff --git a/.gitignore b/.gitignore index 347ae00..fdd8ac2 100644 --- a/.gitignore +++ b/.gitignore @@ -42,3 +42,4 @@ docs/notes.md docs/roadmap.md src/splitme_ai/tools/ src/splitme_ai/tools +tests/data/readme-ai.html diff --git a/tests/data/readme-ai.html b/tests/data/readme-ai.html deleted file mode 100644 index 1fc556b..0000000 --- a/tests/data/readme-ai.html +++ /dev/null @@ -1,1203 +0,0 @@ -
-

- readme-ai -

-

- Designed for simplicity, customization, and developer - productivity. -

-

- - Github Actions - - Test Coverage - - PyPI Version - - Total Downloads - - MIT License - -

-
-

line break

- - -
-

[!IMPORTANT] Visit the Official Documentation for - detailed guides and tutorials.

-
-

line break

-

Introduction

-

ReadmeAI is a developer tool that automatically generates README - files using a robust repository processing engine and advanced language - models. Simply provide a URL or path to your codebase, and a - well-structured and detailed README will be generated.

-

Why Use ReadmeAI?

-

This project aims to streamline the process of creating and - maintaining documentation across all technical disciplines and - experience levels. The core principles include:

- -

Demo

-

Run from your terminal:

-

readmeai-cli-demo

- -

line break

-

Features

-

Let’s begin by exploring various customization options and styles - supported by ReadmeAI:

-
- -
- Header Styles & Themes -
- - - - - - - -
- custom-dragon-project-logo -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
-         --logo custom \
-         --badge-color FF4B4B \
-         --badge-style flat-square \
-         --header-style classic
-        
-
- docker-go-readme-example -

- CLI Command: -

-
$ readme --repository https://github.com/olliefr/docker-gs-ping \
-         --badge-color 00ADD8 \
-         --badge-style for-the-badge \
-         --header-style modern \
-         --navigation-style roman
-        
-
- -

- Banner Styles -

- - - - - - - -
- ascii-readme-header-style -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai \
-         --header-style ascii
-
- svg-banner -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
-         --badge-style for-the-badge \
-         --header-style svg
-
- -

- And More! -

- - - - -
- cloud-logo -

- CLI Command: -

-
$ readme --repository https://github.com/jwills/buenavista \
-           --align left \
-           --badge-style flat-square \
-           --logo cloud
-
- - - - - -
- balloon-logo -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
-           --badge-style flat \
-           --logo custom
-
$ Provide an image file path or URL: \
-           https://www.svgrepo.com/show/395851/balloon.svg
-
- skill-icons -

- CLI Command: -

-
$ readme --repository https://github.com/FerrariDG/async-ml-inference \
-           --badge-style skills-light \
-           --logo grey
-
- - - - - -
- compact-header -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai \
-           --logo cloud \
-           --header-style compact \
-           --navigation-style fold
-
- modern-style -

- CLI Command: -

-
$ readme --repository https://github.com/eli64s/readme-ai \
-           -i custom \
-           -bc BA0098 \
-           -bs flat-square \
-           -hs modern \
-           -ns fold
-
-
-
-

[!IMPORTANT] See the Official Documentation for - a complete list of customization options and examples.

-
-

Explore additional content sections supported by ReadmeAI:

-
- - πŸ”Ή Overview - -
- - - - - - - -
- Overview
-

- β—Ž The Overview section provides a high-level summary of - the - project, including its use case, benefits, and differentiating - features. -

-
- readme-overview-section -
-
-
- - πŸ”Έ Features - -
- - -
- Features Table
-

- β—Ž Generated markdown table that highlights the key technical features - and components of the codebase. This table is generated using a - structured - prompt - template. -

-
  </td>
-</tr>
-<tr>
-  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/llm-content/features-table.png" alt="readme-features-section" width="700">
-  </td>
-</tr>
-
-
-
- - πŸ”Ά Module Analysis - -
- - -
- Directory Tree
-

- β—Ž The project’s directory structure is generated using pure Python - and - embedded in the README. See - readmeai.generators.tree. - for more details. -

-
  </td>
-</tr>
-<tr>
-  <td align="center">
-    <img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/project-structure/tree.png" alt="directory-tree" width="700">
-  </td>
-</tr>
-<tr>
-  <td style="padding-top:20px;">
-    <b>File Summaries</b><br>
-    <p>β—Ž Summarizes key modules of the project, which are also used as context for downstream <a href="https://github.com/eli64s/readme-ai/blob/main/readmeai/config/settings/prompts.toml">prompts.</a>
-    </p>
-  </td>
-</tr>
-<tr>
-  <td align="center">
-    <img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/project-structure/file-summaries.png" alt="file-summaries" width="700">
-</tr>
-
-
-
- - πŸ”Ί Quickstart Guides - -
- - -
- Getting Started
-

- β—Ž Prerequisites and system requirements are extracted from the - codebase - during preprocessing. The - parsers - handles the majority of this logic currently. -

-
  </td>
-</tr>
-<tr>
-  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/getting-started/prerequisites-and-installation.png" alt="getting-started-section-prerequisites" width="700">
-  </td>
-</tr>
-<tr>
-  <td><b>Installation Guide</b><br>
-    <p>β—Ž <code>Installation</code>, <code>Usage</code>, and <code>Testing</code> guides are generated based on the project's dependency files and codebase configuration.
-    </p>
-    <tr>
-    <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/getting-started/usage-and-testing.png" alt="getting-started-section-usage-and-testing" width="700">
-  </td>
-</tr>
-
-
-
- - πŸ”» Contributing Guidelines - -
- - -
- Contributing Guide
-

- β—Ž Dropdown section that outlines general process for - contributing to - your project. -

-
    <p>β—Ž Provides links to your contributing guidelines, issues page, and more resources.</p>
-    <p>β—Ž Graph of contributors is also included.</p>
-    </p>
-  </td>
-</tr>
-<tr>
-  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/contributing/contributing-guidelines.png" alt="contributing-guidelines-section" width="700">
-  </td>
-</tr>
-<tr>
-  <td><b>Additional Sections</b><br>
-    <p>β—Ž <code>Roadmap</code>, <code>Contributing Guidelines</code>, <code>License</code>, and <code>acknowledgment</code> are included by default.
-    </p>
-  </td>
-</tr>
-<tr>
-  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/contributing/footer.png" alt="footer-readme-section" width="700"></td>
-</tr>
-
-
-

line break

-

Getting Started

-

Prerequisites

-

ReadmeAI requires Python 3.9 or higher, plus one installation - method - of your choice:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
RequirementDetails
β€’ Python - β‰₯3.9Core runtime
Installation Method (choose one)
β€’ pipDefault Python package manager
β€’ pipxIsolated environment installer
β€’ uvHigh-performance package manager
β€’ dockerContainerized environment
-

Supported Repository - Platforms

-

ReadmeAI needs access to your repository to generate a README - file. - Current supported platforms include:

- - - - - - - - - - - - - - - - - - - - - - - - - -
PlatformDetails
File - SystemLocal repository access
GitHubIndustry-standard hosting
GitLabFull DevOps integration
BitbucketAtlassian ecosystem
-

Supported LLM API - Services

-

ReadmeAI is model agnostic, with support for the following - LLM API - services:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ProviderBest ForDetails
OpenAIGeneral useIndustry-leading models
AnthropicAdvanced tasksClaude language models
Google - GeminiMultimodal AILatest Google technology
OllamaOpen sourceNo API key needed
Offline - ModeLocal operationNo internet required
-
-

Installation

-

ReadmeAI is available on PyPI as - readmeai and can - be installed as follows:

- -

pip Pip

-

Install with pip (recommended for most users):

-
❯ pip install -U readmeai
- -

pipx Pipx

-

With pipx, readmeai will be installed in an - isolated - environment:

-
❯ pipx install readmeai
-

uv Uv

-

The fastest way to install readmeai is with uv:

-
❯ uv tool install readmeai
- -

docker Docker

-

To run readmeai in a containerized environment, - pull the - latest image from [Docker Hub][dockerhub-link]:

-
❯ docker pull zeroxeli/readme-ai:latest
-

build-from-source From - source

-
- - Click to build readmeai from source - -
    -
  1. Clone the repository:

    -
    ❯ git clone https://github.com/eli64s/readme-ai
  2. -
  3. Navigate to the project - directory:

    -
    ❯ cd readme-ai
  4. -
  5. Install dependencies:

    -
    ❯ pip install -r setup/requirements.txt
  6. -
-

Alternatively, use the [setup script][setup-script] to - install - dependencies:

-

bash Bash

-
    -
  1. Run the setup script:

    -
    ❯ bash setup/setup.sh
  2. -
-

Or, use poetry to build and install project - dependencies:

-

poetry Poetry

-
    -
  1. Install dependencies with - poetry:

    -
    ❯ poetry install
  2. -
-
-


-

Additional Optional - Dependencies

-
-

[!IMPORTANT] To use the Anthropic and - Google - Gemini clients, extra dependencies are required. - Install the - package with the following extras:

- -
-

Usage

-

Set your API key

-

When running readmeai with a third-party - service, you - must provide a valid API key. For example, the - OpenAI - client is set as follows:

-
❯ export OPENAI_API_KEY=<your_api_key>
-
-# For Windows users:
-❯ set OPENAI_API_KEY=<your_api_key>
-
- - Click to view environment variables for - - Ollama, - Anthropic, Google Gemini - -
-
- - Ollama - -


-

Refer to the Ollama - documentation for more information on setting up the - Ollama - server.

-

To start, follow these steps:

-
    -
  1. Pull your model of choice from the Ollama - repository:

    -
    ❯ ollama pull llama3.2:latest
  2. -
  3. Start the Ollama server and set the - OLLAMA_HOST - environment variable:

    -
    ❯ export OLLAMA_HOST=127.0.0.1 && ollama serve
  4. -
-
-
- - Anthropic - -
    -
  1. Export your Anthropic API key:

    -
    ❯ export ANTHROPIC_API_KEY=<your_api_key>
  2. -
-
-
- - Google Gemini - -
    -
  1. Export your Google Gemini API key:

    -
    ❯ export GOOGLE_API_KEY=<your_api_key
  2. -
-
-
-

Using the CLI

-
Running with a LLM API - service
-

Below is the minimal command required to run - readmeai - using the OpenAI client:

-
❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai 
-
-

[!IMPORTANT] The default model set is - gpt-3.5-turbo, - offering the best balance between cost and performance.When - using any - model from the gpt-4 series and up, please - monitor your - costs and usage to avoid unexpected charges.

-
-

ReadmeAI can easily switch between API providers and models. - We can - run the same command as above with the Anthropic - client:

-
❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai
-

And finally, with the Google Gemini client:

-
❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
-
Running with local - models
-

We can also run readmeai with free and - open-source - locally hosted models using the Ollama:

-
❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
-
Running on a local - codebase
-

To generate a README file from a local codebase, simply - provide the - full path to the project:

-
❯ readmeai --repository /users/username/projects/myproject --api openai
-

Adding more customization options:

-
❯ readmeai --repository https://github.com/eli64s/readme-ai \
-           --output readmeai.md \
-           --api openai \
-           --model gpt-4 \
-           --badge-color A931EC \
-           --badge-style flat-square \
-           --header-style compact \
-           --navigation-style fold \
-           --temperature 0.9 \
-           --tree-depth 2
-           --logo LLM \
-           --emojis solar
-
Running in offline mode
-

ReadmeAI supports offline mode, allowing you to - generate - README files without using a LLM API service.

-
❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai
-

docker Docker

-

Run the readmeai CLI in a Docker container:

-
❯ docker run -it --rm \
-    -e OPENAI_API_KEY=$OPENAI_API_KEY \
-    -v "$(pwd)":/app zeroxeli/readme-ai:latest \
-    --repository https://github.com/eli64s/readme-ai \
-    --api openai
-

streamlit Streamlit

-

Try readme-ai directly in your browser on Streamlit Cloud, no - installation required.

-

-

See the readme-ai-streamlit - repository on GitHub for more details about the - application.

-
-

[!WARNING] The readme-ai Streamlit web app may not always - be - up-to-date with the latest features. Please use the - command-line - interface (CLI) for the most recent functionality.

-
-

build-from-source From - source

-
- - Click to run readmeai from source - -

bash Bash

-

If you installed the project from source with the bash - script, run - the following command:

-
    -
  1. Activate the virtual environment:

    -
    ❯ conda activate readmeai
  2. -
  3. Run the CLI:

    -
    ❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
  4. -
-

poetry Poetry

-
    -
  1. Activate the virtual environment:

    -
    ❯ poetry shell
  2. -
  3. Run the CLI:

    -
    ❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
  4. -
-
-

line break

-

Testing

- -

The pytest - and nox frameworks - are used for - development and testing.

-

Install the dependencies with uv:

-
❯ uv pip install -r pyproject.toml --all-extras
-

Run the unit test suite using Pytest:

-
❯ make test
-

Using nox, test the app against Python versions - 3.9, - 3.10, 3.11, and - 3.12:

-
❯ make test-nox
-
-

[!TIP] Nox is an automation tool for testing - applications in - multiple environments. This helps ensure your project is - compatible with - across Python versions and environments.

-
-

line break

-

Configuration

-

Customize your README generation with a variety of options - and style - settings supported such as:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
OptionDescriptionDefault
--alignText alignment in headercenter
--apiLLM API service provideroffline
--badge-colorBadge color name or hex code0080ff
--badge-styleBadge icon style typeflat
--header-styleHeader template styleclassic
--navigation-styleTable of contents stylebullet
--emojisEmoji theme packs prefixed to section titlesNone
--logoProject logo imageblue
--logo-sizeLogo image size30%
--modelSpecific LLM model to usegpt-3.5-turbo
--outputOutput filenamereadme-ai.md
--repositoryRepository URL or local directory pathNone
--temperatureCreativity level for content generation0.1
--tree-depthMaximum depth of the directory tree structure2
-

Run the following command to view all available options:

-
❯ readmeai --help
-

-

Visit the Official - Documentation for a complete guide on configuring and - customizing - README files.

-

line break

-

Examples

-

Explore a variety of README examples generated by - readmeai:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
TechOutputSourceDescription
Readme-aireadme-ai.mdreadme-aiReadme-ai project
Apache Flinkreadme-pyflink.mdpyflink-pocPyflink project
Streamlitreadme-streamlit.mdreadme-ai-streamlitStreamlit web app
Vercel & NPMreadme-vercel.mdgithub-readme-quotesVercel deployment
Go & Dockerreadme-docker-go.mddocker-gs-pingDockerized Go app
FastAPI & Redisreadme-fastapi-redis.mdasync-ml-inferenceAsync ML inference service
Javareadme-java.mdMinimal-TodoMinimalist todo Java app
PostgreSQL & DuckDBreadme-postgres.mdBuenavistaPostgres proxy server
Kotlinreadme-kotlin.mdandroid-clientAndroid client app
Offline Modeoffline-mode.mdlitellmLLM API service
-

-

Find additional README.md file examples in our examples - directory.

-

-

line break

-

Roadmap

- -

Contributing

-

Contributions are welcome! Please read the Contributing - Guide to get started.

- -


-

- - - -

-

Acknowledgments

-

A big shoutout to the projects below for their awesome work - and - open-source contributions:

-
-

- shields.io - - simpleicons.org - - tandpfun/skill-icons - - astrit/css.gg - - Ileriayo/markdown-badges - - Ileriayo/markdown-badges -

-
-

line break

-

πŸŽ— License

-

Copyright Β© 2023 readme-ai. -
- Released under the MIT - license.

-
-

-
- - - - - - - - -