Skip to content

Commit

Permalink
bf/DH-4598 update structure of readthedocs
Browse files Browse the repository at this point in the history
  • Loading branch information
aazo11 committed Sep 8, 2023
1 parent 4af9fb5 commit 0c8c144
Show file tree
Hide file tree
Showing 8 changed files with 81 additions and 30 deletions.
5 changes: 3 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,11 @@

sys.path.insert(0, os.path.abspath(".."))

project = "Dataherald"
project = "Dataherald AI"
copyright = "2023, Dataherald"
author = "Dataherald"
release = "0.0.1"
release = "main"
html_title = project

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
Expand Down
33 changes: 33 additions & 0 deletions docs/envars.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
Environment Variables
=======================


# Openai info. All these fields are required for the engine to work.
OPENAI_API_KEY = #This field is required for the engine to work.
ORG_ID =
LLM_MODEL = 'gpt-4-32k' #the openAI llm model that you want to use. possible values: gpt-4-32k, gpt-4, gpt-3.5-turbo, gpt-3.5-turbo-16k


GOLDEN_RECORD_COLLECTION = 'my-golden-records'
#Pinecone info. These fields are required if the vector store used is Pinecone
PINECONE_API_KEY =
PINECONE_ENVIRONMENT =

# Module implementations to be used names for each required component. You can use the default ones or create your own
API_SERVER = "dataherald.api.fastapi.FastAPI"
SQL_GENERATOR = "dataherald.sql_generator.dataherald_sqlagent.DataheraldSQLAgent"
EVALUATOR = "dataherald.eval.simple_evaluator.SimpleEvaluator"
DB = "dataherald.db.mongo.MongoDB"
VECTOR_STORE = 'dataherald.vector_store.chroma.Chroma'
CONTEXT_STORE = 'dataherald.context_store.default.DefaultContextStore' # Set a context store class, the default one is DefaultContextStore
DB_SCANNER = 'dataherald.db_scanner.sqlalchemy.SqlAlchemyScanner'

# mongo database information
MONGODB_URI = "mongodb://admin:admin@mongodb:27017"
MONGODB_DB_NAME = 'dataherald'
MONGODB_DB_USERNAME = 'admin'
MONGODB_DB_PASSWORD = 'admin'


# The enncryption key is used to encrypt database connection info before storing in Mongo. Please refer to the README on how to set it.
ENCRYPT_KEY =
19 changes: 0 additions & 19 deletions docs/getting_started.rst

This file was deleted.

32 changes: 28 additions & 4 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,38 @@ Dataherald AI
========================================
Welcome to the official documentation page of the Dataherald AI engine. This documentation is intended for developers who want to:

* Use the Dataherald AI engine to set up Natural Language interfaces from structured data in their own projects.
* Contribute to the Dataherald AI engine.
* 🖥️ Use the Dataherald AI engine to set up Natural Language interfaces from structured data in their own projects.
* 🏍️ Contribute to the Dataherald AI engine.

These documents will cover how to get started, how to set up an API from your database that can answer questions in plain English and how to extend the core engine's functionality.

We also have an active Discord community you can join

.. toctree::
:maxdepth: 1
:caption: Getting Started
:hidden:

introduction
quickstart


.. toctree::
:caption: References
:hidden:

getting_started
api
modules
envars
modules

.. toctree::
:caption: Tutorials
:hidden:

tutorial.sample_database
tutorial.finetune_sql_generator
tutorial.chatgpt_plugin




10 changes: 5 additions & 5 deletions docs/introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ You can use Dataherald to:

Dataherald is built to:

* Be modular, allowing different implementations of core modules to be plugged-in
* Come batteries included: Have best-in-class implementations for modules like text to SQL, evaluation
* Be easy to set-up and use with major data warehouses
* Allow for Active Learning, allowing you to improve the performance with usage
* Be fast
* 🔌 Be modular, allowing different implementations of core modules to be plugged-in
* 🔋 Come batteries included: Have best-in-class implementations for modules like text to SQL, evaluation
* 📀 Be easy to set-up and use with major data warehouses
* 👨‍🏫 Allow for Active Learning, allowing you to improve the performance with usage
* 🏎️ Be fast
4 changes: 4 additions & 0 deletions docs/tutorial.chatgpt_plugin.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Create a ChatGPT plug-in from your structured data
=====================================================

Coming soon ...
4 changes: 4 additions & 0 deletions docs/tutorial.finetune_sql_generator.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Using a Custom Text to SQL Engine
==================================

Coming soon ...
4 changes: 4 additions & 0 deletions docs/tutorial.sample_database.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Setting up a sample Database for accurate NL-to-SQL
====================================================

Coming soon ...

0 comments on commit 0c8c144

Please sign in to comment.