Skip to content

Commit

Permalink
Update Readmes
Browse files Browse the repository at this point in the history
  • Loading branch information
petrgazarov committed Oct 28, 2023
1 parent baf8bed commit 89405ac
Show file tree
Hide file tree
Showing 3 changed files with 76 additions and 58 deletions.
110 changes: 64 additions & 46 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,63 @@ Salami is a declarative domain-specific language for cloud infrastructure based
**[Short demo video](https://youtu.be/ej629E0WOIY)** |
**[Release blog post](https://www.petrgazarov.com/posts/infrastructure-as-natural-language)**

## 🚀 Getting Started

### Installation

Homebrew (Mac OS, Linux):

```bash
brew tap petrgazarov/salami
brew install salami
```

Manual:

Download the latest binaries for Mac OS, Linux and Windows from the [releases page](https://github.com/petrgazarov/salami/releases).

### Config

The root of your project should contain a `salami.yaml` config file.

Example:

```yaml
compiler:
target:
platform: terraform
llm:
provider: openai
model: gpt4
api_key: ${OPENAI_API_KEY}
source_dir: salami
target_dir: terraform
```
| Configuration Setting | Description | Required |
| --------------------------- | ----------------------------------------------------------------------------- | -------- |
| compiler.target.platform | Platform to target. Only `terraform` value is currently supported. | Yes |
| compiler.llm.provider | Provider for the LLM. Only `openai` value is currently supported. | Yes |
| compiler.llm.model | Model used by the provider. Only `gpt4` value is currently supported. | Yes |
| compiler.llm.api_key | OpenAI API key. To set it to an env variable, use the `${ENV_VAR}` delimeter. | Yes |
| compiler.llm.max_concurrent | Maximum number of concurrent API calls to OpenAI API. Default is 5. | No |
| compiler.source_dir | The directory where your Salami files are located. | Yes |
| compiler.target_dir | The directory where the Terraform files should be written. | Yes |

### Usage

From the root of your project, run:

```bash
salami compile
```

For verbose output, run:

```bash
salami -v compile
```

## 🎨 Design

### Constructs
Expand Down Expand Up @@ -47,63 +104,24 @@ For more examples, see the `examples` directory. Each example has a README file

**@variable**

| Position | Argument | Type | Required? | Examples |
| -------- | ------------- | ------ | --------- | ------------------------------------ |
| 1 | name | string | Yes | `container_port`, `logs_bucket_name` |
| 2 | variable type | string | Yes | `string`, `number`, `boolean` |
| 3 | default | any | No | `8080`, `logs_bucket_1fdretbnHUdfn` |
| Position | Argument | Type | Required | Examples |
| -------- | ------------- | ------ | -------- | ------------------------------------ |
| 1 | name | string | Yes | `container_port`, `logs_bucket_name` |
| 2 | variable type | string | Yes | `string`, `number`, `boolean` |
| 3 | default | any | No | `8080`, `logs_bucket_1fdretbnHUdfn` |

### Lock file

Salami compiler generates a lock file that includes parsed Salami objects and the resulting Terraform code. The lock file is used to determine which Salami objects have changed since the last compilation. Unchanged objects are not sent to LLM, which makes the compilation process much faster.
The compiler generates a lock file that includes parsed Salami objects and the resulting Terraform code. It is used to determine which objects have changed since the last compilation. Unchanged objects are not sent to LLM, which makes the compilation process much faster.

### File extension

`.sami` is the extension for Salami files.

## 🚀 Getting Started

### Installation

Homebrew (Mac OS, Linux):

```bash
brew tap petrgazarov/salami
brew install salami
```

Manual:

Download the latest binaries for Mac OS, Linux and Windows from the [releases page](https://github.com/petrgazarov/salami/releases).

### Usage

From the root of your project, run:

```bash
salami compile
```

The root of your project should contain the `salami.yaml` config file with the following structure:

```yaml
compiler:
target:
platform: terraform
llm:
provider: openai
model: gpt4
api_key: ${OPENAI_API_KEY}
source_dir: salami
target_dir: terraform
```
Set `compiler.source_dir` to the directory where your Salami files are, and `compiler.target_dir` to the directory where the Terraform files should be written. The config file supports environment variables, which is useful to avoid storing secrets in version control. To inject an env variable at runtime, use the `${ENV_VAR}` delimeter. Use `compiler.llm.max_concurrent` config to control how many concurrent API calls are made to OpenAI API. The default is 5.

## ✅ VS Code Extension

It's recommended to install the [Salami VS Code extension](https://marketplace.visualstudio.com/items?itemName=Salami.salami). It provides highlighting functionality for the `.sami` files.

## 😍 Contributing

Contributions are welcome! If your contribution is non-trivial, please open an issue first to discuss the proposed changes.
Contributions are welcome! For non-trivial contributions, please open an issue first to discuss the proposed changes.
12 changes: 6 additions & 6 deletions examples/public_and_private_ecs_services/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Example: public and private ECS services

This example creates a VPC with public and private subnets, 2 ECS Fargate services, a load balancer and a few other resources.
Creates a VPC with public and private subnets, 2 ECS Fargate services, a load balancer and a few other resources.

## Running the example

Expand All @@ -10,17 +10,17 @@ To run this example, you need:

- `terraform` installed
- `salami` installed (follow installation instructions in the [README](../../README.md))
- AWS credentials (optional, if you want to deploy the infrastructure)
- AWS credentials (optional, to deploy the infrastructure)

### Steps

1. Clone this repository
2. `cd` into the `examples/public_and_private_ecs_services` directory
3. Run `salami compile` to compile the Salami descriptions into Terraform code
4. Optionally, `cd` into the `examples/public_and_private_ecs_services/terraform` directory and run the usual `terraform init`, `terraform plan` and `terraform apply` commands to deploy the infrastructure. Make sure to pass the AWS credentials to Terraform (Salami does not generate the `provider` block for you).
3. Run `salami compile` to run the compiler
4. Optionally, `cd` into the `examples/public_and_private_ecs_services/terraform` directory and run the `terraform init` and `terraform apply` commands to deploy to AWS.

### FYI

1. Note that `salami compile` will examine the `salami-lock.toml` file and the source `.sami` files, and determine which Salami objects have changed since the last compilation. To force a complete recompilation, delete the `salami-lock.toml` file. Or, you can change source `.sami` files and `salami compile` will recompile only the changed objects.
1. `salami compile` command examines `salami-lock.toml` and the source `.sami` files to determine the changeset. To force a complete regeneration, delete the `salami-lock.toml` file and rerun the compiler.

2. Occassionally, OpenAI API delays responses significantly. If `salami compile` is stuck, try setting `compiler.llm.max_concurrent` config to a lower value.
2. If timeout error is raised, try setting `compiler.llm.max_concurrent` config to a lower number. This slows down the compilation process, but reduces the likelihood of timeouts from OpenAI.
12 changes: 6 additions & 6 deletions examples/simple_s3_bucket/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Example: simple S3 bucket

This example creates an S3 bucket.
Creates an S3 bucket.

## Running the example

Expand All @@ -10,17 +10,17 @@ To run this example, you need:

- `terraform` installed
- `salami` installed (follow installation instructions in the [README](../../README.md))
- AWS credentials (optional, if you want to deploy the infrastructure)
- AWS credentials (optional, to deploy the infrastructure)

### Steps

1. Clone this repository
2. `cd` into the `examples/simple_s3_bucket` directory
3. Run `salami compile` to compile the Salami descriptions into Terraform code
4. Optionally, `cd` into the `examples/simple_s3_bucket/terraform` directory and run the usual `terraform init`, `terraform plan` and `terraform apply` commands to deploy the infrastructure. Make sure to pass the AWS credentials to Terraform (Salami does not generate the `provider` block for you).
3. Run `salami compile` to run the compiler
4. Optionally, `cd` into the `examples/simple_s3_bucket/terraform` directory and run the `terraform init` and `terraform apply` commands to deploy to AWS.

### FYI

1. Note that `salami compile` will examine the `salami-lock.toml` file and the source `.sami` files, and determine which Salami objects have changed since the last compilation. To force a complete recompilation, delete the `salami-lock.toml` file. Or, you can change source `.sami` files and `salami compile` will recompile only the changed objects.
1. `salami compile` command examines `salami-lock.toml` and the source `.sami` files to determine the changeset. To force a complete regeneration, delete the `salami-lock.toml` file and rerun the compiler.

2. Occassionally, OpenAI API delays responses significantly. If `salami compile` is stuck, try setting `compiler.llm.max_concurrent` config to a lower value.
2. If timeout error is raised, try setting `compiler.llm.max_concurrent` config to a lower number. This slows down the compilation process, but reduces the likelihood of timeouts from OpenAI.

0 comments on commit 89405ac

Please sign in to comment.