Skip to content

Commit

Permalink
add workflow test for microservice (opea-project#97)
Browse files Browse the repository at this point in the history
Signed-off-by: chensuyue <suyue.chen@intel.com>
Signed-off-by: Spycsh <sihan.chen@intel.com>
Signed-off-by: letonghan <letong.han@intel.com>
  • Loading branch information
chensuyue committed May 29, 2024
1 parent 0dbd265 commit cae346d
Show file tree
Hide file tree
Showing 16 changed files with 553 additions and 216 deletions.
50 changes: 0 additions & 50 deletions .github/workflows/embeddings-comp-test.yml

This file was deleted.

50 changes: 0 additions & 50 deletions .github/workflows/llms-comp-test.yml

This file was deleted.

87 changes: 87 additions & 0 deletions .github/workflows/microservice-test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

name: MicroService-test

on:
pull_request_target:
branches: [main]
types: [opened, reopened, ready_for_review, synchronize] # added `ready_for_review` since draft is skipped
paths:
- comps/**
- "!**.md"
- "!**.txt"
- .github/workflows/microservice.yml

# If there is a new commit, the previous jobs will be canceled
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

jobs:
job1:
name: Get-test-matrix
runs-on: ubuntu-latest
outputs:
run_matrix: ${{ steps.get-test-matrix.outputs.run_matrix }}
steps:
- name: Checkout out Repo
uses: actions/checkout@v4
with:
ref: "refs/pull/${{ github.event.number }}/merge"
fetch-depth: 0
- name: Get test matrix
id: get-test-matrix
run: |
set -xe
changed_files=$(git diff --name-only ${{ github.event.pull_request.base.sha }} ${{ github.event.pull_request.head.sha }} \
| grep 'comps/' | grep -vE '*.md|*.txt')
services=$(printf '%s\n' "${changed_files[@]}" | grep '/' | cut -d'/' -f2 | sort -u)
run_matrix="{\"include\":["
for service in ${services}; do
hardware="gaudi" # default hardware, set based on the changed files
run_matrix="${run_matrix}{\"service\":\"${service}\",\"hardware\":\"${hardware}\"},"
done
run_matrix=$run_matrix"]}"
echo "run_matrix=${run_matrix}" >> $GITHUB_OUTPUT
Microservice-test:
needs: job1
strategy:
matrix: ${{ fromJSON(needs.job1.outputs.run_matrix) }}
runs-on: ${{ matrix.hardware }}
continue-on-error: true
steps:
- name: Clean Up Working Directory
run: sudo rm -rf ${{github.workspace}}/*

- name: Checkout out Repo
uses: actions/checkout@v4
with:
ref: "refs/pull/${{ github.event.number }}/merge"

- name: Run microservice test
env:
HUGGINGFACEHUB_API_TOKEN: ${{ secrets.HUGGINGFACEHUB_API_TOKEN }}
service: ${{ matrix.service }}
hardware: ${{ matrix.hardware }}
run: |
cd tests
if [ -f test_${service}.sh ]; then timeout 10m bash test_${service}.sh; else echo "Test script not found, skip test!"; fi
- name: Clean up container
env:
service: ${{ matrix.service }}
hardware: ${{ matrix.hardware }}
if: cancelled() || failure()
run: |
cid=$(docker ps -aq --filter "name=test-comps-*")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
echo y | docker system prune
- name: Publish pipeline artifact
if: ${{ !cancelled() }}
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.service }}-${{ matrix.hardware }}
path: ${{ github.workspace }}/tests/*.log
22 changes: 12 additions & 10 deletions comps/asr/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,43 +2,45 @@

ASR (Audio-Speech-Recognition) microservice helps users convert speech to text. When building a talking bot with LLM, users will need to convert their audio inputs (What they talk, or Input audio from other sources) to text, so the LLM is able to tokenize the text and generate an answer. This microservice is built for that conversion stage.

# 🚀Start Microservice with Python
# 🚀1. Start Microservice with Python (Option 1)

To start the ASR microservice with Python, you need to first install python packages.

## Install Requirements
## 1.1 Install Requirements

```bash
pip install -r requirements.txt
```

## Start ASR Service with Python Script
## 1.2 Start ASR Service with Python Script

```bash
python asr.py
```

# 🚀Start Microservice with Docker
# 🚀2. Start Microservice with Docker (Option 2)

Alternatively, you can also start the ASR microservice with Docker.

## Build Docker Image
## 2.1 Build Docker Image

```bash
cd ../../
docker build -t opea/gen-ai-comps:asr --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/asr/Dockerfile .
docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/asr/Dockerfile .
```

## Run Docker with CLI
## 2.2 Run Docker with CLI

```bash
docker run -p 9099:9099 --network=host --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy opea/gen-ai-comps:asr
docker run -p 9099:9099 --network=host --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy opea/asr:latest
```

# Test
# 🚀3. Consume ASR Service

You can use the following `curl` command to test whether the service is up. Notice that the first request can be slow because it needs to download the models.

```bash
curl http://localhost:9099/v1/audio/transcriptions -H "Content-Type: application/json" -d '{"url": "https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample_2.wav"}'
curl http://localhost:9099/v1/audio/transcriptions \
-H "Content-Type: application/json" \
-d '{"url": "https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample_2.wav"}'
```
45 changes: 28 additions & 17 deletions comps/dataprep/redis/README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
# Dataprep Microservice with Redis

# 🚀Start Microservice with Python
# 🚀1. Start Microservice with Python(Option 1)

## Install Requirements
## 1.1 Install Requirements

```bash
pip install -r requirements.txt
```

## Start Redis Stack Server
## 1.2 Start Redis Stack Server

Please refer to this [readme](../../../vectorstores/langchain/redis/README.md).

## Setup Environment Variables
## 1.3 Setup Environment Variables

```bash
export REDIS_URL="redis://${your_ip}:6379"
Expand All @@ -22,46 +22,57 @@ export LANGCHAIN_API_KEY=${your_langchain_api_key}
export LANGCHAIN_PROJECT="opea/gen-ai-comps:dataprep"
```

## Start Document Preparation Microservice for Redis with Python Script
## 1.4 Start Document Preparation Microservice for Redis with Python Script

Start document preparation microservice for Redis with below command.

```bash
python prepare_doc_redis.py
```

# 🚀Start Microservice with Docker
# 🚀2. Start Microservice with Docker (Option 2)

## Build Docker Image
## 2.1 Start Redis Stack Server

```bash
cd ../../../../
docker build -t opea/dataprep-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
```
Please refer to this [readme](../../../vectorstores/langchain/redis/README.md).

## Run Docker with CLI
## 2.2 Setup Environment Variables

```bash
export REDIS_URL="redis://${your_ip}:6379"
export INDEX_NAME=${your_index_name}
export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=${your_langchain_api_key}
export LANGCHAIN_PROJECT="opea/gen-ai-comps:dataprep"
export LANGCHAIN_PROJECT="opea/dataprep"
```

## 2.3 Build Docker Image

docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/dataprep-redis:latest
```bash
cd ../../../../
docker build -t opea/dataprep-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
```

## 2.4 Run Docker with CLI (Option A)

```bash
docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME -e TEI_ENDPOINT=$TEI_ENDPOINT opea/dataprep-redis:latest
```

## Run Docker with Docker Compose
## 2.5 Run with Docker Compose (Option B)

```bash
cd comps/dataprep/redis/docker
docker compose -f docker-compose-dataprep-redis.yaml up -d
```

# Invoke Microservice
# 🚀3. Consume Microservice

Once document preparation microservice for Redis is started, user can use below command to invoke the microservice to convert the document to embedding and save to the database.

```bash
curl -X POST -H "Content-Type: application/json" -d '{"path":"/path/to/document"}' http://localhost:6007/v1/dataprep
curl -X POST \
-H "Content-Type: application/json" \
-d '{"path":"/path/to/document"}' \
http://localhost:6007/v1/dataprep
```
Loading

0 comments on commit cae346d

Please sign in to comment.