Skip to content

Commit

Permalink
Container support for AWS (#205)
Browse files Browse the repository at this point in the history
* [Feat] Add build base docker image as abstract method.

* [Feat] Update aws to build the base docker image.

* [Feat] Add Container deployment flag. To Do [Check for the valid providers that support container deployment]

* [Feat] Add Container deployment support.

* [Feat] Addd support of the base docker image AWS (python)

* [Feat] Code Refacotring.

* [Feat] Added Refactoring to do tasks.

* [Feat] 1. Return container deployment flag, container URI. 2. Add create function with container URI

* [Feat] Refactored the code. To Do: Need to add entrrypoint for docker aws.

* Add entry point in the Dockerfile

* Maintain cache for the container deployment.

* Add correct entrypoint in dockerfile for lambda

* Remove Whitespaces

* Docler client login through user provided credentials, refactor ecr authorization, and rase NotImplementedError for GCP and Azure

* Fix Mistyped

* Minor fixes: Getting docker username, passwoord from config

* Minor fixes: Only show the push image error once

* Minor fixes: Refactor some  print debugging statements.

* Get repository name from Config.

* Get repository name from Config if user has provided else generate randomly

* Refactoring

* Refactored the create function with simple if/else

* Linter and formatter

* Removed the parsing for repository name and Image tag. We can use it directly, no need to parse again.

* Linter and formatter

* Updating function based on container deployment and sotre the code hash for  cache

* Refactored

* Added container deployment as CLI option and update the docs (usage.md)

* Linter and formatter

* Linter and formatter

* Linter and formatter

* Add new paramters ( container deployment nad uri) to update_function docs.

* Dockerfile for NodeJs

* Add Empty requirements

* Add Empty requirements

* Black fomatter

* Black fomatter

* Black fomatter

* Black fomatter

* Black fomatter

* Fail early if contianer deployment is set for other than AWS.

* Black fomatter

* Minor Fix.

* Minor Fix.

* Remove unnecessary spaces

* Use resource id instead of randomname for ecr reposiotry

* [Feat] Linting and rebase errors

* [aws] Make sure to fail when we cannot log in to ECR

* [aws] Reorganize ECR repository creation

* [aws] Add improt to rich (nice display) and boto3 stubs for ECR

* [aws] Do not cache Docker passwords

* [aws] Remove unnecessary config

* [aws] Remove unnecessary Docker login

* [aws] Customize function name for container deployment

* [aws] remove debug output

* [aws] [system] Extend cache system to support containers

* [aws] Fix typo in the Docker image

* [aws] Build arm64 Docker images

* [system] Supporting proper caching of containers with different architectures

* [aws] Prune pip cache after building the image

* [aws] Remove unnecessary package

* [aws] CodeRabbit fixes

* [aws] Adjust other implementations to a new interface

* [aws] Move container implementation to a separate class

* [aws] Fix CodeRabbit issue

* [aws] [whisk] Fuse two container implementations

* [tools] Fix incorrect handling of False values

* [system] Reorient container definition to distinguish between AWS and OpenWhisk support

* [system] Add storage configuration option for benchmark invoke

* [system] Disable rich output for regression

* [benchmark] Separate directory for concurrent build of code package and container

* [system] Extend regression to support containers

* [system] Provide documentation on multi-platform builds

* [aws] Update docs

---------

Co-authored-by: Marcin Copik <mcopik@gmail.com>
  • Loading branch information
prajinkhadka and mcopik authored Nov 7, 2024
1 parent 4f7c5dd commit bf5bc35
Show file tree
Hide file tree
Showing 32 changed files with 1,149 additions and 342 deletions.
Empty file.
Empty file.
15 changes: 15 additions & 0 deletions benchmarks/wrappers/aws/python/setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
from distutils.core import setup
from glob import glob
from pkg_resources import parse_requirements

with open('requirements.txt') as f:
requirements = [str(r) for r in parse_requirements(f)]

setup(
name='function',
install_requires=requirements,
packages=['function'],
package_dir={'function': '.'},
package_data={'function': glob('**', recursive=True)},
)

9 changes: 5 additions & 4 deletions config/example.json
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
{
"experiments": {
"deployment": "openwhisk",
"deployment": "aws",
"update_code": false,
"update_storage": false,
"download_results": false,
"download_results": false,
"architecture": "arm64",
"container_deployment": true,
"runtime": {
"language": "nodejs",
"version": "16"
"language": "python",
"version": "3.8"
},
"type": "invocation-overhead",
"perf-cost": {
Expand Down
111 changes: 71 additions & 40 deletions config/systems.json
Original file line number Diff line number Diff line change
Expand Up @@ -55,31 +55,46 @@
}
}
},
"architecture": ["x64"]
"architecture": ["x64"],
"deployments": ["package"]
},
"aws": {
"languages": {
"python": {
"base_images": {
"3.11": "amazon/aws-lambda-python:3.11",
"3.10": "amazon/aws-lambda-python:3.10",
"3.9": "amazon/aws-lambda-python:3.9",
"3.8": "amazon/aws-lambda-python:3.8"
"x64": {
"3.11": "amazon/aws-lambda-python:3.11",
"3.10": "amazon/aws-lambda-python:3.10",
"3.9": "amazon/aws-lambda-python:3.9",
"3.8": "amazon/aws-lambda-python:3.8"
},
"arm64": {
"3.11": "amazon/aws-lambda-python:3.11.2024.05.23.17",
"3.10": "amazon/aws-lambda-python:3.10.2024.06.19.11",
"3.9": "amazon/aws-lambda-python:3.9.2024.05.20.23",
"3.8": "amazon/aws-lambda-python:3.8.2024.09.05.16"
}
},
"images": [
"build"
],
"deployment": {
"files": [
"handler.py",
"storage.py"
"storage.py",
"setup.py"
],
"packages": []
}
},
"nodejs": {
"base_images": {
"16": "amazon/aws-lambda-nodejs:16"
"x64": {
"16": "amazon/aws-lambda-nodejs:16"
},
"arm64": {
"16": "amazon/aws-lambda-nodejs:16.2024.09.06.13"
}
},
"images": [
"build"
Expand All @@ -95,17 +110,20 @@
}
}
},
"architecture": ["x64", "arm64"]
"architecture": ["x64", "arm64"],
"deployments": ["package", "container"]
},
"azure": {
"languages": {
"python": {
"base_images": {
"3.7": "mcr.microsoft.com/azure-functions/python:3.0-python3.7",
"3.8": "mcr.microsoft.com/azure-functions/python:3.0-python3.8",
"3.9": "mcr.microsoft.com/azure-functions/python:3.0-python3.9",
"3.10": "mcr.microsoft.com/azure-functions/python:4-python3.10",
"3.11": "mcr.microsoft.com/azure-functions/python:4-python3.11"
"x64": {
"3.7": "mcr.microsoft.com/azure-functions/python:3.0-python3.7",
"3.8": "mcr.microsoft.com/azure-functions/python:3.0-python3.8",
"3.9": "mcr.microsoft.com/azure-functions/python:3.0-python3.9",
"3.10": "mcr.microsoft.com/azure-functions/python:4-python3.10",
"3.11": "mcr.microsoft.com/azure-functions/python:4-python3.11"
}
},
"images": [
"build"
Expand All @@ -123,9 +141,11 @@
},
"nodejs": {
"base_images": {
"16": "mcr.microsoft.com/azure-functions/node:4-node16",
"18": "mcr.microsoft.com/azure-functions/node:4-node18",
"20": "mcr.microsoft.com/azure-functions/node:4-node20"
"x64": {
"16": "mcr.microsoft.com/azure-functions/node:4-node16",
"18": "mcr.microsoft.com/azure-functions/node:4-node18",
"20": "mcr.microsoft.com/azure-functions/node:4-node20"
}
},
"images": [
"build"
Expand All @@ -148,18 +168,21 @@
"username": "docker_user"
}
},
"architecture": ["x64"]
"architecture": ["x64"],
"deployments": ["package"]
},
"gcp": {
"languages": {
"python": {
"base_images": {
"3.7": "ubuntu:22.04",
"3.8": "ubuntu:22.04",
"3.9": "ubuntu:22.04",
"3.10": "ubuntu:22.04",
"3.11": "ubuntu:22.04",
"3.12": "ubuntu:22.04"
"x64": {
"3.7": "ubuntu:22.04",
"3.8": "ubuntu:22.04",
"3.9": "ubuntu:22.04",
"3.10": "ubuntu:22.04",
"3.11": "ubuntu:22.04",
"3.12": "ubuntu:22.04"
}
},
"images": [
"build"
Expand All @@ -177,12 +200,14 @@
},
"nodejs": {
"base_images": {
"10": "ubuntu:18.04",
"12": "ubuntu:18.04",
"14": "ubuntu:18.04",
"16": "ubuntu:18.04",
"18": "ubuntu:22.04",
"20": "ubuntu:22.04"
"x64": {
"10": "ubuntu:18.04",
"12": "ubuntu:18.04",
"14": "ubuntu:18.04",
"16": "ubuntu:18.04",
"18": "ubuntu:22.04",
"20": "ubuntu:22.04"
}
},
"images": [
"build"
Expand All @@ -200,16 +225,19 @@
}
}
},
"architecture": ["x64"]
"architecture": ["x64"],
"deployments": ["package"]
},
"openwhisk": {
"languages": {
"python": {
"base_images": {
"3.7": "openwhisk/action-python-v3.7",
"3.9": "openwhisk/action-python-v3.9",
"3.10": "openwhisk/action-python-v3.10",
"3.11": "openwhisk/action-python-v3.11"
"x64": {
"3.7": "openwhisk/action-python-v3.7",
"3.9": "openwhisk/action-python-v3.9",
"3.10": "openwhisk/action-python-v3.10",
"3.11": "openwhisk/action-python-v3.11"
}
},
"images": [
"function"
Expand All @@ -228,11 +256,13 @@
},
"nodejs": {
"base_images": {
"10": "openwhisk/action-nodejs-v10",
"12": "openwhisk/action-nodejs-v12",
"14": "openwhisk/action-nodejs-v14",
"18": "openwhisk/action-nodejs-v18",
"20": "openwhisk/action-nodejs-v20"
"x64": {
"10": "openwhisk/action-nodejs-v10",
"12": "openwhisk/action-nodejs-v12",
"14": "openwhisk/action-nodejs-v14",
"18": "openwhisk/action-nodejs-v18",
"20": "openwhisk/action-nodejs-v20"
}
},
"images": [
"function"
Expand All @@ -249,6 +279,7 @@
}
}
},
"architecture": ["x64"]
"architecture": ["x64"],
"deployments": ["container"]
}
}
9 changes: 9 additions & 0 deletions dockerfiles/aws/nodejs/Dockerfile.function
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
ARG BASE_IMAGE
FROM $BASE_IMAGE
COPY . function/
COPY handler.js .
RUN cd function \
&& npm install --no-package-lock --production \
&& npm cache clean --force

CMD ["handler.handler"]
22 changes: 22 additions & 0 deletions dockerfiles/aws/python/Dockerfile.function
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
ARG BASE_IMAGE
FROM $BASE_IMAGE
ARG VERSION
ENV PYTHON_VERSION=${VERSION}

COPY . function/

RUN touch function/__init__.py
RUN if test -f "function/requirements.txt.${PYTHON_VERSION}"; then \
pip install --no-cache-dir \
-r function/requirements.txt \
-r function/requirements.txt.${PYTHON_VERSION} \
function/ && \
pip cache purge; \
else \
pip install --no-cache-dir \
-r function/requirements.txt \
function/ && \
pip cache purge; \
fi

CMD ["function/handler.handler"]
2 changes: 1 addition & 1 deletion docs/build.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ JSON configuration files are needed.

**Build Docker Image** - in this step, we create a new image `function.{platform}.{benchmark}.{language}-{version}`.
Benchmark and all of its dependencies are installed there, and the image can be deployed directly
to the serverless platform. At the moment, this step is used only in OpenWhisk.
to the serverless platform. At the moment, this step is used only on AWS and in OpenWhisk.

## Docker Image Build

Expand Down
2 changes: 1 addition & 1 deletion docs/modularity.md
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@ This function has been retrieved from the cache and requires refreshing function
In practice, this is often limited to updating logging handlers - see existing implementations for details.

```python
def update_function(self, function: Function, code_package: Benchmark):
def update_function(self, function: Function, code_package: Benchmark, container_deployment: bool, container_uri: str):
```

This function updates the function's code and configuration in the platform.
Expand Down
10 changes: 10 additions & 0 deletions docs/platforms.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,16 @@ points for each platform.
> [!WARNING]
> On many platforms, credentials can be provided as environment variables or through the SeBS configuration. SeBS will not store your credentials in the cache. When saving results, SeBS stores user benchmark and experiment configuration for documentation and reproducibility, except for credentials that are erased. If you provide the credentials through JSON input configuration, do not commit nor publish these files anywhere.
### Architectures

By default, SeBS defaults functions built for the x64 (x86_64) architecture. On AWS, functions can also be build and deployed for ARM CPUs to benefit from Graviton CPUs available on Lambda.
This change primarily affects functions that make use of dependencies with native builds, such as `torch`, `numpy` or `ffmpeg`.

Such functions can be build as code packages on any platforms, as we rely on package managers like pip and npm to provide binary dependencies.
However, special care is needed to build Docker containers: since installation of packages is a part of the Docker build, we cannot natively execute
binaries based on ARM containers on x86 CPUs. To build multi-platform images, we recommend to follow official [Docker guidelines](https://docs.docker.com/build/building/multi-platform/#build-multi-platform-images) and provide static QEMU installation.
On Ubuntu-based distributions, this requires installing an OS package and executing a single Docker command to provide seamless emulation of ARM containers.

### Cloud Account Identifiers

SeBS ensures that all locally cached cloud resources are valid by storing a unique identifier associated with each cloud account. Furthermore, we store this identifier in experiment results to easily match results with the cloud account or subscription that was used to obtain them. We use non-sensitive identifiers such as account IDs on AWS, subscription IDs on Azure, and Google Cloud project IDs.
Expand Down
3 changes: 1 addition & 2 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@ For each command you can pass `--verbose` flag to increase the verbosity of the
By default, all scripts will create a cache in the directory `cache` to store code with
dependencies and information on allocated cloud resources.
Benchmarks will be rebuilt after a change in source code is detected.
To enforce redeployment of code and benchmark inputs please use flags `--update-code`
and `--update-storage`, respectively.
To enforce redeployment of code, benchmark inputs, container deployment (supported in AWS) please use flags `--update-code`, `--update-storage` and `--container-deployment` respectively.

**Note:** The cache does not support updating the cloud region. If you want to deploy benchmarks
to a new cloud region, then use a new cache directory.
Expand Down
3 changes: 1 addition & 2 deletions requirements.aws.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,4 @@ boto3
botocore
flake8-boto3
urllib3
boto3-stubs
boto3-stubs[lambda,s3,apigatewayv2,sts,logs,iam]
boto3-stubs[lambda,s3,apigatewayv2,sts,logs,iam,ecr]
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,5 @@ scipy
#
pycurl>=7.43
click>=7.1.2
rich

Loading

0 comments on commit bf5bc35

Please sign in to comment.