Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added New Recommender Operator #889

Merged
merged 11 commits into from
Jul 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions ads/opctl/operator/lowcode/anomaly/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ The operator will run in your local environment without requiring any additional

## 4. Running anomaly detection on the local container

To run the anomaly detection detection operator within a local container, follow these steps:
To run the anomaly detection operator within a local container, follow these steps:

Use the command below to build the anomaly detection container.

Expand Down Expand Up @@ -106,7 +106,7 @@ ads operator run -f ~/anomaly/anomaly.yaml --backend-config ~/anomaly/backend_op

## 5. Running anomaly detection in the Data Science job within container runtime

To execute the anomaly detection detection operator within a Data Science job using container runtime, please follow the steps outlined below:
To execute the anomaly detection operator within a Data Science job using container runtime, please follow the steps outlined below:

You can use the following command to build the anomaly detection container. This step can be skipped if you have already done this for running the operator within a local container.

Expand Down Expand Up @@ -155,7 +155,7 @@ ads opctl watch <OCID>

## 6. Running anomaly detection in the Data Science job within conda runtime

To execute the anomaly detection detection operator within a Data Science job using conda runtime, please follow the steps outlined below:
To execute the anomaly detection operator within a Data Science job using conda runtime, please follow the steps outlined below:

You can use the following command to build the anomaly detection conda environment.

Expand Down
16 changes: 16 additions & 0 deletions ads/opctl/operator/lowcode/recommender/MLoperator
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
type: recommender
version: v1
conda_type: service
name: Recommender Operator
gpu: no
keywords:
- Recommender
backends:
- job
- operator.local
description: |
Recommender Systems are designed to suggest relevant items, products, or content to users based on their
preferences and behaviors. These systems are widely used in various industries such as e-commerce, entertainment,
and social media to enhance user experience by providing personalized recommendations. They help in increasing user
engagement, satisfaction, and sales by predicting what users might like or need based on their past interactions
and the preferences of similar users.
206 changes: 206 additions & 0 deletions ads/opctl/operator/lowcode/recommender/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,206 @@
# Recommender Operator

Recommender Systems are designed to suggest relevant items, products, or content to users based on their preferences and behaviors. These systems are widely used in various industries such as e-commerce, entertainment, and social media to enhance user experience by providing personalized recommendations. They help in increasing user engagement, satisfaction, and sales by predicting what users might like or need based on their past interactions and the preferences of similar users.


Below are the steps to configure and run the Recommender Operator on different resources.

## 1. Prerequisites

Follow the [CLI Configuration](https://accelerated-data-science.readthedocs.io/en/latest/user_guide/cli/opctl/configure.html) steps from the ADS documentation. This step is mandatory as it sets up default values for different options while running the Recommender Operator on OCI Data Science jobs or OCI Data Flow applications. If you have previously done this and used a flexible shape, make sure to adjust `ml_job_config.ini` with shape config details and `docker_registry` information.

- ocpus = 1
- memory_in_gbs = 16
- docker_registry = `<iad.ocir.io/namespace/>`

## 2. Generating configs

To generate starter configs, run the command below. This will create a list of YAML configs and place them in the `output` folder.

```bash
ads operator init -t recommender --overwrite --output ~/recommender/
```

The most important files expected to be generated are:

- `recommender.yaml`: Contains recommender related configuration.
- `backend_operator_local_python_config.yaml`: This includes a local backend configuration for running recommender in a local environment. The environment should be set up manually before running the operator.
- `backend_operator_local_container_config.yaml`: This includes a local backend configuration for running recommender within a local container. The container should be built before running the operator. Please refer to the instructions below for details on how to accomplish this.
- `backend_job_container_config.yaml`: Contains Data Science job-related config to run recommender in a Data Science job within a container (BYOC) runtime. The container should be built and published before running the operator. Please refer to the instructions below for details on how to accomplish this.
- `backend_job_python_config.yaml`: Contains Data Science job-related config to run recommender in a Data Science job within a conda runtime. The conda should be built and published before running the operator.

All generated configurations should be ready to use without the need for any additional adjustments. However, they are provided as starter kit configurations that can be customized as needed.

## 3. Running recommender on the local conda environment

To run recommender locally, create and activate a new conda environment (`ads-recommender`). Install all the required libraries listed in the `environment.yaml` file.

```yaml
- report-creator
- cerberus
- "git+https://github.com/oracle/accelerated-data-science.git@feature/recommender#egg=oracle-ads"
```

Please review the previously generated `recommender.yaml` file using the `init` command, and make any necessary adjustments to the input and output file locations. By default, it assumes that the files should be located in the same folder from which the `init` command was executed.

Use the command below to verify the recommender config.

```bash
ads operator verify -f ~/recommender/recommender.yaml
```

Use the following command to run the recommender within the `ads-recommender` conda environment.

```bash
ads operator run -f ~/recommender/recommender.yaml -b local
```

The operator will run in your local environment without requiring any additional modifications.

## 4. Running recommender on the local container

To run the recommender operator within a local container, follow these steps:

Use the command below to build the recommender container.

```bash
ads operator build-image -t recommender
```

This will create a new `recommender:v1` image, with `/etc/operator` as the designated working directory within the container.


Check the `backend_operator_local_container_config.yaml` config file. By default, it should have a `volume` section with the `.oci` configs folder mounted.

```yaml
volume:
- "/Users/<user>/.oci:/root/.oci"
```

Mounting the OCI configs folder is only required if an OCI Object Storage bucket will be used to store the input recommender data or output recommender result. The input/output folders can also be mounted to the container.

```yaml
volume:
- /Users/<user>/.oci:/root/.oci
- /Users/<user>/recommender/data:/etc/operator/data
- /Users/<user>/recommender/result:/etc/operator/result
```

The full config can look like:
```yaml
kind: operator.local
spec:
image: recommender:v1
volume:
- /Users/<user>/.oci:/root/.oci
- /Users/<user>/recommender/data:/etc/operator/data
- /Users/<user>/recommender/result:/etc/operator/result
type: container
version: v1
```

Run the recommender within a container using the command below:

```bash
ads operator run -f ~/recommender/recommender.yaml --backend-config ~/recommender/backend_operator_local_container_config.yaml
```

## 5. Running recommender in the Data Science job within container runtime

To execute the recommender operator within a Data Science job using container runtime, please follow the steps outlined below:

You can use the following command to build the recommender container. This step can be skipped if you have already done this for running the operator within a local container.

```bash
ads operator build-image -t recommender
```

This will create a new `recommender:v1` image, with `/etc/operator` as the designated working directory within the container.

Publish the `recommender:v1` container to the [Oracle Container Registry](https://docs.public.oneportal.content.oci.oraclecloud.com/en-us/iaas/Content/Registry/home.htm). To become familiar with OCI, read the documentation links posted below.

- [Access Container Registry](https://docs.public.oneportal.content.oci.oraclecloud.com/en-us/iaas/Content/Registry/Concepts/registryoverview.htm#access)
- [Create repositories](https://docs.public.oneportal.content.oci.oraclecloud.com/en-us/iaas/Content/Registry/Tasks/registrycreatingarepository.htm#top)
- [Push images](https://docs.public.oneportal.content.oci.oraclecloud.com/en-us/iaas/Content/Registry/Tasks/registrypushingimagesusingthedockercli.htm#Pushing_Images_Using_the_Docker_CLI)

To publish `recommender:v1` to OCR, use the command posted below:

```bash
ads operator publish-image recommender:v1 --registry <iad.ocir.io/tenancy/>
```

After the container is published to OCR, it can be used within Data Science jobs service. Check the `backend_job_container_config.yaml` config file. It should contain pre-populated infrastructure and runtime sections. The runtime section should contain an image property, something like `image: iad.ocir.io/<tenancy>/recommender:v1`. More details about supported options can be found in the ADS Jobs documentation - [Run a Container](https://accelerated-data-science.readthedocs.io/en/latest/user_guide/jobs/run_container.html).

Adjust the `recommender.yaml` config with proper input/output folders. When the recommender is run in the Data Science job, it will not have access to local folders. Therefore, input data and output folders should be placed in the Object Storage bucket. Open the `recommender.yaml` and adjust the following fields:

```yaml
input_data:
url: oci://bucket@namespace/recommender/input_data/data.csv
output_directory:
url: oci://bucket@namespace/recommender/result/
```

Run the recommender on the Data Science jobs using the command posted below:

```bash
ads operator run -f ~/recommender/recommender.yaml --backend-config ~/recommender/backend_job_container_config.yaml
```

The logs can be monitored using the `ads opctl watch` command.

```bash
ads opctl watch <OCID>
```

## 6. Running recommender in the Data Science job within conda runtime

To execute the recommender operator within a Data Science job using conda runtime, please follow the steps outlined below:

You can use the following command to build the recommender conda environment.

```bash
ads operator build-conda -t recommender
```

This will create a new `recommender_v1` conda environment and place it in the folder specified within `ads opctl configure` command.

Use the command below to Publish the `recommender_v1` conda environment to the Object Storage bucket.

```bash
ads operator publish-conda -t recommender
```
More details about configuring CLI can be found here - [Configuring CLI](https://accelerated-data-science.readthedocs.io/en/latest/user_guide/cli/opctl/configure.html)


After the conda environment is published to Object Storage, it can be used within Data Science jobs service. Check the `backend_job_python_config.yaml` config file. It should contain pre-populated infrastructure and runtime sections. The runtime section should contain a `conda` section.

```yaml
conda:
type: published
uri: oci://bucket@namespace/conda_environments/cpu/recommender/1/recommender_v1
```

More details about supported options can be found in the ADS Jobs documentation - [Run a Python Workload](https://accelerated-data-science.readthedocs.io/en/latest/user_guide/jobs/run_python.html).

Adjust the `recommender.yaml` config with proper input/output folders. When the recommender is run in the Data Science job, it will not have access to local folders. Therefore, input data and output folders should be placed in the Object Storage bucket. Open the `recommender.yaml` and adjust the following fields:

```yaml
input_data:
url: oci://bucket@namespace/recommender/input_data/data.csv
output_directory:
url: oci://bucket@namespace/recommender/result/
test_data:
url: oci://bucket@namespace/recommender/input_data/test.csv
```

Run the recommender on the Data Science jobs using the command posted below:

```bash
ads operator run -f ~/recommender/recommender.yaml --backend-config ~/recommender/backend_job_python_config.yaml
```

The logs can be monitored using the `ads opctl watch` command.

```bash
ads opctl watch <OCID>
```
5 changes: 5 additions & 0 deletions ads/opctl/operator/lowcode/recommender/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*--

# Copyright (c) 2023 Oracle and/or its affiliates.
# Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl/
82 changes: 82 additions & 0 deletions ads/opctl/operator/lowcode/recommender/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*--

# Copyright (c) 2024 Oracle and/or its affiliates.
# Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl/

import json
import os
import sys
from typing import Dict, List

import yaml

from ads.opctl import logger
from ads.opctl.operator.common.const import ENV_OPERATOR_ARGS
from ads.opctl.operator.common.utils import _parse_input_args

from .model.recommender_dataset import RecommenderDatasets
from .operator_config import RecommenderOperatorConfig
from .model.factory import RecommenderOperatorModelFactory

def operate(operator_config: RecommenderOperatorConfig) -> None:
"""Runs the recommender operator."""

datasets = RecommenderDatasets(operator_config)
RecommenderOperatorModelFactory.get_model(
operator_config, datasets
).generate_report()


def verify(spec: Dict, **kwargs: Dict) -> bool:
"""Verifies the recommender detection operator config."""
operator = RecommenderOperatorConfig.from_dict(spec)
msg_header = (
f"{'*' * 50} The operator config has been successfully verified {'*' * 50}"
)
print(msg_header)
print(operator.to_yaml())
print("*" * len(msg_header))


def main(raw_args: List[str]):
"""The entry point of the recommender the operator."""
args, _ = _parse_input_args(raw_args)
if not args.file and not args.spec and not os.environ.get(ENV_OPERATOR_ARGS):
logger.info(
"Please specify -f[--file] or -s[--spec] or "
f"pass operator's arguments via {ENV_OPERATOR_ARGS} environment variable."
)
return

logger.info("-" * 100)
logger.info(
f"{'Running' if not args.verify else 'Verifying'} the recommender detection operator."
)

yaml_string = ""
if args.spec or os.environ.get(ENV_OPERATOR_ARGS):
operator_spec_str = args.spec or os.environ.get(ENV_OPERATOR_ARGS)
try:
yaml_string = yaml.safe_dump(json.loads(operator_spec_str))
except json.JSONDecodeError:
yaml_string = yaml.safe_dump(yaml.safe_load(operator_spec_str))
except:
yaml_string = operator_spec_str

operator_config = RecommenderOperatorConfig.from_yaml(
uri=args.file,
yaml_string=yaml_string,
)

logger.info(operator_config.to_yaml())

# run operator
if args.verify:
verify(operator_config)
else:
operate(operator_config)


if __name__ == "__main__":
main(sys.argv[1:])
33 changes: 33 additions & 0 deletions ads/opctl/operator/lowcode/recommender/cmd.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*--

# Copyright (c) 2023 Oracle and/or its affiliates.
# Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl/

from typing import Dict

from ads.opctl.operator.common.operator_yaml_generator import YamlGenerator
from ads.opctl.operator.common.utils import _load_yaml_from_uri


def init(**kwargs: Dict) -> str:
"""
Generates operator config by the schema.

Properties
----------
kwargs: (Dict, optional).
Additional key value arguments.

- type: str
The type of the operator.

Returns
-------
str
The YAML specification generated based on the schema.
"""

return YamlGenerator(
schema=_load_yaml_from_uri(__file__.replace("cmd.py", "schema.yaml"))
).generate_example_dict(values={"type": kwargs.get("type")})
Loading
Loading