Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/llm predictor #1

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open

Conversation

915-Muscalagiu-AncaIoana
Copy link
Collaborator

No description provided.

@915-Muscalagiu-AncaIoana 915-Muscalagiu-AncaIoana force-pushed the feat/llm_predictor branch 3 times, most recently from c3add2d to a97f838 Compare November 24, 2024 15:07
recsys/inference/llm_ranking_predictor.py Show resolved Hide resolved
recsys/inference/llm_ranking_predictor.py Show resolved Hide resolved
"article_ids": article_ids,
}

def _postprocess_output(self, output):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move the def _postprocess_output(self, output): after def _preprocess_features(self, features): to follow a natural flow of reading.

- Categorical features: These describe qualitative aspects, like product category, color, and material.
3. Your response should only include the probability of purchase for the positive class (e.g., likelihood of being purchased), as a value between 0 and 1.

### Product Features:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are all product features? some of them are not of the customers?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed it's also for the customer.


scores = []
for feature in preprocessed_features:
langchain_output = self.llm.invoke(feature)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we invoke the llm / feature?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's indeed not on the feature, but on the feature set of a candidate together with the customer features, which is one data points that needs to be predicted.

return project, project.get_feature_store()


def get_secrets_api():
connection = hopsworks.connection(host="c.app.hopsworks.ai",
hostname_verification=False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you tried this without adding host="c.app.hopsworks.ai", hostname_verification=False, port=443 ?

You could try to access the get_secrets_api() method directly from the project returned by login. That is usually how it was done throughout the project.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that the secrets_api is not linked with the project instance in hopsworks, but it is at the level of the user ( hence you need also tokens with also the user scope checked in order to access it, which is not the default scope of a token ), therefore you have to take it from hopsworks.connection() not from the project. Now, trying this using the default configuration of the connection() produces this error:
File "/Users/ancaioanamuscalagiu/Documents/hands-on-recommender-system/.venv/lib/python3.11/site-packages/hopsworks/client/external.py", line 38, in init
raise exceptions.ExternalClientError("host")
hopsworks.client.exceptions.ExternalClientError: host
which if we look up in their source code can be traced back to:

    """Initializes a client in an external environment such as AWS Sagemaker."""
    if not host:
        raise exceptions.ExternalClientError("host")

so therefore when it is an external client ( such as my local machine trying to perform this connect ) it expects the host parameter to be set.

secrets = secrets_api.get_secrets()
existing_secret_keys = [secret.name for secret in secrets]
# Create the OPENAI_API_KEY secret if it doesn't exist
if "OPENAI_API_KEY" not in existing_secret_keys:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this runs on the cloud, the settings will not load the .env file. Thus, it will load an empty value. I would avoid doing this step. Better assert these values and crash the program if they are missing with a clear message.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A safer option would be to override the settings object at start-up time, based on the Hopsworks secrets. Like this the settings object is the single source of truth.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To avoid adding these secrets manually, we could create a different scripts that adds them based on the settings file, but I would make it an explicit operation to avoid weird behavior.

}

def _postprocess_output(self, output):
return float(output['text'].split(':')[1].strip())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also check if the output is a flow, otherwise do an try expect, returning a minimum score or something to avoid crashing the probram.

you can do that easily with Pydantic classic + LangChain to check that the value is a float within an expected range.

iusztinpaul pushed a commit that referenced this pull request Nov 28, 2024
@915-Muscalagiu-AncaIoana 915-Muscalagiu-AncaIoana force-pushed the feat/llm_predictor branch 2 times, most recently from 7b5d5d5 to 80c1a99 Compare December 3, 2024 07:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants