Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Internal server error when sending a request to a local endpoint after performing sam-local start-api --debug. #6600

Closed
ghost opened this issue Jan 24, 2024 · 7 comments
Labels
blocked/more-info-needed More info is needed from the requester. If no response in 14 days, it will become stale.

Comments

@ghost
Copy link

ghost commented Jan 24, 2024

Description:

Thank you for taking a look at this issue. I am receiving an Internal server error when sending a request to a local endpoint after performing sam-local start-api --debug.

Steps to reproduce:

Performed a sam init and the options.chosen are below
1 (Hello World Example)
y (zip)
N (xray)
N (Cloudwatch)
lambdalab (project name)

Created a requirments.txt file containing the following text in the hello world folder
langchain==0.1.1
langchain-openai==0.0.3

Entered the code below into app.py

import json
import boto3
from langchain_openai import OpenAI
import requests


def lambda_handler(event, context):
    ssh_key = get_ssh_key() 
    if ssh_key:
        print("Retrieved SSH key:", ssh_key)
    response_body = {
        "OpenAISSHKey": ssh_key
    }
    llm = OpenAI()
    name = llm.invoke(
        "What are some theories about the relationship between unemployment and inflation?"
    )
    return {
        "statusCode": 200,
        "body": name,
    }

def get_ssh_key():
    openai_sshkey = 'open-ai-ssh-key'
    ssm_client = boto3.client('ssm')
    try:
        response = ssm_client.get_parameter(
            Name=openai_sshkey,
            WithDecryption=True 
        )
        return response['Parameter']['Value']
    except ssm_client.exceptions.ParameterNotFound as e:
        print("SSH key parameter not found:", e)
        return None
    except Exception as e:
        print("Error retrieving SSH key:", e)
        return None

Then entered sam local start-api --debug in the terminal

Observed result:

Afterwards I sent a request to http://127.0.0.1:3000/hello and get the response below.
{"message":"Internal server error"}

The error in the terminal states : Unable to import module 'app': No module named 'langchain_openai'

The full message is below:
sam local start-api --debug

2024-01-23 20:33:54,610 | Config file location: /Users/mike/Desktop/lambdalab/samconfig.toml               
2024-01-23 20:33:54,614 | Loading configuration values from [default.['local', 'start-api'].parameters]    
(env.command_name.section) in config file at '/Users/mike/Desktop/lambdalab/samconfig.toml'...             
2024-01-23 20:33:54,615 | Configuration values successfully loaded.                                        
2024-01-23 20:33:54,616 | Configuration values are: {'stack_name': 'lambdalab', 'warm_containers': 'EAGER'}
2024-01-23 20:33:54,621 | Using SAM Template at /Users/mike/Desktop/lambdalab/template.yaml                
2024-01-23 20:33:54,680 | Using config file: samconfig.toml, config environment: default                   
2024-01-23 20:33:54,681 | Expand command line arguments to:                                                
2024-01-23 20:33:54,681 | --template_file=/Users/mike/Desktop/lambdalab/template.yaml --host=127.0.0.1     
--port=3000 --static_dir=public --layer_cache_basedir=/Users/mike/.aws-sam/layers-pkg                      
--warm_containers=EAGER --container_host=localhost --container_host_interface=127.0.0.1                    
2024-01-23 20:33:54,878 | local start-api command is called                                                
2024-01-23 20:33:54,882 | No Parameters detected in the template                                           
2024-01-23 20:33:54,919 | There is no customer defined id or cdk path defined for resource                 
HelloWorldFunction, so we will use the resource logical id as the resource id                              
2024-01-23 20:33:54,920 | There is no customer defined id or cdk path defined for resource                 
ServerlessRestApi, so we will use the resource logical id as the resource id                               
2024-01-23 20:33:54,922 | 0 stacks found in the template                                                   
2024-01-23 20:33:54,923 | No Parameters detected in the template                                           
2024-01-23 20:33:54,936 | There is no customer defined id or cdk path defined for resource                 
HelloWorldFunction, so we will use the resource logical id as the resource id                              
2024-01-23 20:33:54,937 | There is no customer defined id or cdk path defined for resource                 
ServerlessRestApi, so we will use the resource logical id as the resource id                               
2024-01-23 20:33:54,938 | 2 resources found in the stack                                                   
2024-01-23 20:33:54,939 | Found Serverless function with name='HelloWorldFunction' and                     
CodeUri='hello_world/'                                                                                     
2024-01-23 20:33:54,940 | --base-dir is not presented, adjusting uri hello_world/ relative to              
/Users/mike/Desktop/lambdalab/template.yaml                                                                
2024-01-23 20:33:54,942 | watch resource /Users/mike/Desktop/lambdalab/template.yaml                       
2024-01-23 20:33:54,943 | Create Observer for resource /Users/mike/Desktop/lambdalab/template.yaml with    
recursive True                                                                                             
2024-01-23 20:33:54,946 | watch resource /Users/mike/Desktop/lambdalab/template.yaml's parent              
/Users/mike/Desktop/lambdalab                                                                              
2024-01-23 20:33:54,947 | Create Observer for resource /Users/mike/Desktop/lambdalab with recursive False  
2024-01-23 20:33:54,957 | Initializing the lambda functions containers.                                    
2024-01-23 20:33:54,960 | Async execution started                                                          
2024-01-23 20:33:54,961 | Invoking function functools.partial(<function                                    
InvokeContext._initialize_all_functions_containers.<locals>.initialize_function_container at 0x108bf22a0>, 
Function(function_id='HelloWorldFunction', name='HelloWorldFunction', functionname='HelloWorldFunction',   
runtime='python3.9', memory=128, timeout=3, handler='app.lambda_handler', imageuri=None, packagetype='Zip',
imageconfig=None, codeuri='/Users/mike/Desktop/lambdalab/hello_world', environment=None, rolearn=None,     
layers=[], events={'HelloWorld': {'Type': 'Api', 'Properties': {'Path': '/hello', 'Method': 'get',         
'RestApiId': 'ServerlessRestApi'}}}, metadata={'SamResourceId': 'HelloWorldFunction'}, inlinecode=None,    
codesign_config_arn=None, architectures=['x86_64'], function_url_config=None,                              
function_build_info=<FunctionBuildInfo.BuildableZip: ('BuildableZip', 'Regular ZIP function which can be   
build with SAM CLI')>, stack_path='', runtime_management_config=None))                                     
2024-01-23 20:33:54,969 | Waiting for async results                                                        
2024-01-23 20:33:54,976 | No environment variables found for function 'HelloWorldFunction'                 
2024-01-23 20:33:54,977 | Loading AWS credentials from session with profile 'None'                         
2024-01-23 20:33:54,991 | Resolving code path. Cwd=/Users/mike/Desktop/lambdalab,                          
CodeUri=/Users/mike/Desktop/lambdalab/hello_world                                                          
2024-01-23 20:33:54,991 | Resolved absolute path to code is /Users/mike/Desktop/lambdalab/hello_world      
2024-01-23 20:33:56,959 | watch resource /Users/mike/Desktop/lambdalab/hello_world                         
2024-01-23 20:33:56,960 | Create Observer for resource /Users/mike/Desktop/lambdalab/hello_world with      
recursive True                                                                                             
2024-01-23 20:33:56,962 | watch resource /Users/mike/Desktop/lambdalab/hello_world's parent                
/Users/mike/Desktop/lambdalab                                                                              
2024-01-23 20:33:56,964 | Code /Users/mike/Desktop/lambdalab/hello_world is not a zip/jar file             
2024-01-23 20:33:58,481 | Local image is up-to-date                                                        
2024-01-23 20:33:58,499 | Using local image: public.ecr.aws/lambda/python:3.9-rapid-x86_64.                
                                                                                                           
2024-01-23 20:33:58,500 | Mounting /Users/mike/Desktop/lambdalab/hello_world as /var/task:ro,delegated,    
inside runtime container                                                                                   
2024-01-23 20:33:59,118 | Async execution completed                                                        
2024-01-23 20:33:59,120 | Containers Initialization is done.                                               
2024-01-23 20:33:59,122 | Found '1' API Events in Serverless function with name 'HelloWorldFunction'       
2024-01-23 20:33:59,123 | Detected Inline Swagger definition                                               
2024-01-23 20:33:59,125 | Parsing Swagger document using 2.0 specification                                 
2024-01-23 20:33:59,126 | Lambda function integration not found in Swagger document at path='/hello'       
method='get'                                                                                               
2024-01-23 20:33:59,128 | Found '0' APIs in resource 'ServerlessRestApi'                                   
2024-01-23 20:33:59,129 | Found '0' authorizers in resource 'ServerlessRestApi'                            
2024-01-23 20:33:59,130 | Removed duplicates from '0' Explicit APIs and '1' Implicit APIs to produce '1'   
APIs                                                                                                       
2024-01-23 20:33:59,132 | 1 APIs found in the template                                                     
2024-01-23 20:33:59,136 | Mounting HelloWorldFunction at http://127.0.0.1:3000/hello [GET]                 
2024-01-23 20:33:59,138 | You can now browse to the above endpoints to invoke your functions. You do not   
need to restart/reload SAM CLI while working on your functions, changes will be reflected                  
instantly/automatically. If you used sam build before running local commands, you will need to re-run sam  
build for the changes to be picked up. You only need to restart SAM CLI if you update your AWS SAM template
2024-01-23 20:33:59,141 | Localhost server is starting up. Multi-threading = True                          
2024-01-23 20:33:59 WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on http://127.0.0.1:3000
2024-01-23 20:33:59 Press CTRL+C to quit
2024-01-23 20:34:56,834 | Constructed Event 1.0 to invoke Lambda. Event: {'version': '1.0', 'httpMethod':  
'GET', 'body': None, 'resource': '/hello', 'requestContext': {'resourceId': '123456', 'apiId':             
'1234567890', 'resourcePath': '/hello', 'httpMethod': 'GET', 'requestId':                                  
'bec3393c-c88c-4981-9b41-91b18ee77f36', 'accountId': '123456789012', 'stage': 'Prod', 'identity':          
{'apiKey': None, 'userArn': None, 'cognitoAuthenticationType': None, 'caller': None, 'userAgent': 'Custom  
User Agent String', 'user': None, 'cognitoIdentityPoolId': None, 'cognitoAuthenticationProvider': None,    
'sourceIp': '127.0.0.1', 'accountId': None}, 'extendedRequestId': None, 'path': '/hello', 'protocol':      
'HTTP/1.1', 'domainName': '127.0.0.1:3000', 'requestTimeEpoch': 1706060034, 'requestTime':                 
'24/Jan/2024:01:33:54 +0000'}, 'queryStringParameters': None, 'multiValueQueryStringParameters': None,     
'headers': {'Host': '127.0.0.1:3000', 'Connection': 'keep-alive', 'Sec-Ch-Ua': '"Not_A Brand";v="8",       
"Chromium";v="120", "Google Chrome";v="120"', 'Sec-Ch-Ua-Mobile': '?0', 'Sec-Ch-Ua-Platform': '"macOS"',   
'Upgrade-Insecure-Requests': '1', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)           
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36', 'Accept':                          
'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,applicati
on/signed-exchange;v=b3;q=0.7', 'Sec-Fetch-Site': 'none', 'Sec-Fetch-Mode': 'navigate', 'Sec-Fetch-User':  
'?1', 'Sec-Fetch-Dest': 'document', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language':             
'en-US,en;q=0.9', 'X-Forwarded-Proto': 'http', 'X-Forwarded-Port': '3000'}, 'multiValueHeaders': {'Host':  
['127.0.0.1:3000'], 'Connection': ['keep-alive'], 'Sec-Ch-Ua': ['"Not_A Brand";v="8", "Chromium";v="120",  
"Google Chrome";v="120"'], 'Sec-Ch-Ua-Mobile': ['?0'], 'Sec-Ch-Ua-Platform': ['"macOS"'],                  
'Upgrade-Insecure-Requests': ['1'], 'User-Agent': ['Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)        
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36'], 'Accept':                         
['text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,applicat
ion/signed-exchange;v=b3;q=0.7'], 'Sec-Fetch-Site': ['none'], 'Sec-Fetch-Mode': ['navigate'],              
'Sec-Fetch-User': ['?1'], 'Sec-Fetch-Dest': ['document'], 'Accept-Encoding': ['gzip, deflate, br'],        
'Accept-Language': ['en-US,en;q=0.9'], 'X-Forwarded-Proto': ['http'], 'X-Forwarded-Port': ['3000']},       
'pathParameters': None, 'stageVariables': None, 'path': '/hello', 'isBase64Encoded': False}                
2024-01-23 20:34:56,858 | Found one Lambda function with name 'HelloWorldFunction'                         
2024-01-23 20:34:56,859 | Invoking app.lambda_handler (python3.9)                                          
2024-01-23 20:34:56,862 | No environment variables found for function 'HelloWorldFunction'                 
2024-01-23 20:34:56,864 | Resolving code path. Cwd=/Users/mike/Desktop/lambdalab,                          
CodeUri=/Users/mike/Desktop/lambdalab/hello_world                                                          
2024-01-23 20:34:56,866 | Resolved absolute path to code is /Users/mike/Desktop/lambdalab/hello_world      
2024-01-23 20:34:56,882 | Reuse the created warm container for Lambda function 'HelloWorldFunction'        
2024-01-23 20:34:56,908 | Lambda function 'HelloWorldFunction' is already running                          
2024-01-23 20:34:56,928 | Starting a timer for 3 seconds for function 'HelloWorldFunction'                 
Traceback (most recent call last): Unable to import module 'app': No module named 'langchain_openai'
END RequestId: 1b62ee5e-94bd-4aa1-973b-269570d01cef
REPORT RequestId: 1b62ee5e-94bd-4aa1-973b-269570d01cef	Init Duration: 2.35 ms	Duration: 654.46 ms	Billed Duration: 655 ms	Memory Size: 128 MB	Max Memory Used: 128 MB	
2024-01-23 20:34:57,662 | Unable to find Click Context for getting session_id.                             
2024-01-23 20:34:57,666 | Lambda returned empty body!                                                      
2024-01-23 20:34:57,667 | Invalid lambda response received: Invalid API Gateway Response Keys:             
{'stackTrace', 'requestId', 'errorType', 'errorMessage'} in {'errorMessage': "Unable to import module      
'app': No module named 'langchain_openai'", 'errorType': 'Runtime.ImportModuleError', 'requestId':         
'1b62ee5e-94bd-4aa1-973b-269570d01cef', 'stackTrace': []}                                                  
2024-01-23 20:34:57 127.0.0.1 - - [23/Jan/2024 20:34:57] "GET /hello HTTP/1.1" 502 -
2024-01-23 20:34:57 127.0.0.1 - - [23/Jan/2024 20:34:57] "GET /favicon.ico HTTP/1.1" 403 -

Expected result:

A string in the response

Additional environment details (Ex: Windows, Mac, Amazon Linux etc)

  1. OS: Mac
  2. sam --version: SAM CLI, version 1.94.0
  3. AWS region: us-east1
# Paste the output of `sam --info` here
{
  "version": "1.94.0",
  "system": {
    "python": "3.11.6",
    "os": "macOS-13.4.1-x86_64-i386-64bit"
  },
  "additional_dependencies": {
    "docker_engine": "24.0.7",
    "aws_cdk": "Not available",
    "terraform": "1.4.3-dev"
  },
  "available_beta_feature_env_vars": [
    "SAM_CLI_BETA_FEATURES",
    "SAM_CLI_BETA_BUILD_PERFORMANCE",
    "SAM_CLI_BETA_TERRAFORM_SUPPORT",
    "SAM_CLI_BETA_RUST_CARGO_LAMBDA"
  ]

Add --debug flag to command you are running

@ghost ghost added the stage/needs-triage Automatically applied to new issues and PRs, indicating they haven't been looked at. label Jan 24, 2024
@ghost ghost changed the title Internal server error when sending a request to a local Lambda function Internal server error when sending a request to a local endpoint Jan 24, 2024
@ghost ghost changed the title Internal server error when sending a request to a local endpoint Internal server error when sending a request to a local after performing sam-local start-api. Jan 25, 2024
@ghost ghost changed the title Internal server error when sending a request to a local after performing sam-local start-api. Internal server error when sending a request to a local after performing sam-local start-api --debug. Jan 25, 2024
@ghost ghost changed the title Internal server error when sending a request to a local after performing sam-local start-api --debug. Internal server error when sending a request to a local endpoint after performing sam-local start-api --debug. Jan 28, 2024
@jysheng123
Copy link
Contributor

Hi, in a normal project are you able to import that module in the first place? This does not seem like a SAM specific issue, it looks like that your project normally can not find that module in the first place. Could you help verify that the package is installed and that you are able to use it normally without SAM? If so, then that would indicate that something in SAM may be the issue. Thanks

@jysheng123 jysheng123 added the blocked/more-info-needed More info is needed from the requester. If no response in 14 days, it will become stale. label Jan 29, 2024
@ghost
Copy link
Author

ghost commented Jan 30, 2024

Thank you for responding to this issue. I believe this is a SAM related issue. I am using the same code in a Flask app as I am in the SAM app (shown below). The response in the Flask app is a string (proving that its not a code related issue). In the SAM app I am receiving an error shown below.

Flask app:

from flask import Flask
import json
import boto3
from langchain_openai import OpenAI

app = Flask(__name__)

@app.route("/")
def hello_world():
    ssh_key = get_ssh_key() 
    if ssh_key:
        print("Retrieved SSH key:", ssh_key)
    response_body = {
        "OpenAISSHKey": ssh_key
    }
    llm = OpenAI()
    name = llm.invoke("What are some theories about the relationship between unemployment and inflation?")
    return name

def get_ssh_key():
    openai_sshkey = 'open-ai-ssh-key'
    ssm_client = boto3.client('ssm')
    try:
        response = ssm_client.get_parameter(
            Name=openai_sshkey,
            WithDecryption=True 
        )
        return response['Parameter']['Value']
    except ssm_client.exceptions.ParameterNotFound as e:
        print("SSH key parameter not found:", e)
        return None
    except Exception as e:
        print("Error retrieving SSH key:", e)
        return None

Flask Endpoint output:

  1. Phillips Curve: This theory suggests an inverse relationship between unemployment and inflation. As unemployment decreases, wages increase, leading to higher demand for goods and services and thus, higher prices. 2. Demand-Pull Inflation: This theory states that when unemployment is low, consumer demand for goods and services increases, leading to an increase in prices. 3. Cost-Push Inflation: This theory suggests that when unemployment is high, businesses are unable to increase prices due to weak demand. As a result, they may reduce production costs by cutting wages, leading to a decrease in consumer purchasing power and lower prices. 4. Modern Monetary Theory: This theory argues that there is no direct relationship between unemployment and inflation and that inflation is determined by the government's spending and taxation policies. 5. Structural Unemployment: This theory suggests that unemployment and inflation are not directly related. Instead, structural unemployment, caused by changes in the economy or technological advancements, can lead to inflation if businesses need to increase prices to cover the costs of adapting to these changes. 6. Rational Expectations Theory: This theory argues that people's expectations about future inflation can influence their behavior and lead to changes in the current inflation rate. 7. Natural Rate of Unemployment: According to this theory, there is a

SAM app:

import json
import boto3
from langchain_openai import OpenAI

def first_time_users(event, context):
    ssh_key = get_ssh_key() 
    if ssh_key:
        print("Retrieved SSH key:", ssh_key)
    response_body = {
        "OpenAISSHKey": ssh_key
    }
    llm = OpenAI()
    name = llm.invoke("What are some theories about the relationship between unemployment and inflation?")
    return {
        "statusCode": 200,
        "body": name,
    }

def get_ssh_key():
    openai_sshkey = 'open-ai-ssh-key'
    ssm_client = boto3.client('ssm')
    try:
        response = ssm_client.get_parameter(
            Name=openai_sshkey,
            WithDecryption=True 
        )
        return response['Parameter']['Value']
    except ssm_client.exceptions.ParameterNotFound as e:
        print("SSH key parameter not found:", e)
        return None
    except Exception as e:
        print("Error retrieving SSH key:", e)
        return None

SAM Endpoint Output:
{"message":"Internal server error"}

@jysheng123
Copy link
Contributor

Hi, I see have you ran sam build before running sam start local-api?

@ghost
Copy link
Author

ghost commented Jan 30, 2024

Hi, thank you for the response. Yes I ran sam build before running sam start local-api

@mndeveci mndeveci removed the stage/needs-triage Automatically applied to new issues and PRs, indicating they haven't been looked at. label Jan 31, 2024
@mndeveci
Copy link
Contributor

Hi there,

I've tried to re-produce this issue. I've the following files in my project;

template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: sam-app

Globals:
  Function:
    Timeout: 3
    MemorySize: 128

Resources:
  HelloWorldFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: hello_world/
      Handler: app.lambda_handler
      Runtime: python3.9
      Architectures:
        - x86_64
      Events:
        HelloWorld:
          Type: Api
          Properties:
            Path: /hello
            Method: get
hello_world/requirements.txt
langchain==0.1.1
langchain-openai==0.0.3
hello_world/app.py
import json
import boto3
from langchain_openai import OpenAI
import requests


def lambda_handler(event, context):
    ssh_key = get_ssh_key()
    if ssh_key:
        print("Retrieved SSH key:", ssh_key)
    response_body = {
        "OpenAISSHKey": ssh_key
    }
    llm = OpenAI()
    name = llm.invoke(
        "What are some theories about the relationship between unemployment and inflation?"
    )
    return {
        "statusCode": 200,
        "body": name,
    }

def get_ssh_key():
    openai_sshkey = 'open-ai-ssh-key'
    ssm_client = boto3.client('ssm')
    try:
        response = ssm_client.get_parameter(
            Name=openai_sshkey,
            WithDecryption=True
        )
        return response['Parameter']['Value']
    except ssm_client.exceptions.ParameterNotFound as e:
        print("SSH key parameter not found:", e)
        return None
    except Exception as e:
        print("Error retrieving SSH key:", e)
        return None

With this setup, if I run sam build && sam local start-api and then curl the endpoint, I've got the following error.

Invalid lambda response received: Invalid API Gateway Response Keys: {'errorMessage', 'stackTrace', 'requestId', 'errorType'} in {'errorMessage': "Unable to import module 'app': cannot import name 'DEFAULT_CIPHERS' from 'urllib3.util.ssl_'
(/var/task/urllib3/util/ssl_.py)", 'errorType': 'Runtime.ImportModuleError', 'requestId': '749d3de4-9e5b-4ade-bdce-6582a53297c3', 'stackTrace': []}

boto3 library still requires urllib3 v1, so I've added that into my requirements.txt file;

urllib3<2
langchain==0.1.1
langchain-openai==0.0.3

And after adding this I was able to invoke lambda function with no import errors. It now fails since I didn't provide OpenAI API Key but I assume you already set it in your application.

  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)

Can you add this urllib3<2 to your requirements file and see if it works?

@ghost
Copy link
Author

ghost commented Jan 31, 2024

Thank you so much the change worked. I am closing this issue.

@ghost ghost closed this as completed Jan 31, 2024
Copy link
Contributor

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked/more-info-needed More info is needed from the requester. If no response in 14 days, it will become stale.
Projects
None yet
Development

No branches or pull requests

2 participants