From 543b1800d1746d92e1bffede31f6f46879f8a5b4 Mon Sep 17 00:00:00 2001 From: "devin-ai-integration[bot]" <158243242+devin-ai-integration[bot]@users.noreply.github.com> Date: Sun, 19 Jan 2025 00:25:38 +0530 Subject: [PATCH] feat: Add Gemini API integration (#650) * feat: Add Gemini API integration - Add GeminiProvider class for tracking Gemini API calls - Support both sync and streaming modes - Track prompts, completions, and token usage - Add test script demonstrating usage Co-Authored-By: Alex Reibman * fix: Pass session correctly to track LLM events in Gemini provider Co-Authored-By: Alex Reibman * feat: Add Gemini integration with example notebook Co-Authored-By: Alex Reibman * fix: Add null checks and improve test coverage for Gemini provider Co-Authored-By: Alex Reibman * style: Add blank lines between test functions Co-Authored-By: Alex Reibman * test: Improve test coverage for Gemini provider Co-Authored-By: Alex Reibman * style: Fix formatting in test_gemini.py Co-Authored-By: Alex Reibman * test: Add comprehensive test coverage for edge cases and error handling Co-Authored-By: Alex Reibman * test: Add graceful API key handling and skip tests when key is missing Co-Authored-By: Alex Reibman * style: Fix formatting issues in test files Co-Authored-By: Alex Reibman * style: Remove trailing whitespace in test_gemini.py Co-Authored-By: Alex Reibman * test: Add coverage for error handling, edge cases, and argument handling in Gemini provider Co-Authored-By: Alex Reibman * test: Add streaming exception handling test coverage Co-Authored-By: Alex Reibman * style: Apply ruff auto-formatting to test_gemini.py Co-Authored-By: Alex Reibman * test: Fix type errors and improve test coverage for Gemini provider Co-Authored-By: Alex Reibman * test: Add comprehensive error handling test coverage for Gemini provider Co-Authored-By: Alex Reibman * style: Apply ruff-format fixes to test_gemini.py Co-Authored-By: Alex Reibman * fix: Configure Gemini API key before model initialization Co-Authored-By: Alex Reibman * fix: Update GeminiProvider to properly handle instance methods Co-Authored-By: Alex Reibman * fix: Use provider instance in closure for proper method binding Co-Authored-By: Alex Reibman * fix: Use class-level storage for original method Co-Authored-By: Alex Reibman * fix: Use module-level storage for original method Co-Authored-By: Alex Reibman * style: Apply ruff-format fixes to Gemini integration Co-Authored-By: Alex Reibman * fix: Move Gemini tests to unit test directory for proper coverage reporting Co-Authored-By: Alex Reibman * fix: Update Gemini provider to properly handle prompt extraction and improve test coverage Co-Authored-By: Alex Reibman * test: Add comprehensive test coverage for Gemini provider session handling and event recording Co-Authored-By: Alex Reibman * style: Apply ruff-format fixes to test files Co-Authored-By: Alex Reibman * fix: Pass LlmTracker client to GeminiProvider constructor Co-Authored-By: Alex Reibman * remove extra files * fix: Improve code efficiency and error handling in Gemini provider - Add _extract_token_counts helper method - Make error handling consistent with OpenAI provider - Remove redundant session checks - Improve error message formatting - Add comprehensive documentation Co-Authored-By: Alex Reibman * test: Add comprehensive test coverage for Gemini provider Co-Authored-By: Alex Reibman * fix: Set None as default values and improve test coverage Co-Authored-By: Alex Reibman * build: Add google-generativeai as test dependency Co-Authored-By: Alex Reibman * docs: Update examples and README for Gemini integration Co-Authored-By: Alex Reibman * add gemini logo image * add gemini to examples * add gemini to docs * refactor handle_response method * cleanup gemini tracking code * delete unit test for gemini * rename and clean gemini example notebook * ruff * update docs --------- Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com> Co-authored-by: Alex Reibman Co-authored-by: reibs Co-authored-by: Pratyush Shukla --- agentops/llms/providers/gemini.py | 194 ++++++++++++++++++ agentops/llms/tracker.py | 14 ++ docs/images/external/deepmind/gemini-logo.png | Bin 0 -> 37389 bytes docs/mint.json | 1 + docs/v1/examples/examples.mdx | 4 + docs/v1/integrations/gemini.mdx | 118 +++++++++++ .../anthropic-example-sync.py | 167 +++++++++++++++ examples/gemini_examples/README.md | 94 +++++++++ examples/gemini_examples/gemini_example.ipynb | 130 ++++++++++++ .../openai_examples/openai_example_sync.py | 107 ++++++++++ pyproject.toml | 1 + 11 files changed, 830 insertions(+) create mode 100644 agentops/llms/providers/gemini.py create mode 100644 docs/images/external/deepmind/gemini-logo.png create mode 100644 docs/v1/integrations/gemini.mdx create mode 100644 examples/anthropic_examples/anthropic-example-sync.py create mode 100644 examples/gemini_examples/README.md create mode 100644 examples/gemini_examples/gemini_example.ipynb create mode 100644 examples/openai_examples/openai_example_sync.py diff --git a/agentops/llms/providers/gemini.py b/agentops/llms/providers/gemini.py new file mode 100644 index 000000000..6ca96c3eb --- /dev/null +++ b/agentops/llms/providers/gemini.py @@ -0,0 +1,194 @@ +from typing import Optional, Any, Dict, Union + +from agentops.llms.providers.base import BaseProvider +from agentops.event import LLMEvent, ErrorEvent +from agentops.session import Session +from agentops.helpers import get_ISO_time, check_call_stack_for_agent_id +from agentops.log_config import logger +from agentops.singleton import singleton + + +@singleton +class GeminiProvider(BaseProvider): + original_generate_content = None + original_generate_content_async = None + + """Provider for Google's Gemini API. + + This provider is automatically detected and initialized when agentops.init() + is called and the google.generativeai package is imported. No manual + initialization is required.""" + + def __init__(self, client=None): + """Initialize the Gemini provider. + + Args: + client: Optional client instance. If not provided, will be set during override. + """ + super().__init__(client) + self._provider_name = "Gemini" + + def handle_response(self, response, kwargs, init_timestamp, session: Optional[Session] = None) -> dict: + """Handle responses from Gemini API for both sync and streaming modes. + + Args: + response: The response from the Gemini API + kwargs: The keyword arguments passed to generate_content + init_timestamp: The timestamp when the request was initiated + session: Optional AgentOps session for recording events + + Returns: + For sync responses: The original response object + For streaming responses: A generator yielding response chunks + """ + llm_event = LLMEvent(init_timestamp=init_timestamp, params=kwargs) + if session is not None: + llm_event.session_id = session.session_id + + accumulated_content = "" + + def handle_stream_chunk(chunk): + nonlocal llm_event, accumulated_content + try: + if llm_event.returns is None: + llm_event.returns = chunk + llm_event.agent_id = check_call_stack_for_agent_id() + llm_event.model = getattr(chunk, "model", None) or "gemini-1.5-flash" + llm_event.prompt = kwargs.get("prompt", kwargs.get("contents", None)) or [] + + # Accumulate text from chunk + if hasattr(chunk, "text") and chunk.text: + accumulated_content += chunk.text + + # Extract token counts if available + if hasattr(chunk, "usage_metadata"): + llm_event.prompt_tokens = getattr(chunk.usage_metadata, "prompt_token_count", None) + llm_event.completion_tokens = getattr(chunk.usage_metadata, "candidates_token_count", None) + + # If this is the last chunk + if hasattr(chunk, "finish_reason") and chunk.finish_reason: + llm_event.completion = accumulated_content + llm_event.end_timestamp = get_ISO_time() + self._safe_record(session, llm_event) + + except Exception as e: + self._safe_record(session, ErrorEvent(trigger_event=llm_event, exception=e)) + logger.warning( + f"Unable to parse chunk for Gemini LLM call. Error: {str(e)}\n" + f"Response: {chunk}\n" + f"Arguments: {kwargs}\n" + ) + + # For streaming responses + if kwargs.get("stream", False): + + def generator(): + for chunk in response: + handle_stream_chunk(chunk) + yield chunk + + return generator() + + # For synchronous responses + try: + llm_event.returns = response + llm_event.agent_id = check_call_stack_for_agent_id() + llm_event.prompt = kwargs.get("prompt", kwargs.get("contents", None)) or [] + llm_event.completion = response.text + llm_event.model = getattr(response, "model", None) or "gemini-1.5-flash" + + # Extract token counts from usage metadata if available + if hasattr(response, "usage_metadata"): + llm_event.prompt_tokens = getattr(response.usage_metadata, "prompt_token_count", None) + llm_event.completion_tokens = getattr(response.usage_metadata, "candidates_token_count", None) + + llm_event.end_timestamp = get_ISO_time() + self._safe_record(session, llm_event) + except Exception as e: + self._safe_record(session, ErrorEvent(trigger_event=llm_event, exception=e)) + logger.warning( + f"Unable to parse response for Gemini LLM call. Error: {str(e)}\n" + f"Response: {response}\n" + f"Arguments: {kwargs}\n" + ) + + return response + + def override(self): + """Override Gemini's generate_content method to track LLM events.""" + self._override_gemini_generate_content() + self._override_gemini_generate_content_async() + + def _override_gemini_generate_content(self): + """Override synchronous generate_content method""" + import google.generativeai as genai + + # Store original method if not already stored + if self.original_generate_content is None: + self.original_generate_content = genai.GenerativeModel.generate_content + + provider = self # Store provider instance for closure + + def patched_function(model_self, *args, **kwargs): + init_timestamp = get_ISO_time() + session = kwargs.pop("session", None) + + # Handle positional prompt argument + event_kwargs = kwargs.copy() + if args and len(args) > 0: + prompt = args[0] + if "contents" not in kwargs: + kwargs["contents"] = prompt + event_kwargs["prompt"] = prompt + args = args[1:] + + result = provider.original_generate_content(model_self, *args, **kwargs) + return provider.handle_response(result, event_kwargs, init_timestamp, session=session) + + # Override the method at class level + genai.GenerativeModel.generate_content = patched_function + + def _override_gemini_generate_content_async(self): + """Override asynchronous generate_content method""" + import google.generativeai as genai + + # Store original async method if not already stored + if self.original_generate_content_async is None: + self.original_generate_content_async = genai.GenerativeModel.generate_content_async + + provider = self # Store provider instance for closure + + async def patched_function(model_self, *args, **kwargs): + init_timestamp = get_ISO_time() + session = kwargs.pop("session", None) + + # Handle positional prompt argument + event_kwargs = kwargs.copy() + if args and len(args) > 0: + prompt = args[0] + if "contents" not in kwargs: + kwargs["contents"] = prompt + event_kwargs["prompt"] = prompt + args = args[1:] + + result = await provider.original_generate_content_async(model_self, *args, **kwargs) + return provider.handle_response(result, event_kwargs, init_timestamp, session=session) + + # Override the async method at class level + genai.GenerativeModel.generate_content_async = patched_function + + def undo_override(self): + """Restore original Gemini methods. + + Note: + This method is called automatically by AgentOps during cleanup. + Users should not call this method directly.""" + import google.generativeai as genai + + if self.original_generate_content is not None: + genai.GenerativeModel.generate_content = self.original_generate_content + self.original_generate_content = None + + if self.original_generate_content_async is not None: + genai.GenerativeModel.generate_content_async = self.original_generate_content_async + self.original_generate_content_async = None diff --git a/agentops/llms/tracker.py b/agentops/llms/tracker.py index 3609354f5..648920963 100644 --- a/agentops/llms/tracker.py +++ b/agentops/llms/tracker.py @@ -16,6 +16,7 @@ from .providers.ai21 import AI21Provider from .providers.llama_stack_client import LlamaStackClientProvider from .providers.taskweaver import TaskWeaverProvider +from .providers.gemini import GeminiProvider original_func = {} original_create = None @@ -24,6 +25,9 @@ class LlmTracker: SUPPORTED_APIS = { + "google.generativeai": { + "0.1.0": ("GenerativeModel.generate_content", "GenerativeModel.generate_content_stream"), + }, "litellm": {"1.3.1": ("openai_chat_completions.completion",)}, "openai": { "1.0.0": ( @@ -210,6 +214,15 @@ def override_api(self): else: logger.warning(f"Only TaskWeaver>=0.0.1 supported. v{module_version} found.") + if api == "google.generativeai": + module_version = version(api) + + if Version(module_version) >= parse("0.1.0"): + provider = GeminiProvider(self.client) + provider.override() + else: + logger.warning(f"Only google.generativeai>=0.1.0 supported. v{module_version} found.") + def stop_instrumenting(self): OpenAiProvider(self.client).undo_override() GroqProvider(self.client).undo_override() @@ -221,3 +234,4 @@ def stop_instrumenting(self): AI21Provider(self.client).undo_override() LlamaStackClientProvider(self.client).undo_override() TaskWeaverProvider(self.client).undo_override() + GeminiProvider(self.client).undo_override() diff --git a/docs/images/external/deepmind/gemini-logo.png b/docs/images/external/deepmind/gemini-logo.png new file mode 100644 index 0000000000000000000000000000000000000000..61275eb39903ce1534da3993662b692294d41268 GIT binary patch literal 37389 zcmYJa2|SeF_XmDw?6PY?g{&2&Y$42)Erh7-j6#I6rR+0WDYAu6lqr%5jR+A&Df_++ z6(dWwY-1T_d;ZV#{r&%cua}qc%zf@X_iXo^_c=G=oVA%CzZ5?JK+yb*u^j+N_#+bV za>IYt!v|>aAHJY7PN4t@ePjP3GE{{20w4>_jZa+&fA+V|EzyejUAM@HPSdxAJQ%}40th7L)7NSukpkal> z=Zo2?rh3|?R_-6%+&6`9eGvN_=cE6mBz0@lZP3l>X|?c3s^p35#?N$4EXb>@K^#(H zMD^wB5vQ<;Z_TK9m`3)`=f)HD5i?OU2Hc=wyXP4del*iqk#mcnFQMJ-_Bn#u7Lm*a zf;Jlf@Gy%w=9jCtd>nu-VF>{K9<(8bLWkrekl^p(V*m^X4~xK0ce6j*9<_RnNB#dp zb&hd@VJi;+O7rz41W~ufJOOwn^Y0}OUQn9zZ+hM9+z!(=py4`#1K{>ghvO1taBkc% zQ%`pN_T@E#pK-D|3`*~?hx4S;pvk$!9$|nR$GGfB=LU_y1a(b@IU|F zIvH;d-I0<7pPOfEc>vIfl1{vj&k@KOV(AL;fFPj;4$vU)z7fBg~*9>;*i3{oKpEAGXi=|7F{8uU7n7&b$BLc;2UT zgd!khX2fQ<2%B98q!Y6Xi?>i~jFJ<0gN5S=(8crbOSUZDuHG=6oa;EtX{-pZ$iZGQ z{)#mwwE$~ebgrQ{2xkF5;r%zfxpVy8G)-CVM^4u$KTQ4RKc443nSEf41~>mr$?J+Z zCaJw&(j(7=7wygihOe_}me=3@@;fWxO$Im^ir9P|DtFojo|KC{DX;pM?sg7YIT|C` zaBD+DQUr{<#<4lmU`GrYBnZak{|0Fd@HqBHM5P6CX2QQ#od(8_jsb-@#IPWHdJacE zob)nE`!@WZ5Mb*8P8=lE6vgg1gXAb>vL(TDP!WXi$=-npt(0M}3#gl>#4k(OA|8+h zqy>3Co_H=VDuUY&*jmxB`XXo8=N=L!RZmK2wy~}^)JOdoXsBjy_(-9RKIE%|lamE5 zLNL23pB0j;p$kf@$-cXg!#Q0p33mW)k*!@W)*6L`SsHdnE+gXe(|(68FC4&yBG}(_ z-MO^(%mG5>c4h7YArxgfURYO@$+gCcy5&>^KKA}JMtF~1)u8yLc7jTDBd^BOn4On%4!htL`Xkg2Cd`RoyQ43e6E{^^3}rRpuIT5eC4svLFf1ejyO|vxRP4}3_fO+d zA)IIc=UG!D_^VIj4Y2Fly~5TfC5i(paSydhj6*B^N@+3oI2{3w9%Wk_^2(hrRSAsu z*r9p%KZq5rTyD_&Zf8pT8(5WxWVFDz{wZF7+wr)Acy{%;5X;aCxvwR)+=_5im0+R~6;h5mx;pvM0w23J{@x0$__=X>} zU8v2uuC+U|uPPDvB0y7g8E1IAzeI7IkK8YYQj~~e5#05gg7Mt_UwSQ*Y(&~Q~+bbYdB>Hsw!2wRixK&TLbdm zBxZ3OyDG(A=L_37xF)t;0u9@XV(mdK@q~h$pCS~23KK%_q*s+e`NnAWzz_cho+h!x zk0^z7arLATBX5)L<0@#>Ukqo?H!;^u(_bHNhR=wzDV)3wTnEw_&EXr6R?A@hPR@rE zpk!qz*N`WKOcdI_)m=bzp**!I_27{6Y?Cs?r5e?WbmYEXuUFm9c*gZob49tp-F!}rx8Xi}I>8xO1W@j?b$K=P zp?>lj_~^nIvfok6N<5NaBKx$@{9i3W*vc410}pUkNnTXnx)Rua1Fdtig_-KOG5+gJ>jq!pDP~3^7#)E zrm^|@IKuigJvp64V-#cYTM4`PH&+Uc~A*HA}>F zL^$lU^x3S8V{G+#s$CIh#n+_Q<*sP-5kpWv84n!jArmBNAjckO%pP~C_6Bpa-T2Wi zB5BJ^UYU{Bj>*qqd70p*?gm=XXwK|`s{b2E9o{m@{i|^uyPY1dcL}HJESU1)G*CyJ zYl5bA1#4lq{VxWUeNXYI%gajKeqNoZW@wZyz~d{%*>Im7 z2?$QSx#~XMnhS}B86au{6&a$_O4z)c9I}$jq?fp*g&((3eH7?2|VQz;AObrV@n~p!Vd4N>! zBT7q-V(a!xBGP~3l#-i{!D2^Y|H?c3<^tmkc38n3G@wZkvsIZ(##-M``uvb&jx)ZA z!tp-k5y6liQ`;%T7A72DWI+;Buu-#M!H+-dE&6?M+eS_~I(OZ&4rEeA6tVWT%sn_VPYoQetHGkzF?&B+8+FeC(qHvg!#&tF)BT?1Jt?!Dd^HQvgjKSr z20u6&P}twk={+z}``AS3QeYoj|2}!{BDP87pf^1$zKm5Tg)ITI6`Vt;f)fB{PnYoe z9QXLMPdC?=GHgbpwlbeEj?6*+q}Xe<_%}uC%F3*?O`;EPxclwapwa;_o3ILyM&M6D&nOk6cE=n~Qus>LTOtemdfK z;HYn0V_7EF>YGwirrB-db60fv&iw&r8vkuh8n#^> z&s^fhpK^S}@`|7wX0f}m4I!KNWGgw5xv;2O^OxXw;6;<@G>881-q_X8Ck$_$G>`xd z&g>mYuB|OgGb96D%82elxQaH6*-GInD&d1B&?D(grbvT`BYTk#CJ!(RlNZSr!rY)# zn=L3K{Z`D_<99%nO3V@;f2jh>T3Rt#ka=+ilVUK?27S7ki83pfWPbfjC9bni%o6E+ z;KiPQuR5+jXt}w)_Mv!>tT;sSj-~kQUfd)VK|>K2*9gh`PY5OzhDsSkcCx%IjE$SE zjAV4P>ut4;0dyd2p+TnqOu6=m3(I*4vNOb4v)bRVjw~N>1}-!5Q}d`ct`rqm{5sj4 zncRpd#Fmcew>Y4q!ag}9Ju;Z)x3E*RGWU>r zR9jt8vqAXwh`!)C>f>a^ut&<`CtKZS5F!-@9JTvnVgYUgEe3D>?gfVY{5w{L9%QcC zO17iO^tQf(yzL=p9$d>1QfHs=Rjm58Me$yW#9jm}B6`217!skwHI9JyytRExnAOoo zoAG>_jcq!-U48*nn4t-bgpkWb0bu>lk}g?j{mtC6(cE?skG2w*wc&jeGm~9`A<}JIhi6)BfvU4oiunG#%#~P zTJhpqGLc8e&QBu-?gI4@NbfV5Hg}*~=JEaZ!MfZjgJ9V?QS?E8i~? z&w37T&H3>M%{d9{R|K{&Nt~sd$nagjc^M;3ia_WmBp0rEAHw|`}!;l z*7dOSO@r_`+G3UrtQMOcuXPrSvDJ(BxIPXi--OyjoxEgl4FBxMqT>-9#k=ZvvVNPp z<4IC*sFBi}5SIqam$LQS_YBl6E)%olz?xl-)!2>)I5ReFiZz{qe3o22de&1_%#uMM zd)NWYZpBW~2%=nl_v4g>j0Z}PJ*Uz3q|;CPWltFb5C%&ctq9{dM4fHRB<#p`a&m?q z*nG|?`9wUCYPuLxJ_sI6{+_69V=4PR;NH@!9|~D?Nhcn09$550c9mQ6Sy!K_Bw(kJ z7o#V0Zq?r!(DbTiDeC{W{EmvypYYWQ3Po640=l~McIX)Q%VCK1l{UsywS5{{zKW>H zJLQLWfI@23)zGNMSCSt?fKK;sSoyP{t!1cZzYci8hOp9t^6_;eiI44^D2v8K)u{dm z|3S4Xc^lj_S<5h}h&0s*+NpL&W1g-Qoh zb?34KaOsJVPifn00tMJY03DfI?0lUhMbADbiY4@7;4AF&7eJJAJiXnwZV~5$$!|+;JT|2b0@}j`lQ9 zYlPx~?(4>05f~AdfDc>rYTw3BwQlZp*N+nOvgl22$2<@X)3DG`7l z4M(_fXeH(x47BBo;C4~FeIb{hqCbwrItya;2wG{(8&p(GlGgF|SLkJ@VX)%LpfSR= z3ma`uFl3jbuIicC$%14Y9Nn3@!k{<5Xx+Qy^3QnX7=KFgJWJyTlzTvykUD#@K4ZuD z%LS%o5jM|}Vdn1XR_M>Ha6sFzB7{vHq#VcEEp+rHgvjCRkU&EhoFW2eMW$$&Kf}9& zTm+HAAlkDBg)3RB+>XwWVkfk@kh9)~X{WcflHUoX{L~X}C<3z18=y22(UMWW6Ug~x zu~zFzcH!zTp5)1cpKJ`}&GYrbQAP;;jZ^U->XlIe?A&~od7QfHiw`xdId`x-_5tx` z^snskh>5IGV0j-9mg>q-Zi2F*-0PgG;>e$#`+I)-zTDk{nl7h=!I6;Trtiky6z-M> zQJxB1NHiRiI4Ut$>+Ko+5nys|;=f%#skPvTV|)#>HPoF8NB(7Y0uiFWJ3*B(wBa|s ze32uuF!TrhY?di504RL0pQ$aIxwd^?O8@Wv9&S!d;8o!kx|k?)ohh%tb^AH&>4&ef z)3UE$!nfpXPd2Gl6`b<;4jCAsbNd}tuwz+IenCSP!}Vj=sVbil*@W}dDk-|Jxrb>D ztHc6~ZTi9J>%!n{Br2~BY+B5^bIPEeG5GX&Zo6?`1Rg#Stf8Fl*gtiRC*t2|w7NWH zNlJPY?P<9B1~@-ms}RKbIRl^1j0j_X+*q&;<<{}d>WHy-I^!|AyLE3~;KsReNrTe# zl_}zhoUqFbiqA(HX#zF!2Q}nJ7U!U(h-0o&Na-h=SsHve^(D2Fkdq$l?C@PKMbVGW z911%A=jLrsh5Z3_RWL@Z9}wT-6M+r9+Xl}FJNBE8{ZA~rM^NrG9uwEM;QUysM9Ur| z{!*rR;|zAy!XIx^eFPCZ-Xel@r2AN2^O*u=JH~Pt`C{nhk;ge#Muc(Hx+bF`jaCJd zbpi01TOW87rcB2sI*`>Sq0|*&(F!rB_!;it9&{x*n!#$-{#0wTa|fao&StjTv5Mh& zA6c!#C`*J!$Y3WEvF`WG*IOh!jo4ehC}ZQYiz6DdtUKkXMTVxPNneGLxFc|4JACM4 z6ER~-S($!zhS)NyM#wl*jd6pIK2}z`_uD_@!_;(lT=qlo{EUz2_pyzK7)*Eb4Y3yt zWwM!)fZzH&#$LT;k`c=e&a+r_6pIdh-ok+ZJ{WxOT1g5V+s;*`%I7`K!9%O{4~iE_ zsHbN@%e#%_Y>tX0w~;NgD6sXPzRZD1R&5*g8&;xdH*u{#?)TS0gdXXX6fZ@3sP5x~ zQ+c5hpS)JjQ|{>Hz2E45vTag?)vb1E#Zd}Pt=A(5G3B(Gb7*s>O%(XJ zv()TrUe?ht|3R5B#7954R54dsKSW=^_wmLb7FT5=g|0?VeBqQrcqrpw=vmrnq1!Qp zm^wsM$R6K#hjQ+8DZs|qLr?W_-?x>11wI!K##$oIeI5!cA(}QbUBoNGl~hD~Le|<1 zk!61{kBuxv*EDXtio189v0DHpa>omP5TKRJioYerm{ms=ZKBRoyIZ1hXE+-CXy?yE zF@9%)`mt>=l$449PIJIsu#eMJ@YM5 zKl!{`4R3~$soI>5Il49(TC*T?f+y{LY`gpm14Oi&&oqH0?7@k<#&rOchUv3mT72vb zYwuWue@J%?O+HV(p=Ey^ISABNB+@87QymbnYX#s+@-(;oRqG&MOx1Qq2{xhaHY&vB5fJ?@4W0wX zNv;-an(=6YjWo#$JEc%`l72N)y*95%=JTJDh3RhuLGGl%5S@o`l+Z*Gn43|$tqcO- z3TDGl5F6fOm)``t2bWuErVp`^7^SUML)+@9yFm(QT}F-h4bxq#nS{qx z3wkkguYHPpaPD(9*uerSJQb(^V^-_vZ(KVN(y7=5cm$CpVHhUh+oc zxohXbHMd=`!_2>n>K0jf6?5b}(GKZhzl_y&mmkCQvq)_z)XQ_mi#6ES7mprn+t@1H z(2u0RahIi9oDaj-(?~7{3MUIzNxg82(BIlhKDt=PB@Y5%+`8f317)m#Mq%#;%e@hA zXN?}>Xqeqae9tZ0b`>~kL8+i;zT%n<+MEN|fW)uNyep-JK6K-wgujb&LnLOK z)yMHt92xPK7Ew@W^gwqitm9;I(j`vZ4q*Wh^s@IkGw&=m@8ULo7rn|ESgO9oEFyT9 zOEjUdU#NVf?;*(K8ZN94V|5hx47gMSv0RUm>sXqU@N-hiTv#a_Hzo->@wbTAtBMZj z3S}PlNVpShB$N2y7_#9WTqOQf>h+5yX2WujQ3+$Eqspam9bWiDh=8O*5uud#P%$HL zVATdQQ&KY>d$rh=N|;Wv4BA4Ivz9wazG@)ZZ7;%oVzMCN`g7Gb@n)lN_N3mp@zZjn zEaS?QM`lwrHPJ(?2!5P9pRDZ(O_8zBFYc4-EAiI|mXMbYT6uWGmR#QD1Va{~^PZ9! zrzH7HBPD0khywPKlSCbz|EtH%oj`bGzouJ(|LX>MB8F{`893U`AmOs!B=JEujE@s1->*Ean*!%X~*|}<+7~U=w(5snX8Rm#XBse z@ZOD$c1|>5S83J;(VC~>EE~M?7l(#lOn3&kn%^^++tni&E8(<8NJ8j-ahb=}mJLnc zp?sRAWlesdAYZ+dhS2OSxGf0h&UDd&1T44wHi!o~ zWk0+Vkh|E>7aB!YETVk<;Un)2h4X($2VH_)^HENALGY6PbE1XU=WFuv4gFgrT@Y0+ zIVzLBQGcd@Gu1(AKjk8|5*K1!wUf zBam2Wy@o$HW7T@LKGIvwVd|F*`UJhV$@&#FaP)*a=hjGk%Ya%b4TPshPnGEG6U;gw zyvfbz_+li1@W?+U1crT!Gl;txt#B$`*~QVK0GylU_XCwlJT$kUnL3`FwZo%?_H%iz z2y)WhNs&oiyi%pU>NJV(Uej2aS~f_|*xRtILMG>I)onI11jF$pG z)fbvKrZ!;whPB=28TdcMza~(qK^G28160-Ah|DtAZU1YV*?mV$A0W&4aGAU*`md8X zE$1DySs%JAZ`eJ%)fHRv%**G`>fk4b^zq-}zSa*_SBwUpCzdcDpMH@7HYa8?8&YFS zas#Ggv_kSoUya@0H@=GSl6dnW(PwRXbv3}6XR>L-{)}n4M9CaS1PqS*T@H*A%)9`Gg?`9Mhw&On0|j48FF8|GseLyD3^vJ{Cr^BEYLA zqT-rA-^vrQ)%9{mas3^P6>Vab{?T$~i8N8HM7&Gx-(-*xUTC@1=2sQQc9Iy7z>|R{zgw`qV`zK3W)!KZ<+0i|9T_ z`U6xkd!uu;E~8nFpO3_p1VqNjNcxaGQ>lYZJgDSD=vRXvsdk5dv3tdB#S03}bdP~r zN);xGcI&G!FHQrlhYw4z@$%_@8N;pi=Ept`C2RJNeu(w9;ok&sI!ZzX{J5|IR{TT& zl=jofnpNSAE1^pq#?`DRcOM@St39j5F(Wo>fpbnaT7PD{jX&|{EK)YPUy9slyGHXN zS7Qa=#d(OMc|icD>LUHagXTX|HUsv|Nz|{WO-B;7db4goVNKZAh?JeQITc!dRrxfw z;6I4+EF=64jLvo4BMFv<3O8nSclTNJMeY2P(tX5@MQ`pO3VzPf6PXlwRE0$tl+QvX zx$a=i+&enO`t3W+Z|V3l2i^(h^vO%5hv3MzYyXiI&t9`Nj9fZv{T+D?IXj3g;l>rP zln+I4`&mG{N%W%FPVBmz2g@ZT6|7FmDLG^^Zl8&jlH+rw+dv)QIp3Vq6?4M`fG))_N42>6=bl78bDYH z`%9dH(XIYeR>gZoKc8;wEhKiuJh(YSET09RBCURkAIt5W_0ZRJAddPhy}vIXfr+rX!pMAg zV0n%H7Y?%fYC^V`@80ilub9^+|hg~`c`ew18x2r$QPOy zR4oj|KEBl1D{$@r=-MX%#?PicW$AC5nW8t7E0xliNlLN)AyxKJ>VY6T71obdqBb36 z!8tIa%EYo-`-tw(+ykR7f~VYQcL=c`%IcP|qLg@{CwX32)g7x!e1CQX zKiG1{(BbV(_ZsGOaDCw~Qul?mj>wL`G-^{ZHH;U1p5JO#L=?4GiZ8Al3G+!@XN;iC zOcXsA(i^!ROeHLh5Z3U6j6#AC8h;q%MUsda;tfY|`2jZr!)yiynOWV6gtc#>wcKb* z{cbkcZBl=S`YR2tKcYDH9x@&fRW55fpYPT{PVaq|vIki2zktA>UU8}odttQqR zQZ`eJ)!mMII#*-csppL|4xYq+rxWH`5!hg8E6jMb)$rtYdq;S?pj#Tp6&rL3ootzs z?(|pvv%AF#CBpKtgA}LwNq58if?F??qmD3(x+9w60wgZ(+z#&Yw*kpHjnZ&ing4Wn zxC_-%M?(0dXJwnw0U3cVB&?KgzpTa{T9OyUb+nN~O-No)R1wy2sbZ@nff+RdmC!fN zAXPD>&Mh@QFc)H>#+Q;H28D}#w6E4W9=4;saon5bZ`7nhm!}A}zS*@xkurMknCy@| za?;+9*MxH#9aIAc{e`y^sp7U_9jL>jZI=_6;T3S@G|LFsCt+WY5sk0hE`GkXr`xN! zRRdq7d~I*_M|lp9>u-!hRB8mJf8DDQC zB8V(RoCf3HBG?!#b{mg!JWP*BuUBRn4Uk9Ge$t(fLiw-FF9OG<*qFbR-jAW36vM`g zKQVF`vcii+eT-``Rm9$Ue4m3hW0_nRSNV1P z6`wYLLTS&!ej=dIgZ$D2%Gs4;ER zPcn;s(W(bNo;ZyR^Q36ZZD?iwXZ`kdE@9>z@Qg^&Z!DWC8DDjF=Bg1*`9^utGC49Y zYIxu7R1n-82b915b9b}7>lY&um7~U#4mY#>FZb^rslhJ%m%;F{q7}luVHcFag`Z2L zY{rqi;iwye`Jo)@6|Bt7xINcba+BWpD@1nB#bo7Ka2aS9aUN5*mQeG0MbsFh&h~jd zgx8US0qUVY>=QUMotQNjm;wBXUu%xd4>mrcy7V=qw&6+J%Pe=7m)Ey*Tc3gCGqQy- z&n5Etxr`6sR&R`59cygi^TVo#om^V2cr8u*Z{J8~_ayc3-l?_upxasg;|4BDPediT zp1v}1DrulIYNMa69!5-2Up_f`*)8grbRU-WHn}$LFrxG*oRy8us$%`W+YIDx!7+Ou ze^>tNpXB-6;9cl&;^|TB6ru$#&Fw1B*mB56&fY@^C5dUt5&3Z+=EqA0HDstygf`|! zTrT;noZcJVW%2h>(|nB5Sa{x4OWIEi7Qgl1hQaAGpcrn2?Ag&it(Mw^GqgX?%DP6H zPhfc|;vA@5maL=8ZpV-o11-NRFXIMqbDe$PBAo3$K)Y`@3tWF=%H zj`vc1^`EO{0ROuW!}pb85%`mEOJUl)odh4a7!&yIRu~k{{af~&;A!w8J1}~!i(1=? zA*@GWp!HdocdtiT(PyudWs;vo-3&#@dEL~uRb_crLnQ(?+Balu{WyMA*LxT{i3d&H z%$h0AC|B^?(#Y-18iDLh;&w~+g?#%VGVZ@sm_>c@y+BX#_Po<@XLp1kd|YR43HvZD zW6suE6>p5CH_&Lo|253I9S zE@4geNi&(vY6nhAzi@#N#;p%aAa(TXPVR0vfVH<%uwCc*N zxP~n^cS%%U&3Vh+^Avg%MT3hTexJ^FyB^|vr5w{Pvf?_P`>jm*lc$w$fjL&|DEVF& z94U?&bkOxK3f5Ev?nI_crh6tI2C70$Mm3sqH%|{tQrnD5me1~#TS+1R{2jn*BgZbz zkq_1L{@n)`$N2Q9rG*Ln|!dU-I{4d5r^n>^xO~t?P0()zsHBC^6 z9)vDNZrjUZ!f0T!vCCHU-6~r5|kK56^xr&?6A* zGlx)XjjJh}%Kg^A@CQ?q0^?`HyANCPR!1)>ltEeMxB#5OH*9T;?sd7J=O^x2ZSC_+ zfR3g5lWEo_+4@ zhb$Q*(wobZ7l<|^GO4ImuL`KV6`ajjQSl3W)jR?f8O}c&em#;(xsXgUE2V18={t-Ar<@9JNz_)`{6JxV1#S{cwB< z$>H(fk`_vLBvm83gm;5o&QeG-E2Avs=JthftfulC-lc=b;0E}ki}X3VFhOkOC}j8^EBAAde2ap) zwSmRz^LuVbW?T?QIx6#(-VAYd%ffHD5H!SCs>Rt*09S)w;;2t*Q`|*XA=%||e_gUC zr&sO%$YR^Hyp6N!(FgF44&6Iiy!>eT)We-hpYdiS*_39>#4Bqg?9!day}Ucv*Ai~> z&MnP&hLkN&Ze|UtbrQ35^?Ur`-|r@O{SV#G>V{rTI}jRR@;A72|v;a z4;=2#SPi&a)m^+?=m}An#lc<{V57S0gKU?~p*`~-A`()g&o0BoqSmv23;2iwb$!{4 zu-4-DFHSkKF9wv#v)e)MDGNPzSCk7a!<_h|M)lVRV+x8?c{N}gQI_0N$)TjDyNXX^ z6hnH%eLb<-pY^TrlNNCK=N?kQ)(~yQ_refkk((pH(|(w1SyG-#4AXx{r6z*sgp|&t zm0Codz?tU7z_<4I$5f6>6K?G8ymymB0=PWBL3Jr&hE2kiKyGx1iw!F^&1LZ&Mm!9; z9)B12HEQF{=uz*dKMAeMtP1y;aD={kdX3?X;+E6XjWWyJ`apY=KfJ1z&Cn={u~GUd*55}o zqBozv%T)pA3vW6%&kU=7x~Wp9k36gN4*4B zIa@pxYrQPgID4`a1^46ZkT0f3$M2$;jG<6!36oF)R_tl^^|v*}rpiOb^0SPBS6V(D zw%ROMv{STbD&{p>BzE)5~pm{ber?z zd{a*xCQuQ~&s3IP0Ml@fKqs+c`Q8z~8z6p&K<|!y(D^|^8V5HdEG3zJR2C50rr*#E z$ec+z82%v+5rp^``53NkGRgF*-@Tz(N*a0+x3H@v=ENtf%aWX#?PNW7m$l_efGn=aD>nx|ON^-(UHK zK_v>Nx|?=yjMKNfjJU7Wr!&JfcF;#-PY8WrtB^)bX+#%6RpaV;hTLzV)aw0>7%{&^|hP zsMVjly{xXU4THkZ*in^Ob`$uGnBNPqEjowjim_JHAoYjXv)4L^%*0JE0%7hE^|QS4 z=5$5!ho*w_Xy1s=h!?(G;^wTj#jqdH zrWP`DGWc6)4Jk|mqDyerQnF_BPiWgdV>no3s$}%FM)}fe`gE&pnzxP!0^PCcuu~^5 z%Q;KwrHXnO>`~@*h65)(U|Pgzd8C+ZQUSHEUM!~V@yxAAL`EdW&rS*}ALG=PkN|t* z&n}fKvHEX7q?^FCY7!&EgnkUNCSn&e&R>&8lRKcqp0jkCUNw1o;dZyZX9lO(w9+M; zj)Q9fnH|>^{x)U+_p~?BLmlDMv4?oj+DFp_6D=~tY>^&T6TVM2mM<117Cv&_<9}JX zEYv6b^WBXoB?q`Av}*IW=6YqqYk^S`t;ZRv<31> zCH$k7rC^Uamf2yflkMwlb@OGq9QCRDrGjy)C4pujT5^U1ZI5VKGn=T7ESvHCAzv!B zhBFKcTBn6;L!nQEuX7&*+8}oeAGy48`16r`ypyVDLmKjEu$Uq?E#NJ_DSKUGq}}P> z$7~ujUB{0tEmwgy{pUBJil6xSOml2xNTL*LW!2B2)|RSvFYJbx$JIeqMO^b~S2Gz> zH(bIfS)Ds}@4{yKWVzuU|Jsm~a6x{;Hto!m*&|1%nFtqaMj4Czb&d1MX%0OZxbbG? zL;ml1Xz_ni0}^0{({j@=_}JA$)KyN*$&sjM#yObRn%h~mKynUqZKv^~C7mBqUCK#@ zTt*<1-@DDclqP9aT}PVV3EWOZ3#H;?)DrBchZOd|T`*Y=9JNfw*f3J<@9q(TTacq5 zHQPj@LV~MZ{NGWtJP~U5I%mAndNx@_1rf9wP(HNXax4XUkc@hUzP$AehX`s4x=bcA z#=nFbZ{A(L=0cZJ8|$8lpT7VVzak!FVRCp#XO9MWF6gQ!jTqD^KiCB(BJs-+rLFz=cEUCCfe88-8@{P zR%PIT#U!KguThG$Z!|3}Z)^zuP5mY@YMC*Q_pR%I4Eu;NZqQhL1ts&zJ7ld7<;P50 z%v;Xcq{brtcngk%nz6jIXIl9$p2`14(=R?gJ8Ao#H`NUy5K{kI&D_8SN(?cpzFwT5 z9^m(1t+01T+pn8R`zKfDfH#EHa9x%EBS$Vb2jf|?LjT6hS4Tw%+Qv{YL*nQ1HifnO z%|MIueiQ7z=6=ah^4^=jK8Bq1=NzbM-b0#JW({iGixEZ2F~8=T1?LOAyIdhH+RTW!bKPZ~XlTanOIN$A2!E`gUJcsp2p+^|8dE1*D>mFFJZ+e%P z(vWYNhlO~oSHtWlV#XfX=68*fnVdoPCogX#dVhF#MzBZEJ%5W`))9|Y>L+?h7~l7Q zGWGV<`vzJAn2qb2B7AwYjQ=xFe7%M-#usDd$TiAs6->oktg?;LZk!5Q$yYU$8NHBRk6@fKwQ` zYnZT$Os%9_ZzYmgS@5kne%x7kS(QzU3)i`K+axdltjVY!SOvI6v-nybjY2$&eBrvW z;K>RdQBle*4bQ8xXW2~?OrMw?=JcET<~+~xvDzvqLyZl28JQPA;%X7LKHFJ_YVF}q zn&Ea<6#)@%%35#?)3(27y7f3*x!g_Vz0A~FjGc~c5-$xm{CV);z4v90I3Bf|o&SN# z{VD-pgq0m%fzfKXrq5lC_h>75WYvW%(&2j&12;g&N95ygSU)$GnA*)b zTjvcQWo#Zx^4EeryRkMrYBgX~qI#XX#XKp8(?mLMc3UoQs&UpZ@;fGtHtkcX2boZa z6shNg@$U?Sub~uX z-x_KCIPyOm{r`G`v>zN70UYRHJD9>}9n*#~E~FjksRd@#jD?pzcsh*5WMND{DyZ|S zPh(5ylG1>X;XmVUZnZ7Ou7gD>C8^qEs1*~1ze?$8dMF|yBRM3$$S_^utBnkU38vlS z(}#>FC65$z-%AdbMqkjfvpC%O=e=WiI#Edl=~c3n??ns~L)gd(h#2b)VMuFDYeB>+N zh#*>>-LD;ib;%=6|n%J6_tz-N1%+KWRRVWqq`#*^ZFS4ID2G{=dBdn)Fqr*rI;? zSmoxCJTGGLsnW*_c&9=h^p!z!HM!>~v}4(mx1(ip)qgR8ticJlPgGw^U>R@xjn;Qo zjOyMA_W?eEUVX+l%XfP#3AAgR6Fc13Es``y?~pT+bupZeX|5e4-b!rg%6er+%4$|9~(;Uhi)j@4g{Jz2y_V1BBOHjbvo4wZjwT{1U zjlg%`KN0nPE-BJmLmvfaiL+?DQ1NF+$1I!u`|1ppY1GdX`-h!5{?{MF7+-16l_NU^ zOG!ez3(AO6el;wUfdPMwF^ku)kskXCK2FD|xI)IvU!6syKWR`cU#|wYe((Myt?jkd z-hj%Ttsm@%+TY#DeGV%2;F^~*Sixm7a2}L(A0(6PE+@8WXjmi^^r0d;@`tVLGN6c( z;^kzsE3b(T2A+~YO9e{%8^|4a8$~cj-LBxS96VX-gx}15_SyyYsA7UNQ@lWv;IU=k zsR=bs(84?`hZynm{5s_GX6r0+Pc|6+Qd5{n3G~0)MJ!LK@w-$Z@an5f(G$JHGx;q~7Y6E*b zyE3Q|;n*Eh5ppib@bvL_x3BT<3HYP@9Otk#WAHl*M)7FTGPRY&nsqfxOf}g~2scKs z0{U-B5SUY~l9TKhGnw~oy2M(J%Ph+KQD5Rf%Vw$oU1^gZg4EZX{=TPxsS;PA9 z$E;naXF9bdIQRVN|0BV=)4LKnGuM~=W*6%~*(7t}AfUiW0Oy;xos8vvo)Zz0H@0g;)GW>J*y99^}7#TkK$AsCtMaQ|a-K3}XG!cPVccooBsS z3;zE}U2Je}g)ai(r$)}KVcg}DspgxW+PzHjgGZT0rK$L>e=RX{x0X<0SJjyeC*{TG z+`aE01#j%ear4dpe0WJlcKCMqeVny4>7)$>_%kEGK#D*b|FR$*$rbEW;@T(@AdpYuQ~U2?)y5|_4&N- z`Q;O5Gn=i3KRq}Nob^7i)cqIy-PynkQa*}v&(b4|&$AWqOJ4aN+Z8wd5?asjgj4BY z?hnjIGj41P;~vdgO>c${=J%PkE~~}}S(Ww&nruN5$at>ydsy>dxQK&0(*b*L02}Ia z4Q|5=f#PCcHa`q=nHx!U4bvBgn=BYe3{UL&%&HdRmnzoM#pO3MtnU<)go6gIi!i;% zgrc?9LX3uRQ>4$sSy$W)wJ+h+QM2T*uK{6#J=2&_c%W?71G^!&_BY_*VGgGX>40cx zpC|&9#SqdI2RfW~yj;cb%#t0Qgo zV=#=f_&q^~;IF_)%_k#GudJSuz}Bv#SE=~ zzWEOBGJl;&PBN|%nrBkZ|AKGaMzQ#)GJFc$wZ--CB=WO zH7%XRyJCMUW2&RW1VVftFR@GSb+M;GyM)7sUj^tM#rdE9ug{yUDHMBFI6T}a>>G7} z-Flu9OsNsSJ&|k0@VI3GlhZ^DA!O@*5TCHt&zYA_<74(5h#yjni7W7|U^R%|{npN?mX8!4X%2BP(M1a#1IkBnd!)O+S63H&v{x8mH0r- zxMZtk8uy=EhK2nO#52lCANS?OaSji!SEY+{JvGqLd>B`d2Glp_A+C~aUiKk)@6uL4}Mu2pYB9L@QSC~ znr_sX&ul;Cw3vHJfDwOS@QuCm37(4$+l`eC^)a1PngUY-6=iZ;>nvJdtNDb*#`+f^ z5K>r$m^cxF8!g`W^&V%9&AGNCh>pIOy7=!h%Q&1R_jgFiH3U{)X<=OWfKq_1FgPtHZibq!^33#VfYSY)J2nmsA)YcnEE}Q8%knU>+ zQoAKC$P0#^zkYJ-_$zeI{m|KlcSNDYDCj&quPj6K!+m=2At}(>;Hpd=oo3|TkjV1< z@rjPSKBq4oBYg4Mci^WSOy+(zVly%al>W}#RLf$E5a7&aMqJ$GsUbF`tIW&Yp=c;a z)I;I7`v&h&7oAv2H{Sfaw#cYtZ$|6qVoh*#4E}30`hLlF8_lJhR+(XgPBt0?1HGmC zgT7|z55L#mSeVr8nSDdbeWN$r-$H>L-Sr;wKAFlXa#l+F%1T5KZF6lEJ2>2CVPn|d z&3R+9uOGt%jY=`5rjXE=MS=@9rA>S73-wzFtiZhcLi^FhPNiuGSF5+`CL9=PEY%)e z|71Vyc4FcueSK2_5(d_{2A%UNA6!8faXN3X>9zHSNJO3J$SkAFB_sKYG3$QRLE~AH zMbwhRI>8`e(V8ovkudyBTQR*7oh;jjGm&&AZW02MY>9Wq6)**Vs`-={%3^HHRDApdMom?mtGK+06|N=$iPMJh=3FM-S0@3>tBLWE%vONOLK zuSJ*O4>KV%>vL<#60_}I*|UessPn(Op3v#a<|PE1iM#Xc zbtsQOwo{S^PGz)FIGxjxQQFpp#ism?dv1R&#Yl~>BdB}M?dj>ddQ%Y`_@LK;8wTNp z4yNb^ zb4u)NS_x(C+8UR4*Gv2;(+q9;&!&^xv9WG;#KcLffEHy0Zm-z{L%6ypL8-CG;V1v${s@dCHRhAVNGhnzV?JOJnDTwMC} zv;l2xmnrsl)c2IMsR!s!@4WV8ex!T{6oK!VJpd(oIt^d$Ex(JH_z6fL&!tE%mY?1Uwp?^}GL=aZ5lW>#%ILZ@4_ z5@xyI5#M*L(pn4J5({p;OX%d#$bA@_!q<4P)n3O1sS?>jAua@Xv%tnq^cVijgN@Ya zbTORfg}@N4G;rsf*>-C8_B4@izt~_PujgH+_$OSk&pShoA9uflBUa3!h|R684QI5}elD6an?ggQ}3tk2e2cu^wHT&m_K~QOrJCQNK+-)Bi#@CMyTr-Efr- zP@hJ>q&1v;l>{v}9kc1ZOyAalpNYWMPcjwJH6&fjk{_n2ky#hLS_dv_vma4At7qJPKGOI)+5JQtQMl^t0QIN3w9B2bj?TH`GaLpeE#K*h8K&^4G~agieJ_R+it_S#OES&YaI!tU7=JA zMy~H=F6L`?d9KA0$zB`E-i3%%YZ5=SGik^)S}{~2^08dU6`*Fj< z6RZap1dao0tUzbT_q_`!Y4?0Cu|x86#isbVKsDw>bOB^RnH8&zF)(Cg_!VMO$_Xc+ z>{fD+6Hqjb@(F3&!mJhh+|A8ssnIB#O-km_jg2ZzbcNgW=(#2hLaIlTeH&M zbFt?J1H1(#$H8B`ug{;JAK~P5Hm&*m2&%g<3TXTBrSWrlYgFKdB3EX?^7aT#@VvQW zD-}|j?I6$Xm_2m+Rg>oq&i^FKii@9pxBHcDVP|T8*$bkf`SfR3(T2>|CQbvf|U2&px?Q z&O$|m=Z~mU(1v;~>w?hftCo#R_(|+AL&6D&kgp5&P(N-bntAo0 zx}kaWd5LFoGhVgK881(HO!lf(>-)_s_jCD3BFvLkFxv#VP&1eQ*OA~fR3kNd6h0OI zTHosV)!ZnI>Ix~3`%wa5`|S)*J#s5Q>Eb%Ca_SP}9H3}!FP@AfJ!K_#g5jM$k0v43 zol`L~qaq{k(j;~LnjIKb*&iLg&#;K=rcPw8ac0|Z?br1^K}m6SgoJ5$-x?;Q;kq$2 zZ;ly9M2bM6A#uQ(GxK|m_q8UdzI0+8v#G_}2Xk5mH&LuVIBUY;mr%<5-UXkVLEJ{J zaJIcs5ynt~?`D7A7E%iT)iXLSmaS6zz=8HinC%H|B6C^^1vpD+$mr~Sp%d-Z(4?-f7e znjU|{?cNBN2o1d`;-V9JuHVuYdERXK+n5fO;?IQ=$@jznGe_M9{wVj z^-H_hF-Ey?`Ahg1I?1Diecba2Q0@oz?O2Sa|Ef|bzv1C@vpW-hTlK%vIUj?+YOedF zh(WPCg!U*L?yCscArYNYgP9e7UT!U0=bcm?@z@E#w<2+~URo|Po}At)d!SGmQr18s zyEPl3afBpP_2I1i4O3b{j`LiopyD5J{$L0{00!RxcK#i=q;jLcv2uc6y8+;UD>z7B zyP^`TBN8984?Aaf#%^O+J4^I)uv6*UY#>I~zI9uJnmVI)r^QzG+T(h9?b%^kSarlw zoUYD__`X=!fe8B!FD*#LuS{!%>;%G~daI|j$D^7ZBE&PA4Zk6gkg&iuyTXJ^@;l$S z@~hc?%}pTmyLOPAslamT^4NZCikvPgi}nY!4*48S(f{Oas3rpTRoKdh3Y&4*x{c^N z0cWOf8aygKHqOjTsCWO)tR2A32)r9iVB3+bD3`p_iT*vZ#HK$N|CC}h?7*Vg@`lhr zDveFTUAS5g5C!u}>x<7zbVS9bFpZA^4P!K4aOT39MBuT*jn$+- zNJAPDX*0lgGtP=@nyOoL@ZJ2Sc~vCNV7p>f>s;^+kc41AdcMu(fmB}2P&42s!9`PQ z$Obr4z@iLrolr^{tDWst{I6=9y=~8bZ6$WX9F(e+%Em7 z0E=xq{tD~4z}BwkbCQnSC)(JPHmI%Mn)8EBRAAZ}mg}1i7;SVNCJwhi_T7MypvM0klE+^weIRdtOe6n!@GsPstF>ry&t9`5M;gA|pyDc_Gv7b$#1_|J>xNv2 zFlbGygtPswVd#V4nNdaT_sW2)O%8mEUfaPQ&&d&!o$O?hk00h5%CK-A%c87V?MRI+ z0}A4s5+4)-?r&|5UPQ*3zdhR~Ay+2s*JZbp3Z+(s0hM|S#Nq{t!*$({;=O;wzlb2< zTX+7DAO@KC1@AovO7ejB0x;Kn$J))5?PW$1`J*0t8reD3qtGhama+=i;{K}){Nt6f zl%w!T7y^Di^b5)OWVBSg);>$kut5YZ>I^*X#+*GB;`D{xaGM+TPl5$6~F%0#iUi8Nd@c8Ea6Vq2H&m+&03RbF5rqb zzZ3A$wfOEI!i6TF#u@s4dlE9ZgGx`tBo%vmC>+#HxCJc5=iHmD=miYSwt}Nho&*DY zEJ94awE`lwQw`$8declbAhVKqFPFkM?N%(YoLynqm~?t)?_vW6o}jC@3<6%z`bG++Oz{L?Ph zESkC%_J_^LXSm;!sR$pq55%!f(AtS(VPz=#-~i;}$*{gHq`KyR?%z`KNB=vc+WeSZY423rRk z-3E3%hHg|(61QEHZSGNH0t;G-{BW58J&)@Z$+%_?O|7X(a-aW(+etVWVwJZN!P&89 zA9m&J&NG}>S-p*GaMb<1GdenXxMt(+f!h5=y+%i#_qyCa{)z^$mnSNZM zy&xbm?qtR|>e$bphq6SM~QPb4Q?7BYVH!)j2 zQ&<9(@H1i{yL-mS4{BE8T`(gEM|7}V!R*=ghDc#m?`BaAJ4gGfZ#(LM^?CF>L1oHLUgxM(aZdigFJ97all&b|C z&S%pqg%2H%J*?D0JabcB1aoeY;H&cnT;KY0F_=?ElT?dKooaZZ2ZfdFbx6{9Xm?Cl zvpN^*VI7mp96U{jT&(s%a2E4Zy0{3E(z?U>6;uhkE#_`QlGm{3*tfRkAZvcmy@TYv zg;9{4Ol_auV!Mv1?S~l7(KuYuQ;Qij+m6TP!kJb>GWdAnxMs46^K{4!2(h)L_>PP} zMm>DZ9=uHQ z1;_Mq|4X9Y7&xg$VUQCQ1BrJ+baMAc#72kCY^jFem9u5kZ%5uKZaQE)IiSYnf2nhu z_7Y=FtT@djGhWAn9xvpI283#*0JGIuxn@8-OZ4Cq33==&*XQq*Rr_^KSp==pHHDLA zG^7-n{A*LSb8BSg3B4h^!xCXLtRyoII1n@XNM8YI_(EXM zv|40we$o*Cu2-EG@L^BDc$%WucKLCvU8va7C-SJ3EJ^a6kCn0KMDjjqrAp)OaQ3QB zEU8ZW>M&;}oMAr!wwLusc*jLaAU8~6e|?$_muLdYUpY-B{96|jI7E+Qjwu2m*dfO$|>wSs+DNl#me&!blW0=4|iy?-BMim5ug zU)H7l#Z_^kkOfxu5v;{G_#wyM(2;Sl!kzr_4WrB4=M58ltgBj zA9!`5yyJ~t5gmL1WnZ1X;fCf_;~15Lzzz+vQa?kOpOS8hWCU%pgCC+av%BJ5Y8bG6 zK-o9`y&JJC`A)p0Yy+lNdQvHz6y`JP{P?8#_-0K02j`RUJV>o6%HM9{Yu-Q*-9IJz zd6+oD7fNrWPNw7_vR3pc;<3AEthwl1%c^#lLB{0PSAW~v%MSLSR1$9_e5Wg(Z0(K2 zQ%lNqyE6Zh>royYwgOnG-F*&4*xI1fyx_gGYudR2=M$c32ILzs4U~I^T_L6i_vfoX zMdbJCzS01Mxr~uM-(CI@k=5F*(!ep%Z*ils{YVNQVq=l^R9$^ zC|C3`B$fXRR!9KFi=+VJf4HSX1*`Y|jAj5g2c2w4O8wkr)1j5WPzeX;UU8}O*lwiQ z$wxhAijTVMSYQ3qiI(6Y0RD~bYD+guLw^-B{MlY`2P@r)g4iA?EtYC1 zqcY>_dY0e+HvKT1-a(FC!08Z2Yz?u2i2l-c*DZY?P~wF|5&9-%!!$V702>bIN;DkA zKysTJv!f#_<`YH_0!VMB30sR}|6y0hoUMq*VCj1drs3m>*45VNs~s}$lsxUOQYOav z1?`BER_|bKQX5dp<9`*jbs&@W3vrghKVlqiQ#gJMjg zBA_+wnX10TU+Jn1#GXT6nEY7E#1EQ8#99IJ;?8f+@fckcFt6GLT4{S(uH^Xv<&)Ta zxjy673LVJ~!&lK~la6IBGPElZ#2KV!YdnZy^k_qm5e*O#!Rj-=mLOTrP$Ax6$Nd|a z%b)4Z_wJopS1U3D$fnek4jgFfz*u3m=(%Wgt|3RZxLo0w+Q)yF*L>Aroe@zs{jtAR za>SL14x+GFwPxa_)NW{(a|iv67;TYO4F7Y2@V-9pW8GcjeP`AyRBY&)f-?Dyy=0oO zM8FZJXs&Dl?yMTvyF{@}%&ClAJ9fgjp5rE>2u)JjTx&VjYq50<4Zof)`c|I?3FG76Y;_m;i z(-SVcf@+Zxz-Dwj&k7jc$cHH!8Sw+6T70JU?iX+be$%zzve6m8Yf?33)^RD5*wNAO z-Zx(%#EFGW{gKo*eF1u8tt0=?V1mlR>CNd!DV+>Ox*yt2+PniT@QP z=KNx@h3@>tPYRgHQc)mIDR76Ccnp?xjfwnLf)^K8Sl7ek+$lTnAV+r$p#i5bW#=H* zq{zQm{)k3*kw!PBr+IH#JyLKPEMAZ{x7(wCo^?n49@>21NMl&OvY$8Kynh5XR{yGw znQ42oMA~L?y-FKAUVPJ5#nV{NSg#N>Hi7bBrn#K82I~nJtruO=9}fR}gy2Kg0_07K zpcT@K-~=eAC2KZX_`WOkRz3+K}UrY!%_f{FE* z?R4JezZWSIN@_-u+%#I4?~ePJ+CI8xRgl_R96a+ni12*XHBQwuG>jky9$LH*9{~ns zY5vuIYYXWF>Z{v1%Zy<1fgdaq!rkSCj_?83$O`4$IU4~A7OcoG9DyaFV#LX47L>(w zl!Zu!NH7hr&GK4lg3UqO>7&Abuv?1uIct?L)}f}+8<M!#zujbfi}DCw`k# zk00gzsWkC1?@Y=?D~ko&Xww4AQT)Nk$la4!9-?lgI#&dB|RqG znq~Din?o?^7iyyJlPFl+Pk`)_bI8e{?PK^#5JF~D+$Qkz-L2bfdWIGnfA(hG7DOQCdVdt}XJP}};4?mu$N>Dqbmiu* zx+;NdKKO$-zkt#ZZ>nDU!%3(QwM_vqjf+5^^sI!a?xaHJJTRx%WFim5;skXqog(xv zaukyx=&SJp#`>@>$!t;|@%oqBbPGwy-r^HGZq?!-1+G}cx;Gt}vQ$Vq>NV3s3H3)@ zFoRB_C**%ue)v&-JLt_E(6!{2>F|XujkEBmI%H2OJ_5quCnwKH-(#PVH`7hHMwwB! z&DhWS0EzQ1DtIXeJ4q^QJSGU^XYf|AsgM}iHAS<1b8LS-$gOvuZ^~aYyNDtsGbz}^ zy$2Thg#7O&5d?eA#Vq>oy7Yn3j$2Xe>}69b)U~aPUj5w=ZYg~=XWuGB9%>AOEAGUs zxjtaIejhSQtIl;@^%?LT0~dmf>V**>E2&~c#kMmUr&dVQqG zrm+1;MDLq(rNulJV=pY#PuUPr#@35^x+^><2;)%0dRYnM>-dl&JbP24kgX3fF% zEv#r0Rk9zuc}oJO_w+6Ew|}Jt%Ct4&!k6G1_^2D?p&p0$MrdPAKPnbmLPNbRNPr(} zOFMR)=@LEuXc_Vrsf1PRC}9|tc^J5cGjKx)I$l~=QD8ToP0!me#nv&ec1W9TRQYm`Mx1$DJwCgamzCT#Ous8F}Si*0jw9=rvbL_n!j< zTJRhPn{Ii;t1z@hWqCpXx9}}9mG;1gzh9jNy z*to(mToE{1yta7YRdTVr_KSx!FZs=RQPA~g?@YGGY$_wDS)WWy+?roMG&jwAP5{Cu z9sg&5f27Xdt7ax09 zunwobLQgk-Xvx|0PFdiG9LtVqJ@%J&YN*|Tc5()=9Z~l_nDF?}=d+*A9lmg^S5b6h z6%vbzpg$(-Dad7BPLclU`i>JgmqeM|{w4goOZ4w}q`X%dpPc)YY2O%)p$5%cT>7A|IW-N;c*ysZ#VYh+8Sb_ z#^CP>h{BFvIKF*J4Z@%2hR{FcDHo7%F3iV?M5Um`nzQ?)FsHvlQzHQlTXw**19JO! z5Q7Yt#`rTi*~cHqa~WlA*Stz%&2TjpAL|wRV(vm6W2ryBJBi3heTecJDiYu=jh}Us z>b~DKhS5Vlx)={W{iZ z$j-JDllAxy?%`JUTTU7pVgjq*ZX{*1`ihHQtG>b(FX!7|b{p!VMcoFI3=R;YF$*1u zG(Z2VL3@x7MDxtl3uqx3^_-AGC)@FG^6+14vt&DcSdQVOjje_R@f2HRWq#v zqZSjWvuOnO=(81MGc-m=p_A~6%v-D%hJ5T@e;_gC%@6F;=+hZxZE8 zW;de8HdRS4))ef+u8L;lHS-6}t=W%jtyqM&Qp4wnZ`+R4ZXN%5qW58N13kH|a(~m^ z+wYlQzrzf%Q{UF=RM|r`62{xN($~>+3-fUtI$?~~dQzFkCqimb#~inhNUXM@l6meA zayPiewG~qlMZ%ebc}-)<*^-VQk;xu5b!M zc)Wi|XxbLr|K!o7l2gZvtqG6os`kB$@z62@%&Wt+l^%H24pe#}@An0XfWQ3>$odg} zCBZP48A5$B6=-+K=o2m3YLMa{d;dvCOy^-OW3v`|)61#|8ctVvn-sY}2=YL~_6oXw z$?d+}8XYQZ%K;`-=!Uj|@!PEclGF7M?q=jzz3eE^Vag*o~3 zlud8_!bkv6z&NpqFeB^Y*_($eoUn0F_NEw;S7*?1IViYp2^D_%mh_`O2mo{2|1%u> z$(lIfWi5iKnvTg|goR&tD4}^o@KPaGlaZ0{*a|1J0@V{mGT^Xc5`10xLtPUE^_@}D zfsAgpIPQuH)9INMBpM%YLwl$TXak4Df1eO^q^K%!o!)uz0Dy?I8fEY%2lGnwwQ(RJ(RcR$??nBlFx)y~|?FKMoEZ|#C*5YcyaGty z9IO_Zu!0;20C=`5_~n{W8WKu%w(2gfC&m%N?qsbWuJ;7A51GAdr<)Y>AtTxT@%6=A z@2HK|-OyI=OyzaphQ9n3wJ{CkU1Fx9u|kk)fqCgRygT#Ejzh*C zl*ystX%e5HWet0V9|;w%Paf}F47V1*Y!o7e1AvW6^=3$hKefd>4n^-}er4XLC1Ft? zru{~7r5vZYU6zBd zfAycK(Qp#a{zbFDekMt4^-s!JD*G`N^JglY{r#)Z7kX$3^xl5S zi%3-0lE18H$#x3F%z&8h(c>XL(u7iE^}^760d}#ne`g+-n69P6o9~bE*h+Gng**!HkoJb|6=@n?AFFO$ zZ5BkRv$Onev@yIq*}3O+amVcw`aW@$S@kPa&rExl#5F|V+=ou~7Um`2)eTHkxpXKg z<5>VIPv}65u>?oI%bftu?MZVShxTu;?}+Pd+a>jY#{qY;ZWy4@9QMEbJ=8boRB?uH zue>Eru;*W7gV&^!j{@3S7jNGZFoS4BZ-9#(cHIH4?wNyXW=hOYyt<9a!;F!L(BX>q z*-Bvm-V8e$C3gLL|MA|R0i3}%5HYt$MMkCgSt(KXD{g3!2G&Q~Dy6W|pjQJ3UDpVa zKk(??G((yochjLzIo+~b{{uCmx8Txj!{ts~`wc9a97q1>5i!?J*1M;13BQ6^_&D;V z8j8*)MyU5~5#p49l1~nZ#~%JH#%puKS7am!_flp?ca?d0=opsyhVd98HD`# z*mUwgp#N$z`Z1-tVnkmAEe9&$V-o=gR8}Jg9Nat;E-}~cq9G8fp2Gg?_4Ym@kg0sD zIPF*mS^k=yYBrK=O8?S>3wJ*UeX~d}9oz$h{_5yKMg=YT-H714vE+*P%f?m$wnI$fo|J`UCFA*&nDIqGR#m|2J zgVXRyI=Ja-k%{KPb2`ucKHU=Hx`fmza?5mY){`0-kJ8$7#A5Vf5bj$BY6 zV`!`LuKg5iz~Fq%5dV9Jxj;I7_>E1`x=^L$Uz5~OJK@6tf@hO{-x24r*5H&*Mr?Xr zICe8g&bQ}^oX7`J&bu3WsO5FDqu|JY&yC|xUY-$LQf!9-m%4Xf^ImVZweMRop5D0* z>$YEKd@!78 zV-VIJvRv-yWK2O%yQiue) zq(y32s~rc|;*$xU=@p8=v|j$qvy%QxwW{}mX4H1#+e-mw8u;72CRjQQx?e>*;5HSG z8xbj*`TZz}GoJ^+OCEaSZuaU(9y0Dgr&~9}QoY zzDS~fN3fs|0roddv5qO>%J>!K%NbGq>AyxafyZ}e>S1=L@*22ve{nh{I#;SC@r0t$ zM;Axb&>3?(cAsN&)(>Zi64LkEm@!`L~m!U2|5Aa7Y?e%`yB3p||(y?-y;?yCQElgv-ZvzDoarRD#XJdckw@XKH0_PD4+@}lQ&#CEZEM&!Whj@5F%w1M8ggW~jfDm=}xe^qOrgtQOifG{a1N3rG#WoD_#^}hVX5|U20((A4XG!`R*I9jR4k9y}4>;gY zo2kI3*J{v5?1HAu8-mlelZ{@AT-?t}#w0~>Y(LkPZ4=>Pok?`uIWIb7D1ZE8A&~S6 z{bgu5^rH(p=`H_H8nYw(gr}OXzVh-D(8-d=){7Mxeq^gojs=6o0iiqvIl5l&MAaos zXutGd1jeW3&%G54$cvk0avhT|+;+p{83Y#-kzNZSH$7k<3~qi9g{{8DGFjJY4{|jb zxvA`T8@xF}FhIiHA~XD~0^q5hUh+v*Sn!AWWf{!p8$^2K;P_wqfxv~@@C1>#K`sR= z*rSZEBz>i<=$BiJ9m>&V*7zJ&bzX9ra$bN9HLCkPbjkFVYDOJ9<h>gn-ZTqBI-gSjnm(|XDCnC}-DZJ%OG_OE*(T9}N4Hp^7e8?WKRkX>LKN)h$7eOS$AzoVB!_utPn!-jYT{78$JpYC zmuZpPQRc1Kmj;7CyABP0C@Uh=fYox<*Q5`PAB=6Z9Kc;pN+yodc>iAIf)K9626~?j zf>qQ^p+{k27c!T4n+!fvV|Z?=9RE-VycfXUPS2z)Zk`yVTw;vI3k0fIDDd|Dq|kD)%jXORs;&-1gjsqaNgHmc4}32W ztIp6O5p)Ot)GpfL8TAH!Wz_Cx8!Y#`2Sk|>)}ZMui#6rR%Q_)DB`KLm&?mC%-e4A- zodZSZh;>iDuX}bnhU>VTVckjxuIZs!kC@C~=V7{fKRE!Yu_t)+#t@0QJyj$h?Y)R}} zBVzRcblte%{6sRbacgw^HKJF&KniAvB>L3i2-5;n#Bt+fe?32a2XIsEf@WC0;E z|HMt-eZ6p&K*bg6!`Hot)jiPo!>91>=789-lOmQ)%g883ByEJX%j=i{XQ(z-BBgZEhCMaW3`-sX_!sGBulD zoLF1aP3PZSe5qSUA0;j0#)k=wMR$C;EHEGEbMy*Nzl-5dh<|ydXJAIq-`5)kz)y_v zA1IjR{Nb6Oz4Y{})_?42%99}Wx+Unfz~N50H+-(OczJKE|K27HuLKK#NA3Y0Jx5bJ z8t~y~ZI;ulAQXWA3?-g}^@Qa}cgG+Lzu^$DcbeiiFU?{=Uu@V$h^2FeEP{Q>Jye>& zEu+3H=qyn2ychB4IkcGICXzh4lPJ%kMVr0?6Gp(BMfcyhxptg^20|uKY65RGL9B1! zOU)ANYed+AO&u;AfO1nsGQQF8@81cr>ntb#D$)p0mi1LTw_qICy%bX#(Zbm%;d)I) zt!#eo?!u*QhIb(|xezdzlfDoD^>**-^l~4!wqhPV!0BWHd48%;pJ(272VzcF#)jL{^W0GP)JPhY#gbs z+76U$>jN+<3z>PCVfULjy`L~0`^`najZ+SL0YE6mQ17xOj%nk?kuXvO(Dg$|H|}S- zME5l?ym((Y?Kp-sOUHU76PnR9;eVan3)p6=Vi>QlCCE!#4X;y16$7n(y?gd=e*sCB zQ+Z5B8s@L_P3`Bl9wJA6J@ZA`k4#;@0sraMHwjK&h^Ju4dzqKGl>s5ZITXzhd*)QG zG#j7@=0>x2xGr6KH@41BLV;}(BS>7|Uus=g5Cgw*OL|57$uF!x39*^a&l-~l@H z`yxDrc!Lh9&m5>6P|u29X#f)$n6~| zG&)2W8RXR6k<%3hvEPqF^Yvb#EY}{kI3+Fcxv_;(Gk%g6Q}<0uAn8Dd07kT6h}96VQl`%{_$Hf&&1PWP62 zW=P2k^gscOAm9w=KNp3^_1(SBhOm7(>a0z35tfDbejozV^anbBlsj-t7UmE^V!!LN zIIC|nrj>w&75KRy7e-+vVGUS6FrCGxuoaqO*T)a*6qr%8-|X2=BIHp_r-MkrshBh@ z_ii^jZb`>D^;=bzD`9Xy^MaEv=SE;&6z_@Qd2FK)+WuRMe#to_na zXh!AlLPIe+08uzqVm%)_S~A;=1WMTf348oc&q+ydDk&|^R1)VKPLN)m_PlwpQg-V% zJKL$j>X9V_2|5AFywI~dcEcbvdC+BcS<%5&bNl*(kmaXXr^tj#76r^JqL+6fGfD+y zgl*s(aNsj8RAc_Bl3^6Oj3nvywoT-<-7(gco(ei|QY5rGINTl{t@qfwFsG*rhXr=; z&)0O%ON9xPUwPVQr$R<39++O|v?z%P2?6Rr{EJ4X$Se-ofkzlij-P)y>lxJ1k-w}u z&PVFdzOR&N4iE}yoMouZv@FUF*Jb>Vjq^LO#PX(7s>c>wGL_IrH$o(xam5#WX^vhy zLFtnG3LueTLLwq*^mY3aoiI?mWeNNb2qkHERFO+W}0qfhE_L#^Ed13rf*j z?5tmaDO2+Pt6z)H@MUyf>)i!@#SJwXV#s^l!JsW{a5EL!yX^RG>M3#tK*5k8HHfSg z!10x^#rGx<*Ld2zP%Sn@wGg(uiM<*m31Gix9XL>Sx9UP~F$bdJ4UpRjdgfzLA710u zI?#<3$Xg=>-Uc@aw6MWzs{$>bH`)O~zQ_2}9Zc;u&aESn4)-i&5KURHWiDEQE={+b2Rw~{C};qOIiWqybS}f+6R1o9J4@-Cl?Hz52jm$MM<^689jpMjhzV5x z@3EjjCBH>(R7u{$m@P>J;CvD~dc0cxa_9x~U!BD`#gZUZzEL%*|jODqar5P zU{L&eQuC-f7^-m@YSD2WvCIfZ)_8<&fC@f|@m7(~GhRlhKpvo%6Ed!I2D88ymsRV* z2A}JH5oOnR|GxDvNBCJ>MT{aipP6CLA_j5j>V10xvKxKB<72A6#;wm=yt#ezjS% z2Sz%#I9JD^_uhulxkA(LK=)0Vbz_{hcYmI69k!%#Y3{lPE87Jtl*W7D8mC@=BAXs? ziuXmv(%Xv8SqmYUHP{rmyDCmfbC(T)7WW01viIX2fyI~wVw5nrk$RNL%6`Z>cO2^7 zzd!w%Okg#C+uv(GL$IU~=7I^L%WI4e0GQ&Qh$kk#wb-3VF{pEICoXLM3?T6~>BPL@ z80rM-a&!>+Ue}r|PdeXgrlY^BK0b1^Yy#v7Sfe*%f`G^tXkJ|YZ2&arf;B=_dkk{t z{Lpsw3wAz9shZPb6hz|?LtlO>z#4a7b*Y8_>*(6!nmD%bY!Zk(gCd{=43AcC(TG;7 z2$e(%#U~1-)C<3oaIZ>i0*DCO^2nlM!H9rdR8)il3aFSA0eeA6N~D*E0#+(W6jBk8 zXLv|RLb7{j$zRFN&YU^3-}lWqll{$z$cH^(&}pxbo|)2h9<*@J=(g4uPYG)(Q78Gp zT72{=7fuizYEk+nUBg};D{p1_`joeGcKHFOmWy!bIlUd=MnJ9LPmOZ=4I$m5?tg*u z5IZI2#5iEwT|RLC3+z{;pY_U^LfWIj4Ed(@Z4`DWMjFqAG#_44J$~egW;^m;N3GYY zQn}`8CglTH#V2d5Np&OQd2G^GS-#y6cj2GTKO3g*7(^ovS^CGu5*~~0)p(qLi_NG@V%S7WJ7oj$hMdt7gn^Q1C~5D+LxmVJ79ZO@L?FgD2ewp@mLj% z+o|KRs1~XIl9wum1iDd6HzZ5`84<1KKn-*X26&nBbND5BpC0 zRWKJ3%~RRkld@3#NO@0=m)yHmD7z2=JuV{TC5p;ko;ws1u&g&kdOI5VIQQi9!0G!* z_!f1b83@$UdcQnBfm+ND#o&@v7MJOoOZ6|4&{*-7)5V29)E#4F{p9$X9+_8(T*1-j zz7)`VXnVlUg#N}g@7L0t#9xC6Q{m|=X^RH8dstbIUMa*+6RtbKfxO@uH?Dkscfo*3 z3(O}1s@mdnG`*ZtlPyE^XLU3%p1}m%b4&73=;0t$ipTc=7Gjli^1=H(Xlbn}gWn;3iBKB!5RwnQzS4FUJyUVziI2l(Sq{P%(&N zaNl|FC z&v{E*Vhe;vg_kFU+&QBC^R#w;g4BLKvC6BpEtgdHXHRP?N2|(wG+Wm)b3PT`<^|Ug zYM*X$@9&XFX@>D5?14ukkl4=m)R86R+~{l5EXk+K29dC*(PZBma6_#wnG(9pGfB{X z0zsRcdO2pZ*f&B+cUXtN#&7g4nBZgu5h(aU;j8;z%XfByl9V&O*-cCLGrREP4CbO@ z$Yu$%=~2&b63LvUE@glf?*bR_h*Ujc_Ou1p2Fh005n^c}fzEz}B?lE*>KYZCd+U`j zV@NbaA5BDL$6$6PSpC*`dNp3#dJwJ<4TxO8!cCKgMr9aP;tN%XIrp?cPMhF=|94rH zF6xM~C688XZB*X?xO1_$sJ%Y!DI@ZxDZ^q&BNqB{54eE?MFvDb8bmBG&7H}qvNiINczs5+G zW`Q=`al&8%a*nS-@AxC1;bsDNNK@OycO8FIU%X)Dobgd+;o7PuP)kiiQD*IyRT1b(RURfqB-oY*lu)=W8iL^<4{ z{L@WkCwsRU7IL%g#ERcJ3wW|WJ7@F=_qv@*Jez`SSYr(jo33^`Z-bkgF9TgGLakb8nCA!4epke=>??S$5rixA7o$}DW*K~3`IJLo z8IQUau=nwHIdRNn{*>K-(wAxitKR61v_||&Us2L2>>|gdY6ZwQdH5+hYKgK5HirI- zfPJ7^`VfQ3Lw&B`^7_iasUEg3_OCe`dP+ zm|rDbS*pm}75=2LNkLv>f$g578pXd+u+#yU3UlFevIn8Pclxc+yof$p^)r>>B_3>w zGv+<3ws*Yec9l>zb-DNg^y*ALCV!B9ZdcX{8M~e6+b|r+E9u>JNDyfi@7fX_W)I|I zbwhCjmaAlH$jItvP)rI>jpgGX;rYmEOi*<6_HS{T)K{dgJRz%d6am%daGcX7#yBF< z{znd@D6O~zbwDV6X%4V>DTE8x>OMBKnyQRtZW1F<-SsvCzxW&}GO_CN zhU~VTjl?QobF?tP79F=hpR9+qLm~gVe0~}hUbi_#qJ&WU`1bBBZ6vaw^cs zis$SCL2-Ns7fixvkhh}^ke9xE=BQbKS!UzAFa&`1k8kcXlH>nc0Pw@zgE3mJpW3fW RmYgAe@9_)pedxnI`+t+++vWfO literal 0 HcmV?d00001 diff --git a/docs/mint.json b/docs/mint.json index 3fa8f6633..7c4aa15f6 100644 --- a/docs/mint.json +++ b/docs/mint.json @@ -93,6 +93,7 @@ "v1/integrations/camel", "v1/integrations/cohere", "v1/integrations/crewai", + "v1/integrations/gemini", "v1/integrations/groq", "v1/integrations/langchain", "v1/integrations/llama_stack", diff --git a/docs/v1/examples/examples.mdx b/docs/v1/examples/examples.mdx index dda198bee..cc7c9e5bb 100644 --- a/docs/v1/examples/examples.mdx +++ b/docs/v1/examples/examples.mdx @@ -57,6 +57,10 @@ mode: "wide" Ultra-fast LLM inference with Groq Cloud + } iconType="image" href="/v1/integrations/gemini"> + Explore Google DeepMind's Gemini with observation via AgentOps + + } iconType="image" href="/v1/examples/langchain"> Jupyter Notebook with a sample LangChain integration diff --git a/docs/v1/integrations/gemini.mdx b/docs/v1/integrations/gemini.mdx new file mode 100644 index 000000000..0f643f346 --- /dev/null +++ b/docs/v1/integrations/gemini.mdx @@ -0,0 +1,118 @@ +--- +title: Gemini +description: "Explore Google DeepMind's Gemini with observation via AgentOps" +--- + +import CodeTooltip from '/snippets/add-code-tooltip.mdx' +import EnvTooltip from '/snippets/add-env-tooltip.mdx' + +[Gemini (Google Generative AI)](https://ai.google.dev/gemini-api/docs/quickstart) is a leading provider of AI tools and services. +Explore the [Gemini API](https://ai.google.dev/docs) for more information. + + + `google-generativeai>=0.1.0` is currently supported. + + + + + + ```bash pip + pip install agentops + ``` + ```bash poetry + poetry add agentops + ``` + + + + + `google-generativeai>=0.1.0` is required for Gemini integration. + + + ```bash pip + pip install google-generativeai + ``` + ```bash poetry + poetry add google-generativeai + ``` + + + + + + ```python python + import google.generativeai as genai + import agentops + + agentops.init() + model = genai.GenerativeModel("gemini-1.5-flash") + ... + # End of program (e.g. main.py) + agentops.end_session("Success") # Success|Fail|Indeterminate + ``` + + + + ```python .env + AGENTOPS_API_KEY= + GEMINI_API_KEY= + ``` + + Read more about environment variables in [Advanced Configuration](/v1/usage/advanced-configuration) + + + Execute your program and visit [app.agentops.ai/drilldown](https://app.agentops.ai/drilldown) to observe your Agent! 🕵️ + + After your run, AgentOps prints a clickable url to console linking directly to your session in the Dashboard + +
+ + + + + + +## Full Examples + + + ```python sync + import google.generativeai as genai + import agentops + + agentops.init() + model = genai.GenerativeModel("gemini-1.5-flash") + + response = model.generate_content( + "Write a haiku about AI and humans working together" + ) + + print(response.text) + agentops.end_session('Success') + ``` + + ```python stream + import google.generativeai as genai + import agentops + + agentops.init() + model = genai.GenerativeModel("gemini-1.5-flash") + + response = model.generate_content( + "Write a haiku about AI and humans working together", + stream=True + ) + + for chunk in response: + print(chunk.text, end="") + + agentops.end_session('Success') + ``` + + +You can find more examples in the [Gemini Examples](/v1/examples/gemini_examples) section. + + + + + + diff --git a/examples/anthropic_examples/anthropic-example-sync.py b/examples/anthropic_examples/anthropic-example-sync.py new file mode 100644 index 000000000..b4060293c --- /dev/null +++ b/examples/anthropic_examples/anthropic-example-sync.py @@ -0,0 +1,167 @@ +#!/usr/bin/env python +# coding: utf-8 + +# # Anthropic Sync Example +# +# We are going to create a program called "Nier Storyteller". In short, it uses a message system similar to Nier Automata's to generate a one sentence summary before creating a short story. +# +# Example: +# +# {A foolish doll} {died in a world} {of ended dreams.} turns into "In a forgotten land where sunlight barely touched the ground, a little doll wandered through the remains of shattered dreams. Its porcelain face, cracked and wea..." + +# First, we start by importing Agentops and Anthropic + +# In[ ]: + + +get_ipython().run_line_magic('pip', 'install agentops') +get_ipython().run_line_magic('pip', 'install anthropic') + + +# Setup our generic default statements + +# In[4]: + + +from anthropic import Anthropic, AsyncAnthropic +import agentops +from dotenv import load_dotenv +import os +import random + + +# And set our API keys. + +# In[6]: + + +load_dotenv() +ANTHROPIC_API_KEY = os.getenv("ANTHROPIC_API_KEY") or "ANTHROPIC KEY HERE" +AGENTOPS_API_KEY = os.getenv("AGENTOPS_API_KEY") or "AGENTOPS KEY HERE" + + +# Now let's set the client as Anthropic and an AgentOps session! + +# In[7]: + + +client = Anthropic(api_key=ANTHROPIC_API_KEY) + + +# In[ ]: + + +agentops.init(AGENTOPS_API_KEY, default_tags=["anthropic-example"]) + +# Remember that story we made earlier? As of writing, claude-3-5-sonnet-20240620 (the version we will be using) has a 150k word, 680k character length. We also get an 8192 context length. This is great because we can actually set an example for the script! +# +# Let's assume we have user (the person speaking), assistant (the AI itself) for now and computer (the way the LLM gets references from). +# Let's set a default story as a script! + +# In[10]: + + +defaultstory = "In a forgotten land where sunlight barely touched the ground, a little doll wandered through the remains of shattered dreams. Its porcelain face, cracked and weathered, reflected the emptiness that hung in the air like a lingering fog. The doll's painted eyes, now chipped and dull, stared into the distance, searching for something—anything—that still held life. It had once belonged to a child who dreamt of endless adventures, of castles in the clouds and whispered secrets under starry skies. But those dreams had long since crumbled to dust, leaving behind nothing but a hollow world where even hope dared not tread. The doll, a relic of a life that had faded, trudged through the darkness, its tiny feet stumbling over broken wishes and forgotten stories. Each step took more effort than the last, as if the world itself pulled at the doll's limbs, weary and bitter. It reached a place where the ground fell away into an abyss of despair, the edge crumbling under its weight. The doll paused, teetering on the brink. It reached out, as though to catch a fading dream, but there was nothing left to hold onto. With a faint crack, its brittle body gave way, and the doll tumbled silently into the void. And so, in a world where dreams had died, the foolish little doll met its end. There were no tears, no mourning. Only the soft, empty echo of its fall, fading into the darkness, as the land of ended dreams swallowed the last trace of what once was." + + +# We are almost done! Let's generate a one sentence story summary by taking 3 random sentence fragments and connecting them! + +# In[11]: + + +# Define the lists +first = [ + "A unremarkable soldier", + "A lone swordsman", + "A lone lancer", + "A lone pugilist", + "A dual-wielder", + "A weaponless soldier", + "A beautiful android", + "A small android", + "A double-crossing android", + "A weapon carrying android", +] + +second = [ + "felt despair at this cold world", + "held nothing back", + "gave it all", + "could not get up again", + "grimaced in anger", + "missed the chance of a lifetime", + "couldn't find a weakpoint", + "was overwhelmed", + "was totally outmatched", + "was distracted by a flower", + "hesitated to land the killing blow", + "was attacked from behind", + "fell to the ground", +] + +third = [ + "in a dark hole beneath a city", + "underground", + "at the enemy's lair", + "inside an empty ship", + "at a tower built by the gods", + "on a tower smiled upon by angels", + "inside a tall tower", + "at a peace-loving village", + "at a village of refugees", + "in the free skies", + "below dark skies", + "in a blood-soaked battlefield", +] + +# Generate a random sentence +generatedsentence = ( + f"{random.choice(first)} {random.choice(second)} {random.choice(third)}." +) + + +# And now to construct a stream/message! We set an example for the assistant now! + +# In[ ]: + + +stream = client.messages.create( + max_tokens=2400, + model="claude-3-5-sonnet-20240620", # Comma added here + messages=[ + { + "role": "user", + "content": "Create a story based on the three sentence fragments given to you, it has been combined into one below.", + }, + { + "role": "assistant", + "content": "{A foolish doll} {died in a world} {of ended dreams.}", + }, + {"role": "assistant", "content": defaultstory}, + { + "role": "user", + "content": "Create a story based on the three sentence fragments given to you, it has been combined into one below.", + }, + {"role": "assistant", "content": generatedsentence}, + ], + stream=True, +) + +response = "" +for event in stream: + if event.type == "content_block_delta": + response += event.delta.text + elif event.type == "message_stop": + print(generatedsentence) + print(response) + + +# We can observe the session in the AgentOps dashboard by going to the session URL provided above. +# +# Now we will end the session with a success message. We can also end the session with a failure or intdeterminate status. By default, the session will be marked as indeterminate. + +# In[ ]: + + +agentops.end_session("Success") + diff --git a/examples/gemini_examples/README.md b/examples/gemini_examples/README.md new file mode 100644 index 000000000..fb0af7624 --- /dev/null +++ b/examples/gemini_examples/README.md @@ -0,0 +1,94 @@ +# Gemini Integration Examples + +This directory contains examples demonstrating how to use AgentOps with Google's Gemini API for tracking and monitoring LLM interactions. + +## Prerequisites + +- Python 3.7+ +- `agentops` package installed (`pip install -U agentops`) +- `google-generativeai` package installed (`pip install -U google-generativeai>=0.1.0`) +- A Gemini API key (get one at [Google AI Studio](https://ai.google.dev/tutorials/setup)) +- An AgentOps API key (get one at [AgentOps Dashboard](https://app.agentops.ai/settings/projects)) + +## Environment Setup + +1. Install required packages: +```bash +pip install -U agentops google-generativeai +``` + +2. Set your API keys as environment variables: +```bash +export GEMINI_API_KEY='your-gemini-api-key' +export AGENTOPS_API_KEY='your-agentops-api-key' +``` + +## Examples + +### Synchronous and Streaming Example + +The [gemini_example_sync.ipynb](./gemini_example_sync.ipynb) notebook demonstrates: +- Basic synchronous text generation +- Streaming text generation with chunk handling +- Automatic event tracking and token usage monitoring +- Session management and statistics + +```python +import google.generativeai as genai +import agentops + +# Configure API keys +genai.configure(api_key=GEMINI_API_KEY) + +# Initialize AgentOps (provider detection is automatic) +agentops.init() + +# Create Gemini model +model = genai.GenerativeModel("gemini-1.5-flash") + +# Generate text (synchronous) +response = model.generate_content("What are the three laws of robotics?") +print(response.text) + +# Generate text (streaming) +response = model.generate_content( + "Explain machine learning in simple terms.", + stream=True +) +for chunk in response: + print(chunk.text, end="") + +# End session and view stats +agentops.end_session( + end_state="Success", + end_state_reason="Example completed successfully" +) +``` + +To run the example: +1. Make sure you have set up your environment variables +2. Open and run the notebook: `jupyter notebook gemini_example_sync.ipynb` +3. View your session in the AgentOps dashboard using the URL printed at the end + +## Features + +- **Automatic Provider Detection**: The Gemini provider is automatically detected and initialized when you call `agentops.init()` +- **Zero Configuration**: No manual provider setup required - just import and use +- **Comprehensive Event Tracking**: All LLM calls are automatically tracked and visible in your AgentOps dashboard +- **Token Usage Monitoring**: Token counts are extracted from the Gemini API's usage metadata when available +- **Error Handling**: Robust error handling for both synchronous and streaming responses +- **Session Management**: Automatic session tracking with detailed statistics + +## Notes + +- The provider supports both synchronous and streaming text generation +- All events are automatically tracked and can be viewed in the AgentOps dashboard +- Token usage is extracted when available in the response metadata +- Error events are automatically captured and logged +- The provider is designed to work seamlessly with AgentOps' session management + +## Additional Resources + +- [Gemini API Documentation](https://ai.google.dev/docs) +- [AgentOps Documentation](https://docs.agentops.ai) +- [Gemini Integration Guide](https://docs.agentops.ai/v1/integrations/gemini) diff --git a/examples/gemini_examples/gemini_example.ipynb b/examples/gemini_examples/gemini_example.ipynb new file mode 100644 index 000000000..3e85414ee --- /dev/null +++ b/examples/gemini_examples/gemini_example.ipynb @@ -0,0 +1,130 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "580c85ac", + "metadata": {}, + "source": [ + "# Gemini API Example with AgentOps\n", + "\n", + "This notebook demonstrates how to use AgentOps with Google's Gemini API for both synchronous and streaming text generation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d731924a", + "metadata": {}, + "outputs": [], + "source": [ + "import google.generativeai as genai\n", + "import agentops\n", + "from dotenv import load_dotenv\n", + "import os" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "a94545c9", + "metadata": {}, + "outputs": [], + "source": [ + "load_dotenv()\n", + "\n", + "GEMINI_API_KEY = os.getenv(\"GEMINI_API_KEY\") or \"your gemini api key\"\n", + "AGENTOPS_API_KEY = os.getenv(\"AGENTOPS_API_KEY\") or \"your agentops api key\"\n", + "\n", + "genai.configure(api_key=GEMINI_API_KEY)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d632fe48", + "metadata": {}, + "outputs": [], + "source": [ + "# Initialize AgentOps and Gemini model\n", + "agentops.init()\n", + "model = genai.GenerativeModel(\"gemini-1.5-flash\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3923b6b8", + "metadata": {}, + "outputs": [], + "source": [ + "# Test synchronous generation\n", + "print(\"Testing synchronous generation:\")\n", + "response = model.generate_content(\n", + " \"What are the three laws of robotics?\",\n", + " # session=ao_client\n", + ")\n", + "print(response.text)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "da54e521", + "metadata": {}, + "outputs": [], + "source": [ + "# Test streaming generation\n", + "print(\"\\nTesting streaming generation:\")\n", + "response = model.generate_content(\n", + " \"Explain the concept of machine learning in simple terms.\",\n", + " stream=True,\n", + " # session=ao_client\n", + ")\n", + "\n", + "for chunk in response:\n", + " print(chunk.text, end=\"\")\n", + "print() # Add newline after streaming output\n", + "\n", + "# Test another synchronous generation\n", + "print(\"\\nTesting another synchronous generation:\")\n", + "response = model.generate_content(\n", + " \"What is the difference between supervised and unsupervised learning?\",\n", + " # session=ao_client\n", + ")\n", + "print(response.text)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c6a674c0", + "metadata": {}, + "outputs": [], + "source": [ + "# End session and check stats\n", + "agentops.end_session(end_state=\"Success\")" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "ops", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.16" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/examples/openai_examples/openai_example_sync.py b/examples/openai_examples/openai_example_sync.py new file mode 100644 index 000000000..c1a64227c --- /dev/null +++ b/examples/openai_examples/openai_example_sync.py @@ -0,0 +1,107 @@ +#!/usr/bin/env python +# coding: utf-8 + +# # OpenAI Sync Example +# +# We are going to create a simple chatbot that creates stories based on a prompt. The chatbot will use the gpt-4o-mini LLM to generate the story using a user prompt. +# +# We will track the chatbot with AgentOps and see how it performs! + +# First let's install the required packages + +# In[ ]: + + +get_ipython().run_line_magic('pip', 'install -U openai') +get_ipython().run_line_magic('pip', 'install -U agentops') + + +from openai import OpenAI +import agentops +import os +from dotenv import load_dotenv + +# Then continue with the example + + +# Next, we'll grab our API keys. You can use dotenv like below or however else you like to load environment variables + +# In[2]: + + +load_dotenv() +OPENAI_API_KEY = os.getenv("OPENAI_API_KEY") or "" +AGENTOPS_API_KEY = os.getenv("AGENTOPS_API_KEY") or "" + + +# Next we initialize the AgentOps client. + +# In[ ]: + + +agentops.init(AGENTOPS_API_KEY, default_tags=["openai-sync-example"]) + + +# And we are all set! Note the seesion url above. We will use it to track the chatbot. +# +# Let's create a simple chatbot that generates stories. + +# In[4]: + + +client = OpenAI(api_key=OPENAI_API_KEY) + +system_prompt = """ +You are a master storyteller, with the ability to create vivid and engaging stories. +You have experience in writing for children and adults alike. +You are given a prompt and you need to generate a story based on the prompt. +""" + +user_prompt = "Write a story about a cyber-warrior trapped in the imperial time period." + +messages = [ + {"role": "system", "content": system_prompt}, + {"role": "user", "content": user_prompt}, +] + + +# In[ ]: + + +response = client.chat.completions.create( + model="gpt-4o-mini", + messages=messages, +) + +print(response.choices[0].message.content) + + +# The response is a string that contains the story. We can track this with AgentOps by navigating to the session url and viewing the run. + +# ## Streaming Version +# We will demonstrate the streaming version of the API. + +# In[ ]: + + +stream = client.chat.completions.create( + model="gpt-4o-mini", + messages=messages, + stream=True, +) + +for chunk in stream: + print(chunk.choices[0].delta.content or "", end="") + + +# Note that the response is a generator that yields chunks of the story. We can track this with AgentOps by navigating to the session url and viewing the run. + +# In[ ]: + + +agentops.end_session(end_state="Success", end_state_reason="The story was generated successfully.") + + +# We end the session with a success state and a success reason. This is useful if you want to track the success or failure of the chatbot. In that case you can set the end state to failure and provide a reason. By default the session will have an indeterminate end state. +# +# All done! diff --git a/pyproject.toml b/pyproject.toml index f21dd03c7..1e274b2f8 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -49,6 +49,7 @@ test = [ "groq", "ollama", "mistralai", + "google-generativeai>=0.1.0", # ;; # The below is a really hard dependency, that can be installed only between python >=3.10,<3.13. # CI will fail because all tests will automatically pull this dependency group;