Take the L402 pill...
pyl402 is a Python package designed to handle HTTP requests with integrated payment functionality, specifically focusing on the Lightning Network protocol L402 and automatic handling of HTTP 402 Payment Required responses. It integrates seamlessly with Lightning wallets to provide a smooth experience when dealing with paid resources.
Note: This project is currently in alpha stage and is considered Work-In-Progress (WIP). Features and functionality are subject to change, and more testing is needed to ensure stability and reliability.
To install pyl402, you can use pip:
pip install pyl402
Please note that this package requires Python 3.7 or later.
- Automatic Token Handling: Manages L402 tokens automatically, storing and retrieving tokens as needed.
- Integrated Payment: Automatically handles payment processes if a resource requires payment, through compatible wallet implementations.
- Extensible: Easily extend the library to support different wallet types and token storage mechanisms.
Here's a quick example to get you started with the L402 Client:
from pyl402.wallet import AlbyWallet
from pyl402.token_store import MemoryTokenStore
from pyl402.client import L402Client
# Initialize wallet and token store
wallet = AlbyWallet(token="your_alby_api_token_here")
store = MemoryTokenStore()
# Create the L402 Client
client = L402Client(wallet=wallet, store=store)
# Use the client to send HTTP requests
response = client.get('https://rnd.ln.sulu.sh/randomnumber')
print(response.text)
This example demonstrates creating an L402 client using an Alby wallet and a memory-based token store to access a resource that may require payment.
With this example you can access inference APIs just with Bitcoin and the Lightning Network!
import os
from openai import OpenAI
from pyl402.wallet import AlbyWallet
from pyl402.client import L402Client
from pyl402.token_store import MemoryTokenStore
# Create the L402 client
alby_wallet = AlbyWallet(os.getenv("ALBY_BEARER_TOKEN"))
token_store = MemoryTokenStore()
client = L402Client(wallet=alby_wallet, store=token_store)
client = OpenAI(
http_client=client,
base_url='https://suluai.ln.sulu.sh/v1',
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
model="meta-llama/Llama-3-70b-chat-hf",
)
print(chat_completion.choices[0].message.content)
Contributions are welcome! If you'd like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.
The code in this project is licensed under the MIT license. See LICENSE for details.
This is an alpha release, and as such, it might contain bugs and incomplete features. We do not recommend using it in a production environment.