The Pi Labs Python SDK provides convenient, type-safe access to the Pi Client REST API from any Python 3.8+ application. This guide covers installation, basic usage, configuration, async programming, and advanced features.Documentation is also available at the GitHub repository.The SDK supports a wide range of Python environments and frameworks:
Python Versions
Python 3.8+ with full type support and async capabilities
Web Frameworks
Django, Flask, FastAPI, and other WSGI/ASGI frameworks
Async Support
Full asyncio support with httpx and optional aiohttp backends
Data Science
Jupyter Notebooks, pandas integration, and ML workflows
scores = pi.scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train. You can take a short ride to the stunning shores of Lake Como to explore picturesque towns like Bellagio and Varenna. Alternatively, the historic hilltop city of Bergamo is another charming and easily accessible option. For a spectacular alpine adventure, consider the Bernina Express scenic train journey through the Swiss Alps. Cities like Turin and Verona are also just an hour or two away via high-speed train.", scoring_spec=[ {"question": "Does the response maintain a professional tone?"}, {"question": "Did the response fulfill the intent of the user's query?"}, {"question": "Did the response only present data relevant to the user's query?"} ],)print('Total Score:', scores.total_score)print('Question Scores:', scores.question_scores)
You should receive a response with the scores Pi Scorer assigned to the generation.
The SDK provides full async support with AsyncPiClient. Simply import AsyncPiClient instead of PiClient and use await with each API call.By default, the async client uses httpx for HTTP requests. However, for improved concurrency performance you may also use aiohttp as the HTTP backend.
httpx
aiohttp
Copy
import osimport asynciofrom withpi import AsyncPiClientpi = AsyncPiClient( api_key=os.environ.get("WITHPI_API_KEY"), # This is the default and can be omitted)async def main() -> None: scores = await pi.scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train. You can take a short ride to the stunning shores of Lake Como to explore picturesque towns like Bellagio and Varenna.", scoring_spec=[ {"question": "Does the response maintain a professional tone?"}, {"question": "Did the response fulfill the intent of the user's query?"}, ], ) print(scores.total_score)asyncio.run(main())
Functionality between the synchronous and asynchronous clients is otherwise identical.
The SDK includes comprehensive type definitions using TypedDicts for requests and Pydantic models for responses:
Copy
from withpi import PiClientpi = PiClient()# TypedDict provides autocomplete for request parametersscoring_params = { "llm_input": "What are some good day trips from Milan by train?", "llm_output": "Milan is an excellent hub for day trips by train. You can take a short ride to the stunning shores of Lake Como to explore picturesque towns like Bellagio and Varenna.", "scoring_spec": [ { "label": "Professional Tone", "question": "Does the response maintain a professional tone?" }, { "label": "Intent Fulfillment", "question": "Did the response fulfill the intent of the user's query?" } ],}# Pydantic model provides type-safe response handlingresponse = pi.scoring_system.score(**scoring_params)# Access response properties with full autocompleteprint(f"Total Score: {response.total_score}")print(f"Question Scores: {response.question_scores}")# Convert to different formatsresponse_dict = response.to_dict() # Convert to dictionaryresponse_json = response.to_json() # Serialize to JSON
In the event of errors, SDK raises subclasses of APIConnectionError:
Copy
import withpifrom withpi import PiClientpi = PiClient()try: result = pi.scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train.", scoring_spec=[ {"question": "Does the response maintain a professional tone?"}, {"question": "Did the response fulfill the intent of the user's query?"}, ], )except withpi.APIConnectionError as e: print("The server could not be reached") print(e.__cause__) # an underlying Exception, likely raised within httpx.except withpi.RateLimitError as e: print("A 429 status code was received; we should back off a bit.")except withpi.APIStatusError as e: print("Another non-200-range status code was received") print(e.status_code) print(e.response)
Access the underlying response object for headers and other metadata:
Raw Response Only
Streaming Response
Copy
from withpi import PiClientpi = PiClient()response = pi.scoring_system.with_raw_response.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train.", scoring_spec=[{"question": "Does the response maintain a professional tone?"}],)print(response.headers.get('X-My-Header'))print(response.status_code)# Parse the response datascoring_system = response.parse()print(scoring_system.total_score)
You can make requests to any endpoint, including undocumented ones:
Copy
import httpx# Custom endpointresponse = pi.post( "/some/path", cast_to=httpx.Response, body={"some_prop": "foo"},)print(response.headers.get("x-foo"))# Undocumented parameterspi.scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train.", scoring_spec=[{"question": "Does the response maintain a professional tone?"}], # Extra parameters extra_body={"undocumented_param": "value"}, extra_query={"debug": "true"}, extra_headers={"X-Custom-Header": "value"},)
Customize the underlying HTTP client for proxies, transports, and advanced functionality:
Copy
import httpxfrom withpi import PiClient, DefaultHttpxClient# Global configurationpi = PiClient( base_url="http://my.test.server.example.com:8083", http_client=DefaultHttpxClient( proxy="http://my.test.proxy.example.com", transport=httpx.HTTPTransport(local_address="0.0.0.0"), ),)# Per-request configurationpi.with_options( http_client=DefaultHttpxClient(proxy="http://different.proxy.com")).scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train.", scoring_spec=[{"question": "Does the response maintain a professional tone?"}])
When making a one-off request, you can use a context manager to ensure the underlying HTTP client and its open connections are closed properly:
Copy
from withpi import PiClient# Context manager (recommended)with PiClient() as pi: result = pi.scoring_system.score( llm_input="What are some good day trips from Milan by train?", llm_output="Milan is an excellent hub for day trips by train.", scoring_spec=[{"question": "Does the response maintain a professional tone?"}] )# HTTP client is now closed.# To close manually:pi = PiClient()try: result = pi.scoring_system.score( # ... )finally: pi.close()
In an API response, a field may be explicitly null, or missing entirely; in either case, its value is None in this library. You can differentiate the two cases with .model_fields_set:
Copy
if response.my_field is None: if 'my_field' not in response.model_fields_set: print('Got json like {}, without a "my_field" key present at all.') else: print('Got json like {"my_field": null}.')