There were no good libraries I could find which let me create a runtime configurable llm client for use in agents. so I built this one which is  runtime configurable and enables the discovery of available models and services. 
  loading...
loading...