hey Prasanna we probably won't support every permutation of model provider so I think LiteLLM is best here. Are you able to initialize the LiteLLM client directly? If so you may be just missing some params. (my guess is you need to set api_base in model_kwargs (https://github.com/Arize-ai/phoenix/blob/ab7354b4f813a757e4a120cb71ff08fcf2a105cd/packages/phoenix-evals/src/phoenix/evals/models/litellm.py#L26) Otherwise lightLLM doesn't know how to connect to the model API.
Let us know how it goes!