Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]getting error when importing trulens #1308

Closed
Lorrrrraine opened this issue Jul 29, 2024 · 13 comments · Fixed by #1393
Closed

[BUG]getting error when importing trulens #1308

Lorrrrraine opened this issue Jul 29, 2024 · 13 comments · Fixed by #1393
Assignees
Labels
bug Something isn't working

Comments

@Lorrrrraine
Copy link

Bug Description
TypeError: Type parameter +R without a default follows type parameter with a default

To Reproduce
Which steps should someone take to run into the same error? A small, reproducible code example is useful here.
image

Expected behavior
A clear and concise description of what you expected to happen.

Relevant Logs/Tracebacks
Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks. If the issue is related to the TruLens dashboard, please also include a screenshot.
Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev\pydevconsole.py", line 364, in runcode
coro = func()
File "", line 1, in
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_umd.py", line 197, in runfile
pydev_imports.execfile(filename, global_vars, local_vars) # execute the script
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "E:\Py_api_test\trulens-test\main.py", line 3, in
from trulens_eval import Feedback, TruLlama
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval_init
.py", line 20, in
from trulens_eval import tru as mod_tru
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\tru.py", line 28, in
from trulens_eval.database import sqlalchemy
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\database\sqlalchemy.py", line 23, in
from trulens_eval import app as mod_app
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\app.py", line 22, in
from trulens_eval import feedback as mod_feedback
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback_init
.py", line 3, in
from trulens_eval.feedback import feedback as mod_feedback
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback\feedback.py", line 25, in
from trulens_eval.feedback.provider import base as mod_base_provider
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback\provider_init
.py", line 1, in
from trulens_eval.feedback.provider.base import Provider
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback\provider\base.py", line 10, in
from trulens_eval.feedback import prompts
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback\prompts.py", line 6, in
from trulens_eval.feedback.v2 import feedback as v2
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\trulens_eval\feedback\v2\feedback.py", line 4, in
from langchain.evaluation.criteria.eval_chain import _SUPPORTED_CRITERIA
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langchain\evaluation_init
.py", line 56, in
from langchain.evaluation.agents import TrajectoryEvalChain
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langchain\evaluation\agents_init
.py", line 3, in
from langchain.evaluation.agents.trajectory_eval_chain import TrajectoryEvalChain
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langchain\evaluation\agents\trajectory_eval_chain.py", line 22, in
from langchain_core.callbacks.manager import (
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self.system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langchain_core\callbacks_init
.py", line 23, in
from langchain_core.callbacks.manager import (
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langchain_core\callbacks\manager.py", line 29, in
from langsmith.run_helpers import get_run_tree_context
File "C:\Program Files\JetBrains\PyCharm 2023.3.5\plugins\python\helpers\pydev_pydev_bundle\pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\langsmith\run_helpers.py", line 197, in
class SupportsLangsmithExtra(Protocol, Generic[P, R]):
File "C:\Users\alisa\AppData\Local\Programs\Python\Python310\lib\typing.py", line 312, in inner
return func(*args, **kwds)
File "C:\Users\alisa\AppData\Local\Programs\Python\Python310\lib\typing.py", line 1346, in class_getitem
return _GenericAlias(cls, params,
File "C:\Users\alisa\AppData\Local\Programs\Python\Python310\lib\typing.py", line 1025, in init
self.parameters = _collect_type_vars(params, typevar_types=_typevar_types)
File "E:\Py_api_test\trulens-test.venv\lib\site-packages\pip_vendor\typing_extensions.py", line 3019, in _collect_type_vars
raise TypeError(f'Type parameter {t!r} without a default'
TypeError: Type parameter +R without a default follows type parameter with a default
Environment:

  • OS: Windows
  • Python Version: 3.10
  • TruLens version: 0.33.0
  • Versions of other relevant installed libraries

Additional context
Add any other context about the problem here.

@Lorrrrraine Lorrrrraine added the bug Something isn't working label Jul 29, 2024
Copy link

dosubot bot commented Jul 29, 2024

Hello, @Lorrrrraine! I'm here to help you with any bugs, questions, or contributions.

To resolve the TypeError: Type parameter +R without a default follows type parameter with a default when importing the trulens_eval library, you need to ensure that the type parameters in your code are ordered correctly. Specifically, type parameters without defaults should precede those with defaults.

This error typically arises in generic class definitions or function annotations. Here is a general example of how to correct such an error:

from typing import TypeVar, Generic

# Incorrect order of type parameters
T = TypeVar('T', bound=int)
R = TypeVar('R', default=str)

class Example(Generic[T, R]):  # This will raise the TypeError
    pass

To fix this, ensure that type parameters without defaults come before those with defaults:

from typing import TypeVar, Generic

# Correct order of type parameters
T = TypeVar('T', bound=int)
R = TypeVar('R', default=str)

class Example(Generic[T, R]):  # This is correct
    pass

In the trulens_eval library, there are specific type parameter definitions that might be causing the TypeError. For instance, the TruLlama class has type parameters defined for its app attribute and root_callable attribute:

app: Union[BaseQueryEngine, BaseChatEngine]

root_callable: ClassVar[FunctionOrMethod] = Field(
    default_factory=lambda: FunctionOrMethod.of_callable(TruLlama.query)
)

Additionally, the constructor_of_class function raises a TypeError if an unknown component type is encountered:

def constructor_of_class(cls: Class) -> Type[app.TrulensComponent]:
    for view in COMPONENT_VIEWS:
        if view.class_is(cls):
            return view

    raise TypeError(f"Unknown trulens component type with class {cls}")

Moreover, there is a known issue involving the use of BaseLLM and BaseChatModel within pydantic v2, which could be causing the type parameter error:

# Cannot validate BaseLLM / BaseChatModel as they are pydantic v1 and there
# is some bug involving their use within pydantic v2.
# https://github.com/langchain-ai/langchain/issues/10112

Ensure that your type parameters are ordered correctly and check for any known issues with the libraries you are using [1][2][3][4].

To continue talking to Dosu, mention @dosu.

@ddamme05
Copy link

ddamme05 commented Jul 30, 2024

Same bug here when trying to run quickstart.py in VSCode.

FIXED:

In my virtual environment on VSCode, when initializing the venv it had a different version as opposed to the one I was using. I was using 3.11.8 and it was creating venv of 3.9.3.

@ksteimel
Copy link

I was able to avoid this error by installing

langchain==0.2.10
langchain-core==0.2.22

@sfc-gh-chu
Copy link
Contributor

@Lorrrrraine, what version of langchain do you have installed?

@sfc-gh-chu sfc-gh-chu self-assigned this Jul 31, 2024
@Lorrrrraine
Copy link
Author

@Lorrrrraine, what version of langchain do you have installed?

langchain==0.2.11
langchain-core==0.2.24

@Lorrrrraine
Copy link
Author

I was able to avoid this error by installing

langchain==0.2.10
langchain-core==0.2.22

Thanks, unfortunately, it hasn't worked out for my situation.

@rishwanth-thiyagaraj
Copy link

rishwanth-thiyagaraj commented Aug 1, 2024

I was able to resolve this error by downgrading pip

pip == 24.1.2

@sfc-gh-chu
Copy link
Contributor

sfc-gh-chu commented Aug 1, 2024

Appears to be related to Langchain's typing of SupportsLangsmithExtra(Protocol, Generic[P, R]). I gave @rishwanth-thiyagaraj's suggestion a try and reinstalled Langchain which seemed to have fixed this error.

pip install pip==24.1.2
pip install -U langchain

@Lorrrrraine can you confirm if that helps here?

@Lorrrrraine
Copy link
Author

Appears to be related to Langchain's typing of SupportsLangsmithExtra(Protocol, Generic[P, R]). I gave @rishwanth-thiyagaraj's suggestion a try and reinstalled Langchain which seemed to have fixed this error.

pip install pip==24.1.2
pip install -U langchain

@Lorrrrraine can you confirm if that helps here?

Thanks for your help. But other problems have occured :( sad
image

@akabeera
Copy link

akabeera commented Aug 3, 2024

Appears to be related to Langchain's typing of SupportsLangsmithExtra(Protocol, Generic[P, R]). I gave @rishwanth-thiyagaraj's suggestion a try and reinstalled Langchain which seemed to have fixed this error.

pip install pip==24.1.2
pip install -U langchain

@Lorrrrraine can you confirm if that helps here?

I was getting this error while running TruVirtual app and this has worked for me. Thx

@jimrange
Copy link

jimrange commented Aug 8, 2024

Appears to be related to Langchain's typing of SupportsLangsmithExtra(Protocol, Generic[P, R]). I gave @rishwanth-thiyagaraj's suggestion a try and reinstalled Langchain which seemed to have fixed this error.

pip install pip==24.1.2
pip install -U langchain

@Lorrrrraine can you confirm if that helps here?

I was having this issue too so I created simplified docker image with my same base setup with Ubuntu 22.04.4 LTS, Python 3.10.12, pip 24.2 and the following in my requirements.txt:

openai
trulens-eval
chromadb
notebook
ipywidgets 
jupyter
pipdeptree

For me downgrading from pip 24.2 to 24.1.2 alone did not fix the issue.

Since I am using docker it was easy to restart from scratch (e.g. docker-compose up --build --force-recreate) to see exactly which change resolved the issue.

Downgrading from pip 24.2 to 24.1.2 and then doing an upgrade on langchain resolved this issue for me.

It is strange though because pip install -U langchain results in langchain==0.2.12 before and after running the upgrade.

And no dependencies change. All requirements were already satisfied when the upgrade ran and I ran pipdeptree -p langchain before and after the upgrade and then did a diff of the output and it was identical (no installed dependencies that langchain uses were changed).

Since it is in a docker container I rebuilt it and tried just doing the pip install -U langchain and that would not fix the issue. Also tried just downgrading pip to 24.1.2 and that didn't fix the issue. But even if I then did the downgrade of pip to 24.1.2 after the pip install -U langchain then it would resolve the issue. Strange that the order of downgrading pip and running the upgrade of langchain did not make a difference and either way would resolve the issue.

It isn't clear to me what is causing this strange behavior.

@rishwanth-thiyagaraj @sfc-gh-chu Any idea why this fixes the issue?

I tried recreating my docker image with pip==24.1.2 from the start instead of letting it default to pip 24.2 and have it do a fresh install of all the packages and this issue did not occur. So maybe this is an issue with pip 24.2 and langchain?

@aehsaei
Copy link

aehsaei commented Aug 19, 2024

I've also faced this error and finally fixed the error by changing the order of import statements. The Langchain imports must be first, before the Trulens Tru import.

for example, this causes the failure discussed above:

from trulens_eval import Tru
from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI
...

To fix:

from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI
...
from trulens_eval import Tru

@sfc-gh-jreini
Copy link
Contributor

sfc-gh-jreini commented Aug 28, 2024

Hey all - thanks for your patience.

We've fixed this issue in 0.33.1, 0.32.1, and 0.31.1 so you can now use TruLens-Eval with the latest pip version.

pip install -U trulens_eval && pip install -U pip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants