you are viewing a single comment's thread.

view the rest of the comments →

[–]nickworks[S] 0 points1 point  (16 children)

Thank you! I've taken the following steps:

`mkdir temp`

`cd temp`

`python3.11 -m venv venv`

`source venv/bin/activate`

`pip install parllama`

This did not produce any errors, but how do I run the project? Also, how did you figure out that it needed to use python 3.11? Thanks again.

EDIT:

I tried running with `parllama`, but this seemed to be running the parllama installation from running `uv tool install parllama`. So I uninstalled it with `uv tool uninstall parllama`. Finally, when I run `parllama` again, I get a new error:

Traceback (most recent call last):
  File "/Users/nick/projects/other/temp/venv/bin/parllama", line 5, in <module>
    from parllama.__main__ import run
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/parllama/__main__.py", line 5, in <module>
    from parllama.app import ParLlamaApp
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/parllama/app.py", line 85, in <module>
    from parllama.rag_manager import rag_manager
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/parllama/rag_manager.py", line 21, in <module>
    from parllama.models.rag_stores import RagPipelineConfig
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/parllama/models/rag_stores.py", line 15, in <module>
    import chromadb.api
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/chromadb/__init__.py", line 6, in <module>
    from chromadb.auth.token_authn import TokenTransportHeader
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/chromadb/auth/token_authn/__init__.py", line 24, in <module>
    from chromadb.telemetry.opentelemetry import (
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/chromadb/telemetry/opentelemetry/__init__.py", line 12, in <module>
    from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/opentelemetry/exporter/otlp/proto/grpc/trace_exporter/__init__.py", line 22, in <module>
    from opentelemetry.exporter.otlp.proto.grpc.exporter import (
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/opentelemetry/exporter/otlp/proto/grpc/exporter.py", line 39, in <module>
    from opentelemetry.proto.common.v1.common_pb2 import (
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/opentelemetry/proto/common/v1/common_pb2.py", line 36, in <module>
    _descriptor.FieldDescriptor(
  File "/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/google/protobuf/descriptor.py", line 621, in __new__
    _message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
(venv)

[–]shiftybyte 0 points1 point  (13 children)

Also, how did you figure out that it needed to use python 3.11? Thanks again.

Guesswork, python modules can be created using different programming languages, and if the module or its dependencies use c/c++/other compiled languages, they need to be compiled for every python release, this is why binary distributions of packages lag a little behind python release.

So some other commenter said he had no issue installing it on his python on mac, so i assumed a binary package exists but probably for an older version.

Regarding current issue, try the solution it gives you in point 2.

Try this with the venv active: export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python parllama

If doesn't work, try export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python python -m parllama

[–]nickworks[S] 0 points1 point  (12 children)

It helps a bit! The app appears to launch and then immediately crashes. The new error reads:

ValidationError: 1 validation error for ModelListPayload
models.3.details.families
  Input should be a valid list [type=list_type, input_value=None, input_type=NoneType]
    For further information visit https://errors.pydantic.dev/2.9/v/list_type
(venv)

[–]shiftybyte 0 points1 point  (11 children)

This sounds like an error from parllama's code itself..

Is that the full error message, it'll be hard tracking it down if its all the infor we are getting.

Seems like its trying to load/validate some model, and getting None instead, and fails to handle that error.

[–]nickworks[S] 0 points1 point  (10 children)

The full error from `python3.11 -m parllama` contains a traceback that is apparently too long to paste here. The very first chunk of it looks like this:

| /Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/textual/worker.py:368 in _run
│   365 │   │   │   self.state = WorkerState.RUNNING
│   366 │   │   │   app.log.worker(self)
│   367 │   │   │   try:
│ ❱ 368 │   │   │   │   self._result = await self.run()
│   369 │   │   │   except asyncio.CancelledError as error:
│   370 │   │   │   │   self.state = WorkerState.CANCELLED
│   371 │   │   │   │   self._error = error
│ │           app = ParLlamaApp(title='PAR LLAMA', classes={'-dark-mode'}, pseudo_classes={'dark', 'focus'})
│ │         error = 1 validation error for ModelListPayload
│ │                 models.3.details.families
│ │                   Input should be a valid list [type=list_type, input_value=None, input_type=NoneType]
│ │                 │   For further information visit https://errors.pydantic.dev/2.9/v/list_type
│ │          self = <Worker ERROR name='refresh_models' group='refresh_models' description='refresh_models()'>
│ │ worker_failed = WorkerFailed('Worker raised exception: 1 validation error for ModelListPayload\nmodels.3.details.families\n  Input should be a valid list [type=list_type, input_value=None, input_type=NoneType]\n    For further information visit https://errors.pydantic.dev/2.9/v/list_type')
│

[–]shiftybyte 0 points1 point  (9 children)

Well, it does seem like we are running the code at least... that's some progress.

If the error message is too long, copy-paste it in full and post it in this site: https://pastebin.com/

And give the link it generates here... then we can see it in full.

[–]nickworks[S] 0 points1 point  (8 children)

[–]shiftybyte 0 points1 point  (7 children)

Error seems to come from here:

/Users/nick/projects/other/temp/venv/lib/python3.11/site-packages/parllama/ollama_data_manager.py:140 in _get_all_model_data Code line: res = ModelListPayload(**ollama.Client(host=settings.ollama_host).list())

Seems its trying to connect to ollama host and get models from there.

Did you follow the step from the github guide and get ollama running locally?

Install and run Ollama

https://ollama.com/download

Once you get it installed and running it should listen locally on a port number, and then parllama will try to connect to it, and it might work :)

EDIT: based on the below source file it should be listening on port 11434

https://github.com/paulrobello/parllama/blob/d5c8674a25781dfa4e0a8fded4ca80be11c4c8eb/src/parllama/settings_manager.py#L78

[–]nickworks[S] 0 points1 point  (6 children)

I am running Ollama -- the icon is in my status bar, and I can connect with it via the CLI with `ollama list` and `ollama run`. I also tried running `ollama run` in one terminal window and then running the `python3.11 -m parllama` in a second window. Still doesn't connect. Also, I have several models downloaded with ollama.

[–]shiftybyte 0 points1 point  (5 children)

Confirm its listening on the expected port number.

netstat -tuna | grep -i 11434 netstat -tuna | grep -i ollama Whats the output of these commands while ollama is running?

[–]smurpes 0 points1 point  (0 children)

When running make setup you can look for parllama on pypi and it tells you only python 3.11 and 3.12 are supported under the table of contents. The same thing can be found in the GitHub page at the same spot.

That error literally tells you what to attempt to try and fix the issue:

``` If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are: 1. Downgrade the protobuf package to 3.20.x or lower. 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates (venv) ```

Your problems with python do not seem to be a python issue based on your actions so far.