Fix for non cross platform download of piper release package
Describe the bug A clear and concise description of what the bug is.
To Reproduce Steps to reproduce the behavior:
- have windows 10
- have pycharm
- have LM Studio running a server
- clone the repo from windows command line
- make sure to checkout my previous PR so you can get past that error
-
git remote add upstream https://github.com/OpenInterpreter/01.git -
git fetch upstream -
git checkout upstream/pr/166 - open pycharm
pycharm . - after the project is created goto a termina
cd software - run these commands to make a poetry venv in a reasonable place
set POETRY_VIRTUALENVS_IN_PROJECT=true
poetry install
- go back into pycharm run configuration and tell it where the poetry venv is, locating python.exe in scripts
- close the old terminal
- open a new terminal notice that the venv is activated by the command line prefix
-
poetry run 01 --local
Expected behavior expected server to start.
Screenshots
(01os-py3.11) E:\oi\01\software>poetry run 01 --local
Warning: '01' is an entry point defined in pyproject.toml, but it's not installed as a script. You may get improper `sys.argv[0]`.
The support to run uninstalled scripts will be removed in a future release.
Run `poetry install` to resolve and get rid of this message.
○
Starting...
▌ 01 is compatible with several local model providers.
[?] Which one would you like to use?:
Ollama
> LM Studio
To use use 01 with LM Studio, you will need to run LM Studio in the background.
1 Download LM Studio from https://lmstudio.ai/, then start it.
2 Select a language model then click Download.
3 Click the <-> button on the left (below the chat button).
4 Select your model at the top, then click Start Server.
Once the server is running, you can begin your conversation below.
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ E:\oi\01\software\start.py:44 in run │
│ │
│ 41 │ │ │ local: bool = typer.Option(False, "--local", help="Use recommended local ser │
│ 42 │ │ ): │
│ 43 │ │
│ ❱ 44 │ _run( │
│ 45 │ │ server=server, │
│ 46 │ │ server_host=server_host, │
│ 47 │ │ server_port=server_port, │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ client = False │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = False │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_url = None │ │
│ │ stt_service = 'openai' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'openai' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰────────────────────────────────────╯ │
│ │
│ E:\oi\01\software\start.py:136 in _run │
│ │
│ 133 │ │ │ │ except FileNotFoundError: │
│ 134 │ │ │ │ │ client_type = "linux" │
│ 135 │ │ │
│ ❱ 136 │ │ module = importlib.import_module(f".clients.{client_type}.device", package='sour │
│ 137 │ │ client_thread = threading.Thread(target=module.main, args=[server_url]) │
│ 138 │ │ client_thread.start() │
│ 139 │
│ │
│ ╭────────────────────────────────────── locals ───────────────────────────────────────╮ │
│ │ client = True │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ handle_exit = <function _run.<locals>.handle_exit at 0x00000204BD0258A0> │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ loop = <ProactorEventLoop running=True closed=False debug=False> │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = True │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_thread = <Thread(Thread-11 (run_until_complete), started 136420)> │ │
│ │ server_url = '0.0.0.0:10001' │ │
│ │ stt_service = 'local-whisper' │ │
│ │ system_type = 'Windows' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'piper' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰─────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ C:\Python\Python311\Lib\importlib\__init__.py:126 in import_module │
│ │
│ 123 │ │ │ if character != '.': │
│ 124 │ │ │ │ break │
│ 125 │ │ │ level += 1 │
│ ❱ 126 │ return _bootstrap._gcd_import(name[level:], package, level) │
│ 127 │
│ 128 │
│ 129 _RELOADING = {} │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ character = 'c' │ │
│ │ level = 1 │ │
│ │ name = '.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────────── locals ────────────────╮ │
│ │ level = 1 │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000020481473D80> │ │
│ │ module = <object object at 0x00000204814A4050> │ │
│ │ name = 'source.clients.auto.device' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1126 │
│ ╭────────────────────────── locals ──────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000020481473D80> │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ parent = 'source.clients.auto' │ │
│ │ parent_spec = None │ │
│ │ path = None │ │
│ ╰────────────────────────────────────────────────────────────╯ │
│ in _call_with_frames_removed:241 │
│ ╭────────────────────── locals ───────────────────────╮ │
│ │ args = ('source.clients.auto',) │ │
│ │ f = <function _gcd_import at 0x0000020481473D80> │ │
│ │ kwds = {} │ │
│ ╰─────────────────────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────── locals ─────────────╮ │
│ │ level = 0 │ │
│ │ name = 'source.clients.auto' │ │
│ │ package = None │ │
│ ╰─────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000020481473D80> │ │
│ │ module = <object object at 0x00000204814A4050> │ │
│ │ name = 'source.clients.auto' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1140 │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ child = 'auto' │ │
│ │ import_ = <function _gcd_import at 0x0000020481473D80> │ │
│ │ name = 'source.clients.auto' │ │
│ │ parent = 'source.clients' │ │
│ │ parent_module = <module 'source.clients' from │ │
│ │ 'E:\\oi\\01\\software\\source\\clients\\__init__.py'> │ │
│ │ parent_spec = ModuleSpec(name='source.clients', │ │
│ │ loader=<_frozen_importlib_external.SourceFileLoader object at │ │
│ │ 0x00000204BD050650>, │ │
│ │ origin='E:\\oi\\01\\software\\source\\clients\\__init__.py', │ │
│ │ submodule_search_locations=['E:\\oi\\01\\software\\source\\clients']) │ │
│ │ path = ['E:\\oi\\01\\software\\source\\clients'] │ │
│ │ spec = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ModuleNotFoundError: No module named 'source.clients.auto'
Exception in thread Thread-11 (run_until_complete):
Traceback (most recent call last):
File "C:\Python\Python311\Lib\threading.py", line 1045, in _bootstrap_inner
self.run()
File "C:\Python\Python311\Lib\threading.py", line 982, in run
self._target(*self._args, **self._kwargs)
File "C:\Python\Python311\Lib\asyncio\base_events.py", line 654, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "E:\oi\01\software\source\server\server.py", line 413, in main
service_instance = ServiceClass(config)
^^^^^^^^^^^^^^^^^^^^
File "E:\oi\01\software\source\server\services\tts\piper\tts.py", line 13, in __init__
self.install(config["service_directory"])
File "E:\oi\01\software\source\server\services\tts\piper\tts.py", line 61, in install
urllib.request.urlretrieve(f"{PIPER_URL}{PIPER_ASSETNAME}", os.path.join(PIPER_FOLDER_PATH, PIPER_ASSETNAME))
File "C:\Python\Python311\Lib\urllib\request.py", line 241, in urlretrieve
with contextlib.closing(urlopen(url, data)) as fp:
^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 216, in urlopen
return opener.open(url, data, timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 525, in open
response = meth(req, response)
^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 634, in http_response
response = self.parent.error(
^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 557, in error
result = self._call_chain(*args)
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 496, in _call_chain
result = func(*args)
^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 749, in http_error_302
return self.parent.open(new, timeout=req.timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 525, in open
response = meth(req, response)
^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 634, in http_response
response = self.parent.error(
^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 563, in error
return self._call_chain(*args)
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 496, in _call_chain
result = func(*args)
^^^^^^^^^^^
File "C:\Python\Python311\Lib\urllib\request.py", line 643, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
Desktop (please complete the following information):
- OS: windows 10
- Python Version 3.11 (No conda)
Additional context Trying to set up and windows 10 + PyCharm development environment looks like its just that tts.py has some more non-cross platform code. fix incoming
this seems to still happen on the latest commit even though the pr was merged
my error seems slightly different from the one in this post, but the exception is still referring to missing source.clients.auto
> poetry run 01 --local
○
Starting...
▌ 01 is compatible with several local model providers.
[?] Which one would you like to use?:
Ollama
> LM Studio
To use use 01 with LM Studio, you will need to run LM Studio in the background.
1 Download LM Studio from https://lmstudio.ai/, then start it.
2 Select a language model then click Download.
3 Click the <-> button on the left (below the chat button).
4 Select your model at the top, then click Start Server.
Once the server is running, you can begin your conversation below.
Exception in thread Thread-11 (run_until_complete):
Traceback (most recent call last):
File "C:\Program Files\Python311\Lib\threading.py", line 1038, in _bootstrap_inner
self.run()
File "C:\Program Files\Python311\Lib\threading.py", line 975, in run
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ E:\User\Documents\git\01\software\start.py:44 in run │
│ │
│ 41 │ │ │ local: bool = typer.Option(False, "--local", help="Use recommended local ser │
│ 42 │ │ ): │
│ 43 │ │
│ ❱ 44 │ _run( │
│ 45 │ │ server=server, │
│ 46 │ │ server_host=server_host, │
│ 47 │ │ server_port=server_port, │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ client = False │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = False │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_url = None │ │
│ │ stt_service = 'openai' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'openai' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰────────────────────────────────────╯ │
│ │
│ E:\User\Documents\git\01\software\start.py:136 in _run │
│ │
│ 133 │ │ │ │ except FileNotFoundError: │
│ 134 │ │ │ │ │ client_type = "linux" │
│ 135 │ │ │
│ ❱ 136 │ │ module = importlib.import_module(f".clients.{client_type}.device", package='sour │
│ 137 │ │ client_thread = threading.Thread(target=module.main, args=[server_url]) │
│ 138 │ │ client_thread.start() │
│ 139 │
│ │
│ ╭────────────────────────────────────── locals ───────────────────────────────────────╮ │
│ │ client = True │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ handle_exit = <function _run.<locals>.handle_exit at 0x0000028EDAFC4860> │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ loop = <ProactorEventLoop running=True closed=False debug=False> │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
self._target(*self._args, **self._kwargs)
│ │ server = True │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_thread = <Thread(Thread-11 (run_until_complete), started 18204)> │ │
│ │ server_url = '0.0.0.0:10001' │ │
│ │ stt_service = 'local-whisper' │ │
│ │ system_type = 'Windows' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'piper' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰─────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ C:\Program Files\Python311\Lib\importlib\__init__.py:126 in import_module │
│ │
│ 123 │ │ │ if character != '.': │
│ 124 │ │ │ │ break │
│ 125 │ │ │ level += 1 │
│ ❱ 126 │ return _bootstrap._gcd_import(name[level:], package, level) │
│ 127 │
│ 128 │
│ 129 _RELOADING = {} │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ character = 'c' │ │
│ │ level = 1 │ │
│ │ name = '.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────────── locals ────────────────╮ │
│ │ level = 1 │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000028E9F2E3D80> │ │
│ │ module = <object object at 0x0000028E9F314050> │ │
│ │ name = 'source.clients.auto.device' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1126 │
│ ╭────────────────────────── locals ──────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000028E9F2E3D80> │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ parent = 'source.clients.auto' │ │
│ │ parent_spec = None │ │
│ │ path = None │ │
│ ╰────────────────────────────────────────────────────────────╯ │
│ in _call_with_frames_removed:241 │
│ ╭────────────────────── locals ───────────────────────╮ │
│ │ args = ('source.clients.auto',) │ │
│ │ f = <function _gcd_import at 0x0000028E9F2E3D80> │ │
│ │ kwds = {} │ │
│ ╰─────────────────────────────────────────────────────╯ │
│ in _gcd_import:1204 │
File "C:\Program Files\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
│ ╭──────────── locals ─────────────╮ │
│ │ level = 0 │ │
│ │ name = 'source.clients.auto' │ │
│ │ package = None │ │
│ ╰─────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000028E9F2E3D80> │ │
│ │ module = <object object at 0x0000028E9F314050> │ │
│ │ name = 'source.clients.auto' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1140 │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ child = 'auto' │ │
│ │ import_ = <function _gcd_import at 0x0000028E9F2E3D80> │ │
│ │ name = 'source.clients.auto' │ │
│ │ parent = 'source.clients' │ │
│ │ parent_module = <module 'source.clients' from │ │
│ │ 'E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.py'> │ │
│ │ parent_spec = ModuleSpec(name='source.clients', │ │
│ │ loader=<_frozen_importlib_external.SourceFileLoader object at │ │
│ │ 0x0000028EDAFCFF90>, │ │
│ │ origin='E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.p… │ │
│ │ submodule_search_locations=['E:\\User\\Documents\\git\\01\\software\\source… │ │
│ │ path = ['E:\\User\\Documents\\git\\01\\software\\source\\clients'] │ │
│ │ spec = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ModuleNotFoundError: No module named 'source.clients.auto'
return future.result()
^^^^^^^^^^^^^^^
File "E:\User\Documents\git\01\software\source\server\server.py", line 413, in main
service_instance = ServiceClass(config)
^^^^^^^^^^^^^^^^^^^^
File "E:\User\Documents\git\01\software\source\server\services\tts\piper\tts.py", line 13, in __init__
self.install(config["service_directory"])
File "E:\User\Documents\git\01\software\source\server\services\tts\piper\tts.py", line 64, in install
urllib.request.urlretrieve(asset_url, os.path.join(PIPER_FOLDER_PATH, PIPER_ASSETNAME))
^^^^^^^^^
UnboundLocalError: cannot access local variable 'asset_url' where it is not associated with a value
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
Thank you for s surfacing this! The pr by @dheavy should take care of this nicely when it's done.
dheavy's pr got merged, but the excepttion is still there albeit with some new errors, posting the new stacktraces
> poetry run 01 --local
○
Starting...
▌ 01 is compatible with several local model providers.
[?] Which one would you like to use?:
Ollama
> LM Studio
To use use 01 with LM Studio, you will need to run LM Studio in the background.
1 Download LM Studio from https://lmstudio.ai/, then start it.
2 Select a language model then click Download.
3 Click the <-> button on the left (below the chat button).
4 Select your model at the top, then click Start Server.
Once the server is running, you can begin your conversation below.
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ E:\User\Documents\git\01\software\start.py:44 in run │
│ │
│ 41 │ │ │ local: bool = typer.Option(False, "--local", help="Use recommended local ser │
│ 42 │ │ ): │
│ 43 │ │
│ ❱ 44 │ _run( │
│ 45 │ │ server=server, │
│ 46 │ │ server_host=server_host, │
│ 47 │ │ server_port=server_port, │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ client = False │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = False │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_url = None │ │
│ │ stt_service = 'openai' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'openai' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰────────────────────────────────────╯ │
│ │
│ E:\User\Documents\git\01\software\start.py:136 in _run │
│ │
│ 133 │ │ │ │ except FileNotFoundError: │
│ 134 │ │ │ │ │ client_type = "linux" │
│ 135 │ │ │
│ ❱ 136 │ │ module = importlib.import_module(f".clients.{client_type}.device", package='sour │
│ 137 │ │ client_thread = threading.Thread(target=module.main, args=[server_url]) │
│ 138 │ │ client_thread.start() │
│ 139 │
│ │
│ ╭────────────────────────────────────── locals ───────────────────────────────────────╮ │
│ │ client = True │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ handle_exit = <function _run.<locals>.handle_exit at 0x000001E17F784860> │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ loop = <ProactorEventLoop running=True closed=False debug=False> │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = True │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_thread = <Thread(Thread-11 (run_until_complete), started 18296)> │ │
│ │ server_url = '0.0.0.0:10001' │ │
│ │ stt_service = 'local-whisper' │ │
│ │ system_type = 'Windows' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'piper' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰─────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ C:\Program Files\Python311\Lib\importlib\__init__.py:126 in import_module │
│ │
│ 123 │ │ │ if character != '.': │
│ 124 │ │ │ │ break │
│ 125 │ │ │ level += 1 │
│ ❱ 126 │ return _bootstrap._gcd_import(name[level:], package, level) │
│ 127 │
│ 128 │
│ 129 _RELOADING = {} │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ character = 'c' │ │
│ │ level = 1 │ │
│ │ name = '.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────────── locals ────────────────╮ │
│ │ level = 1 │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x000001E15AB03D80> │ │
│ │ module = <object object at 0x000001E15AB34050> │ │
│ │ name = 'source.clients.auto.device' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1126 │
│ ╭────────────────────────── locals ──────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x000001E15AB03D80> │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ parent = 'source.clients.auto' │ │
│ │ parent_spec = None │ │
│ │ path = None │ │
│ ╰────────────────────────────────────────────────────────────╯ │
│ in _call_with_frames_removed:241 │
│ ╭────────────────────── locals ───────────────────────╮ │
│ │ args = ('source.clients.auto',) │ │
│ │ f = <function _gcd_import at 0x000001E15AB03D80> │ │
│ │ kwds = {} │ │
│ ╰─────────────────────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────── locals ─────────────╮ │
│ │ level = 0 │ │
│ │ name = 'source.clients.auto' │ │
│ │ package = None │ │
│ ╰─────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x000001E15AB03D80> │ │
│ │ module = <object object at 0x000001E15AB34050> │ │
│ │ name = 'source.clients.auto' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1140 │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ child = 'auto' │ │
│ │ import_ = <function _gcd_import at 0x000001E15AB03D80> │ │
│ │ name = 'source.clients.auto' │ │
│ │ parent = 'source.clients' │ │
│ │ parent_module = <module 'source.clients' from │ │
│ │ 'E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.py'> │ │
│ │ parent_spec = ModuleSpec(name='source.clients', │ │
│ │ loader=<_frozen_importlib_external.SourceFileLoader object at │ │
│ │ 0x000001E17F789A90>, │ │
│ │ origin='E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.p… │ │
│ │ submodule_search_locations=['E:\\User\\Documents\\git\\01\\software\\source… │ │
│ │ path = ['E:\\User\\Documents\\git\\01\\software\\source\\clients'] │ │
│ │ spec = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ModuleNotFoundError: No module named 'source.clients.auto'
Exception in thread Thread-11 (run_until_complete):
Traceback (most recent call last):
File "C:\Program Files\Python311\Lib\tarfile.py", line 1877, in gzopen
t = cls.taropen(name, mode, fileobj, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 1854, in taropen
return cls(name, mode, fileobj, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 1714, in __init__
self.firstmember = self.next()
^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 2631, in next
raise e
File "C:\Program Files\Python311\Lib\tarfile.py", line 2604, in next
tarinfo = self.tarinfo.fromtarfile(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 1292, in fromtarfile
buf = tarfile.fileobj.read(BLOCKSIZE)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\gzip.py", line 301, in read
return self._buffer.read(size)
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\_compression.py", line 68, in readinto
data = self.read(len(byte_view))
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\gzip.py", line 499, in read
if not self._read_gzip_header():
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\gzip.py", line 468, in _read_gzip_header
last_mtime = _read_gzip_header(self._fp)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\gzip.py", line 428, in _read_gzip_header
raise BadGzipFile('Not a gzipped file (%r)' % magic)
gzip.BadGzipFile: Not a gzipped file (b'PK')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Program Files\Python311\Lib\threading.py", line 1038, in _bootstrap_inner
self.run()
File "C:\Program Files\Python311\Lib\threading.py", line 975, in run
self._target(*self._args, **self._kwargs)
File "C:\Program Files\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "E:\User\Documents\git\01\software\source\server\server.py", line 413, in main
service_instance = ServiceClass(config)
^^^^^^^^^^^^^^^^^^^^
File "E:\User\Documents\git\01\software\source\server\services\tts\piper\tts.py", line 13, in __init__
self.install(config["service_directory"])
File "E:\User\Documents\git\01\software\source\server\services\tts\piper\tts.py", line 75, in install
with tarfile.open(os.path.join(PIPER_FOLDER_PATH, PIPER_ASSETNAME), 'r:gz') as tar:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 1824, in open
return func(name, filemode, fileobj, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\tarfile.py", line 1881, in gzopen
raise ReadError("not a gzip file") from e
tarfile.ReadError: not a gzip file
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
is this actually working on other windows machines?
investigating
haha freaking typos. I see it. fix incoming.
Need to create tests for this file eventually.
(01os-py3.11) E:\oi\01\software>poetry run 01 --local --server
Warning: '01' is an entry point defined in pyproject.toml, but it's not installed as a script. You m
ay get improper `sys.argv[0]`.
The support to run uninstalled scripts will be removed in a future release.
Run `poetry install` to resolve and get rid of this message.
> LM Studio
To use use 01 with LM Studio, you will need to run LM Studio in the background.
1 Download LM Studio from https://lmstudio.ai/, then start it.
2 Select a language model then click Download.
3 Click the <-> button on the left (below the chat button).
4 Select your model at the top, then click Start Server.
Once the server is running, you can begin your conversation below.
C:\Users\Tasha\AppData\Local\01\01\services\tts\piper\piper
Piper setup completed.
fix is #178
Okay so I can get to this maybe later tonight here is the info needed if someone else wants to handle it:
the urls and link formats for the various archives.
[piper_linux_aarch64.tar.gz](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_linux_aarch64.tar.gz)
[piper_linux_armv7l.tar.gz](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_linux_armv7l.tar.gz)
[piper_linux_x86_64.tar.gz](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_linux_x86_64.tar.gz)
[piper_macos_aarch64.tar.gz](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_macos_aarch64.tar.gz)
[piper_macos_x64.tar.gz](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_macos_x64.tar.gz)
[piper_windows_amd64.zip](https://github.com/rhasspy/piper/releases/download/2023.11.14-2/piper_windows_amd64.zip)
What you can expect from platform https://docs.python.org/3/library/sys.html#sys.platform
its just that other platforms need to be added! to lines 38->111 of tts.py
Ok so apparently piper got fixed, but the exception above, the one about the source.clients.auto module still exists, as far as i understand this is a separate issue then? should we close this and open a new one?
> poetry run 01 --local
○
Starting...
▌ 01 is compatible with several local model providers.
[?] Which one would you like to use?:
Ollama
> LM Studio
To use use 01 with LM Studio, you will need to run LM Studio in the background.
1 Download LM Studio from https://lmstudio.ai/, then start it.
2 Select a language model then click Download.
3 Click the <-> button on the left (below the chat button).
4 Select your model at the top, then click Start Server.
Once the server is running, you can begin your conversation below.
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ E:\User\Documents\git\01\software\start.py:44 in run │
│ │
│ 41 │ │ │ local: bool = typer.Option(False, "--local", help="Use recommended local ser │
│ 42 │ │ ): │
│ 43 │ │
│ ❱ 44 │ _run( │
│ 45 │ │ server=server, │
│ 46 │ │ server_host=server_host, │
│ 47 │ │ server_port=server_port, │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ client = False │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = False │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_url = None │ │
│ │ stt_service = 'openai' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'openai' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰────────────────────────────────────╯ │
│ │
│ E:\User\Documents\git\01\software\start.py:136 in _run │
│ │
│ 133 │ │ │ │ except FileNotFoundError: │
│ 134 │ │ │ │ │ client_type = "linux" │
│ 135 │ │ │
│ ❱ 136 │ │ module = importlib.import_module(f".clients.{client_type}.device", package='sour │
│ 137 │ │ client_thread = threading.Thread(target=module.main, args=[server_url]) │
│ 138 │ │ client_thread.start() │
│ 139 │
│ │
│ ╭────────────────────────────────────── locals ───────────────────────────────────────╮ │
│ │ client = True │ │
│ │ client_type = 'auto' │ │
│ │ context_window = 2048 │ │
│ │ expose = False │ │
│ │ handle_exit = <function _run.<locals>.handle_exit at 0x0000021CE889C860> │ │
│ │ llm_service = 'litellm' │ │
│ │ llm_supports_functions = False │ │
│ │ llm_supports_vision = False │ │
│ │ local = True │ │
│ │ loop = <ProactorEventLoop running=True closed=False debug=False> │ │
│ │ max_tokens = 4096 │ │
│ │ model = 'gpt-4' │ │
│ │ server = True │ │
│ │ server_host = '0.0.0.0' │ │
│ │ server_port = 10001 │ │
│ │ server_thread = <Thread(Thread-11 (run_until_complete), started 584)> │ │
│ │ server_url = '0.0.0.0:10001' │ │
│ │ stt_service = 'local-whisper' │ │
│ │ system_type = 'Windows' │ │
│ │ temperature = 0.8 │ │
│ │ tts_service = 'piper' │ │
│ │ tunnel_service = 'ngrok' │ │
│ ╰─────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ C:\Program Files\Python311\Lib\importlib\__init__.py:126 in import_module │
│ │
│ 123 │ │ │ if character != '.': │
│ 124 │ │ │ │ break │
│ 125 │ │ │ level += 1 │
│ ❱ 126 │ return _bootstrap._gcd_import(name[level:], package, level) │
│ 127 │
│ 128 │
│ 129 _RELOADING = {} │
│ │
│ ╭────────────── locals ──────────────╮ │
│ │ character = 'c' │ │
│ │ level = 1 │ │
│ │ name = '.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────────── locals ────────────────╮ │
│ │ level = 1 │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ package = 'source' │ │
│ ╰────────────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000021CAC863D80> │ │
│ │ module = <object object at 0x0000021CAC894050> │ │
│ │ name = 'source.clients.auto.device' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1126 │
│ ╭────────────────────────── locals ──────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000021CAC863D80> │ │
│ │ name = 'source.clients.auto.device' │ │
│ │ parent = 'source.clients.auto' │ │
│ │ parent_spec = None │ │
│ │ path = None │ │
│ ╰────────────────────────────────────────────────────────────╯ │
│ in _call_with_frames_removed:241 │
│ ╭────────────────────── locals ───────────────────────╮ │
│ │ args = ('source.clients.auto',) │ │
│ │ f = <function _gcd_import at 0x0000021CAC863D80> │ │
│ │ kwds = {} │ │
│ ╰─────────────────────────────────────────────────────╯ │
│ in _gcd_import:1204 │
│ ╭──────────── locals ─────────────╮ │
│ │ level = 0 │ │
│ │ name = 'source.clients.auto' │ │
│ │ package = None │ │
│ ╰─────────────────────────────────╯ │
│ in _find_and_load:1176 │
│ ╭──────────────────────── locals ────────────────────────╮ │
│ │ import_ = <function _gcd_import at 0x0000021CAC863D80> │ │
│ │ module = <object object at 0x0000021CAC894050> │ │
│ │ name = 'source.clients.auto' │ │
│ ╰────────────────────────────────────────────────────────╯ │
│ in _find_and_load_unlocked:1140 │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ child = 'auto' │ │
│ │ import_ = <function _gcd_import at 0x0000021CAC863D80> │ │
│ │ name = 'source.clients.auto' │ │
│ │ parent = 'source.clients' │ │
│ │ parent_module = <module 'source.clients' from │ │
│ │ 'E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.py'> │ │
│ │ parent_spec = ModuleSpec(name='source.clients', │ │
│ │ loader=<_frozen_importlib_external.SourceFileLoader object at │ │
│ │ 0x0000021CE88A0990>, │ │
│ │ origin='E:\\User\\Documents\\git\\01\\software\\source\\clients\\__init__.p… │ │
│ │ submodule_search_locations=['E:\\User\\Documents\\git\\01\\software\\source… │ │
│ │ path = ['E:\\User\\Documents\\git\\01\\software\\source\\clients'] │ │
│ │ spec = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ModuleNotFoundError: No module named 'source.clients.auto'
Piper already set up. Skipping download.
Finished release [optimized] target(s) in 0.09s
Whisper model already exists. Skipping download.
INFO: Started server process [8924]
INFO: Waiting for application startup.
Ready.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:10001 (Press CTRL+C to quit)
obviously this doesn't work even though the api server starts
@ProtoxiDe22, I'm currently working on a fix for the whole windows process.
I have started setting up a PC workstation for AI so that's an opportunity to consolidate 01 for this environment once and for all. Thanks for bearing with us in the meantime 🙂
I've pushed a WIP branch that fixed what you described and other issues (with probably others withstanding).
I'll try to finish it this weekend--I will let you know.
I'm not in front of the computer right now so I can't show you, but if you're curious it's in the PR page labeled with a big [WIP].