snipsNLU in dokcer

Hey,

I would like to test snips NLU but have some problems with it.
I have installed the rhasspy docker container with version 2.5

Do I need to install anything else to use snips?

I get a TimeoutError during testing

[ERROR:2020-07-27 11:49:00,023] rhasspyserver_hermes: 
Traceback (most recent call last):
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/quart/app.py", line 1821, in full_dispatch_request
    result = await self.dispatch_request(request_context)
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/quart/app.py", line 1869, in dispatch_request
    return await handler(**request_.view_args)
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/rhasspyserver_hermes/__main__.py", line 1341, in api_text_to_intent
    user_entities=user_entities,
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/rhasspyserver_hermes/__main__.py", line 2458, in text_to_intent_dict
    result = await core.recognize_intent(text, intent_filter=intent_filter)
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/rhasspyserver_hermes/__init__.py", line 452, in recognize_intent
    handle_intent(), messages, message_types
  File "/usr/lib/rhasspy/lib/python3.7/site-packages/rhasspyserver_hermes/__init__.py", line 898, in publish_wait
    result_awaitable, timeout=timeout_seconds
  File "/usr/lib/python3.7/asyncio/tasks.py", line 449, in wait_for
    raise futures.TimeoutError()
concurrent.futures._base.TimeoutError
[DEBUG:2020-07-27 11:49:00,023] rhasspyserver_hermes: Publishing 20 bytes(s) to rhasspy/handle/toggleOn
[DEBUG:2020-07-27 11:49:00,022] rhasspyserver_hermes: -> HandleToggleOn(site_id='testpi')
[DEBUG:2020-07-27 11:48:30,019] rhasspyserver_hermes: Publishing 194 bytes(s) to hermes/nlu/query
[DEBUG:2020-07-27 11:48:30,019] rhasspyserver_hermes: -> NluQuery(input='wie spät ist es', site_id='testpi', id='cac912f7-fdee-4b34-b65d-0d6f7e0eafc4', intent_filter=None, session_id='cac912f7-fdee-4b34-b65d-0d6f7e0eafc4', wakeword_id=None)
[DEBUG:2020-07-27 11:48:30,017] rhasspyserver_hermes: Publishing 20 bytes(s) to rhasspy/handle/toggleOff
[DEBUG:2020-07-27 11:48:30,017] rhasspyserver_hermes: -> HandleToggleOff(site_id='testpi')

I want to test if Snips recognizes words that are not predefined

For example:
Add water to shopping list

The word water would be flexible here and not predetermined

It does, i.e Add soap to shopping list. As long as you defined examples in your sentences.ini.
See https://snips-nlu.readthedocs.io/en/latest/dataset.html for further information. Especially the section Implicit Entity Values.
As for Snips in the docker container i had to customly install snips-nlu within the container for my pi4 running the docker image (https://github.com/jr-k/snips-nlu-rebirth/tree/master/wheels). After including libatlas-base-dev as well it worked.
However i saw that @synesthesiam was fiddeling with snips-nlu installation with Dockerfile.debian in the repository there might be an easier way to do this…
[Edit] Just saw that you could select Snips through UI which i wasn’t able to so my solution might not be applicable for you…

I keep having to pull out Snips from the Docker/Debian install because the ARM-based platforms (Pi’s) don’t seem to install a Rust compiler like amd64. I’ve tried using the “rustup” install inside the Docker build, but it never works so installing snips-nlu via pip fails.

How did you get it working on the Pi? Did you have to install a Rust compiler separately?

I had the same problem while trying to install it on the Pi… until I found these prebuilt wheel files by chance:https://github.com/jr-k/snips-nlu-rebirth
For me nothing else worked … Tried via rust up and setuptools-rust … I know it’s just a workaround but it works for me.
Maybe some newer dependencies are not supported…

Awesome, thank you! I’ll check it out and see if it will work in the Docker images.

I am currently working on a permenantly listening solution and am basing it off the rhasspysnips-nlu-hermes modul. In case your interested i would be willing to integrate it into rhasspy as a module.
Sorry for the wrong chat, I thought why not mention it :smiley:

1 Like

FYI: The snips-nlu-rebirth repo is maintained by one of the Project Alice guys. I think it only deal with parsing intents and does not allow training an assistant. For training, they use the Python version.

I was able to do a full interaction including training this way. The wheel files provided install version 0.20.2 of snips-nlu which is up to date. Don’ know what they changed but they did it good…:slight_smile:

As far as I remember the training is done by the rust part there aswell. Whats done in python is converting the json training file format project alice is using to the one required by the snips-nlu

Ah yes. Looks like the training part is included indeed. My bad :hugs:

Sorry but I couldn’t follow the whole thing right now.
What do I have to do to use snpis NLU?
Can I implement the above mentioned?
What do you mean with “Examples”?
I wanted to avoid storing every article I want to add in a slot

Snips NLU was pulled from Rhasspy 2.5.3 (because it was kind of broken before?). I would suggest that you wait until it returns in a later version.

If you really want to use it beforehand, you can try to get rhasspy-snips-nlu-hermes running as a separate process and configure Rhasspy to use “Hermes MQTT” for intent recognition. Installation steps are described in the README. But be warned: This is a tedious process if you’re on ARM (and not using precompiled wheels)!

The way I did it on the Pi4 (as far as I remember):

  1. Get the wheel files from https://github.com/jr-k/snips-nlu-rebirth and put them in a local folder

  2. Pull the newest Docker image and start with bash into it making the wheel files available in the process:
    This can be done as described in the wiki and modifying the docker command a little:
    docker run -p 12101:12101
    –name rhasspy
    –restart unless-stopped
    -v “$HOME/.config/rhasspy/profiles:/profiles”
    -v “/etc/localtime:/etc/localtime:ro”
    –device /dev/snd:/dev/snd
    -v /path/to/wheelfolder:/path/in/container
    –entrypoint=/bin/bash
    -it
    rhasspy/rhasspy
    Note here the arguments after rhasspy you will need them later:
    –user-profiles /profiles
    –profile en
    This will bring you into the shell of the rhasspy docker container.

  3. [EDIT] Just remembered as rhasspy runs in a venv you need to activate the environment before installing snips, otherwise the error persists: source /usr/lib/rhasspy/.venv/bin/activate

  4. From here follow the instructions in the repository while using your defined /path/in/container (always relative to starting from /) for the pip install paths.

  5. After this snips is installed which can be verified by typing snips-nlu which should show the version 0.20.

  6. Lastly I needed to install libatlas via: apt-get install libatlas-base-dev
    After that you should be good to go and can verify that by including the arguments you left out earlier and entering;
    /usr/lib/rhasspy/bin/rhasspy-voltron --profile en --user-profile /profiles

If you run into trouble give me a heads up i might have messed up the infos at some point :smiley:
One last thing to notice as @NullEntity mentioned it is not officially supported therefore not available via UI. However you can still set it via the config.json in the web interface. Just follow the wiki here.
Hope this helps.

1 Like

Thank you for that.
I’ve made some headway with that.

I run rhasspy as a client server system.
The server, which also handles the intent recognition, runs not on a pi but on a hypervisor with AMD64 processor.
So unfortunately I can not install the wheels.

In that case do steps 2 and 3 and afterwards:

  • pip3 install setuptools_rust
  • pip3 install snips_nlu
    I believe that worked on my docker build :slight_smile:

Thank you.
I did it.
But unfortunately still no success.
I thought Snips can also recognize words that are given.

I wanted to realize a shopping list with it.
But that means I have to learn every article I want to add as a word.
Is there any other way?

According to the docs Snips NLU can recognize unknown words in the same context. I tested this and it worked for some simple scenarios. I’m not sure how well it handles multiple unknown words in a sentence.

Your ASR engine has to support open transcription obviously. What ASR engine are you using?

Hey

sorry but what is the ASR Engine?
On my master server I use the following settings.

Intent Recognition = SnipsNLU

I have also noticed that with Snips substitutions do not work.
For example I have a slot that looks like this.

yes:true
no:false

Here I would like that with yes, true is transmitted.
With Fsticuffs this works, with Snips unfortunately not.
Do I have to do anything else for it?
Otherwise I have to work without the substitutions.

Yeah, that seems to be missing at the moment :slightly_frowning_face:. I already opened an issue for this: https://github.com/rhasspy/rhasspy-snips-nlu/issues/1

“Automatic Speech Recognition”. Also known as “Speech to Text” (STT). Sorry for the confusion :smile:. I see you’re using Kaldi for STT. It should support open transcription if enabled.

Hello everybody, I’m new in this forum and hope you could understand me. My last English lesson was some decades ago.

With this thread (?) I was able to get snips work in a docker on my Synology. My intention is also to handle unseen words, so far that works.
But I got stuck in the same corner with the slots and synonyms. Is there a way to define slots that are accepted by snips?
I tried to modify the dataset.yaml, but after a new training, the modifications are gone.

Thank you for your work, it’s so much impressive