Need advice about which tool to choose?Ask the StackShare community!

Gensim

74
91
+ 1
0
rasa NLU

120
282
+ 1
25
Add tool

Gensim vs rasa NLU: What are the differences?

What is Gensim? A python library for Topic Modelling. It is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Target audience is the natural language processing (NLP) and information retrieval (IR) community.

What is rasa NLU? Open source, drop-in replacement for NLP tools like wit.ai. rasa NLU (Natural Language Understanding) is a tool for intent classification and entity extraction. You can think of rasa NLU as a set of high level APIs for building your own language parser using existing NLP and ML libraries.

Gensim and rasa NLU can be categorized as "NLP / Sentiment Analysis" tools.

Gensim and rasa NLU are both open source tools. It seems that Gensim with 9.65K GitHub stars and 3.52K forks on GitHub has more adoption than rasa NLU with 6.05K GitHub stars and 1.82K GitHub forks.

According to the StackShare community, rasa NLU has a broader approval, being mentioned in 5 company stacks & 17 developers stacks; compared to Gensim, which is listed in 3 company stacks and 5 developer stacks.

Manage your open source components, licenses, and vulnerabilities
Learn More
Pros of Gensim
Pros of rasa NLU
    Be the first to leave a pro
    • 9
      Open Source
    • 6
      Docker Image
    • 6
      Self Hosted
    • 3
      Comes with rasa_core
    • 1
      Enterprise Ready

    Sign up to add or upvote prosMake informed product decisions

    Cons of Gensim
    Cons of rasa NLU
      Be the first to leave a con
      • 4
        No interface provided
      • 4
        Wdfsdf

      Sign up to add or upvote consMake informed product decisions

      What is Gensim?

      It is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Target audience is the natural language processing (NLP) and information retrieval (IR) community.

      What is rasa NLU?

      rasa NLU (Natural Language Understanding) is a tool for intent classification and entity extraction. You can think of rasa NLU as a set of high level APIs for building your own language parser using existing NLP and ML libraries.

      Need advice about which tool to choose?Ask the StackShare community!

      Jobs that mention Gensim and rasa NLU as a desired skillset
      What companies use Gensim?
      What companies use rasa NLU?
      Manage your open source components, licenses, and vulnerabilities
      Learn More

      Sign up to get full access to all the companiesMake informed product decisions

      What tools integrate with Gensim?
      What tools integrate with rasa NLU?

      Sign up to get full access to all the tool integrationsMake informed product decisions

      What are some alternatives to Gensim and rasa NLU?
      NLTK
      It is a suite of libraries and programs for symbolic and statistical natural language processing for English written in the Python programming language.
      Keras
      Deep Learning library for Python. Convnets, recurrent neural networks, and more. Runs on TensorFlow or Theano. https://keras.io/
      FastText
      It is an open-source, free, lightweight library that allows users to learn text representations and text classifiers. It works on standard, generic hardware. Models can later be reduced in size to even fit on mobile devices.
      SpaCy
      It is a library for advanced Natural Language Processing in Python and Cython. It's built on the very latest research, and was designed from day one to be used in real products. It comes with pre-trained statistical models and word vectors, and currently supports tokenization for 49+ languages.
      TensorFlow
      TensorFlow is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API.
      See all alternatives