Natural Language Processing libraries
Real time Fast ( < 1ms )
Offline, On device



C++ NLP library - You will get a header file and a .a file (binary static library) that you can use with your C++ program.

Python NLP library - You will get a .so file (binary) that you can import in your Python code.

Java NLP library - Coming soon !


Native, Offline libraries

Our commercial Natural language processing library is a high performance tool for text analysis of your user search queries. Use it as a text analyzer for better understanding of the natural language queries. Deliver better search results (i.e convert natural language to sql query).
It has been implemented as C and C++ library, gives you native efficiency and can be integrated with Java backends or other high level languages like Python, PHP, Perl etc.



Real time language processing

Fast analysis of natural language with time of under 1ms for 10 word queries. This is an order of magnitude faster than some of the big names providing such services. (PAAS, IAAS). This is ideal for real time NLP, NLU applications involving sub-ms text processing.


No GPU required!

ThatNeedle's NLP libraries are not dependent on GPUs/hardware accelerators.
Our libraries work on CPUs and this makes for cost efficient operations.
This is unlike many other NLP frameworks where GPU or hardware accelerator is mandatory for getting the system ready.



Compact

Most of our nlp libraries are less than 4MB on disk and are ideal for embedding in edge devices and offline, on device nlp applications.
They can be configured to run as on-premise microservices also.


Custom Needs?

We can incorporate your custom language processing needs into a custom library. It will enjoy the same level of high performance ThatNeedle is known for. We only charge a reasonable fee for the customization that's only a fraction your inhouse engineering cost.
You should definitely consider ThatNeedle library as an alternative to Stanford NLP, NLTK or other open-source NLP frameworks especially for realtime offline nlp applications or where speed, size are important. It also performs better than commercial alternatives like Watson, LUIS, api.ai, wit.ai, google nlp etc. and also the open source alternatives like NLTK, spacy etc. Please get in touch with your needs if you are looking for alternatives to above vendors.



Ready to be plugged in

Integration of ThatNeedle library by your programmers into your application is easy. All it takes is a restful call or a native function call to harness the the power of many years of intense research into NLP.
It is meant for developers and apart from the restful API that enable you to have NLP offline (on your premises without internet), we also have python, C++ interfaces and others to the nlp library that can work on Mac OSX (Apple), Windows and Linux variants.



List of some ThatNeedle binary NLP libraries



Custom Entity Recognition

Real time detection of custom, niche entities from unstructured text (Custom NER) and slot filling. Supports compound word and multiword recognition and conversion.
Learn more about custom NER

Voice Commands

Add custom vocabulary to your default speech to text engine. Improve the accuracy of custom voice command recognition (better WER for custom words).
Learn more about custom voice commands

Auto Suggest / Autocomplete

Show instant auto suggestions as the user types (typeahead). Real time fast.
Learn more about Instant suggestions



Name Gender Prediction

Predict whether a person is a male or a female from their name. Our name search algorithm is good for use in marketing analytics and insights.

Topic Extraction

Extract technology topics from text. Technology topics and other niche topics can be extracted from raw text. No need to train your own topic modeling.
Learn more about nlp topic
Experience ai summary demo

Homophones correction

Library to automatically correct homophone errors in English transcriptions. (eg to/too/two)



Custom Tokenizer

Custom text tokenization capability for better performance than the default tokenizers.

Niche word prediction

Predict missing words in a sentence based on training data from large text corpuses in your domain.

Text diff library

Detect text edits, corrections. This is an advanced text diff tool. Handles 1-N, N-1, N-M edits apart from 1-1 text edits.
Learn more about text diff



NL to SQL

Library to convert Natural language to SQL for NLP database queries.
Learn more about Natural language to SQL

Numbers NLP

Natural language number handling. Handle numbers, number ranges, time, date etc

Text Classification

Semantic Text classifiers for various verticals. Brings in niche knowledge depth for better and more relevant categorization.
Learn more about text classification

Gist API: YouTube Summary

Get a crisp YouTube Summary instantly based on Generative AI/GPT. The efficacy of this API can be seen as a standalone product packages as a Chrome extension. The API can be used without the extension too.

Text to Knowledge Graph Generation

Automatically generate Knowledge Graph from text (unstructured). This helps you build sophisticated information retrieval and analysis systems from unstructured text.





Tell us about your business for custom optimized real time libraries:

contact us


Benefits of Offline, On-device, local NLP


1. Zero Network latency

On device NLP means that you donot have to go over the network to analyse the intent and look up information for the query. This easily saves you about a 100milliseconds in round trip time.


2. Data Security and Privacy

Working with your data on device / on premises keeps it safe from illegitimate network access which would otherwise be a risk when data is in motion. This leads to better data security. It also means less regulatory and compliance headaches.

3. Economical in the long term - No usage limits and usage pricing

Cloud APIs are based on per api call pricing and impose usage limits on the calls to the cloud infrastructure. An offline or on-prem deployment helps you break free from the unexpected bills.


Our libraries can work on Microsoft Windows, Mac OSX and Linux variants.