Back to Blog

Build a Named Entity Recognition (NER) ML model in hours

build-named Entity Recognition ML model in hrs- platform

Named-entity recognition (NER) also known as entity extraction is a subtask of information extraction that aims to locate and classify atomic elements in text into predefined categories such as the names of persons, organizations, places, expressions of times, quantities, monetary values and more.

For example, in the following sentence, “John will arrive in New York on Monday” , the named entities can be extracted as,

 Name – John

 Place  – New York

Time – Monday

NER has numerous applications across various industries and can be utilized to automate repetitive tasks based on understanding natural language.

Some Use Cases of Named Entity Recognition

Classifying content for news providers

Named Entity Recognition can be used to scan news articles and reveal which are the major people, organizations, and places in them. The news articles can then be classified based on these entities.

Powering Content Recommendations

By extracting entities from a particular article, companies can understand the keywords which users are interested in, and recommend other articles that have similar entities mentioned in them.

Customer Support

NER can be used to categorize the complaints/ emails/ suggestions and direct them to the relevant department within the organization.

Building an NER Model with offers templatized and guided workflows for your projects. By choosing the correct template, and following the instructions, you can create working ML models in hours or minutes.

We would be building a Customer Reviews Content moderation Model. Customer Reviews are a rich source of information and can be used to understand users’ perspectives about products. We would be building a model that can extract product mentions and product features from the customer review text. Such intelligence can later be used to extract key features which customers look for in a product, and classify the reviews based upon those criteria.

Related: Download this comprehensive cheat sheet to deploy machine learning on  time and on budget →

1. Project Creation

Choose the NER template from the given template options, and give the name and description of the project. 



2. Dataset

After the project is created, you can design your dataset as per the entities you want to extract from your model. Give a dataset name, description and add the entities. We added product_features and product_mentions as the named entities. Click the submit button to create the dataset.


3. Data Collection

With various data collection options like collect API, data collection Jobs or CSV Upload, you can choose the most suitable option as per your convenience and upload the data. We used the Upload CSV approach for uploading the customer reviews.

Select the CSV upload option from the left panel under Data Collection, and drag and drop the CSV file to upload the data to the dataset. You can select which dataset you want to upload the data to from the drop-down menu.


4. Data Labeling provides mobile-based and form-based labeling options. You can create data labeling jobs that can be assigned to collaborators who will do the labeling for you.

5. Feature Set

Feature set is the subset of the dataset which goes as the input for machine learning training. To create a feature set, click on the Feature set tab on the left panel, and click on the ‘Add FeatureSet' button.

Give a name, description, and select the dataset from which you want to create the featureset in the dropdown menu. Configure the split policy, selection method, number of records, and filter options for the train and test sets. 


6. Model Training has the provision of one-click training which is super easy and convenient for model experimentation. To train a model, click the “Train a New Model” button, give a training name, description, and choose the featureset. provides optimized and relevant algorithm options depending upon the type of template you are using. So you can choose whichever algorithm you find fit for your problem, and also try out different algorithms on your data. So model experimentation becomes really easy with

Choose the algorithm from the drop-down menu, tune the hyperparameters like the number of epochs, learning rate, number of layers, etc.

Click the ‘Train’ button to start your model training and go to the Logs tab to see the live logs of your model training. 



7. Model Deployment

As soon as training finishes successfully in the cloud, a model is generated which gets listed under the Model Deployment tab. Our training received an accuracy of 95% which is a pretty good score. So the whole cycle of creating a model from scratch, was achieved in about an hour. 


Click on the training to see more details around it like the training reports, Inference API, and more.


To deploy a mode, go to the deployment tab where a model can be deployed with just one click. Rolling back a model is as simple as deployment. It’s also a single-click feature, where your model would be rolled back from the cloud if it’s not in use. This provision takes away the whole work of setting up a model deployment pipeline. 


8. Inference API

Once the model is deployed inference APIs are generated which are available in all the major languages. You can simply copy them and integrate them into your system and start predicting.


9. Model Monitoring

As soon as the Inference API is hooked with your system, the stats start coming up on the monitoring page and you can see how you are model is performing. also has the provision of ‘Quick-Test’ which allows you to test your model on unseen data and see how it is performing before you actually integrate it with your system.


So that's how we built the NER Model with in hours. can be really useful for model experimentation and give you early visibility with quicker results. It can also help with faster data collection and labeling. With its highly scalable on-demand infrastructure, you can focus solely on solving your actual business problem, building data intuition and delivering intelligence based on data.