AppDividend
Latest Code Tutorials

Node js Elastic Search Tutorial Example

1,643

Get real time updates directly on you device, subscribe now.

Node js Elastic Search Tutorial Example is the today’s main topic.  Elasticsearch is an open source search engine, which becomes hugely popular due to its high performance and distributed architecture. Elasticsearch is built on top of the Apache Lucene, which is a high-performance text search engine library. Although Elasticsearch can perform the storage and retrieval of data, its primary purpose is not to serve as a database. Instead, it is a search engine (server) with the primary goal of indexing, searching, and providing real-time statistics on the data.

Node js Elastic Search Tutorial Example

In this tutorial, we will integrate with Node.js and use it to index and search the data. When data is imported, it immediately becomes available for searching. Elasticsearch is schema-free, stores data in JSON documents, and can automatically detect the data structure and its type. 

Elasticsearch is also entirely API driven. It means that almost any operations can be done via a simple RESTful API using JSON data over HTTP. It has many client libraries for nearly every programming language, including for JavaScript. In this example, we will use an official client library.

#Step 1: Install Elasticsearch on Mac via Homebrew.

We start this tutorial by installing the Elasticsearch. I am installing on the Mac. So type the following command in your terminal to install Elasticsearch via homebrew.

brew install elasticsearch

It will install it and now start the services using the following command.

brew services start elasticsearch

#Step 2: Setup Node.js Environment.

Create a new Node.js project by the following command.

mkdir node-elastic

Now, go into the project folder.

cd node-elastic

Initialize the package.json file using the following command.

npm init

We need to install official node.js elasticsearch package.

npm install elasticsearch --save

Inside the project root, create one file called server.js.

Write the following code inside the server.js file.

// server.js

const express = require('express');
const elasticsearch = require('elasticsearch');
const fs = require('fs');
const app = express();

const PORT = 5000;

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

app.listen(PORT, function() {
    console.log('Server is running on PORT:',PORT);
});

We have already started the elastic service when we have installed via homebrew

Start the node.js server using the following command.

node server

So, if your elastic server is running on PORT: 9200 successfully then you will not get any error, and displays Server is running on PORT: 5000. Remember, our node.js server port is 5000 and elasticsearch is running on PORT: 9200.

#Step 3: Create a sample data.

We need to index the data so that we can query later that data. Now, inside the root of the project, create one file called data.json and add the following content. It is a sample data, but in real-time, it is the data from the database.

[
    {
      "id": "575084573a2404eec25acdcd",
      "title": "Id sint ex consequat ut.",
      "journal": "First Journal",
      "volume": 54,
      "number": 6,
      "pages": "255-268",
      "year": 2011,
      "authors": [
        {
          "firstname": "Kerr",
          "lastname": "Berry",
          "institution": "Skyplex",
          "email": "Kerr@Skyplex.info"
        },
        {
          "firstname": "Fischer",
          "lastname": "Farmer",
          "institution": "Digique",
          "email": "Fischer@Digique.biz"
        },
        {
          "firstname": "Brandie",
          "lastname": "Reed",
          "institution": "Fanfare",
          "email": "Brandie@Fanfare.com"
        },
        {
          "firstname": "Martinez",
          "lastname": "Bradford",
          "institution": "Comveyer",
          "email": "Martinez@Comveyer.name"
        },
        {
          "firstname": "Lula",
          "lastname": "Charles",
          "institution": "Gadtron",
          "email": "Lula@Gadtron.tv"
        }
      ],
      "abstract": "Do occaecat reprehenderit dolore proident nulla magna nostrud aliquip dolore. Officia minim eiusmod eu minim ea labore velit ea. Voluptate sit deserunt duis reprehenderit.",
      "link": "http://ea.ca/575084573a2404eec25acdcd.pdf",
      "keywords": [
        "aute",
        "nisi",
        "adipisicing",
        "fugiat",
        "qui"
      ],
      "body": "Quis pariatur velit ipsum tempor eu ad. Do nisi dolore tempor anim eiusmod in ea aliqua velit fugiat culpa sunt ea. Labore sint officia  Adipisicing occaecat incididunt sunt labore elit. Pariatur officia nulla anim labore enim non labore laborum eu eu"
    },
    {
      "id": "5750845735dff7db71032593",
      "title": "Consectetur velit do esse laborum duis cillum mollit Lorem aliquip occaecat.",
      "journal": "Second Journal",
      "volume": 11,
      "number": 11,
      "pages": "277-302",
      "year": 1987,
      "authors": [
        {
          "firstname": "Beck",
          "lastname": "Browning",
          "institution": "Snowpoke",
          "email": "Beck@Snowpoke.us"
        },
        {
          "firstname": "Tracy",
          "lastname": "Vaughn",
          "institution": "Zaggle",
          "email": "Tracy@Zaggle.me"
        },
        {
          "firstname": "Britney",
          "lastname": "Hudson",
          "institution": "Exozent",
          "email": "Britney@Exozent.net"
        },
        {
          "firstname": "Alissa",
          "lastname": "Perez",
          "institution": "Buzzmaker",
          "email": "Alissa@Buzzmaker.org"
        },
        {
          "firstname": "Darlene",
          "lastname": "Love",
          "institution": "Hydrocom",
          "email": "Darlene@Hydrocom.co.uk"
        }
      ],
      "abstract": "Anim labore nulla et et sunt. Esse ad enim velit culpa irure. Irure cillum culpa velit exercitation voluptate sint. Incididunt voluptate minim ipsum est sint amet sit esse. Laboris aute ut sint nulla consectetur nulla non veniam dolore eu aute exercitation. Commodo aute ea duis qui officia quis nisi consequat adipisicing enim magna dolore nisi. Lorem est duis dolor sint est id culpa. Sunt proident consectetur ex deserunt do adipisicing fugiat incididunt nulla aliquip officia. Occaecat esse dolor voluptate aute et ad deserunt commodo nisi. Lorem ipsum cupidatat nostrud nisi Lorem minim ut excepteur aute.",
      "link": "http://reprehenderit.io/5750845735dff7db71032593.pdf",
      "keywords": [
        "ipsum",
        "nostrud",
        "commodo",
        "fugiat"
      ],
      "body": "Adipisicing nulla mollit sunt deserunt. Dolor nostrud velit reprehenderit dolor dolore. Sunt officia culpa labore officia cupidatat commodo eiusmod sit cupidatat aliqua ullamco et sit Lorem. Amet cupidatat laboris anim mollit"
    },
    {
      "id": "575084570b63f5fd28d23f48",
      "title": "Et duis laborum id laborum qui reprehenderit laborum.",
      "journal": "Third Journal",
      "volume": 18,
      "number": 12,
      "pages": "247-265",
      "year": 2013,
      "authors": [
        {
          "firstname": "Leticia",
          "lastname": "Ingram",
          "institution": "Acruex",
          "email": "Leticia@Acruex.info"
        }
      ],
      "abstract": "Ea velit aute ipsum tempor excepteur sint magna labore et occaecat. Exercitation nulla officia enim qui ex proident ullamco quis irure. Ut ullamco proident et culpa fugiat qui. Excepteur laborum consectetur mollit dolore fugiat proident et sint.",
      "link": "http://dolor.biz/575084570b63f5fd28d23f48.pdf",
      "keywords": [
        "irure",
        "consequat",
        "incididunt"
      ],
      "body": "Enim occaecat incididunt irure Lorem id. Proident enim duis dolore culpa ut velit consequat excepteur ullamco sit excepteur cupidatat minim qui. Aliqua sunt ipsum magna non non elit consequat voluptate adipisicing. Ipsum ad ut in irure non eu ea. Non sit enim ipsum ea nisi officia do incididunt minim ipsum mollit laboris deserunt labore."
    },
    {
      "id": "575084573ae731ef0dc821c6",
      "title": "Nisi sunt ea eiusmod elit irure amet esse magna.",
      "journal": "Fourth Journal",
      "volume": 40,
      "number": 7,
      "pages": "133-162",
      "year": 1985,
      "authors": [
        {
          "firstname": "Rosario",
          "lastname": "Waters",
          "institution": "Unq",
          "email": "Rosario@Unq.com"
        },
        {
          "firstname": "Gonzalez",
          "lastname": "Klein",
          "institution": "Ginkogene",
          "email": "Gonzalez@Ginkogene.name"
        },
        {
          "firstname": "Corina",
          "lastname": "Fischer",
          "institution": "Orboid",
          "email": "Corina@Orboid.tv"
        },
        {
          "firstname": "Sonia",
          "lastname": "Adams",
          "institution": "Pyramis",
          "email": "Sonia@Pyramis.ca"
        },
        {
          "firstname": "Curry",
          "lastname": "Sharpe",
          "institution": "Cytrak",
          "email": "Curry@Cytrak.us"
        }
      ],
      "abstract": "Lorem anim deserunt mollit dolore consectetur dolor ex amet do et tempor sunt. Officia nulla magna aliqua mollit voluptate mollit culpa minim. Ad eiusmod magna exercitation anim sint ut consequat adipisicing veniam irure minim. Dolore aute adipisicing elit quis in in laboris excepteur minim. Velit veniam labore pariatur duis anim esse. Duis non aute ullamco ex voluptate in nulla est enim sit dolore. Et cupidatat aliqua commodo veniam incididunt ea proident dolore elit et amet mollit minim. Exercitation amet nisi consectetur irure nulla proident esse do ullamco veniam ea sint qui.",
      "link": "http://pariatur.me/575084573ae731ef0dc821c6.pdf",
      "keywords": [
        "irure",
        "cupidatat",
        "nostrud"
      ],
      "body": "Ea exercitation nostrud ullamco ad sit sint occaecat do ullamco magna. Eu aute exercitation deserunt velit. Et minim ex do anim adipisicing commodo elit. Laborum excepteur minim ea incididunt ipsum esse non laboris consequat mollit exercitation ea. Eiusmod irure dolor pariatur ad est irure excepteur aliquip quis voluptate aute et eiusmod fugiat."
    }
]

It is an array of 4 elements. So, when we index our data, if we index it correctly then the length of the data is 4, and we will prove it.

#Step 4: Add sample data for indexing.

Okay then write the following code inside the server.js file.

// server.js

const express = require('express');
var elasticsearch = require('elasticsearch');
const fs = require('fs');
const app = express();

const PORT = 5000;

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

client.ping({ requestTimeout: 30000 }, function(error) {
    if (error) {
        console.error('elasticsearch cluster is down!');
    } else {
        console.log('Everything is ok');
    }
});


const bulkIndex = function bulkIndex(index, type, data) {
    let bulkBody = [];
  
    data.forEach(item => {
      bulkBody.push({
        index: {
          _index: index,
          _type: type,
          _id: item.id
        }
      });
  
      bulkBody.push(item);
    });
  
client.bulk({body: bulkBody})
    .then(response => {
      let errorCount = 0;
      response.items.forEach(item => {
        if (item.index && item.index.error) {
          console.log(++errorCount, item.index.error);
        }
      });
      console.log(
        `Successfully indexed ${data.length - errorCount}
         out of ${data.length} items`
      );
    })
    .catch(console.err);
  };

async function indexData() {
    const articlesRaw = await fs.readFileSync('./data.json');
    const articles = JSON.parse(articlesRaw);
    console.log(`${articles.length} items parsed from data file`);
    bulkIndex('library', 'article', articles);
  };

indexData();

app.listen(PORT, function() {
    console.log('Server is running on PORT:',PORT);
});

Here, we have created the function called indexData();

Inside this function, we are reading the data from the data.json file. Then, we have parsed that data to the JSON and pass those data in the function as a parameter called bulkIndex() function.

The bulk API makes it possible to perform many index/delete operations in the single API call. It can significantly increase the indexing speed.

Related Posts
1 of 11

#Step 5: Verify the indexing of the data.

We can verify that if our data is indexed or not by adding some code in our application.

Let us create another file called verify.js inside the root folder.

Add the following code inside a verify.js file.

// verify.js

const elasticsearch = require('elasticsearch');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });


function indices() {
    return client.cat.indices({v: true})
    .then(console.log)
    .catch(err => console.error(`Error connecting to the es client: ${err}`));
  }

module.exports = function verify() {
    console.log(`elasticsearch indices information:`);
    indices();
}

If the data is correctly indexed then we can see the count 4 because we have indexed an array of items whose length is 4. Now, import this function inside the server.js file.

// server.js

const express = require('express');
var elasticsearch = require('elasticsearch');
const fs = require('fs');
const app = express();

const PORT = 5000;
const verify = require('./verify');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

client.ping({ requestTimeout: 30000 }, function(error) {
    if (error) {
        console.error('elasticsearch cluster is down!');
    } else {
        console.log('Everything is ok');
    }
});


const bulkIndex = function bulkIndex(index, type, data) {
    let bulkBody = [];
  
    data.forEach(item => {
      bulkBody.push({
        index: {
          _index: index,
          _type: type,
          _id: item.id
        }
      });
  
      bulkBody.push(item);
    });
  
client.bulk({body: bulkBody})
    .then(response => {
      let errorCount = 0;
      response.items.forEach(item => {
        if (item.index && item.index.error) {
          console.log(++errorCount, item.index.error);
        }
      });
      console.log(
        `Successfully indexed ${data.length - errorCount}
         out of ${data.length} items`
      );
    })
    .catch(console.err);
  };

async function indexData() {
    const articlesRaw = await fs.readFileSync('./data.json');
    const articles = JSON.parse(articlesRaw);
    console.log(`${articles.length} items parsed from data file`);
    bulkIndex('library', 'article', articles);
  };

indexData();
verify();

app.listen(PORT, function() {
    console.log('Server is running on PORT:',PORT);
});

Save the file and go to the terminal and restart the node.js server. You can see something like the following.

Node js Elastic Search Tutorial Example

 

That means, we have successfully indexed the data. Now, we can query the data and fetch the result.

#Step 6: Query the elasticsearch and fetch the data.

First, we get all the journals from the elasticsearch.

Inside the root folder, create one file called search.js and add the following code in it.

// search.js

const elasticsearch = require('elasticsearch');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

const search = function search(index, body) {
    return client.search({index: index, body: body});
  };
  
module.exports =  function searchData() {
    let body = {
      size: 4,
      from: 0,
      query: {
        match_all: {}
      }
    };
  
    search('library', body)
    .then(results => {
      console.log(`found ${results.hits.total} items in ${results.took}ms`);
      console.log(`returned journals:`);
      results.hits.hits.forEach(
        (hit, index) => console.log(
          hit._source.journal
        )
      )
    })
    .catch(console.error);
  };

So, we have already indexed the data, we just need to fetch all the journals from the elasticsearch. So, we have passed the two parameters to the search() function provided by elasticsearch.

  1. index name
  2. body

Body object contains the size of records and query parameter. This parameter is generally coming from the client side, where user search term is typed from the search box. Here, we are matching all records of the journals and display to the console. Now, we can import this search.js file inside server.js. 

So, our search.js file looks like this.

// server.js

const express = require('express');
var elasticsearch = require('elasticsearch');
const fs = require('fs');
const app = express();

const PORT = 5000;
const verify = require('./verify');
const searchData = require('./search');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

client.ping({ requestTimeout: 30000 }, function(error) {
    if (error) {
        console.error('elasticsearch cluster is down!');
    } else {
        console.log('Everything is ok');
    }
});


const bulkIndex = function bulkIndex(index, type, data) {
    let bulkBody = [];
  
    data.forEach(item => {
      bulkBody.push({
        index: {
          _index: index,
          _type: type,
          _id: item.id
        }
      });
  
      bulkBody.push(item);
    });
  
client.bulk({body: bulkBody})
    .then(response => {
      let errorCount = 0;
      response.items.forEach(item => {
        if (item.index && item.index.error) {
          console.log(++errorCount, item.index.error);
        }
      });
      console.log(
        `Successfully indexed ${data.length - errorCount}
         out of ${data.length} items`
      );
    })
    .catch(console.err);
  };

async function indexData() {
    const articlesRaw = await fs.readFileSync('./data.json');
    const articles = JSON.parse(articlesRaw);
    console.log(`${articles.length} items parsed from data file`);
    bulkIndex('library', 'article', articles);
  };

// indexData();
// verify();
searchData();

app.listen(PORT, function() {
    console.log('Server is running on PORT:',PORT);
});

Here, I have commented on the two functions because we have already indexed the data. This is not the best way to prevent reindex, but for this example, it is enough. Save the file and see the console. 

Elasticsearch in Node

 

#Step 7: Search Specific Term.

Inside the root folder, create one file called search_term.js and add the following code.

// search_term.js

const elasticsearch = require('elasticsearch');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
});

const search = function search(index, body) {
    return client.search({index: index, body: body});
  };

module.exports = function searchTerm() {
    let body = {
      size: 4,
      from: 0,
      query: {
        match: {
          journal: {
            query: 'Fir',
            minimum_should_match: 2,
            fuzziness: 2
          }
        }
      }
    };

    console.log(`retrieving documents whose journal matches '${body.query.match.journal.query}' (displaying ${body.size} items at a time)...`);
    search('library', body)
    .then(results => {
      console.log(`found ${results.hits.total} items in ${results.took}ms`);
      if (results.hits.total > 0) console.log(`returned journals:`);
      results.hits.hits.forEach(hit => console.log(hit._source.journal));
    })
    .catch(console.error);
  };

Here, our main query part looks like this.

query: {
        match: {
          journal: {
            query: 'Fir',
            minimum_should_match: 2,
            fuzziness: 2
          }
        }
      }

Query describes that we need to find the Fir term in the Journal value. If it matches at least minimum two characters then it will give us the result.

Import this file inside the server.js file.

// server.js

const express = require('express');
var elasticsearch = require('elasticsearch');
const fs = require('fs');
const app = express();

const PORT = 5000;
const verify = require('./verify');
const searchData = require('./search');
const searctTerm = require('./search_term');

const client = new elasticsearch.Client({
    host: '127.0.0.1:9200',
    log: 'error'
 });

client.ping({ requestTimeout: 30000 }, function(error) {
    if (error) {
        console.error('elasticsearch cluster is down!');
    } else {
        console.log('Everything is ok');
    }
});


const bulkIndex = function bulkIndex(index, type, data) {
    let bulkBody = [];
  
    data.forEach(item => {
      bulkBody.push({
        index: {
          _index: index,
          _type: type,
          _id: item.id
        }
      });
  
      bulkBody.push(item);
    });
  
client.bulk({body: bulkBody})
    .then(response => {
      let errorCount = 0;
      response.items.forEach(item => {
        if (item.index && item.index.error) {
          console.log(++errorCount, item.index.error);
        }
      });
      console.log(
        `Successfully indexed ${data.length - errorCount}
         out of ${data.length} items`
      );
    })
    .catch(console.err);
  };

async function indexData() {
    const articlesRaw = await fs.readFileSync('./data.json');
    const articles = JSON.parse(articlesRaw);
    console.log(`${articles.length} items parsed from data file`);
    bulkIndex('library', 'article', articles);
  };

// indexData();
// verify();
// searchData();
searctTerm();

app.listen(PORT, function() {
    console.log('Server is running on PORT:',PORT);
});

Finally, save the file and see the result in the console.

MongoDB Elasticsearch

 

So, we have searched all the records and specific term. This is the basic elasticsearch tutorial in the node.js and you can find more on its original documentation.

Finally, Node js Elastic Search Tutorial Example From Scratch is over. I have put this code on Github.

Github Code

1 Comment
  1. Valeriu says

    Thank you, great post & tutorial!

Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.