Elastic Search has become increasingly popular over the last few years and is indispensable for many application scenarios involving the search and analysis of large data sets. Elasticsearch is an open-source search engine based on Apache Lucene. It works with indices consisting of JSON documents in NoSQL format. The search engine works very fast, can be used for searching in large amounts of data, and supports distributed architectures for high availability. Together with Kibana and Logstash, Elasticsearch forms the Elastic Stack. You can deploy elastic search on-premise or on every common cloud provider (AWS, Azure, or Google).
This article will create an end-to-end working search application for quotes-based ASP.NET Blazor application with .NET 5 and C#. After reading this article, you will be able to set up your elastic.co cloud environment, seed your first index and search within your application.
You can find a full-working-code-copy at the end of the article.
This article won’t focus on how to install ElasticSearch on your local machine. Instead, we make use of the official Cloud offering that can be found on https://cloud.elastic.co/. However, the example will also work with your local installation or any other on-prem or cloud installation.
- Go to https://cloud.elastic.co/ and register your account. You will get a free 14 days trial.
- Create a deployment and choose your favorite Cloud provider. After provision, download the credentials.
- Switch to the Elasticsarch menu entry and copy your endpoint data
- In Visual Studio, create a Blazor Server App.
- Install the NuGet ElasticSearch package by Install-Package NEST. This will install the official ElasticSearch .NET client.
- Go to app.settings or better user secrets and configure an ElasticSearch block with your credentials taken from the last section.
Writing the Elastic Search Client
The ElasticSearchClient will handle all requests and responses from the NEST Client. Create an ElasticSearchClient.cs class and get the copy from here. Register the ElasticSearchClient in your startup.cs.
You will note that the Client is build up in the constructor. A default index name “quotes” is provided here, making the respective specification with each call superfluous.
Create Data Seeder
For this project, we like to search for quotes from historical personalities. Fortunately, we can find such variety on the Internet, and these even prepared as JSON files.
- Create a blazor page called “DataSeeder” and link it to the default menu and use the code provided here and here
- Create a POCO class called “QuotesModel” with the code from here
- Create a class called ElasticSearchDataSeeder and get the code from here
- Register the ElasticSarchDataSeeder in your startup.cs.
The SeedAsync will first get quotes from a Github repository and then deserialize the quotes to a POCO object. Afterward, it will delete the existing index for cleanup and create a new one. Finally, it will push the vales to the service.
5. Run the application and go to the new DataSeed page to start seeding there.
Create search page
Now that we have successfully developed the client and the DataSeeder, it is time to create the search page. For this, we will take a simple search mask and display the results in a table.
Visualise datawith Kibana
While this section is not needed for setting up the project, it shows how powerful the ElasticSearch ecosystem is:
- Go back to your Elastic Cloud page.
- Open your Elastic Search deployment.
- Launch Kibana from the application menu
4. Click the “manage” button and then choose under the Kibana menu the “Index Patterns” entry
5. Create an index pattern called “quotes” since this was the index we choose in our application
6. Go back to the Kabana start page and now click the “Visualye and analyze” tile
7. Choose “Discover” and then select the “quotes” entry from the left picklist. You can now start to search for elements. You will notice that Kibana will already come up with suggestions
In this article, we created and configured a working ElasticSearch cloud environment. After that, we created a fully functional client to index new data and retrieved existing data. This client can be easily adapted for any further project. The data is displayed on a blazor page.
You can use this article as a starting point for your own project, and you will see how relatively easy it is to map even complex cases like website search or product search in your .NET application.
Get the code from here.