TryHackMe: Elastic Stack – The Basics (SOC Level 1) – under revamp

Welcome to this walkthrough of the Investigating with ELK 101 Room on TryHackMe. ELK stands for Elasticsearch, Logstash, and Kibana, a powerful trio of open-source tools used for search, logging, and data visualization. Elasticsearch handles fast search and analytics, Logstash processes and ingests data, and Kibana offers an intuitive dashboard for visualizing results. Together, they provide a scalable and flexible solution for managing large volumes of data in real-time.

Elastic Stack: The Basics Banner

Room URL:
https://tryhackme.com/room/investigatingwithelk101

I am making these walkthroughs to keep myself motivated to learn cyber security, and ensure that I remember the knowledge gained by these challenges on HTB and THM. Join me on learning cyber security. I will try and explain concepts as I go, to differentiate myself from other walkthroughs.


Table of Contents


Task 1: Introduction

In this room, we will learn how the Elastic Stack (ELK) can be used for log analysis and investigations. Although ELK is not a traditional SIEM, many SOC teams use it like one because of its data searching and visualizing capability. We will explore how the components of ELK and learn how log analysis can be performed through it. We will also explore creating visualizations and dashboards in ELK. 

Learning Objectives

This room has the following learning objectives:

  • Understand the components of ELK and their use in SOC
  • Explore the different features of ELK 
  • Learn to search and filter data in ELK
  • Investigate VPN logs to identify anomalies
  • Familiarize with creating visualizations and dashboards in ELK

Questions

I am all set!

Answer: No answer needed


Task 2: ElasticStack Overview

The Elastic Stack (also known as ELK Stack) is a collection of open-source components that work together to help users gather, search, analyze, and visualize data in real-time from various sources and formats. Here’s a brief overview of the components:

  1. Elasticsearch: A full-text search and analytics engine that stores and analyzes JSON-formatted documents. It supports a RESTful API for data interaction and is essential for data analysis, correlation, and storage.
  2. Logstash: A data processing engine that ingests data from multiple sources, applies filters or normalization, and sends the processed data to destinations such as Elasticsearch or Kibana. It has three main parts:
    • Input: Defines where the data comes from (supports various plugins).
    • Filter: Specifies how to normalize or transform the data (supports various plugins).
    • Output: Specifies where the data should be sent (e.g., Elasticsearch, Kibana, files).
  3. Beats: Lightweight, single-purpose agents (data shippers) that collect and send specific data from endpoints to Elasticsearch. Examples include Winlogbeat (for Windows event logs) and Packetbeat (for network traffic).
  4. Kibana: A web-based visualization tool that works with Elasticsearch to analyze, investigate, and visualize data in real-time. It allows users to create dashboards and visualizations to better understand the data.

How They Work Together:

  • Beats collect data from various sources (e.g., endpoint logs, network traffic) and send it to Logstash.
  • Logstash processes and normalizes the data, then stores it in Elasticsearch.
  • Elasticsearch serves as the database for storing and searching the data.
  • Kibana provides a user-friendly interface to visualize and interact with the data stored in Elasticsearch, turning it into insightful charts and dashboards.

Questions

Logstash is used to visualize the data. (yay/nay)

Logstash processes and normalizes the data. So this is a no.

Answer: nay

Elasticstash supports all data formats apart from JSON. (yay / nay)

Eliasticstash stores JSON-formated documents. So nay! No other data formats are supported.

Answer: nay


Task 3: Lab Connection

ABefore proceeding with the following tasks, start the attached virtual machine by clicking the Start Machine in the THM room.

The machine may take 3-5 minutes to start. After the machine starts, the ELK Instance can be accessed at http://MACHINE_IP if you are connected with the TryHackMe VPN. If you are not, you can open AttackBox and access the ELK instance by copying and pasting the MACHINE_IP into its web browser. 

Use the following credentials for the ELK instance.

Username: Analyst

Password: analyst123

When you open the ELK instance through this task, each upcoming task will guide you through the features in detail and ask you some questions. These questions can be comfortably answered if you follow along with the tasks.

Questions

Move to the next task!

Answer: No answer needed


Task 4: Discover Tab

The Discover tab is where SOC analysts spend most of their investigation time. It allows searching, filtering, and analyzing raw log data.

Main Components

1. Logs

  • Each row represents a single log event.
  • Expanding a log shows all parsed fields and values.

2. Fields Pane (Left Side)

  • Shows all available fields from the selected logs.
  • Clicking a field shows its top 5 values and frequency.
  • You can quickly apply filters:
    • + = show logs containing the value
    • = exclude logs with that value

3. Index Pattern

  • Defines which Elasticsearch logs you are viewing.
  • Different log sources have different index patterns (e.g., vpn_connections).
  • An index pattern can include multiple indices.
  • Allows writing queries and applying manual filters.
  • Used to narrow down logs based on keywords, fields, or conditions.

5. Time Filter

  • Filters logs based on a chosen time range (last 15 min, last 24 hours, custom, etc.).

6. Timeline Chart

  • Shows event counts over time.
  • Useful for spotting spikes and anomalies.
  • Clicking a bar limits results to that time period.

7. Top Bar

  • Options to save searches, load saved searches, and share.

8. Add Filter

  • Button to create structured filters without typing a query manually.

9. Create Table

  • You can choose specific fields to display, creating a cleaner, structured table.
  • Helps remove noise.
  • The table view can be saved and reused.

Questions

Select the index vpn_connections and filter from 31st December 2021 to 2nd Feb 2022. How many hits are returned?

Use the Time Filter in the top right corner to adjust the time filter from the 31st of December 2021 to 2nd February 2022. Press refresh. You should see 2,861 hits left after filtering.

Filtered date hits
Filtered date hits

Answer: 2861

Which IP address has the max number of connections?

Look at the Fields Pane. Here you should be able to find the Source_Ip field.

Max traffic IP
Max traffic IP

Select it, and you should be able to see the top 5 values. The first one has the highest number of connections.

Answer: 238.163.231.224

Which user is responsible for max traffic?

The process here is similar. Find username in the fields pane, select it and find the top value:

Answer: james

Create a table with the fields IP, UserName, Source_Country and save.

Select all columns using the “toggle column in table” button.

Answer: No answer needed

Apply Filter on UserName Emanda; which SourceIP has max hits?

Select the UserName field in the Fields Pane. Find Emanda and select the little + icon right beside it. This filter aways all other documents that do not originate from Emanda.

Now, select the SourceIP field (it should be under Selected fields).

As before, simply look at the top value – 107.14.1.247 in this case.

Answer: 107.14.1.247

On 11th Jan, which IP caused the spike observed in the time chart?

Remove the filtering on Emanda. On the timeline, select the bar containing the 11th of January:

Now the data gets filtered on this specific date. Now you can just select the Source_ip field once more in the fields pane and see the top 5 values:

The IP is 172.201.60.191.

Answer: 172.201.60.191.

How many connections were observed from IP 238.163.231.224, excluding the New York state?

Remove the timeline filter (you should be able to press back on your browser). Now select the SourceIP field, find the 238.163.231.224 value, and press the + icon beside it. Now select the Source_State field and find New York State. Instead of pressing the + icon to filter ON the value, we press the – to remove all documents which have New York State as field value.

Now simply find the total documents left:

48 connections are left!

Answer: 48


Task 6: KQL Overview

Kibana Query Language (KQL) is a powerful tool for searching logs and documents in Elasticsearch. It supports both free text and field-based search, making it flexible for different use cases.

  • Simply typing a word (e.g., security) returns documents containing that term, regardless of the field.
  • KQL searches whole words—searching for United won’t match United States.
  • Use wildcards like United* to match partial terms.

Logical Operators

KQL supports:

  • OR — e.g., "United States" OR "England"
  • AND — e.g., "United States" AND "Virginia"
  • NOT — e.g., "United States" AND NOT ("Florida")

Target specific fields with the format FIELD : VALUE:

  • Example: Source_ip : 238.163.231.224 AND UserName : Suleman
  • As you type, Kibana suggests available fields for easier query building.

To dive deeper into KQL, check the official Elastic guide.

Questions

Create a search query to filter out the logs from Source_Country as the United States and show logs from User James or Albert. How many records were returned?

If you are familiar with SQL and logic statements, this one should be easy. Read the theory in the room and the query should be easy to construct:

Source_Country : "United States"  and (UserName : "James" or UserName : "Albert" )

Answer: 161

As User Johny Brown was terminated on 1st January 2022, create a search query to determine how many times a VPN connection was observed after his termination.

The first part is easy, but the date filtering took me some googling to find out. Of course you can also use the filter function instead of using KQL. But if you want to learn something new the syntax is as follows:

UserName: "Johny Brown" and  @timestamp > "2022-01-01T00:00:00.000Z"

Answer: 1


Task 7: Creating Visualizations

The Visualization tab in Kibana helps transform your log data into clear, visual insights using tables, pie charts, bar charts, and more.

Creating Visualizations

  • Access it via the Discover tab by clicking on a field and selecting “Visualize”.
  • Choose from multiple chart types like tables, pie charts, or bar charts.

Correlation Views

  • You can correlate fields by dragging them into the central panel (e.g., Source_Country vs Source_IP).

Example: Pie Chart for Top 5 Source Countries

  • Visualize the top origins of traffic by building a pie chart using the Source_Country field.
  • Example: Table View
  • Create tables comparing data like IP addresses and the number of logs per country.

Saving & Sharing

  • After creating a visualization:
    1. Click Save in the top-right corner.
    2. Add a descriptive title and description.
    3. Choose to add it to an existing or new dashboard.
    4. Click Save and add to library.

Questions

Which user was observed with the greatest number of failed attempts?

Click on the UserName field and press Visualize:

This requires some playing around. I also dragged the action field (which includes the failed action!) into the graph, and changed it’s type table. Make sure username is on the Rows part and action on the Columns. Now we can sort on failed:

Simon has the highest number of failed connections. Well, actually he is the only one with failed connections.

Answer: Simon

How many wrong VPN connection attempts were observed in January?

This is a confusing one! I again added action on the horizontal axis, but this time broke it down by @timestamp. On the vertical axis I used the count() function.

Number of failed connections

Now looking at the dates all the failed connections are on the left bar (green), but this correlates to the december date. So I am not sure why 274 is the answer, as I had thought that it should be the blue bar (which is 0 in the above screenshot above “failed”. But anyway, 274 seems to be the answer.

Answer: 274


Task 8: Creating Dashboards

Dashboards in Kibana offer clear visibility into log data and can be tailored for specific needs—like monitoring VPN activity.

How to Build a Custom Dashboard

  1. Go to the Dashboard tab and click “Create dashboard.”
  2. Click “Add from Library” to insert saved visualizations and searches.
  3. Arrange and resize the components as needed.
  4. Click Save when you’re done to preserve your layout.

This approach lets you combine insights from various sources into a single, easy-to-read view.

Questions

Create the dashboard containing the available visualizations.

I will leave this one to you 🙂

Answer: No answer needed


Task 9: Conclusion

In this room, we briefly explored ELK components and then focused more on the Kibana interface and its features. While exploring Kibana Interface, we learned:

  • How to create a search query to search for the logs
  • Apply filters to narrow down the results.
  • Create Visualizations and dashboards.
  • How to investigate VPN logs.

Questions

Complete

Answer: No answer needed


Congratulations on completing Investigating with ELK 101

Congratulations on completing Investigating with ELK 101
Congratulations on completing Investigating with ELK 101

Congratulations on completing Investigating with ELK 101. I hope you can see why Kibana and Elasticsearch of the Elastic stack is such a powerful and popular piece of software. It really makes investinging logs quicker and more fun!

Come back soon for more walkthroughs of rooms on TryHackMe and HackTheBox, and other Cybersecurity discussions.

Find my other walkthroughs here.

Like my articles?

You are welcome to comment on this post, or share my post with friends.I would be even more grateful if you support me by buying me a cup of coffee:

Buy me a coffee
Buy me a coffee

I learned a lot through HackTheBox’s Academy. If you want to sign up, you can get extra cubes, and support me in the process, if you use the following link:

https://referral.hackthebox.com/mzwwXlg

Newsletter Updates

Enter your email address below and subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *