Reading Time: 5 minutes

What is Logstash?

logstash logo

Logstash is a free, open source, server-side data collection and processing engine which uses dynamic pipelining capabilities. It takes in information from multiple data sources, reconstructs it, and then sends it to a destination of our choice. Logstash also cleans and modifies the data for use in an advanced outbound analytics and visualization use cases. It is the L in the ELK stack (Elasticsearch, Logstash and Kibana) and is typically responsible for sending data to Elasticsearch.

Effectively administrating Linux systems requires familiarity with the locations of and possible contents of logs. Application and system-level logs provide insight into an application’s behavior or the system that might not be apparent on the surface. When many cooperating systems each provide their logs, aggregating, and processing that information becomes imperative. This is where Logstash shines.

How to Install Logstash

Event Pipeline

A Logstash event pipeline is composed of three separate stages.

  • The input stage is much as it sounds. Essentially, it is the mechanism by which Logstash receives events and can include the file system, Redis, or beats events.
  • The second stage, filters, is responsible for data processing, changing data that may be unstructured into structured data and possibly triggering events based on certain conditions.
  • The final stage, outputs, is the landing place for the data in the pipeline. Data can be output to Elasticsearch, the file system, graphite, or any other number of destinations.

A benefit of the extensive usage of Logstash and the fact that it is open-source is that there is an abundant plugin ecosystem available to facilitate each stage of the Logstash event pipeline. These plugins make hooking Logstash up to various other services a snap. In this tutorial, we will cover how to install Logstash on an Ubuntu 18.04 server via the package manager apt.

Preflight Check

  • These instructions are performed as the root user on a Liquid Web Self-Managed Ubuntu 18.04 LTS server.
  • The user has working knowledge of using CLI in the terminal
  • This tutorial assumes there is a working installation of Java available on the server.

Install Dependencies

Because Elasticsearch uses Java, we need to ensure the Java Development Kit (JDK) is installed. We can check for the Java installation on our Ubuntu server using this command.

root@ubuntu18:~$ java -version
-bash: java: command not found

If Java is not installed, you can run the command below to install it or review our KB article for more detailed instructions.

root@ubuntu18:~$ apt install openjdk-8-jdk

Now, we can re-verify our Java JDK installation by rerunning the following command.

root@ubuntu18:~$ java -version
openjdk version "13.0.2" 2020-01-14
OpenJDK Runtime Environment (build 13.0.2+8)
OpenJDK 64-Bit Server VM (build 13.0.2+8, mixed mode, sharing)

Prepare the Environment

First things first as a best practice, it’s a good idea to always update the system packages by running the following command.

root@ubuntu18:~# apt update -y

Next, run the following wget command to pull down and install the Public Signing Key for the Elastic package repositories.

root@ubuntu18:~# wget -qO - | sudo apt-key add -

The next step may or may not be necessary on all systems but, to be certain that all prerequisite packages are available, install the following package.

root@ubuntu18:~# apt install apt-transport-https -y
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following NEW packages will be installed:
0 upgraded, 1 newly installed, 0 to remove and 80 not upgraded.
Need to get 0 B/1692 B of archives.
After this operation, 153 kB of additional disk space will be used.
Selecting previously unselected package apt-transport-https.
(Reading database ... 154597 files and directories currently installed.)
Preparing to unpack .../apt-transport-https_1.6.12ubuntu0.1_all.deb ...
Unpacking apt-transport-https (1.6.12ubuntu0.1) ...
Setting up apt-transport-https (1.6.12ubuntu0.1) ...

The final step to installing Logstash via apt is to add the actual repository it will be pulled from.

root@ubuntu18:~# echo "deb stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list

Install Logstash

Now that the Logstash repository has been added apt needs to be updated to be made aware of the new source.

root@ubuntu18:~# apt update -y

After that is finished, install Logstash like any other package.

root@ubuntu18:~# apt install logstash -y

Next up, verify that logstash is working properly by running the following command.

root@ubuntu18:~# /usr/share/logstash/bin/logstash -V
logstash 7.8.1

To further test the Logstash installation kick off the most basic Logstash pipeline.

root@ubuntu18:~# /usr/share/logstash/bin/logstash -e 'input { stdin { } } output { stdout {} }'
[INFO ] 2020-08-13 16:15:55.703 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.8.1", "jruby.version"=>"jruby (2.5.7) 2020-03-25 b1f55b1a40 OpenJDK 64-Bit Server VM 11.0.8+10-post-Ubuntu-0ubuntu118.04.1 on 11.0.8+10-post-Ubuntu-0ubuntu118.04.1 +indy +jit [linux-x86_64]"}
[INFO ] 2020-08-13 16:15:57.792 [Converge PipelineAction::Create] Reflections - Reflections took 26 ms to scan 1 urls, producing 21 keys and 41 values
[INFO ] 2020-08-13 16:15:58.601 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["config string"], :thread=>"#"}
[INFO ] 2020-08-13 16:15:59.318 [[main]-pipeline-manager] javapipeline - Pipeline started {""=>"main"}
The stdin plugin is now waiting for input:
[INFO ] 2020-08-13 16:15:59.363 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2020-08-13 16:15:59.603 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

Once the output stops, enter hello world into the console and hit enter, the output should look something like this.

hello world
"host" => "",
"@version" => "1",
"message" => "hello world",
"@timestamp" => 2020-08-11T03:14:25.951Z

Hold the ctrl key and press D to exit Logstash.


There you have it! Logstash is now installed and ready to start pulling in, aggregating, and handling logs from available sources. Logstash can now act as a data pipeline ingesting logs shipped to it and passing those off to other services. This is an essential step toward being able to centralize, search, and visualize logs. Whether you have a single Liquid Web VPS spun up that is serving up a single website or are using multiple nodes to provide access to an API, logstash can ingest log files on those servers, aggregate them and ship them where they need to go.

Get Started Today!

We pride ourselves on being The Most Helpful Humans In Hosting™!

Our talented Support Teams are full of experienced Linux technicians and System administrators who have an intimate knowledge of multiple web hosting technologies, especially those discussed in this article. We are always available to assist with any issues related to this article, 24 hours a day, 7 days a week 365 days a year.

If you are a Fully Managed VPS server, Cloud Dedicated, VMWare Private Cloud, Private Parent server or a Dedicated server owner and you are uncomfortable with performing any of the steps outlined, we can be reached via phone @800.580.4985, a chat or support ticket to assisting you with this process.

Avatar for Justin Palmer

About the Author: Justin Palmer

Justin Palmer is a professional application developer with Liquid Web

Latest Articles

How to install Puppet Server on Linux (AlmaLinux)

Read Article

Deploying web applications with NGINX HTTP Server

Read Article

Email security best practices for using SPF, DKIM, and DMARC

Read Article

Linux dos2unix command syntax — removing hidden Windows characters from files

Read Article

Change cPanel password from WebHost Manager (WHM)

Read Article