Nowadays, for most modern web applications, it is very important to have information about requests coming to backend servers in order to know if your application is performing as expected. To be able to rely on historical data of your application's API (like the number of requests grouped by resource type) can be very useful. When you run on a cloud environment like AWS, you do not need to worry about that because they can provide all such information. However, if you are running on your own infrastructure, well.. you should find out a solution by yourself. This is what has led me to the subject of this post.
Bellow, we can see a general deployment view in my case:
Image 1: general deployment view
The main idea is to get Logstash installed in each middle tier server parsing IIS logs and sending this parsed data to Elastic Search. Further, this data can be queried using Kibana in order to produce graphs and useful information about what happens with your application.
Now, we are going to see how I have achieved this. The purpose here is to share what I needed to do without going deep in details, therefore giving you the necessary background to accomplish this by yourself in your particular scenario.
IIS Logs: Installing IIS Advanced Logging and configuring it
I have chosen to use Advanced Logging due to its extra fields and configuration capabilities. It can be installed using the IIS "Web Platform Installer" option (found in the IIS Services Manager panel interface, server selection) or it can be downloaded at:
Once you install it, don't forget to disable the default IIS logging. You don't want IIS generating two different log files. It is possible to do it clicking on "Logging" => "Disable" (in the right of IIS panel). This same way, you should enable Advanced Logging by clicking on "Advanced Logging" => "Enable Advanced Logging". After that, it is possible to tweak some configuration in order to keep the log file format as you want:
Image 2: IIS Advanced Logging configuration
Most settings are ok as default. The main concern should be about the order of fields in the log file (which you can set using the options presented in the image above) and which fields should be part of the log (clicking on "Select Fields..." button will bring up the following window):
Image 3: IIS Advanced Loggin configuration - Select Logging Fields
It can be "a pain in the ass" to configure it for more than one server. Not surprisingly, it is possible to obtain what you have done in one server and replicate to others. IIS 7.5 on Windows server 2008 stores these configurations in the applicationHost.config, which can be found in the path "c:\windows\system32\inetsrv\config". Inside the "<system.webServer>" configuration section of this file, you can find the "<advancedLogging>" element. Copy and replace it in another server's file. I have provided my configuration section in this .txt file. This way, if you want, you can use my Logstash configuration file too (further in this post) and save some time.
Logstash: Installing and configuring
Logstash will be in charge of getting data from IIS log files, filtering and parsing and sending it to Elastic Search. Logstash can be downloaded from the Elastic company website. Details about how to install Logstash 1.5 version (this is the version used in my solution) can be found here. After getting Logstash installed, you should create and configure your Logstash .conf file. It can be named as you want and should be passed to Logstash starting comand with option "-f" (see more below). Basically, a Logstash configuration file has three sections:
input section
You can use this section to set the origin of Logstash incoming data. A typical configuration for getting data from IIS log files is like this:
input {
file {
type => "IIS Advanced Log"
path => "C:/inetpub/logs/AdvancedLogs/*.log"
}
}
filter section
Basically, it is used for processing and filtering data.
output section
You can use this section to tell Logstash what it should do to the processed data, like sending it to Elastic Search Server, for instance.
There is a lot of information on Elastic website about how to configure and take advantage of all the features that each of these sections can provide. As I said before, I have shared my .conf file. If you use it in conjunction with the given configuration for IIS Advanced Logging (as exposed above), you should be able to get this working just like me.
In order to keep Logstash working as windows service, I used Nssm - the Non-Sucking Service Manager. It has worked properly. Bellow, a configuration example of how a script can be built to install Logstash as a windows service:
After that, a new "logstash" windows service name will be available in your server. Type "net start logstash" in command prompt window running under admin privilege and the service should be started:
Kibana: the final step to get done!
Installation of Kibana is easy: just follow the available instructions on Elastic website. Once it is installed, it is reasonably easy to get your reports. As I said before, the target here is not to teach you how to use Kibana or even Logstash. You can take a look on tutorials and documentation on Elastic website (Kibana User Guide). Bellow, just an example of what Kibana can do for you:
Conclusion
Using Logstash, Kibana and NSSM, it is possible to have a service analyzing your IIS log data, processing it and building some cool graphs :)
Logstash will be in charge of getting data from IIS log files, filtering and parsing and sending it to Elastic Search. Logstash can be downloaded from the Elastic company website. Details about how to install Logstash 1.5 version (this is the version used in my solution) can be found here. After getting Logstash installed, you should create and configure your Logstash .conf file. It can be named as you want and should be passed to Logstash starting comand with option "-f" (see more below). Basically, a Logstash configuration file has three sections:
input section
You can use this section to set the origin of Logstash incoming data. A typical configuration for getting data from IIS log files is like this:
input {
file {
type => "IIS Advanced Log"
path => "C:/inetpub/logs/AdvancedLogs/*.log"
}
}
filter section
Basically, it is used for processing and filtering data.
output section
You can use this section to tell Logstash what it should do to the processed data, like sending it to Elastic Search Server, for instance.
There is a lot of information on Elastic website about how to configure and take advantage of all the features that each of these sections can provide. As I said before, I have shared my .conf file. If you use it in conjunction with the given configuration for IIS Advanced Logging (as exposed above), you should be able to get this working just like me.
In order to keep Logstash working as windows service, I used Nssm - the Non-Sucking Service Manager. It has worked properly. Bellow, a configuration example of how a script can be built to install Logstash as a windows service:
@echo off
set nssm_path=c:\nssm\nssm-2.24\win64
set logstash_path=c:\Elastic\logstash-1.5.2
echo.
echo Instaling logstach as service...
echo.
echo Expected logstach path: %logstash_path%
echo Expected nssm path: %nssm_path%
echo.
echo.
cd %nssm_path%
nssm install logstash %logstash_path%\bin\logstash.bat
nssm set logstash AppParameters -f %logstash_path%\bin\[your logstach config file] -l %logstash_path%\log\logs.log
nssm set logstash AppDirectory %logstash_path%
nssm set logstash AppEnvironmentExtra "JAVA_HOME=%JAVA_HOME%"
nssm set logstash AppStdout %logstash_path%\nssm\stdout.log
nssm set logstash AppStderr %logstash_path%\nssm\stderr.log
REM Replace stdout and stderr files
nssm set logstash AppStdoutCreationDisposition 2
nssm set logstash AppStderrCreationDisposition 2
REM Disable WM_CLOSE, WM_QUIT in the Shutdown options. Without it, NSSM can't stop Logstash properly
nssm set logstash AppStopMethodSkip 6
REM Let's start Logstash. I assume a correct configuration is already in place
REM net start logstash
After that, a new "logstash" windows service name will be available in your server. Type "net start logstash" in command prompt window running under admin privilege and the service should be started:
Image 4: starting logstash service after setting it using NSSM.
Kibana: the final step to get done!
Installation of Kibana is easy: just follow the available instructions on Elastic website. Once it is installed, it is reasonably easy to get your reports. As I said before, the target here is not to teach you how to use Kibana or even Logstash. You can take a look on tutorials and documentation on Elastic website (Kibana User Guide). Bellow, just an example of what Kibana can do for you:
Image 5: Data visualization on Kibana
Conclusion
Using Logstash, Kibana and NSSM, it is possible to have a service analyzing your IIS log data, processing it and building some cool graphs :)
No comments:
Post a Comment