@ -2,9 +2,9 @@ This article contains a brief introduction to centralized structured logging wit
![](img/elk/kibana-working.png)
## Wiring eshopOnContainers with ELK
## Wiring eshopOnContainers with ELK in Localhost
eshopOnContainers is ready for work with ELK, you only need to setup the configuration parameter **LogstashgUrl**, in **Serilog** Section, for achieve this, you can do it modifing this parameter in every appsettings.json in every service, or via Environment Variable **Serilog:LogstashUrl**.
eshopOnContainers is ready for work with ELK, you only need to setup the configuration parameter **LogstashUrl**, in **Serilog** Section, for achieve this, you can do it modifing this parameter in every appsettings.json of every service, or via Environment Variable **Serilog:LogstashUrl**.
There is another option, a zero-configuration environment for testing the integration launching via ```docker-compose``` command, on the root directory of eshopOnContainers:
@ -27,4 +27,16 @@ You can wait a bit and refresh the page, the first time you enter, you need to c
With the index pattern configured, you can enter in the discover section and start viewing how the tool is recollecting the logging information.
![](img/elk/kibana_result.png)
![](img/elk/kibana_result.png)
## Configuring ELK on Azure VM
Another option is to use a preconfigured virtual machine with Logstash, ElasticSearch and Kibana and point the configuration parameter **LogstashUrl**. For doing this you can address to Microsoft Azure, and start searching a Certified ELK Virtual Machine
![](img/elk/create-vm-elk-azure.png)
This options it have a certified preconfigured options (Network, VirtualMachine type, OS, RAM, Disks) for having a good starting point of ELK with good performance.
![](img/elk/create-vm-elk-azure-summary.png)
When you have configured the main aspects of your virtual machine, you will have a review&create last step like this: