Skip to content

Commit b81ef71

Browse files
authored
Add README.md (#24)
1 parent 9eb6b20 commit b81ef71

File tree

4 files changed

+154
-4
lines changed

4 files changed

+154
-4
lines changed

docs/Grafana.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
# Grafana Integration
2+
3+
This guide explains how to visualize access-log-exporter metrics in Grafana.
4+
5+
## Quick Setup
6+
7+
### 1. Import Dashboard
8+
9+
access-log-exporter includes a pre-built Grafana dashboard for visualizing web server metrics.
10+
11+
**Import the dashboard:**
12+
13+
1. Download the dashboard JSON: [grafana-dashboard.json](https://github.com/jkroepke/access-log-exporter/blob/main/contrib/grafana-dashboard.json)
14+
2. In Grafana, go to **Dashboards****Import**
15+
3. Upload the JSON file or paste the content
16+
4. Click **Import**
17+
18+
### 2. Dashboard Features
19+
20+
The included dashboard provides:
21+
22+
- **Request Rate**: Requests per second by host and status
23+
- **Response Times**: P50, P95, P99 response time percentiles
24+
- **Error Rates**: 4xx and 5xx error percentages
25+
- **Traffic Volume**: Request/response size metrics
26+
- **Upstream Performance**: Connection and response times (for upstream presets)
27+
- **URI Analytics**: Top endpoints by traffic (for `simple_uri_upstream` preset)
28+
29+
## Demo Setup
30+
31+
For a complete demo environment with Grafana, see the demo configuration in `docs/demo/`:
32+
33+
```bash
34+
cd docs/demo
35+
docker-compose up -d
36+
```
37+
38+
This starts:
39+
- access-log-exporter on port 4040
40+
- Nginx with sample traffic on port 8080
41+
- Prometheus on port 9090
42+
- Grafana on port 3000
43+
44+
The dashboard will be automatically imported and configured.

docs/Home.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,3 +7,5 @@ Welcome to the access-log-exporter wiki!
77
1. [Installation](Installation)
88
2. [Configuration](Configuration)
99
3. [Webserver Configuration](Webserver)
10+
4. [Prometheus](Prometheus)
11+
4. [Grafana](Grafana)

docs/Prometheus.md

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
# Prometheus Integration
2+
3+
This guide explains how to integrate access-log-exporter with Prometheus.
4+
5+
## Quick Setup
6+
7+
### 1. Start access-log-exporter
8+
9+
```bash
10+
access-log-exporter --preset simple
11+
```
12+
13+
By default, metrics are exposed on `http://localhost:4040/metrics`
14+
15+
### 2. Configure Prometheus
16+
17+
Add this to your `prometheus.yml`:
18+
19+
```yaml
20+
scrape_configs:
21+
- job_name: 'access-log-exporter'
22+
static_configs:
23+
- targets: ['localhost:4040']
24+
scrape_interval: 30s
25+
```
26+
27+
### 3. Verify
28+
29+
Check Prometheus targets at `http://your-prometheus:9090/targets`
30+
31+
## Available Metrics
32+
33+
### Basic Metrics (simple preset)
34+
- `http_requests_total` - Request counter with labels: host, method, status
35+
- `http_request_duration_seconds` - Response time histogram
36+
- `http_request_size_bytes` - Request size histogram
37+
- `http_response_size_bytes` - Response size histogram
38+
39+
### Upstream Metrics (simple_upstream preset)
40+
- `http_upstream_connect_duration_seconds` - Upstream connection time
41+
- `http_upstream_header_duration_seconds` - Upstream header time
42+
- `http_upstream_request_duration_seconds` - Upstream response time
43+
44+
### URI Tracking (simple_uri_upstream preset)
45+
All metrics include `request_uri` label with path normalization:
46+
47+
```prometheus
48+
http_requests_total{host="example.com",method="GET",status="200",path="/api/users/.+"}
49+
```
50+
51+
## Useful Queries
52+
53+
```promql
54+
# Requests per second
55+
rate(http_requests_total[5m])
56+
57+
# 95th percentile response time
58+
histogram_quantile(0.95, rate(http_request_duration_seconds_bucket[5m]))
59+
60+
# Error rate
61+
sum(rate(http_requests_total{status=~"5.."}[5m])) / sum(rate(http_requests_total[5m]))
62+
```
63+
64+
## Docker Compose Example
65+
66+
```yaml
67+
version: '3.8'
68+
services:
69+
access-log-exporter:
70+
image: ghcr.io/jkroepke/access-log-exporter:latest
71+
ports:
72+
- "4040:4040"
73+
- "8514:8514/udp"
74+
75+
prometheus:
76+
image: prom/prometheus:latest
77+
ports:
78+
- "9090:9090"
79+
volumes:
80+
- ./prometheus.yml:/etc/prometheus/prometheus.yml
81+
```
82+
83+
## Troubleshooting
84+
85+
- **No metrics**: Check if access-log-exporter is receiving logs from your web server
86+
- **High cardinality**: Use `simple_uri_upstream` preset for automatic path normalization
87+
- **Performance**: Use 30-60 second scrape intervals for web metrics

internal/nginx/collector.go

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,14 @@
11
package nginx
22

33
import (
4+
"context"
45
"fmt"
56
"io"
67
"log/slog"
78
"net/http"
89
"strings"
910
"sync"
11+
"time"
1012

1113
"github.com/prometheus/client_golang/prometheus"
1214
)
@@ -105,15 +107,30 @@ func (c *Collector) Describe(ch chan<- *prometheus.Desc) {
105107
}
106108

107109
func (c *Collector) Collect(ch chan<- prometheus.Metric) {
108-
c.logger.Error("hit Collect method")
109-
110110
c.mu.Lock() // To protect metrics from concurrent collects
111111
defer c.mu.Unlock()
112112

113113
serverVersion := "N/A"
114114

115-
//nolint:noctx
116-
resp, err := http.Get(c.scrapeURL)
115+
ctx, cancel := context.WithTimeout(context.Background(), 1*time.Second)
116+
defer cancel()
117+
118+
req, err := http.NewRequestWithContext(ctx, http.MethodGet, c.scrapeURL, nil)
119+
if err != nil {
120+
c.logger.Error("Failed to create HTTP request for NGINX metrics",
121+
slog.String("url", c.scrapeURL),
122+
slog.Any("error", err),
123+
)
124+
125+
ch <- prometheus.MustNewConstMetric(c.upMetric,
126+
prometheus.GaugeValue, 0, serverVersion)
127+
128+
return
129+
}
130+
131+
req.Header.Set("User-Agent", "jkroepke/access-log-exporter")
132+
133+
resp, err := http.DefaultClient.Do(req)
117134
if err != nil {
118135
c.logger.Error("Failed to scrape NGINX metrics",
119136
slog.String("url", c.scrapeURL),

0 commit comments

Comments
 (0)