Skip to content

Commit 3e56d89

Browse files
committed
ngx-http-log-json: release v0.0.2
nginx: upgrade to nginx 1.11.10 support module: rename to formal name ngx_http_log_json module: parse recipe with pcre regex module: refactoring functions config: support array values config: support literal boolean values config: support literal null values config: if condition argument tests: initial unit tests build: travis integration
1 parent a9eabc5 commit 3e56d89

16 files changed

+1698
-1354
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
tags
2+
t/servroot

.travis.yml

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
sudo: required
2+
dist: trusty
3+
4+
os: linux
5+
6+
language: c
7+
8+
compiler:
9+
- gcc
10+
11+
cache:
12+
apt: true
13+
directories:
14+
15+
env:
16+
global:
17+
- NGINX_PREFIX=/opt/nginx
18+
- JOBS=4
19+
- PATH=$PATH:$NGINX_PREFIX/sbin
20+
21+
before_install:
22+
- sudo apt-get install -qq -y software-properties-common
23+
- sudo add-apt-repository "deb http://us.archive.ubuntu.com/ubuntu/ xenial main universe"
24+
- sudo apt-get update -qq -y --fix-missing
25+
- sudo apt-get install -qq -y --fix-missing cpanminus librdkafka-dev libjansson-dev mercurial build-essential make clang valgrind libjson-perl
26+
27+
install:
28+
- if [ ! -d /opt ]; then mkdir /opt; fi
29+
- git clone https://github.com/openresty/test-nginx.git
30+
- hg clone http://hg.nginx.org/nginx
31+
32+
script:
33+
- cd test-nginx/ && sudo cpanm . && cd ..
34+
- sudo cpanm install JSON
35+
- cd nginx
36+
- auto/configure --with-debug --prefix=$NGINX_PREFIX --add-module=$PWD/.. > build.log 2>&1 || (cat build.log && exit 1)
37+
- make -j$JOBS > build.log 2>&1 || (cat build.log && exit 1)
38+
- sudo make install > build.log 2>&1 || (cat build.log && exit 1)
39+
- cd ..
40+
- export PATH=$NGINX_PREFIX/sbin:$PATH
41+
- /opt/nginx/sbin/nginx -V
42+
- ldd /opt/nginx/sbin/nginx
43+
- prove

CHANGES

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,15 @@
1+
Changes with ngx-http-log-json 0.0.2 26 Mar 2017
2+
3+
*) upgrade to nginx 1.11.10 support
4+
*) parse recipe with pcre regex
5+
*) support literal boolean values
6+
*) support literal null values
7+
*) support array values
8+
*) refactoring functions
9+
*) unit tests
10+
*) travis script integration
11+
*) if= condition argument
12+
113
Changes with ngx-kasha 0.0.1 14 Dec 2016
214

315
*) Log output to Kafka topic.

README.md

Lines changed: 82 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,34 +1,58 @@
1-
# ngx-kasha
1+
# ngx-http-log-json [![Build Status](https://travis-ci.org/fooinha/nginx-http-log-json.svg?branch=master)](https://travis-ci.org/fooinha/nginx-http-log-json)
22

3-
4-
nginx module for advanced per location logging - aka kasha (🍲)
3+
nginx http module for logging in custom json format - aka kasha (🍲)
54

65
## Description
76

87
This module adds to nginx the ability of advanced JSON logging of HTTP requests per location.
8+
99
It's possible to log to a destination ouput any request made to a specific nginx location.
10+
1011
The output format is configurable.
1112

13+
It also allows to log complex and multi-level JSON documents.
14+
15+
It supports logging to text file or to a kafka topic.
16+
17+
## Use cases
18+
19+
That are many use cases.
20+
21+
Many things can be done by using the access log data.
22+
23+
Having it in JSON format makes easier for integration with other platforms and applications.
24+
25+
A quick example:
26+
27+
![](docs/use-case-kafka-logs.png?raw=true)
28+
29+
1230
### Configuration
1331

14-
Each logging configuration is based on a kasha_recipe. (🍲)
32+
Each logging configuration is based on a http_log_json_format. (🍲)
1533

16-
A kasha recipe is a ';' separated list of items to include in the logging preparation.
34+
A http_log_json_format is a ';' separated list of items to include in the logging preparation.
1735

1836
The left hand side part of item will be the JSON Path for the variable name
19-
The left hand side part can be prefixed with 's:', 'i:' or 'r:', so the JSON encoding type can be controlled.
37+
The left hand side part can be prefixed with 's:', 'i:', 'r:', 'b:' or 'n:', so, the JSON encoding type can be controlled.
2038

2139
* 's:' - JSON string ( default )
2240
* 'i:' - JSON integer
2341
* 'r:' - JSON real
42+
* 'b:' - JSON boolean
43+
* 'n:' - JSON null
44+
45+
Additional prefix:
46+
47+
* 'a:' - JSON Array - MUST be used before other prefixes. All keys with same name and defined as array will be its values grouped together in an array. ( see example below )
2448

2549

2650
The right hand side will be the variable's name or literal value.
2751
For this, known or previously setted variables, can be used by using the '$' before name.
2852

2953
Common HTTP nginx builtin variables like $uri, or any other variable set by other handler modules can be used.
3054

31-
The output is sent to the location specified by the first kasha_recipe argument.
55+
The output is sent to the location specified by the first http_log_json_format argument.
3256
The possible output locations are:
3357

3458
* "file:" - The logging location will be a local filesystem file.
@@ -40,7 +64,7 @@ The possible output locations are:
4064
##### A simple configuration example
4165

4266
```yaml
43-
kasha_recipe file:/tmp/log '
67+
http_log_json_format file:/tmp/log '
4468
src.ip $remote_addr;
4569
src.port $remote_port;
4670
dst.ip $server_addr;
@@ -49,12 +73,15 @@ The possible output locations are:
4973
r:_real 1.1;
5074
i:_int 2016;
5175
i:_status $status;
76+
b:_notrack false;
5277
_literal root;
5378
comm.proto http;
5479
comm.http.method $request_method;
5580
comm.http.path $uri;
5681
comm.http.host $host;
5782
comm.http.server_name $server_name;
83+
a:i:list 1;
84+
a:list string;
5885
';
5986
```
6087
@@ -68,6 +95,7 @@ To ease reading, it's shown here formatted with newlines.
6895
"_literal": "root",
6996
"_real": 1.1,
7097
"_status": 200,
98+
"_notrack": false,
7199
"comm": {
72100
"http": {
73101
"host": "localhost",
@@ -84,7 +112,11 @@ To ease reading, it's shown here formatted with newlines.
84112
"src": {
85113
"ip": "127.0.0.1",
86114
"port": "52136"
87-
}
115+
},
116+
"list": [
117+
1,
118+
"string"
119+
]
88120
}
89121
```
90122

@@ -100,7 +132,7 @@ To ease reading, it's shown here formatted with newlines.
100132
return "";
101133
}';
102134
103-
kasha_recipe file:/tmp/log '
135+
http_log_json_format file:/tmp/log '
104136
comm.http.server_name $server_name;
105137
perl.bar $bar;
106138
';
@@ -125,7 +157,7 @@ To ease reading, it's shown here formatted with newlines.
125157
### Directives
126158

127159
---
128-
* Syntax: **kasha_recipe** _location_ { _recipe_ };
160+
* Syntax: **http_log_json_format** _location_ { _format_ } _if_=...;
129161
* Default: —
130162
* Context: http location
131163

@@ -139,61 +171,64 @@ For a **file:** type the value part will be a local file name. e.g. **file:**/tm
139171

140172
For a **kafka:** type the value part will be the topic name. e.g. **kafka:** topic
141173

142-
The kafka output only happens if a list of brokers is defined by **kasha_kafka_brokers** directive.
174+
The kafka output only happens if a list of brokers is defined by **http_log_json_kafka_brokers** directive.
143175

144-
###### _recipe_ ######
176+
###### _format_ ######
145177

146178
See details above.
147179

148180
---
149181

150-
* Syntax: **"kasha_kafka_partition** _compression_codec_;
182+
* Syntax: **"http_log_json_kafka_partition** _partition_;
151183
* Default: RD_KAFKA_PARTITION_UA
152184
* Context: http local
153185

154186
---
155187

156-
* Syntax: **kasha_kafka_brokers** list of brokers separated by spaces;
188+
* Syntax: **http_log_json_kafka_brokers** list of brokers separated by spaces;
157189
* Default: —
158190
* Context: http main
159191

160192
---
161193

162-
* Syntax: **kasha_kafka_client_id** _id_;
163-
* Default: kasha
194+
* Syntax: **http_log_json_kafka_client_id** _id_;
195+
* Default: http_log_json
164196
* Context: http main
165197

166198
---
167199

168-
* Syntax: **"kasha_kafka_compression** _compression_codec_;
200+
* Syntax: **"http_log_json_kafka_compression** _compression_codec_;
169201
* Default: snappy
170202
* Context: http main
171203

172204
---
173205

174-
* Syntax: **"kasha_kafka_log_level** _numeric_log_level_;
206+
* Syntax: **"http_log_json_kafka_log_level** _numeric_log_level_;
175207
* Default: 6
176208
* Context: http main
177209

178210
---
179211

180-
* Syntax: **"kasha_kafka_max_retries** _numeric_;
212+
* Syntax: **"http_log_json_kafka_max_retries** _numeric_;
181213
* Default: 0
182214
* Context: http main
183215

184216
---
185217

186-
* Syntax: **"kasha_kafka_buffer_max_messages** _numeric_;
218+
* Syntax: **"http_log_json_kafka_buffer_max_messages** _numeric_;
187219
* Default: 100000
188220
* Context: http main
189221

190222
---
191223

192-
* Syntax: **"kasha_kafka_backoff_ms** _numeric_;
224+
* Syntax: **"http_log_json_kafka_backoff_ms** _numeric_;
193225
* Default: 10
194226
* Context: http main
195227

196228

229+
###### _if_=... ######
230+
231+
Works the same way as _if_ argument from http [access_log](http://nginx.org/en/docs/http/ngx_http_log_module.html#access_log) directive.
197232

198233
### Build
199234

@@ -212,7 +247,7 @@ $ sudo apt-get install libjansson-dev librdkafka-dev
212247
Build as a common nginx module.
213248

214249
```bash
215-
$ ./configure --add-module=/build/ngx-kasha
250+
$ ./configure --add-module=/build/ngx-http_log_json
216251
$ make && make install
217252
218253
```
@@ -223,7 +258,29 @@ $ make && make install
223258

224259
**THIS IS NOT PRODUCTION** ready.
225260

226-
This was done over the weekend as a proof of concept, and it also lacks unit tests.
227-
228261
So there's no guarantee of success. It most probably blow up when running in real life scenarios.
229262

263+
#### Unit tests
264+
265+
The unit tests use https://github.com/openresty/test-nginx framework.
266+
267+
268+
```
269+
$ git clone https://github.com/openresty/test-nginx.git
270+
$ cd test-nginx/
271+
$ cpanm .
272+
$ export PATH=$PATH:/usr/local/nginx/sbin/
273+
```
274+
275+
At project root just run the prove command:
276+
277+
```
278+
$ prove
279+
280+
t/0001_simple_file_log.t .. ok
281+
All tests successful.
282+
Files=1, Tests=8, 0 wallclock secs ( 0.02 usr 0.01 sys + 0.15 cusr 0.00 csys = 0.18 CPU)
283+
Result: PASS
284+
285+
```
286+

config

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
1-
ngx_addon_name=ngx_kasha_module
1+
ngx_addon_name=ngx_http_log_json_module
22
ngx_module_incs=$ngx_addon_dir/src
33

4-
HTTP_MODULES="$HTTP_MODULES ngx_kasha_module"
4+
HTTP_MODULES="$HTTP_MODULES ngx_http_log_json_module"
55

66
CORE_INCS="$CORE_INCS $ngx_module_incs"
77
NGX_ADDON_SRCS="$NGX_ADDON_SRCS \
8-
$ngx_addon_dir/src/ngx_kasha.c \
9-
$ngx_addon_dir/src/ngx_kasha_kafka.c \
10-
$ngx_addon_dir/src/ngx_kasha_str.c"
8+
$ngx_addon_dir/src/ngx_http_log_json_module.c \
9+
$ngx_addon_dir/src/ngx_http_log_json_kafka.c \
10+
$ngx_addon_dir/src/ngx_http_log_json_str.c"
1111
CORE_LIBS="$CORE_LIBS -ljansson -lrdkafka"

docs/use-case-kafka-logs.png

46.9 KB
Loading

0 commit comments

Comments
 (0)