Skip to content

Commit b85cc5b

Browse files
index to docs
1 parent 8d0d7cb commit b85cc5b

File tree

1 file changed

+102
-0
lines changed

1 file changed

+102
-0
lines changed

docs/index.md

Lines changed: 102 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,102 @@
1+
# trawler
2+
3+
![Trawler Logo ](docs/trawler.png)
4+
5+
6+
Trawler is a metrics exporter for IBM API Connect.
7+
8+
[![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/5829/badge)](https://bestpractices.coreinfrastructure.org/projects/5829)
9+
10+
## Deployment
11+
12+
Trawler is designed to run within the same kubernetes cluster as API Connect, such that it can scrape metrics from the installed components and make them available. The metrics gathering in Trawler is separated into separate nets for the different types of metrics to expose so you can select which ones to enable for a particular environment.
13+
14+
It requires a service account with read access to list pods and services in the namespace(s) the API Connect components are deployed in.
15+
16+
[More details on installing trawler](docs/install.md)
17+
18+
### Configuring trawler
19+
20+
Trawler gets its config from a mounted configmap containing config.yaml which looks like this:
21+
22+
```yaml
23+
trawler:
24+
frequency: 10
25+
use_kubeconfig: false
26+
prometheus:
27+
port: 63512
28+
enabled: true
29+
logging:
30+
level: debug
31+
filters: trawler:trace
32+
format: pretty
33+
nets:
34+
datapower:
35+
enabled: true
36+
timeout: 5
37+
username: trawler-monitor
38+
namespace: apic-gateway
39+
product:
40+
enabled: true
41+
username: trawler-monitor
42+
namespace: apic-management
43+
```
44+
**General trawler settings:**
45+
- frequency: number of seconds to wait between trawling for metrics
46+
- use_kubeconfig: use the current kubeconfig from the environment instead looking at _in cluster_ config
47+
- logging: set the default logging level, output format and filters for specific components
48+
**Prometheus settings:**
49+
The port specified in the prometheus block needs to match the prometheus annotations on the deployed trawler pod for prometheus to discover the metrics exposed.
50+
51+
**Individual nets**
52+
Each of the different areas of metrics is handled by a separate net, which can be enabled/disabled independently. The configuration for these is currently a pointer to the namespace the relevant subsystem is deployed into and a username to use. Passwords are loaded separately from the following values in a kubernetes secret mounted at the default location of `/app/secrets` - which can be overridden using the SECRETS environment variable:
53+
54+
- datapower_password - password to use with the datapower net for accessing the [DataPower REST management](https://www.ibm.com/support/knowledgecenter/SS9H2Y_7.7.0/com.ibm.dp.doc/restmgtinterface.html) interface.
55+
- cloudmanager_password - password to use with the manager net to retreive API Connect usage metrics.
56+
57+
## Issues, enhancements and pull requests
58+
59+
Feature requests and issue reports are welcome as [github issues](https://github.com/IBM/apiconnect-trawler/issues) through this repository. Contributions of pull requests are also accepted and should be provided with a linked issue explaining the reasoning behind the change, should follow the existing code format standards and tests should be included in the PR ensuring the overall code coverage is not reduced.
60+
61+
## More documentation
62+
63+
- [Metrics gathered by trawler](docs/metrics.md)
64+
- [Install](docs/install.md)
65+
- [Frequently asked questions](docs/faq.md)
66+
67+
68+
## Development tips
69+
70+
### Setting up your development environment
71+
72+
Install the pre-reqs for trawler from requirements.txt and development and testing requirements from requirements-dev.txt
73+
74+
pip install -r requirements.txt
75+
pip install -r requirements-dev.txt
76+
77+
Initialise the pre-commit checks for Trawler using [pre-commit](https://pre-commit.com/)
78+
79+
pre-commit install
80+
81+
### Running test cases locally
82+
83+
Trawler uses py.test for test cases and the test suite is intended to be run with the test-assets directory as the secrets path.
84+
85+
SECRETS=test-assets coverage run --source . -m py.test
86+
87+
88+
### Running locally
89+
90+
To run locally point the config parameter to a local config file
91+
92+
python3 trawler.py --config local/config.yaml
93+
94+
You can view the data that is being exposed to prometheus at [localhost:63512](http://localhost:63512) (or the custom port value if it's been changed)
95+
96+
97+
98+
Notes on developing with a running k8s pod:
99+
100+
kubectl cp datapower_trawl.py {trawler_pod}:/app/datapower_trawl.py
101+
kubectl cp newconfig.yaml {trawler_pod}:/app/newconfig.yaml
102+
kubectl exec {trawler_pod} -- sh -c 'cd /app;python3 trawler.py -c newconfig.yaml'

0 commit comments

Comments
 (0)