Browse Source

add aliases

master
Brett Langdon 7 years ago
parent
commit
1f6211eb93
No known key found for this signature in database GPG Key ID: E6600FB894DB3D19
10 changed files with 55 additions and 51 deletions
  1. +1
    -0
      dash-apm-python/background.sh
  2. +3
    -0
      dash-apm-python/foreground.sh
  3. +2
    -2
      dash-apm-python/index.json
  4. +1
    -1
      dash-apm-python/step_2.md
  5. +1
    -1
      dash-apm-python/step_3.md
  6. +1
    -1
      dash-apm-python/step_4.md
  7. +1
    -1
      dash-apm-python/step_5.md
  8. +1
    -1
      dash-apm-python/step_6.md
  9. +44
    -0
      dash-apm-python/step_7.md
  10. +0
    -44
      dash-apm-python/step_8.md

+ 1
- 0
dash-apm-python/background.sh View File

@ -2,4 +2,5 @@
mkdir /tracing-workshop mkdir /tracing-workshop
git clone https://github.com/brettlangdon/distributed-tracing-with-apm-workshop /tracing-workshop git clone https://github.com/brettlangdon/distributed-tracing-with-apm-workshop /tracing-workshop
cd /tracing-workshop cd /tracing-workshop
docker-compose pull docker-compose pull

+ 3
- 0
dash-apm-python/foreground.sh View File

@ -1,4 +1,7 @@
#!/usr/bin/env bash #!/usr/bin/env bash
alias start-services="docker-compose up"
alias restart-services="docker-compose up"
cd /tracing-workshop cd /tracing-workshop
# clear the console # clear the console


+ 2
- 2
dash-apm-python/index.json View File

@ -28,11 +28,11 @@
"text": "step_6.md" "text": "step_6.md"
}, },
{ {
"title": "Adding manual spans",
"title": "Enable trace and logs",
"text": "step_7.md" "text": "step_7.md"
}, },
{ {
"title": "Enable trace and logs",
"title": "Adding manual spans",
"text": "step_8.md" "text": "step_8.md"
}, },
{ {


+ 1
- 1
dash-apm-python/step_2.md View File

@ -5,7 +5,7 @@ Copy the following command and replace `<KEY>` with your API key and execute:
Next, run the following command to start our docker services: Next, run the following command to start our docker services:
`docker-compose up`{{execute}}
`start-services`{{execute}}
Next, open the web interface to verify the application is up and running: Next, open the web interface to verify the application is up and running:


+ 1
- 1
dash-apm-python/step_3.md View File

@ -15,7 +15,7 @@ frontend:
Afterwards restart docker services: Afterwards restart docker services:
`docker-compose up`{{execute interrupt}}
`restart-services`{{execute interrupt}}
Finally, open trace analytics dashboard: Finally, open trace analytics dashboard:


+ 1
- 1
dash-apm-python/step_4.md View File

@ -15,7 +15,7 @@ def add_user_id():
span.set_tag('user_id', user_id) span.set_tag('user_id', user_id)
```{{copy}} ```{{copy}}
Restart services `docker-compose up`{{execute interrupt}}
`restart-services`{{execute interrupt}}
Finally, open the service page for `frontend` to view the new metadata on traces. Finally, open the service page for `frontend` to view the new metadata on traces.


+ 1
- 1
dash-apm-python/step_5.md View File

@ -22,7 +22,7 @@ def simulate_sensors():
return jsonify(sensors) return jsonify(sensors)
```{{copy}} ```{{copy}}
Restart our services `docker-compose up`{{execute interrupt}}.
`restart-services`{{execute interrupt}}.
Open the `/simulate_sensors` resource page for our `frontend` service. Open the `/simulate_sensors` resource page for our `frontend` service.


+ 1
- 1
dash-apm-python/step_6.md View File

@ -23,7 +23,7 @@ def simulate_all_sensors():
return [s.serialize() for s in sensors] return [s.serialize() for s in sensors]
```{{copy}} ```{{copy}}
Restart our services `docker-compose up`{{execute interrupt}}
`restart-services`{{execute interrupt}}
Open the service page for our `frontend` service. Open the service page for our `frontend` service.


+ 44
- 0
dash-apm-python/step_7.md View File

@ -0,0 +1,44 @@
# Enable Datadog logs agent
Add the following environment variables to the `agent` service in `docker-compose.yml`.
`DD_LOGS_ENABLED=true`{{copy}}
`DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true`{{copy}}
Our service should look like:
```yaml
agent:
environment:
- DD_API_KEY
- DD_APM_ENABLED=true
- DD_TAGS='env:apm-workshop'
- DD_LOGS_ENABLED=true
- DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true
```
# Enable trace id injection into logs
Add the following environment variable to the `frontend`, `node`, `pumps`,
and `sensors` services in `docker-compose.yml`.
`DD_LOGS_INTEGRATION=true`{{copy}}
Our services should look like:
```yaml
frontend:
environment:
- DATADOG_SERVICE_NAME=frontend
- DATADOG_TRACE_AGENT_HOSTNAME=agent
- DD_ANALYTICS_ENABLED=true
- DD_LOGS_INTEGRATION=true
```
Afterwards restart docker services:
`restart-services`{{execute interrupt}}
Finally, open logs dashboard:
https://app.datadoghq.com/logs

+ 0
- 44
dash-apm-python/step_8.md View File

@ -1,44 +0,0 @@
# Enable Datadog logs agent
Add the following environment variables to the `agent` service in `docker-compose.yml`.
`DD_LOGS_ENABLED=true`{{copy}}
`DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true`{{copy}}
Our service should look like:
```yaml
agent:
environment:
- DD_API_KEY
- DD_APM_ENABLED=true
- DD_TAGS='env:apm-workshop'
- DD_LOGS_ENABLED=true
- DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true
```
# Enable trace id injection into logs
Add the following environment variable to the `frontend`, `node`, `pumps`,
and `sensors` services in `docker-compose.yml`.
`DD_LOGS_INTEGRATION=true`{{copy}}
Our services should look like:
```yaml
frontend:
environment:
- DATADOG_SERVICE_NAME=frontend
- DATADOG_TRACE_AGENT_HOSTNAME=agent
- DD_ANALYTICS_ENABLED=true
- DD_LOGS_INTEGRATION=true
```
Afterwards restart docker services:
`docker-compose up`{{execute interrupt}}
Finally, open logs dashboard:
https://app.datadoghq.com/logs

Loading…
Cancel
Save