| @ -1,8 +1,84 @@ | |||||
| In this phase we are going to introduce an exception into our application | |||||
| to see how we can track it down. | |||||
| # Enable Datadog logs agent | |||||
| `restart-services`{{execute interrupt}} to continue. | |||||
| Add the following environment variables to the `agent` service in `docker-compose.yml`. | |||||
| Open the `frontend` service page to continue. | |||||
| `DD_LOGS_ENABLED=true`{{copy}} | |||||
| `DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true`{{copy}} | |||||
| https://app.datadoghq.com/apm/service/frontend | |||||
| Our service should look like: | |||||
| ```yaml | |||||
| agent: | |||||
| environment: | |||||
| - DD_API_KEY | |||||
| - DD_APM_ENABLED=true | |||||
| - DD_TAGS='env:apm-workshop' | |||||
| - DD_LOGS_ENABLED=true | |||||
| - DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true | |||||
| ```{{copy}} | |||||
| # Enable trace id injection into logs | |||||
| Add the following environment variable to the `frontend`, `node`, `pumps`, | |||||
| and `sensors` services in `docker-compose.yml`. | |||||
| `DD_LOGS_INTEGRATION=true`{{copy}} | |||||
| Our services should look like: | |||||
| ```yaml | |||||
| frontend: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - DATADOG_SERVICE_NAME=frontend | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| node: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - DD_SERVICE_NAME=users-api | |||||
| - DD_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| pumps: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - FLASK_APP=pumps.py | |||||
| - FLASK_DEBUG=1 | |||||
| - POSTGRES_PASSWORD=postgres | |||||
| - POSTGRES_USER=postgres | |||||
| - DATADOG_SERVICE_NAME=pumps-service | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| sensors: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - FLASK_APP=sensors.py | |||||
| - FLASK_DEBUG=1 | |||||
| - POSTGRES_PASSWORD=postgres | |||||
| - POSTGRES_USER=postgres | |||||
| - DATADOG_SERVICE_NAME=sensors-api | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| Afterwards restart docker services: | |||||
| `restart-services`{{execute interrupt}} | |||||
| Finally, open logs dashboard: | |||||
| https://app.datadoghq.com/logs | |||||
| @ -1,3 +0,0 @@ | |||||
| #!/usr/bin/env bash | |||||
| echo "WORKSHOP_ADD_ERRORS=true" >> /tracing-workshop/.env | |||||
| @ -1,3 +1,3 @@ | |||||
| #!/usr/bin/env bash | #!/usr/bin/env bash | ||||
| echo "WORKSHOP_ADD_LATENCY=true" >> /tracing-workshop/.env | |||||
| echo "WORKSHOP_ADD_ERRORS=true" >> /tracing-workshop/.env | |||||
| @ -1,84 +0,0 @@ | |||||
| # Enable Datadog logs agent | |||||
| Add the following environment variables to the `agent` service in `docker-compose.yml`. | |||||
| `DD_LOGS_ENABLED=true`{{copy}} | |||||
| `DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true`{{copy}} | |||||
| Our service should look like: | |||||
| ```yaml | |||||
| agent: | |||||
| environment: | |||||
| - DD_API_KEY | |||||
| - DD_APM_ENABLED=true | |||||
| - DD_TAGS='env:apm-workshop' | |||||
| - DD_LOGS_ENABLED=true | |||||
| - DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true | |||||
| ```{{copy}} | |||||
| # Enable trace id injection into logs | |||||
| Add the following environment variable to the `frontend`, `node`, `pumps`, | |||||
| and `sensors` services in `docker-compose.yml`. | |||||
| `DD_LOGS_INTEGRATION=true`{{copy}} | |||||
| Our services should look like: | |||||
| ```yaml | |||||
| frontend: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - DATADOG_SERVICE_NAME=frontend | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| node: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - DD_SERVICE_NAME=users-api | |||||
| - DD_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| pumps: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - FLASK_APP=pumps.py | |||||
| - FLASK_DEBUG=1 | |||||
| - POSTGRES_PASSWORD=postgres | |||||
| - POSTGRES_USER=postgres | |||||
| - DATADOG_SERVICE_NAME=pumps-service | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| ``` yaml | |||||
| sensors: | |||||
| env_file: ".env" | |||||
| environment: | |||||
| - FLASK_APP=sensors.py | |||||
| - FLASK_DEBUG=1 | |||||
| - POSTGRES_PASSWORD=postgres | |||||
| - POSTGRES_USER=postgres | |||||
| - DATADOG_SERVICE_NAME=sensors-api | |||||
| - DATADOG_TRACE_AGENT_HOSTNAME=agent | |||||
| - DD_ANALYTICS_ENABLED=true | |||||
| - DD_LOGS_INTEGRATION=true | |||||
| ```{{copy}} | |||||
| Afterwards restart docker services: | |||||
| `restart-services`{{execute interrupt}} | |||||
| Finally, open logs dashboard: | |||||
| https://app.datadoghq.com/logs | |||||
| @ -0,0 +1,8 @@ | |||||
| In this phase we are going to introduce an latency into our application | |||||
| to see how we can track it down. | |||||
| `restart-services`{{execute interrupt}} to continue. | |||||
| Open the `frontend` service page to continue. | |||||
| https://app.datadoghq.com/apm/service/frontend | |||||
| @ -0,0 +1,3 @@ | |||||
| #!/usr/bin/env bash | |||||
| echo "WORKSHOP_ADD_LATENCY=true" >> /tracing-workshop/.env | |||||