github.com/yankunsam/loki/v2@v2.6.3-0.20220817130409-389df5235c27/docs/sources/clients/promtail/gcplog-cloud.md (about) 1 --- 2 title: Cloud setup GCP Logs 3 --- 4 # Cloud setup GCP logs 5 6 This document explain how one can setup Google Cloud Platform to forward its cloud resource logs from a particular GCP project into Google Pubsub topic so that is available for Promtail to consume. 7 8 This document assumes, that reader have `gcloud` installed and have required permissions(as mentioned in #[Roles and Permission] section) 9 10 ## Roles and Permission 11 12 User should have following roles to complete the setup. 13 - "roles/pubsub.editor" 14 - "roles/logging.configWriter" 15 16 ## Setup Pubsub Topic 17 18 Google Pubsub Topic will act as the queue to persist log messages which then can be read from Promtail. 19 20 ```bash 21 $ gcloud pubsub topics create $TOPIC_ID 22 ``` 23 24 e.g: 25 ```bash 26 $ gcloud pubsub topics create cloud-logs 27 ``` 28 29 ## Setup Log Router 30 31 We create a log sink to forward cloud logs into pubsub topic created before 32 33 ```bash 34 $ gcloud logging sinks create $SINK_NAME $SINK_LOCATION $OPTIONAL_FLAGS 35 ``` 36 37 e.g: 38 ```bash 39 $ gcloud logging sinks create cloud-logs pubsub.googleapis.com/projects/my-project/topics/cloud-logs \ 40 --log-filter='resource.type=("gcs_bucket")' \ 41 --description="Cloud logs" 42 ``` 43 44 Above command also adds `log-filter` option which represents what type of logs should get into the destination `pubsub` topic. 45 For more information on adding `log-filter` refer this [document](https://cloud.google.com/logging/docs/export/configure_export_v2#creating_sink) 46 47 We cover more advanced `log-filter` [below](#Advanced-Log-filter) 48 49 ## Grant log sink the pubsub publisher role 50 51 Find the writer identity service account of the log sink just created: 52 53 ```bash 54 gcloud logging sinks describe \ 55 --format='value(writerIdentity)' $SINK_NAME 56 ``` 57 58 For example: 59 ```bash 60 gcloud logging sinks describe \ 61 --format='value(writerIdentity)' cloud-logs 62 ``` 63 64 Create an IAM policy binding to allow log sink to publish messages to the topic: 65 ```bash 66 gcloud pubsub topics add-iam-policy-binding $TOPIC_ID \ 67 --member=$WRITER_IDENTITY --role=roles/pubsub.publisher 68 ``` 69 70 For example: 71 ```bash 72 gcloud pubsub topics add-iam-policy-binding cloud-logs \ 73 --member=serviceAccount:pxxxxxxxxx-xxxxxx@gcp-sa-logging.iam.gserviceaccount.com --role=roles/pubsub.publisher 74 ``` 75 76 ## Create Pubsub subscription for Grafana Loki 77 78 We create subscription for the pubsub topic we create above and Promtail uses this subscription to consume log messages. 79 80 ```bash 81 $ gcloud pubsub subscriptions create cloud-logs --topic=$TOPIC_ID \ 82 --ack-deadline=$ACK_DEADLINE \ 83 --message-retention-duration=$RETENTION_DURATION \ 84 ``` 85 86 e.g: 87 ```bash 88 $ gcloud pubsub subscriptions create cloud-logs --topic=projects/my-project/topics/cloud-logs \ 89 --ack-deadline=10 \ 90 --message-retention-duration=7d 91 ``` 92 93 For more fine grained options, refer to the `gcloud pubsub subscriptions --help` 94 95 ## ServiceAccount for Promtail 96 97 We need a service account with following permissions. 98 - pubsub.subscriber 99 100 This enables Promtail to read log entries from the pubsub subscription created before. 101 102 you can find example for Promtail scrape config for `gcplog` [here](../scraping/#gcplog-scraping) 103 104 If you are scraping logs from multiple GCP projects, then this serviceaccount should have above permissions in all the projects you are tyring to scrape. 105 106 ## Operations 107 108 Sometimes you may wish to clear the pending pubsub queue containing logs. 109 110 These messages stays in Pubsub Subscription until they're acknowledged. The following command removes log messages without needing to be consumed via Promtail or any other pubsub consumer. 111 112 ```bash 113 gcloud pubsub subscriptions seek <subscription-path> --time=<yyyy-mm-ddThh:mm:ss> 114 ``` 115 116 To delete all the old messages until now, set `--time` to current time. 117 118 ```bash 119 gcloud pubsub subscriptions seek projects/my-project/subscriptions/cloud-logs --time=$(date +%Y-%m-%dT%H:%M:%S) 120 ``` 121 122 ## Advanced log filter 123 124 So far we've covered admitting GCS bucket logs into Grafana Loki, but often one may need to add multiple cloud resource logs and may also need to exclude unnecessary logs. The following is a more complex example. 125 126 We use the `log-filter` option to include logs and the `exclusion` option to exclude them. 127 128 ### Use Case 129 Include following cloud resource logs 130 - GCS bucket 131 - Kubernetes 132 - IAM 133 - HTTP Load balancer 134 135 And we exclude specific HTTP load balancer logs based on payload and status code. 136 137 ``` 138 $ gcloud logging sinks create cloud-logs pubsub.googleapis.com/projects/my-project/topics/cloud-logs \ 139 --log-filter='resource.type=("gcs_bucket OR k8s_cluster OR service_account OR iam_role OR api OR audited_resource OR http_load_balancer")' \ 140 --description="Cloud logs" \ 141 --exclusion='name=http_load_balancer,filter=<<EOF 142 resource.type="http_load_balancer" 143 ( 144 ( 145 jsonPayload.statusDetails=("byte_range_caching" OR "websocket_closed") 146 ) 147 OR 148 ( 149 http_request.status=(101 OR 206) 150 ) 151 ) 152 EOF 153 ```