github.com/vektah/gqlgen@v0.7.2/docs/content/reference/complexity.md (about)

     1  ---
     2  title: 'Preventing overly complex queries'
     3  description: Avoid denial of service attacks by calculating query costs and limiting complexity.
     4  linkTitle: Query Complexity
     5  menu: { main: { parent: 'reference' } }
     6  ---
     7  
     8  GraphQL provides a powerful way to query your data, but putting great power in the hands of your API clients also exposes you to a risk of denial of service attacks. You can mitigate that risk with gqlgen by limiting the complexity of the queries you allow.
     9  
    10  ## Expensive Queries
    11  
    12  Consider a schema that allows listing blog posts. Each blog post is also related to other posts.
    13  
    14  ```graphql
    15  type Query {
    16  	posts(count: Int = 10): [Post!]!
    17  }
    18  
    19  type Post {
    20  	title: String!
    21  	text: String!
    22  	related(count: Int = 10): [Post!]!
    23  }
    24  ```
    25  
    26  It's not too hard to craft a query that will cause a very large response:
    27  
    28  ```graphql
    29  {
    30  	posts(count: 100) {
    31  		related(count: 100) {
    32  			related(count: 100) {
    33  				related(count: 100) {
    34  					title
    35  				}
    36  			}
    37  		}
    38  	}
    39  }
    40  ```
    41  
    42  The size of the response grows exponentially with each additional level of the `related` field. Fortunately, gqlgen's `http.Handler` includes a way to guard against this type of query.
    43  
    44  ## Limiting Query Complexity
    45  
    46  Limiting query complexity is as simple as adding a parameter to the `handler.GraphQL` function call:
    47  
    48  ```go
    49  func main() {
    50      c := Config{ Resolvers: &resolvers{} }
    51      gqlHandler := handler.GraphQL(
    52          blog.NewExecutableSchema(c),
    53          handler.ComplexityLimit(5), // This line is the key
    54      )
    55      http.Handle("/query", gqlHandler)
    56  }
    57  ```
    58  
    59  Now any query with complexity greater than 5 is rejected by the API. By default, each field and level of depth adds one to the overall query complexity.
    60  
    61  This helps, but we still have a problem: the `posts` and `related` fields, which return arrays, are much more expensive to resolve than the scalar `title` and `text` fields. However, the default complexity calculation weights them equally. It would make more sense to apply a higher cost to the array fields.
    62  
    63  ## Custom Complexity Calculation
    64  
    65  To apply higher costs to certain fields, we can use custom complexity functions.
    66  
    67  ```go
    68  func main() {
    69      c := Config{ Resolvers: &resolvers{} }
    70  
    71      countComplexity := func(childComplexity, count int) int {
    72          return count * childComplexity
    73      }
    74      c.Complexity.Query.Posts = countComplexity
    75      c.Complexity.Post.Related = countComplexity
    76  
    77      gqlHandler := handler.GraphQL(
    78          blog.NewExecutableSchema(c),
    79          handler.ComplexityLimit(100),
    80      )
    81      http.Handle("/query", gqlHandler)
    82  }
    83  ```
    84  
    85  When we assign a function to the appropriate `Complexity` field, that function is used in the complexity calculation. Here, the `posts` and `related` fields are weighted according to the value of their `count` parameter. This means that the more posts a client requests, the higher the query complexity. And just like the size of the response would increase exponentially in our original query, the complexity would also increase exponentially, so any client trying to abuse the API would run into the limit very quickly.
    86  
    87  By applying a query complexity limit and specifying custom complexity functions in the right places, you can easily prevent clients from using a disproportionate amount of resources and disrupting your service.