github.com/terramate-io/tf@v0.0.0-20230830114523-fce866b4dfcd/website/docs/language/modules/testing-experiment.mdx (about)

     1  ---
     2  page_title: Module Testing Experiment - Configuration Language
     3  description: Part of the ongoing design research for module integration testing.
     4  ---
     5  
     6  # Module Testing Experiment
     7  
     8  This page is about some experimental features available in recent versions of
     9  Terraform CLI related to integration testing of shared modules.
    10  
    11  The Terraform team is aiming to use these features to gather feedback as part
    12  of ongoing research into different strategies for testing Terraform modules.
    13  These features are likely to change significantly in future releases based on
    14  feedback.
    15  
    16  ## Current Research Goals
    17  
    18  Our initial area of research is into the question of whether it's helpful and
    19  productive to write module integration tests in the Terraform language itself,
    20  or whether it's better to handle that as a separate concern orchestrated by
    21  code written in other languages.
    22  
    23  Some existing efforts have piloted both approaches:
    24  
    25  * [Terratest](https://terratest.gruntwork.io/) and
    26  [kitchen-terraform](https://github.com/newcontext-oss/kitchen-terraform)
    27  both pioneered the idea of writing tests for Terraform modules with explicit
    28  orchestration written in the Go and Ruby programming languages, respectively.
    29  
    30  * The Terraform provider
    31  [`apparentlymart/testing`](https://registry.terraform.io/providers/apparentlymart/testing/latest)
    32  introduced the idea of writing Terraform module tests in the Terraform
    33  language itself, using a special provider that can evaluate assertions
    34  and fail `terraform apply` if they don't pass.
    35  
    36  Both of these approaches have both advantages and disadvantages, and so it's
    37  likely that both will coexist for different situations, but the community
    38  efforts have already explored the external-language testing model quite deeply
    39  while the Terraform-integrated testing model has not yet been widely trialled.
    40  For that reason, the current iteration of the module testing experiment is
    41  aimed at trying to make the Terraform-integrated approach more accessible so
    42  that more module authors can hopefully try it and share their experiences.
    43  
    44  ## Current Experimental Features
    45  
    46  -> This page describes the incarnation of the experimental features introduced
    47  in **Terraform CLI v0.15.0**. If you are using an earlier version of Terraform
    48  then you'll need to upgrade to v0.15.0 or later to use the experimental features
    49  described here, though you only need to use v0.15.0 or later for running tests;
    50  your module itself can remain compatible with earlier Terraform versions, if
    51  needed.
    52  
    53  Our current area of interest is in what sorts of tests can and cannot be
    54  written using features integrated into the Terraform language itself. As a
    55  means to investigate that without invasive, cross-cutting changes to Terraform
    56  Core we're using a special built-in Terraform provider as a placeholder for
    57  potential new features.
    58  
    59  If this experiment is successful then we expect to run a second round of
    60  research and design about exactly what syntax is most ergonomic for writing
    61  tests, but for the moment we're interested less in the specific syntax and more
    62  in the capabilities of this approach.
    63  
    64  The temporary extensions to Terraform for this experiment consist of the
    65  following parts:
    66  
    67  * A temporary experimental provider `terraform.io/builtin/test`, which acts as
    68  a placeholder for potential new language features related to test assertions.
    69  
    70  * A `terraform test` command for more conveniently running multiple tests in
    71  a single action.
    72  
    73  * An experimental convention of placing test configurations in subdirectories
    74  of a `tests` directory within your module, which `terraform test` will then
    75  discover and run.
    76  
    77  We would like to invite adventurous module authors to try writing integration
    78  tests for their modules using these mechanisms, and ideally also share the
    79  tests you write (in a temporary VCS branch, if necessary) so we can see what
    80  you were able to test, along with anything you felt unable to test in this way.
    81  
    82  If you're interested in giving this a try, see the following sections for
    83  usage details. Because these features are temporary experimental extensions,
    84  there's some boilerplate required to activate and make use of it which would
    85  likely not be required in a final design.
    86  
    87  ### Writing Tests for a Module
    88  
    89  For the purposes of the current experiment, module tests are arranged into
    90  _test suites_, each of which is a root Terraform module which includes a
    91  `module` block calling the module under test, and ideally also a number of
    92  test assertions to verify that the module outputs match expectations.
    93  
    94  In the same directory where you keep your module's `.tf` and/or `.tf.json`
    95  source files, create a subdirectory called `tests`. Under that directory,
    96  make another directory which will serve as your first test suite, with a
    97  directory name that concisely describes what the suite is aiming to test.
    98  
    99  Here's an example directory structure of a typical module directory layout
   100  with the addition of a test suite called `defaults`:
   101  
   102  ```
   103  main.tf
   104  outputs.tf
   105  providers.tf
   106  variables.tf
   107  versions.tf
   108  tests/
   109    defaults/
   110      test_defaults.tf
   111  ```
   112  
   113  The `tests/defaults/test_defaults.tf` file will contain a call to the
   114  main module with a suitable set of arguments and hopefully also one or more
   115  resources that will, for the sake of the experiment, serve as the temporary
   116  syntax for defining test assertions. For example:
   117  
   118  ```hcl
   119  terraform {
   120    required_providers {
   121      # Because we're currently using a built-in provider as
   122      # a substitute for dedicated Terraform language syntax
   123      # for now, test suite modules must always declare a
   124      # dependency on this provider. This provider is only
   125      # available when running tests, so you shouldn't use it
   126      # in non-test modules.
   127      test = {
   128        source = "terraform.io/builtin/test"
   129      }
   130      # This example also uses the "http" data source to
   131      # verify the behavior of the hypothetical running
   132      # service, so we should declare that too.
   133      http = {
   134        source = "hashicorp/http"
   135      }
   136    }
   137  }
   138  module "main" {
   139    # source is always ../.. for test suite configurations,
   140    # because they are placed two subdirectories deep under
   141    # the main module directory.
   142    source = "../.."
   143    # This test suite is aiming to test the "defaults" for
   144    # this module, so it doesn't set any input variables
   145    # and just lets their default values be selected instead.
   146  }
   147  # As with all Terraform modules, we can use local values
   148  # to do any necessary post-processing of the results from
   149  # the module in preparation for writing test assertions.
   150  locals {
   151    # This expression also serves as an implicit assertion
   152    # that the base URL uses URL syntax; the test suite
   153    # will fail if this function fails.
   154    api_url_parts = regex(
   155      "^(?:(?P<scheme>[^:/?#]+):)?(?://(?P<authority>[^/?#]*))?",
   156      module.main.api_url,
   157    )
   158  }
   159  # The special test_assertions resource type, which belongs
   160  # to the test provider we required above, is a temporary
   161  # syntax for writing out explicit test assertions.
   162  resource "test_assertions" "api_url" {
   163    # "component" serves as a unique identifier for this
   164    # particular set of assertions in the test results.
   165    component = "api_url"
   166    # equal and check blocks serve as the test assertions.
   167    # the labels on these blocks are unique identifiers for
   168    # the assertions, to allow more easily tracking changes
   169    # in success between runs.
   170    equal "scheme" {
   171      description = "default scheme is https"
   172      got         = local.api_url_parts.scheme
   173      want        = "https"
   174    }
   175    check "port_number" {
   176      description = "default port number is 8080"
   177      condition   = can(regex(":8080$", local.api_url_parts.authority))
   178    }
   179  }
   180  # We can also use data resources to respond to the
   181  # behavior of the real remote system, rather than
   182  # just to values within the Terraform configuration.
   183  data "http" "api_response" {
   184    depends_on = [
   185      # make sure the syntax assertions run first, so
   186      # we'll be sure to see if it was URL syntax errors
   187      # that let to this data resource also failing.
   188      test_assertions.api_url,
   189    ]
   190    url = module.main.api_url
   191  }
   192  resource "test_assertions" "api_response" {
   193    component = "api_response"
   194    check "valid_json" {
   195      description = "base URL responds with valid JSON"
   196      condition   = can(jsondecode(data.http.api_response.body))
   197    }
   198  }
   199  ```
   200  
   201  If you like, you can create additional directories alongside
   202  the `default` directory to define additional test suites that
   203  pass different variable values into the main module, and
   204  then include assertions that verify that the result has changed
   205  in the expected way.
   206  
   207  ### Running Your Tests
   208  
   209  The `terraform test` command aims to make it easier to exercise all of your
   210  defined test suites at once, and see only the output related to any test
   211  failures or errors.
   212  
   213  The current experimental incarnation of this command expects to be run from
   214  your main module directory. In our example directory structure above,
   215  that was the directory containing `main.tf` etc, and _not_ the specific test
   216  suite directory containing `test_defaults.tf`.
   217  
   218  Because these test suites are integration tests rather than unit tests, you'll
   219  need to set up any credentials files or environment variables needed by the
   220  providers your module uses before running `terraform test`. The test command
   221  will, for each suite:
   222  
   223  * Install the providers and any external modules the test configuration depends
   224  on.
   225  * Create an execution plan to create the objects declared in the module.
   226  * Apply that execution plan to create the objects in the real remote system.
   227  * Collect all of the test results from the apply step, which would also have
   228  "created" the `test_assertions` resources.
   229  * Destroy all of the objects recorded in the temporary test state, as if running
   230  `terraform destroy` against the test configuration.
   231  
   232  ```shellsession
   233  $ terraform test
   234  ─── Failed: defaults.api_url.scheme (default scheme is https) ───────────────
   235  wrong value
   236      got:  "http"
   237      want: "https"
   238  ─────────────────────────────────────────────────────────────────────────────
   239  ```
   240  
   241  In this case, it seems like the module returned an `http` rather than an
   242  `https` URL in the default case, and so the `defaults.api_url.scheme`
   243  assertion failed, and the `terraform test` command detected and reported it.
   244  
   245  The `test_assertions` resource captures any assertion failures but does not
   246  return an error, because that can then potentially allow downstream
   247  assertions to also run and thus capture as much context as possible.
   248  However, if Terraform encounters any _errors_ while processing the test
   249  configuration it will halt processing, which may cause some of the test
   250  assertions to be skipped.
   251  
   252  ## Known Limitations
   253  
   254  The design above is very much a prototype aimed at gathering more experience
   255  with the possibilities of testing inside the Terraform language. We know it's
   256  currently somewhat non-ergonomic, and hope to improve on that in later phases
   257  of research and design, but the main focus of this iteration is on available
   258  functionality and so with that in mind there are some specific possibilities
   259  that we know the current prototype doesn't support well:
   260  
   261  * Testing of subsequent updates to an existing deployment of a module.
   262  Tests written in this way can only exercise the create and destroy
   263  behaviors.
   264  
   265  * Assertions about expected errors. For a module that includes variable
   266  validation rules and data resources that function as assertion checks,
   267  the current prototype doesn't have any way to express that a particular
   268  set of inputs is _expected_ to produce an error, and thus report a test
   269  failure if it doesn't. We'll hopefully be able to improve on this in a future
   270  iteration with the test assertions better integrated into the language.
   271  
   272  * Capturing context about failures. Due to this prototype using a provider as
   273  an approximation for new assertion syntax, the `terraform test` command is
   274  limited in how much context it's able to gather about failures. A design
   275  more integrated into the language could potentially capture the source
   276  expressions and input values to give better feedback about what went wrong,
   277  similar to what Terraform typically returns from expression evaluation errors
   278  in the main language.
   279  
   280  * Unit testing without creating real objects. Although we do hope to spend more
   281  time researching possibilities for unit testing against fake test doubles in
   282  the future, we've decided to focus on integration testing to start because
   283  it feels like the better-defined problem.
   284  
   285  ## Sending Feedback
   286  
   287  The sort of feedback we'd most like to see at this stage of the experiment is
   288  to see the source code of any tests you've written against real modules using
   289  the features described above, along with notes about anything that you
   290  attempted to test but were blocked from doing so by limitations of the above
   291  features. The most ideal way to share that would be to share a link to a
   292  version control branch where you've added such tests, if your module is open
   293  source.
   294  
   295  If you've previously written or attempted to write tests in an external
   296  language, using a system like Terratest or kitchen-terraform, we'd also be
   297  interested to hear about comparative differences between the two: what worked
   298  well in each and what didn't work so well.
   299  
   300  Our ultimate goal is to work towards an integration testing methodology which
   301  strikes the best compromise between the capabilities of these different
   302  approaches, ideally avoiding a hard requirement on any particular external
   303  language and fitting well into the Terraform workflow.
   304  
   305  Since this is still early work and likely to lead to unstructured discussion,
   306  we'd like to gather feedback primarily via new topics in
   307  [the community forum](https://discuss.hashicorp.com/c/terraform-core/27). That
   308  way we can have some more freedom to explore different ideas and approaches
   309  without the structural requirements we typically impose on GitHub issues.
   310  
   311  Any feedback you'd like to share would be very welcome!