github.com/hugorut/terraform@v1.1.3/website/docs/language/modules/testing-experiment.mdx (about)

     1  ---
     2  page_title: Module Testing Experiment - Configuration Language
     3  ---
     4  
     5  # Module Testing Experiment
     6  
     7  This page is about some experimental features available in recent versions of
     8  Terraform CLI related to integration testing of shared modules.
     9  
    10  The Terraform team is aiming to use these features to gather feedback as part
    11  of ongoing research into different strategies for testing Terraform modules.
    12  These features are likely to change significantly in future releases based on
    13  feedback.
    14  
    15  ## Current Research Goals
    16  
    17  Our initial area of research is into the question of whether it's helpful and
    18  productive to write module integration tests in the Terraform language itself,
    19  or whether it's better to handle that as a separate concern orchestrated by
    20  code written in other languages.
    21  
    22  Some existing efforts have piloted both approaches:
    23  
    24  * [Terratest](https://terratest.gruntwork.io/) and
    25    [kitchen-terraform](https://github.com/newcontext-oss/kitchen-terraform)
    26    both pioneered the idea of writing tests for Terraform modules with explicit
    27    orchestration written in the Go and Ruby programming languages, respectively.
    28  
    29  * The Terraform provider
    30    [`apparentlymart/testing`](https://registry.terraform.io/providers/apparentlymart/testing/latest)
    31    introduced the idea of writing Terraform module tests in the Terraform
    32    language itself, using a special provider that can evaluate assertions
    33    and fail `terraform apply` if they don't pass.
    34  
    35  Both of these approaches have both advantages and disadvantages, and so it's
    36  likely that both will coexist for different situations, but the community
    37  efforts have already explored the external-language testing model quite deeply
    38  while the Terraform-integrated testing model has not yet been widely trialled.
    39  For that reason, the current iteration of the module testing experiment is
    40  aimed at trying to make the Terraform-integrated approach more accessible so
    41  that more module authors can hopefully try it and share their experiences.
    42  
    43  ## Current Experimental Features
    44  
    45  -> This page describes the incarnation of the experimental features introduced
    46  in **Terraform CLI v0.15.0**. If you are using an earlier version of Terraform
    47  then you'll need to upgrade to v0.15.0 or later to use the experimental features
    48  described here, though you only need to use v0.15.0 or later for running tests;
    49  your module itself can remain compatible with earlier Terraform versions, if
    50  needed.
    51  
    52  Our current area of interest is in what sorts of tests can and cannot be
    53  written using features integrated into the Terraform language itself. As a
    54  means to investigate that without invasive, cross-cutting changes to Terraform
    55  Core we're using a special built-in Terraform provider as a placeholder for
    56  potential new features.
    57  
    58  If this experiment is successful then we expect to run a second round of
    59  research and design about exactly what syntax is most ergonomic for writing
    60  tests, but for the moment we're interested less in the specific syntax and more
    61  in the capabilities of this approach.
    62  
    63  The temporary extensions to Terraform for this experiment consist of the
    64  following parts:
    65  
    66  * A temporary experimental provider `terraform.io/builtin/test`, which acts as
    67    a placeholder for potential new language features related to test assertions.
    68  
    69  * A `terraform test` command for more conveniently running multiple tests in
    70    a single action.
    71  
    72  * An experimental convention of placing test configurations in subdirectories
    73    of a `tests` directory within your module, which `terraform test` will then
    74    discover and run.
    75  
    76  We would like to invite adventurous module authors to try writing integration
    77  tests for their modules using these mechanisms, and ideally also share the
    78  tests you write (in a temporary VCS branch, if necessary) so we can see what
    79  you were able to test, along with anything you felt unable to test in this way.
    80  
    81  If you're interested in giving this a try, see the following sections for
    82  usage details. Because these features are temporary experimental extensions,
    83  there's some boilerplate required to activate and make use of it which would
    84  likely not be required in a final design.
    85  
    86  ### Writing Tests for a Module
    87  
    88  For the purposes of the current experiment, module tests are arranged into
    89  _test suites_, each of which is a root Terraform module which includes a
    90  `module` block calling the module under test, and ideally also a number of
    91  test assertions to verify that the module outputs match expectations.
    92  
    93  In the same directory where you keep your module's `.tf` and/or `.tf.json`
    94  source files, create a subdirectory called `tests`. Under that directory,
    95  make another directory which will serve as your first test suite, with a
    96  directory name that concisely describes what the suite is aiming to test.
    97  
    98  Here's an example directory structure of a typical module directory layout
    99  with the addition of a test suite called `defaults`:
   100  
   101  ```
   102  main.tf
   103  outputs.tf
   104  providers.tf
   105  variables.tf
   106  versions.tf
   107  tests/
   108    defaults/
   109      test_defaults.tf
   110  ```
   111  
   112  The `tests/defaults/test_defaults.tf` file will contain a call to the
   113  main module with a suitable set of arguments and hopefully also one or more
   114  resources that will, for the sake of the experiment, serve as the temporary
   115  syntax for defining test assertions. For example:
   116  
   117  ```hcl
   118  terraform {
   119    required_providers {
   120      # Because we're currently using a built-in provider as
   121      # a substitute for dedicated Terraform language syntax
   122      # for now, test suite modules must always declare a
   123      # dependency on this provider. This provider is only
   124      # available when running tests, so you shouldn't use it
   125      # in non-test modules.
   126      test = {
   127        source = "terraform.io/builtin/test"
   128      }
   129  
   130      # This example also uses the "http" data source to
   131      # verify the behavior of the hypothetical running
   132      # service, so we should declare that too.
   133      http = {
   134        source = "hashicorp/http"
   135      }
   136    }
   137  }
   138  
   139  module "main" {
   140    # source is always ../.. for test suite configurations,
   141    # because they are placed two subdirectories deep under
   142    # the main module directory.
   143    source = "../.."
   144  
   145    # This test suite is aiming to test the "defaults" for
   146    # this module, so it doesn't set any input variables
   147    # and just lets their default values be selected instead.
   148  }
   149  
   150  # As with all Terraform modules, we can use local values
   151  # to do any necessary post-processing of the results from
   152  # the module in preparation for writing test assertions.
   153  locals {
   154    # This expression also serves as an implicit assertion
   155    # that the base URL uses URL syntax; the test suite
   156    # will fail if this function fails.
   157    api_url_parts = regex(
   158      "^(?:(?P<scheme>[^:/?#]+):)?(?://(?P<authority>[^/?#]*))?",
   159      module.main.api_url,
   160    )
   161  }
   162  
   163  # The special test_assertions resource type, which belongs
   164  # to the test provider we required above, is a temporary
   165  # syntax for writing out explicit test assertions.
   166  resource "test_assertions" "api_url" {
   167    # "component" serves as a unique identifier for this
   168    # particular set of assertions in the test results.
   169    component = "api_url"
   170  
   171    # equal and check blocks serve as the test assertions.
   172    # the labels on these blocks are unique identifiers for
   173    # the assertions, to allow more easily tracking changes
   174    # in success between runs.
   175  
   176    equal "scheme" {
   177      description = "default scheme is https"
   178      got         = local.api_url_parts.scheme
   179      want        = "https"
   180    }
   181  
   182    check "port_number" {
   183      description = "default port number is 8080"
   184      condition   = can(regex(":8080$", local.api_url_parts.authority))
   185    }
   186  }
   187  
   188  # We can also use data resources to respond to the
   189  # behavior of the real remote system, rather than
   190  # just to values within the Terraform configuration.
   191  data "http" "api_response" {
   192    depends_on = [
   193      # make sure the syntax assertions run first, so
   194      # we'll be sure to see if it was URL syntax errors
   195      # that let to this data resource also failing.
   196      test_assertions.api_url,
   197    ]
   198  
   199    url = module.main.api_url
   200  }
   201  
   202  resource "test_assertions" "api_response" {
   203    component = "api_response"
   204  
   205    check "valid_json" {
   206      description = "base URL responds with valid JSON"
   207      condition   = can(jsondecode(data.http.api_response.body))
   208    }
   209  }
   210  ```
   211  
   212  If you like, you can create additional directories alongside
   213  the `default` directory to define additional test suites that
   214  pass different variable values into the main module, and
   215  then include assertions that verify that the result has changed
   216  in the expected way.
   217  
   218  ### Running Your Tests
   219  
   220  The `terraform test` command aims to make it easier to exercise all of your
   221  defined test suites at once, and see only the output related to any test
   222  failures or errors.
   223  
   224  The current experimental incarnation of this command expects to be run from
   225  your main module directory. In our example directory structure above,
   226  that was the directory containing `main.tf` etc, and _not_ the specific test
   227  suite directory containing `test_defaults.tf`.
   228  
   229  Because these test suites are integration tests rather than unit tests, you'll
   230  need to set up any credentials files or environment variables needed by the
   231  providers your module uses before running `terraform test`. The test command
   232  will, for each suite:
   233  
   234  * Install the providers and any external modules the test configuration depends
   235    on.
   236  * Create an execution plan to create the objects declared in the module.
   237  * Apply that execution plan to create the objects in the real remote system.
   238  * Collect all of the test results from the apply step, which would also have
   239    "created" the `test_assertions` resources.
   240  * Destroy all of the objects recorded in the temporary test state, as if running
   241    `terraform destroy` against the test configuration.
   242  
   243  ```shellsession
   244  $ terraform test
   245  ─── Failed: defaults.api_url.scheme (default scheme is https) ───────────────
   246  wrong value
   247      got:  "http"
   248      want: "https"
   249  ─────────────────────────────────────────────────────────────────────────────
   250  ```
   251  
   252  In this case, it seems like the module returned an `http` rather than an
   253  `https` URL in the default case, and so the `defaults.api_url.scheme`
   254  assertion failed, and the `terraform test` command detected and reported it.
   255  
   256  The `test_assertions` resource captures any assertion failures but does not
   257  return an error, because that can then potentially allow downstream
   258  assertions to also run and thus capture as much context as possible.
   259  However, if Terraform encounters any _errors_ while processing the test
   260  configuration it will halt processing, which may cause some of the test
   261  assertions to be skipped.
   262  
   263  ## Known Limitations
   264  
   265  The design above is very much a prototype aimed at gathering more experience
   266  with the possibilities of testing inside the Terraform language. We know it's
   267  currently somewhat non-ergonomic, and hope to improve on that in later phases
   268  of research and design, but the main focus of this iteration is on available
   269  functionality and so with that in mind there are some specific possibilities
   270  that we know the current prototype doesn't support well:
   271  
   272  * Testing of subsequent updates to an existing deployment of a module.
   273    Currently tests written in this way can only exercise the create and destroy
   274    behaviors.
   275  
   276  * Assertions about expected errors. For a module that includes variable
   277    validation rules and data resources that function as assertion checks,
   278    the current prototype doesn't have any way to express that a particular
   279    set of inputs is _expected_ to produce an error, and thus report a test
   280    failure if it doesn't. We'll hopefully be able to improve on this in a future
   281    iteration with the test assertions better integrated into the language.
   282  
   283  * Capturing context about failures. Due to this prototype using a provider as
   284    an approximation for new assertion syntax, the `terraform test` command is
   285    limited in how much context it's able to gather about failures. A design
   286    more integrated into the language could potentially capture the source
   287    expressions and input values to give better feedback about what went wrong,
   288    similar to what Terraform typically returns from expression evaluation errors
   289    in the main language.
   290  
   291  * Unit testing without creating real objects. Although we do hope to spend more
   292    time researching possibilities for unit testing against fake test doubles in
   293    the future, we've decided to focus on integration testing to start because
   294    it feels like the better-defined problem.
   295  
   296  ## Sending Feedback
   297  
   298  The sort of feedback we'd most like to see at this stage of the experiment is
   299  to see the source code of any tests you've written against real modules using
   300  the features described above, along with notes about anything that you
   301  attempted to test but were blocked from doing so by limitations of the above
   302  features. The most ideal way to share that would be to share a link to a
   303  version control branch where you've added such tests, if your module is open
   304  source.
   305  
   306  If you've previously written or attempted to write tests in an external
   307  language, using a system like Terratest or kitchen-terraform, we'd also be
   308  interested to hear about comparative differences between the two: what worked
   309  well in each and what didn't work so well.
   310  
   311  Our ultimate goal is to work towards an integration testing methodology which
   312  strikes the best compromise between the capabilities of these different
   313  approaches, ideally avoiding a hard requirement on any particular external
   314  language and fitting well into the Terraform workflow.
   315  
   316  Since this is still early work and likely to lead to unstructured discussion,
   317  we'd like to gather feedback primarily via new topics in
   318  [the community forum](https://discuss.hashicorp.com/c/terraform-core/27). That
   319  way we can have some more freedom to explore different ideas and approaches
   320  without the structural requirements we typically impose on GitHub issues.
   321  
   322  Any feedback you'd like to share would be very welcome!