github.com/muratcelep/terraform@v1.1.0-beta2-not-internal-4/website/docs/language/modules/testing-experiment.html.md (about)

     1  ---
     2  layout: "language"
     3  page_title: "Module Testing Experiment - Configuration Language"
     4  ---
     5  
     6  # Module Testing Experiment
     7  
     8  This page is about some experimental features available in recent versions of
     9  Terraform CLI related to integration testing of shared modules.
    10  
    11  The Terraform team is aiming to use these features to gather feedback as part
    12  of ongoing research into different strategies for testing Terraform modules.
    13  These features are likely to change significantly in future releases based on
    14  feedback.
    15  
    16  ## Current Research Goals
    17  
    18  Our initial area of research is into the question of whether it's helpful and
    19  productive to write module integration tests in the Terraform language itself,
    20  or whether it's better to handle that as a separate concern orchestrated by
    21  code written in other languages.
    22  
    23  Some existing efforts have piloted both approaches:
    24  
    25  * [Terratest](https://terratest.gruntwork.io/) and
    26    [kitchen-terraform](https://github.com/newcontext-oss/kitchen-terraform)
    27    both pioneered the idea of writing tests for Terraform modules with explicit
    28    orchestration written in the Go and Ruby programming languages, respectively.
    29  
    30  * The Terraform provider
    31    [`apparentlymart/testing`](https://registry.terraform.io/providers/apparentlymart/testing/latest)
    32    introduced the idea of writing Terraform module tests in the Terraform
    33    language itself, using a special provider that can evaluate assertions
    34    and fail `terraform apply` if they don't pass.
    35  
    36  Both of these approaches have both advantages and disadvantages, and so it's
    37  likely that both will coexist for different situations, but the community
    38  efforts have already explored the external-language testing model quite deeply
    39  while the Terraform-integrated testing model has not yet been widely trialled.
    40  For that reason, the current iteration of the module testing experiment is
    41  aimed at trying to make the Terraform-integrated approach more accessible so
    42  that more module authors can hopefully try it and share their experiences.
    43  
    44  ## Current Experimental Features
    45  
    46  -> This page describes the incarnation of the experimental features introduced
    47  in **Terraform CLI v0.15.0**. If you are using an earlier version of Terraform
    48  then you'll need to upgrade to v0.15.0 or later to use the experimental features
    49  described here, though you only need to use v0.15.0 or later for running tests;
    50  your module itself can remain compatible with earlier Terraform versions, if
    51  needed.
    52  
    53  Our current area of interest is in what sorts of tests can and cannot be
    54  written using features integrated into the Terraform language itself. As a
    55  means to investigate that without invasive, cross-cutting changes to Terraform
    56  Core we're using a special built-in Terraform provider as a placeholder for
    57  potential new features.
    58  
    59  If this experiment is successful then we expect to run a second round of
    60  research and design about exactly what syntax is most ergonomic for writing
    61  tests, but for the moment we're interested less in the specific syntax and more
    62  in the capabilities of this approach.
    63  
    64  The temporary extensions to Terraform for this experiment consist of the
    65  following parts:
    66  
    67  * A temporary experimental provider `terraform.io/builtin/test`, which acts as
    68    a placeholder for potential new language features related to test assertions.
    69  
    70  * A `terraform test` command for more conveniently running multiple tests in
    71    a single action.
    72  
    73  * An experimental convention of placing test configurations in subdirectories
    74    of a `tests` directory within your module, which `terraform test` will then
    75    discover and run.
    76  
    77  We would like to invite adventurous module authors to try writing integration
    78  tests for their modules using these mechanisms, and ideally also share the
    79  tests you write (in a temporary VCS branch, if necessary) so we can see what
    80  you were able to test, along with anything you felt unable to test in this way.
    81  
    82  If you're interested in giving this a try, see the following sections for
    83  usage details. Because these features are temporary experimental extensions,
    84  there's some boilerplate required to activate and make use of it which would
    85  likely not be required in a final design.
    86  
    87  ### Writing Tests for a Module
    88  
    89  For the purposes of the current experiment, module tests are arranged into
    90  _test suites_, each of which is a root Terraform module which includes a
    91  `module` block calling the module under test, and ideally also a number of
    92  test assertions to verify that the module outputs match expectations.
    93  
    94  In the same directory where you keep your module's `.tf` and/or `.tf.json`
    95  source files, create a subdirectory called `tests`. Under that directory,
    96  make another directory which will serve as your first test suite, with a
    97  directory name that concisely describes what the suite is aiming to test.
    98  
    99  Here's an example directory structure of a typical module directory layout
   100  with the addition of a test suite called `defaults`:
   101  
   102  ```
   103  main.tf
   104  outputs.tf
   105  providers.tf
   106  variables.tf
   107  versions.tf
   108  tests/
   109    defaults/
   110      test_defaults.tf
   111  ```
   112  
   113  The `tests/defaults/test_defaults.tf` file will contain a call to the
   114  main module with a suitable set of arguments and hopefully also one or more
   115  resources that will, for the sake of the experiment, serve as the temporary
   116  syntax for defining test assertions. For example:
   117  
   118  ```hcl
   119  terraform {
   120    required_providers {
   121      # Because we're currently using a built-in provider as
   122      # a substitute for dedicated Terraform language syntax
   123      # for now, test suite modules must always declare a
   124      # dependency on this provider. This provider is only
   125      # available when running tests, so you shouldn't use it
   126      # in non-test modules.
   127      test = {
   128        source = "terraform.io/builtin/test"
   129      }
   130  
   131      # This example also uses the "http" data source to
   132      # verify the behavior of the hypothetical running
   133      # service, so we should declare that too.
   134      http = {
   135        source = "hashicorp/http"
   136      }
   137    }
   138  }
   139  
   140  module "main" {
   141    # source is always ../.. for test suite configurations,
   142    # because they are placed two subdirectories deep under
   143    # the main module directory.
   144    source = "../.."
   145  
   146    # This test suite is aiming to test the "defaults" for
   147    # this module, so it doesn't set any input variables
   148    # and just lets their default values be selected instead.
   149  }
   150  
   151  # As with all Terraform modules, we can use local values
   152  # to do any necessary post-processing of the results from
   153  # the module in preparation for writing test assertions.
   154  locals {
   155    # This expression also serves as an implicit assertion
   156    # that the base URL uses URL syntax; the test suite
   157    # will fail if this function fails.
   158    api_url_parts = regex(
   159      "^(?:(?P<scheme>[^:/?#]+):)?(?://(?P<authority>[^/?#]*))?",
   160      module.main.api_url,
   161    )
   162  }
   163  
   164  # The special test_assertions resource type, which belongs
   165  # to the test provider we required above, is a temporary
   166  # syntax for writing out explicit test assertions.
   167  resource "test_assertions" "api_url" {
   168    # "component" serves as a unique identifier for this
   169    # particular set of assertions in the test results.
   170    component = "api_url"
   171  
   172    # equal and check blocks serve as the test assertions.
   173    # the labels on these blocks are unique identifiers for
   174    # the assertions, to allow more easily tracking changes
   175    # in success between runs.
   176  
   177    equal "scheme" {
   178      description = "default scheme is https"
   179      got         = local.api_url_parts.scheme
   180      want        = "https"
   181    }
   182  
   183    check "port_number" {
   184      description = "default port number is 8080"
   185      condition   = can(regex(":8080$", local.api_url_parts.authority))
   186    }
   187  }
   188  
   189  # We can also use data resources to respond to the
   190  # behavior of the real remote system, rather than
   191  # just to values within the Terraform configuration.
   192  data "http" "api_response" {
   193    depends_on = [
   194      # make sure the syntax assertions run first, so
   195      # we'll be sure to see if it was URL syntax errors
   196      # that let to this data resource also failing.
   197      test_assertions.api_url,
   198    ]
   199  
   200    url = module.main.api_url
   201  }
   202  
   203  resource "test_assertions" "api_response" {
   204    component = "api_response"
   205  
   206    check "valid_json" {
   207      description = "base URL responds with valid JSON"
   208      condition   = can(jsondecode(data.http.api_response.body))
   209    }
   210  }
   211  ```
   212  
   213  If you like, you can create additional directories alongside
   214  the `default` directory to define additional test suites that
   215  pass different variable values into the main module, and
   216  then include assertions that verify that the result has changed
   217  in the expected way.
   218  
   219  ### Running Your Tests
   220  
   221  The `terraform test` command aims to make it easier to exercise all of your
   222  defined test suites at once, and see only the output related to any test
   223  failures or errors.
   224  
   225  The current experimental incarnation of this command expects to be run from
   226  your main module directory. In our example directory structure above,
   227  that was the directory containing `main.tf` etc, and _not_ the specific test
   228  suite directory containing `test_defaults.tf`.
   229  
   230  Because these test suites are integration tests rather than unit tests, you'll
   231  need to set up any credentials files or environment variables needed by the
   232  providers your module uses before running `terraform test`. The test command
   233  will, for each suite:
   234  
   235  * Install the providers and any external modules the test configuration depends
   236    on.
   237  * Create an execution plan to create the objects declared in the module.
   238  * Apply that execution plan to create the objects in the real remote system.
   239  * Collect all of the test results from the apply step, which would also have
   240    "created" the `test_assertions` resources.
   241  * Destroy all of the objects recorded in the temporary test state, as if running
   242    `terraform destroy` against the test configuration.
   243  
   244  ```shellsession
   245  $ terraform test
   246  ─── Failed: defaults.api_url.scheme (default scheme is https) ───────────────
   247  wrong value
   248      got:  "http"
   249      want: "https"
   250  ─────────────────────────────────────────────────────────────────────────────
   251  ```
   252  
   253  In this case, it seems like the module returned an `http` rather than an
   254  `https` URL in the default case, and so the `defaults.api_url.scheme`
   255  assertion failed, and the `terraform test` command detected and reported it.
   256  
   257  The `test_assertions` resource captures any assertion failures but does not
   258  return an error, because that can then potentially allow downstream
   259  assertions to also run and thus capture as much context as possible.
   260  However, if Terraform encounters any _errors_ while processing the test
   261  configuration it will halt processing, which may cause some of the test
   262  assertions to be skipped.
   263  
   264  ## Known Limitations
   265  
   266  The design above is very much a prototype aimed at gathering more experience
   267  with the possibilities of testing inside the Terraform language. We know it's
   268  currently somewhat non-ergonomic, and hope to improve on that in later phases
   269  of research and design, but the main focus of this iteration is on available
   270  functionality and so with that in mind there are some specific possibilities
   271  that we know the current prototype doesn't support well:
   272  
   273  * Testing of subsequent updates to an existing deployment of a module.
   274    Currently tests written in this way can only exercise the create and destroy
   275    behaviors.
   276  
   277  * Assertions about expected errors. For a module that includes variable
   278    validation rules and data resources that function as assertion checks,
   279    the current prototype doesn't have any way to express that a particular
   280    set of inputs is _expected_ to produce an error, and thus report a test
   281    failure if it doesn't. We'll hopefully be able to improve on this in a future
   282    iteration with the test assertions better integrated into the language.
   283  
   284  * Capturing context about failures. Due to this prototype using a provider as
   285    an approximation for new assertion syntax, the `terraform test` command is
   286    limited in how much context it's able to gather about failures. A design
   287    more integrated into the language could potentially capture the source
   288    expressions and input values to give better feedback about what went wrong,
   289    similar to what Terraform typically returns from expression evaluation errors
   290    in the main language.
   291  
   292  * Unit testing without creating real objects. Although we do hope to spend more
   293    time researching possibilities for unit testing against fake test doubles in
   294    the future, we've decided to focus on integration testing to start because
   295    it feels like the better-defined problem.
   296  
   297  ## Sending Feedback
   298  
   299  The sort of feedback we'd most like to see at this stage of the experiment is
   300  to see the source code of any tests you've written against real modules using
   301  the features described above, along with notes about anything that you
   302  attempted to test but were blocked from doing so by limitations of the above
   303  features. The most ideal way to share that would be to share a link to a
   304  version control branch where you've added such tests, if your module is open
   305  source.
   306  
   307  If you've previously written or attempted to write tests in an external
   308  language, using a system like Terratest or kitchen-terraform, we'd also be
   309  interested to hear about comparative differences between the two: what worked
   310  well in each and what didn't work so well.
   311  
   312  Our ultimate goal is to work towards an integration testing methodology which
   313  strikes the best compromise between the capabilities of these different
   314  approaches, ideally avoiding a hard requirement on any particular external
   315  language and fitting well into the Terraform workflow.
   316  
   317  Since this is still early work and likely to lead to unstructured discussion,
   318  we'd like to gather feedback primarily via new topics in
   319  [the community forum](https://discuss.hashicorp.com/c/terraform-core/27). That
   320  way we can have some more freedom to explore different ideas and approaches
   321  without the structural requirements we typically impose on GitHub issues.
   322  
   323  Any feedback you'd like to share would be very welcome!