github.com/SUSE/skuba@v1.4.17/ci/infra/testrunner/tests/README.md (about)

     1  # Writing tests
     2  
     3  `testrunner` offers the `test` command which allows running tests using `testrunner`'s functionality for deploying infrastructure and executing `skuba` commands.
     4  
     5  Tests are based on [pytest](https://docs.pytest.org) framework and take advantage of features such as [`fixtures`](https://docs.pytest.org/en/latest/fixture.html) to facilitate test setup and tear down.
     6  
     7  Following pytest's standard test organization, tests must be defined in python files with a name following the pattern `xxxx_test.py`, where `xxx` is the name of the test suite. Each test is defined in the test file as an individual function or as a function in a class. Test functions must follow the name convention `test_xxxx` where `xxx` is the name of the test.
     8  
     9  See the following example:
    10  
    11  ```
    12  def test_add_worker(bootstrap, skuba):
    13      skuba.node_join(role="worker", nr=0)
    14      masters = skuba.num_of_nodes("master")
    15      workers = skuba.num_of_nodes("worker")
    16      assert masters == 1
    17      assert workers == 1
    18  ```
    19  Listing 1. Sample Test
    20  
    21  ## Using fixtures
    22  
    23  You may have noticed in the example above the two parameters to the `test_add_worker`, `setup` and `skuba`. These are `fixtures'.
    24  
    25  Testrunner provides the following fixtures:
    26  - conf: an object with the configuration read from the `vars` file.
    27  - platform: a Platform object
    28  - skuba: an Skuba object configured
    29  - target: the name of the target plaform
    30  
    31  Tests can define and use additional fixtures, such as the `setup` fixture in the example above, which executes the initialziation of the cluster. When used for this purpose, one interesting feature is the definition of a finalizer function which is executed automatically when a test that uses this fixture ends, either successfully or due to an error.
    32  
    33  The example below shows a fixture that provides a bootstrapped cluster. It also automatically cleans up the allocated resources by adding the `cleanup` function as a finalizer:
    34  
    35  ```
    36  @pytest.fixture
    37  def setup(request, platform, skuba):
    38      def cleanup():
    39          platform.cleanup()
    40      request.addfinalizer(cleanup)
    41  
    42      platform.provision()
    43      skuba.cluster_init()
    44      skuba.node_bootstrap()
    45  ```
    46  
    47  Note: pytest also allow a more idiomatic way of defining teardown logic in fixtures by using python's `yield` statement instead of registering a finalizer, as shown in the code below. However, finalizer functions have the advantage that they will always be called regardless if the fixture setup code raises an exception, provided they are registered before the exception occurs. Therefore, **testrunner encourages using finalizer functions**.
    48  
    49  ```
    50  @pytest.fixture
    51  def setup(request, platform, skuba):
    52      platform.provision()
    53      skuba.cluster_init()
    54      skuba.node_bootstrap()
    55      yield               # return from fixture
    56      platform.cleanup()  # teardown logic
    57  ```
    58  
    59  ## Reusing already deployed infrastructure
    60  
    61  Sometime, it is convenient to reuse an already deployed infrastructure when executing tests. This is a common case while tests are beeing developed (as they must be tested by the developer and errors need to be fixed), or when multiple tests which have no side effects can share the same infrastructure.
    62  
    63  To address these uses cases, `testrunner`'s [`test` command](../README.md#test-command) provides the `--skip-setup` option which allows skipping the execution of one or more setup fixtures that setup, whithout having to modify the test or the fixtures. If a fixuture depends on other fixtures, those are also skipped automatically.
    64  
    65  Consider the following fixtures:
    66  
    67  ```
    68  @pytest.fixture()
    69  def provision():
    70  # provision infrastructure
    71  
    72  @pytest.fixture()
    73  def bootstrap(provision, skuba):
    74  # bootstrap cluster
    75  
    76  @pytest.fixture()
    77  def deployment(bootstrap, platform, skuba)
    78  # complete cluster deployment
    79  # joining all nodes
    80  
    81  def test_deployment(deployment, skuba:
    82  # test fully deployed cluster
    83  
    84  ```
    85  
    86  Running the following command will executed the test without executing the cluster deployment fixture and neither of the fixtures it depends (`bootstrap`, `provision`)
    87  
    88  ```
    89  testrunner test --skip-setup deployed -t test_deployment
    90  ```
    91  
    92  
    93  ## Running tests with the Testrunner
    94  
    95  The `testrunner` command can be used for running tests. It allows selecting a directory, an individual test file (a suite of tests) or an specific test in a test file.
    96  
    97  Given the following directory structure:
    98  ```
    99  testrunner
   100  vars
   101   |-- vars.yaml
   102  tests
   103   |-- test_workers.py
   104  
   105  The command below will exectute the `test_add_worker` function defined in `tests/test_workers.py`:
   106  
   107  ```
   108  testrunner -v vars/vars.yaml test --module tests --suite core_tests.py --test test_add_worker
   109  ```
   110  
   111  
   112  
   113  ## Using Testrunner library
   114  
   115  Testrunner provides a library of functions that wraps `skuba` and `terraform` for executing actions such as provisioning a platform, or runnig any `skuba` command,
   116  
   117  ### Platform
   118  
   119  `Platform` offers the functions required for provisioning a platform for deploying a cluster. Provides the following functions:
   120  - `get_platform(conf)`: returns an instance of the platform initialized with the configuration passed in the `conf` parameter. This configuration can be obtained by means of the `conf` fixture
   121  - `provision`:  executes the provisioning of the platform
   122  - `cleanup`: releases any resource obtained by `provision`
   123  - `get_nodes_ipaddrs(role)`: return the list of ip addresses for the nodes provisioned for a role.
   124  - `get_lb_ipadd`: returns the ip address for the load balancer node
   125  
   126  ### Skuba
   127  
   128  `Skuba` wraps the `skuba` commands:
   129  - `Skuba(conf): creates an instance of the `Skuba` class initialized with the configuration provided in `conf`
   130  - `cluster_init()`: initializes the skuba cluster configuration
   131  - `node_bootstrap()`: bootstraps a cluster
   132  - `node_join(role, nr)`: adds a new node to the cluster with the given role. The node is identified by its index in the provisioned nodes for that role.
   133  - `node_remove(role, nr)` removes a node currently part of the cluster. The node is identified by its role an its id in the list of provisioned nodes for that role.
   134  - `num_of_nodes(role)`: returns the number of nodes in cluster for the given role.
   135  
   136  ## Handling timeouts and retries
   137  
   138  Sometimes tests involve operations that require waiting for some time until they are completed (e.g. deploying a component) and eventually retrying them. In order to facilitate implementing this kind of logic, the testrunner test library offers the function `wait` which can be used to wrap another function call and specify how to handle timeouts and retries:
   139  
   140  ```
   141  wait(func, *args, **kwargs)
   142  ```
   143  
   144  The `wait` function receives the name of a function to invoke, a list of arguments, and a list of key-value pairs, which are passed to the function. Additionally, some key-value parameters can be passed to the wait function itself:
   145  * wait_allow: a tuple of exceptions that are expected and must be retried (default, none)
   146  * wait_backoff: delay in seconds between retries (default, 0 seconds)
   147  * wait_delay: time before first try in seconds (default 0)
   148  * wait_elapsed: maximum time to wait for the function to complete successfully, regardless of the number or attempts and considering the total time of initial delay, timeout for each attempt and backoff between attempts. If specified, a non-zero `wait_retries` cannot be specified.
   149  * wait_retries: number of retries in case of failed or timeout invocation. If specified, a non-zero `wait_elapsed` cannot be specified.
   150  * wait_timeout: timeout in seconds for waiting each try to complete 
   151  
   152  For example, the following code reboots a node and waits until a command can be executed successfully, with an initial 30 seconds delay to give time for the node to reboot. The exception `RuntimeError` is allow and retried because `ssh_run` raises it in case it cannot stablish a connection.
   153  ```
   154  from test.testconf import platform
   155  from test.utils import wait
   156  
   157  test_reboot(provision, platform):
   158      
   159      platform.ssh_run("master", "0", "sudo reboot &")
   160  
   161      wait(platform.ssh, "master", "0", "/bin/true", wait_delay=30, wait_timeout=10, wait_retries=3, wait_bakoff=30, wait_allow=(RuntimeError))
   162  ```
   163  
   164  Alternatively, if we are not interested in the number of attempts, but is a fixed maximum elapsed time of for example 120 seconds for the test, we can use the following code:
   165  ```
   166  test_reboot(provision, platform):
   167      
   168      platform.ssh_run("master", "0", "sudo reboot &")
   169  
   170      wait(platform.ssh, "master", "0", "/bin/true", wait_delay=30, wait_timeout=10, wait_backoff=30, wait_elapsed=120, wait_allow=(RuntimeError))
   171  ```
   172  Notice that in this case, if we change the `wait_timeout` or the `wait_elapsed` parameters, this will not affect the maximum time the test can take. This makes easier to reason about test duration.
   173   
   174  
   175  **Note**: current implementation does not allow nesting calls to the `wait` function. Therefore the code shown below will not work:
   176  ```
   177  def waiting_function():
   178      wait( ....)
   179  
   180  test_waiting():
   181      wait(waiting_function, ...)
   182  ```