github.com/pingcap/tiflow@v0.0.0-20240520035814-5bf52d54e205/tests/integration_tests/README.md (about)

     1  ## Preparations
     2  
     3  ### Run integration tests locally
     4  
     5  1. Run `make prepare_test_binaries community=true` to download TiCDC related binaries for integration test.
     6  You can specify version and arch, for example: `make prepare_test_binaries community=true ver=v7.0.0 arch=amd64`.
     7  
     8     You should find these executables in the `tiflow/bin` directory after downloading successfully.
     9     * `tidb-server` # version >= 6.0.0-rc.1
    10     * `tikv-server` # version >= 6.0.0-rc.1
    11     * `pd-server`   # version >= 6.0.0-rc.1
    12     * `pd-ctl`      # version >= 6.0.0-rc.1
    13     * `tiflash`     # tiflash binary
    14     * `libc++.so, libc++abi.so, libgmssl.so, libtiflash_proxy.so` # some necessary so files related to tiflash
    15     * `sync_diff_inspector`
    16     * [go-ycsb](https://github.com/pingcap/go-ycsb)
    17     * [etcdctl](https://github.com/etcd-io/etcd/tree/master/etcdctl)
    18     * [jq](https://stedolan.github.io/jq/)
    19     * [minio](https://github.com/minio/minio)
    20  
    21     > You could download these binaries by yourself from [tidb-community-toolkit](https://download.pingcap.org/tidb-community-toolkit-v6.0.0-linux-amd64.tar.gz) and [tidb-community-server](https://download.pingcap.org/tidb-community-server-v6.0.0-linux-amd64.tar.gz).
    22  
    23  2. These are programs/packages need be installed.
    24     * [mysql](https://dev.mysql.com/doc/mysql-installation-excerpt/5.7/en/) (the MySQL cli client,
    25       currently [mysql client 8.0 is not supported](https://github.com/pingcap/tidb/issues/14021))
    26     * [s3cmd](https://s3tools.org/download)
    27     * unzip
    28     * psmisc
    29  
    30     > You can install `unzip` and `psmisc` using `apt-get` (Ubuntu / Debian) or `yum` (RHEL).
    31  
    32     > Since the integration test cases will use port 3306 on localhost, please make sure in advance that port 3306 is
    33     > not occupied. (You’d like to stop the local MySQL service on port 3306, if there is one)
    34  
    35  3. The user used to execute the tests must have permission to create the folder /tmp/tidb_cdc_test. All test artifacts
    36     will be written into this folder.
    37  
    38  ### Run integration tests in docker
    39  
    40  The following programs must be installed:
    41  
    42  * [docker](https://docs.docker.com/get-docker/)
    43  * [docker-compose](https://docs.docker.com/compose/install/)
    44  
    45  We recommend that you provide docker with at least 6+ cores and 8G+ memory. Of course, the more resources, the better.
    46  
    47  ## Running
    48  
    49  ### Unit Test
    50  
    51  1. Unit test does not need any dependencies, just running `make unit_test` in root dir of source code, or cd into
    52     directory of a test case and run single case via `GO111MODULE=on go test -check.f TestXXX`.
    53  
    54  ### Integration Test
    55  
    56  #### Run integration tests in docker
    57  
    58  > **Warning:**
    59  > These scripts and files may not work under the arm architecture,
    60  > and we have not tested against it. We will try to resolve it as soon as possible.
    61  >
    62  > The script is designed to download necessary binaries from the PingCAP
    63  > intranet by default, requiring access to the PingCAP intranet. However,
    64  > if you want to download the community version, you can specify it through
    65  > the `COMMUNITY` environment variable. For instance, you can use the following
    66  > command as an example:
    67  > `BRANCH=master COMMUNITY=true VERSION=v7.0.0 START_AT="clustered_index" make kafka_docker_integration_test_with_build`
    68  
    69  1. If you want to run kafka tests,
    70     run `START_AT="clustered_index" make kafka_docker_integration_test_with_build`
    71  
    72  2. If you want to run MySQL tests,
    73     run `CASE="clustered_index" make mysql_docker_integration_test_with_build`
    74  
    75  3. Use the command `make clean_integration_test_images`
    76     to clean up the corresponding environment.
    77  
    78  Some useful tips:
    79  
    80  1. The log files for the test are mounted in the `./deployments/ticdc/docker-compose/logs` directory.
    81  
    82  2. You can specify multiple tests to run in CASE, for example: `CASE="clustered_index kafka_messages"`. You can even
    83     use `CASE="*"` to indicate that you are running all tests。
    84  
    85  3. You can specify in the [integration-test.Dockerfile](../../deployments/ticdc/docker/integration-test.Dockerfile)
    86     the version of other dependencies that you want to download, such as tidb, tikv, pd, etc.
    87     > For example, you can change `RUN ./download-integration-test-binaries.sh master`
    88     to `RUN ./download-integration-test-binaries.sh release-5.2`
    89     > to use the release-5.2 dependency.
    90     > Then rebuild the image with `make build_mysql_integration_test_images`.
    91  
    92  #### Run integration tests locally
    93  
    94  1. Run `make integration_test_build` to generate TiCDC related binaries for integration test
    95  
    96  2. Run `make integration_test` to execute the integration tests. This command will
    97  
    98     1. Check that all required executables exist.
    99     2. Execute `tests/integration_tests/run.sh`
   100  
   101     > If want to run one integration test case only, just pass the CASE parameter, such as `make integration_test CASE=simple`.
   102  
   103     > If want to run integration test cases from the specified one, just pass the START_AT parameter, such as `make integration_test START_AT=simple` .
   104  
   105     > There exists some environment variables that you can set by yourself, variable details can be found in [test_prepare](_utils/test_prepare).
   106  
   107     > `MySQL sink` will be used by default, if you want to test `Kafka sink`, please run with `make integration_test_kafka CASE=simple`.
   108  
   109  3. After executing the tests, run `make coverage` to get a coverage report at `/tmp/tidb_cdc_test/all_cov.html`.
   110  
   111  ## Writing new tests
   112  
   113  1. New integration tests can be written as shell scripts in `tests/integration_tests/TEST_NAME/run.sh`. The script should
   114  exit with a nonzero error code on failure.
   115  
   116  2. Add TEST_NAME to existing group in [run_group.sh](./run_group.sh), or add a new group for it.
   117  
   118  3. If you add a new group, the name of the new group must be added to CI.
   119     * [cdc-integration-kafka-test](https://github.com/PingCAP-QE/ci/blob/main/pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml)
   120     * [cdc-integration-mysql-test](https://github.com/PingCAP-QE/ci/blob/main/pipelines/pingcap/tiflow/latest/pull_cdc_integration_test.groovy)