github.com/RedHatInsights/insights-results-aggregator@v1.4.1/docs/local_setup.md (about)

     1  ---
     2  layout: page
     3  nav_order: 14
     4  ---
     5  # Local setup
     6  {: .no_toc }
     7  
     8  ## Table of contents
     9  {: .no_toc .text-delta }
    10  
    11  1. TOC
    12  {:toc}
    13  
    14  There is a `docker-compose` configuration that provisions a minimal stack of Insight Platform and
    15  a postgres database.
    16  You can download it here <https://gitlab.cee.redhat.com/insights-qe/iqe-ccx-plugin/blob/master/docker-compose.yml>
    17  
    18  ## Prerequisites
    19  
    20  * edit localhost line in your `/etc/hosts`:  `127.0.0.1       localhost kafka minio`
    21  * `ingress` image should present on your machine. You can build it locally from this repo
    22  <https://github.com/RedHatInsights/insights-ingress-go>
    23  * `ccx-data-pipeline` installed. You can doing it by cloning the ccx-data-pipeline repository and running `pip install -r requirements.txt`. Make sure you are using the appropiate version of Python (look at `ccx-data-pipeline` README) and a virtualenv (optional but recommended).
    24  
    25  ## Usage
    26  
    27  1. Start the stack `podman-compose up` or `docker-compose up`
    28  2. Wait until kafka will be up.
    29  3. Start `ccx-data-pipeline`: `python3 -m insights_messaging config-devel.yaml`
    30  4. Create necessary topics in Kafka manually, or upload an archive to Ingress service and they should be created automatically
    31  5. Build `insights-results-aggregator`: `make build`
    32  6. Start `insights-results-aggregator`: `INSIGHTS_RESULTS_AGGREGATOR_CONFIG_FILE=./config-devel.toml ./insights-results-aggregator`
    33  
    34  Stop Minimal Insights Platform stack `podman-compose down` or `docker-compose down`
    35  
    36  In order to upload an insights archive, you can use `curl`:
    37  
    38  ```shell
    39  curl -k -vvvv -F "upload=@/path/to/your/archive.zip;type=application/vnd.redhat.testareno.archive+zip" http://localhost:3000/api/ingress/v1/upload -H "x-rh-identity: eyJpZGVudGl0eSI6IHsiYWNjb3VudF9udW1iZXIiOiAiMDAwMDAwMSIsICJpbnRlcm5hbCI6IHsib3JnX2lkIjogIjEifX19Cg=="
    40  ```
    41  
    42  or you can use integration tests suite. More details are [here](https://gitlab.cee.redhat.com/insights-qe/iqe-ccx-plugin).
    43  
    44  ## Troubleshooting
    45  
    46  * If you find an error in the last step because no `migration_info` table was found in the DB, run
    47  ```
    48  INSIGHTS_RESULTS_AGGREGATOR_CONFIG_FILE=config-devel.toml ./insights-results-aggregator migrate latest
    49  ```
    50  
    51  * If the binary cannot find the `config-devel.toml` even if the relative path is right, use an absolute one instead i.e `/path/to/config-devel.toml`.
    52  
    53  * Make sure to read the contents of `config-devel.toml` as it has all the configuration about the connections to the containers and the API prefix.
    54  
    55  
    56  ## Kafka producer
    57  
    58  It is possible to use the script `produce_insights_results` from `utils` to produce several Insights
    59  results into Kafka topic. Its dependency is Kafkacat that needs to be installed on the same machine.
    60  You can find installation instructions [on this page](https://github.com/edenhill/kafkacat).