github.com/bigcommerce/nomad@v0.9.3-bc/terraform/README.md (about)

     1  # Provision a Nomad cluster in the Cloud
     2  
     3  Use this repo to easily provision a Nomad sandbox environment on AWS or Azure with 
     4  [Packer](https://packer.io) and [Terraform](https://terraform.io). 
     5  [Consul](https://www.consul.io/intro/index.html) and 
     6  [Vault](https://www.vaultproject.io/intro/index.html) are also installed 
     7  (colocated for convenience). The intention is to allow easy exploration of 
     8  Nomad and its integrations with the HashiCorp stack. This is *not* meant to be
     9  a production ready environment. A demonstration of [Nomad's Apache Spark 
    10  integration](examples/spark/README.md) is included. 
    11  
    12  ## Setup
    13  
    14  Clone the repo and optionally use [Vagrant](https://www.vagrantup.com/intro/index.html) 
    15  to bootstrap a local staging environment:
    16  
    17  ```bash
    18  $ git clone git@github.com:hashicorp/nomad.git
    19  $ cd nomad/terraform
    20  $ vagrant up && vagrant ssh
    21  ```
    22  
    23  The Vagrant staging environment pre-installs Packer, Terraform, Docker and the 
    24  Azure CLI.
    25  
    26  ## Provision a cluster
    27  
    28  - Follow the steps [here](aws/README.md) to provision a cluster on AWS.
    29  - Follow the steps [here](azure/README.md) to provision a cluster on Azure.
    30  
    31  Continue with the steps below after a cluster has been provisioned.
    32  
    33  ## Test
    34  
    35  Run a few basic status commands to verify that Consul and Nomad are up and running 
    36  properly:
    37  
    38  ```bash
    39  $ consul members
    40  $ nomad server members
    41  $ nomad node status
    42  ```
    43  
    44  ## Unseal the Vault cluster (optional)
    45  
    46  To initialize and unseal Vault, run:
    47  
    48  ```bash
    49  $ vault operator init -key-shares=1 -key-threshold=1
    50  $ vault operator unseal
    51  $ export VAULT_TOKEN=[INITIAL_ROOT_TOKEN]
    52  ```
    53  
    54  The `vault init` command above creates a single 
    55  [Vault unseal key](https://www.vaultproject.io/docs/concepts/seal.html) for 
    56  convenience. For a production environment, it is recommended that you create at 
    57  least five unseal key shares and securely distribute them to independent 
    58  operators. The `vault init` command defaults to five key shares and a key 
    59  threshold of three. If you provisioned more than one server, the others will 
    60  become standby nodes but should still be unsealed. You can query the active 
    61  and standby nodes independently:
    62  
    63  ```bash
    64  $ dig active.vault.service.consul
    65  $ dig active.vault.service.consul SRV
    66  $ dig standby.vault.service.consul
    67  ```
    68  
    69  See the [Getting Started guide](https://www.vaultproject.io/intro/getting-started/first-secret.html) 
    70  for an introduction to Vault.
    71  
    72  ## Getting started with Nomad & the HashiCorp stack
    73  
    74  Use the following links to get started with Nomad and its HashiCorp integrations:
    75  
    76  * [Getting Started with Nomad](https://www.nomadproject.io/intro/getting-started/jobs.html)
    77  * [Consul integration](https://www.nomadproject.io/docs/service-discovery/index.html)
    78  * [Vault integration](https://www.nomadproject.io/docs/vault-integration/index.html)
    79  * [consul-template integration](https://www.nomadproject.io/docs/job-specification/template.html)
    80  
    81  ## Apache Spark integration
    82  
    83  Nomad is well-suited for analytical workloads, given its performance 
    84  characteristics and first-class support for batch scheduling. Apache Spark is a 
    85  popular data processing engine/framework that has been architected to use 
    86  third-party schedulers. The Nomad ecosystem includes a [fork that natively 
    87  integrates Nomad with Spark](https://github.com/hashicorp/nomad-spark). A
    88  detailed walkthrough of the integration is included [here](examples/spark/README.md).