github.com/instill-ai/component@v0.16.0-beta/.github/CONTRIBUTING.md (about)

     1  # Contributing Guidelines
     2  
     3  We appreciate your contribution to this amazing project! Any form of engagement
     4  is welcome, including but not limiting to
     5  - feature request
     6  - documentation wording
     7  - bug report
     8  - roadmap suggestion
     9  - ...and so on!
    10  
    11  Please refer to the [community contributing
    12  section](https://github.com/instill-ai/community#contributing) for more details.
    13  
    14  ## Concepts
    15  
    16  Before delving into the details to come up with your first PR, please
    17  familiarize yourself with the project structure of [Instill
    18  Core](https://github.com/instill-ai/community#instill-core).
    19  
    20  ### Pipeline
    21  
    22  In VDP, a **pipeline** is a DAG (Directed Acyclic Graph) consisting of multiple
    23  **components**.
    24  
    25  
    26  ```mermaid
    27  flowchart LR
    28      s[Trigger] --> c1[OpenAI Connector]
    29      c1 --> c2[Stability AI Connector]
    30      c1 --> c3[MySQL Connector]
    31      c1 --> e[Response]
    32      c2 --> e
    33  ```
    34  
    35  ### Component
    36  
    37  There are different types of component:
    38  - **connector**
    39    - Queries, processes or transmits the ingested data to a service or app.
    40    - Users need to configure their connectors (e.g. by providing an API token to
    41      a remote service).
    42  - **operator**
    43    - Performs data injection and manipulation.
    44  - **iterator**
    45    - Takes an array and executes an operation (defined by a set of nested
    46      components) on each of its elements.
    47  - **trigger / response**
    48    - These special components provide an input / output interface to pipeline
    49      triggers.
    50  
    51  #### Connector
    52  
    53  - **Connectors** are used by the pipeline to interact with an external service.
    54    They are defined and initialized in the [connector](../pkg/connector) package.
    55  - In order to set up a connector, you may need to introduce its **connection**
    56    details in the connector properties.
    57    - In order to prevent private keys from being unintentionally leaked when
    58      sharing a pipeline, the connection properties only take reference to a
    59      **secret** (e.g. `${secrets.my-secret}`).
    60    - You can create secrets from the console settings or through an [API
    61      call](https://openapi.instill.tech/reference/pipelinepublicservice_createusersecret).
    62  
    63  #### Operator
    64  
    65  - **Operators** perform data transformations inside the pipeline. They are defined
    66    and initialized in the [operator](../pkg/operator) package.
    67  
    68  The key difference between `connector` and `operator` is that the former will
    69  connect to an external service, so it's **I/O bound** while the latter is **CPU
    70  bound**. Connectors don't process but transfer data.
    71  
    72  ### Recipe
    73  
    74  A **pipeline recipe** specifies how components are configured and how they are
    75  interconnected.
    76  
    77  Recipes are represented by a JSON object:
    78  
    79  ```json
    80  {
    81    "version": "v1beta",
    82    "components": [
    83      {
    84        "id": "<component_id>", // must be unique within the pipeline.
    85        "<component_type>": { // operator_component, connector_component
    86          "definition_name": "<definition_name>",
    87          "task": "<task>",
    88          "input": {
    89            // values for the input fields
    90          },
    91          "condition": "<condition>", // conditional statement to execute or bypass the component
    92          "connection": {
    93            // connection specification values, optional
    94          }
    95        }
    96      },
    97    ],
    98    "trigger": {
    99      "trigger_by_request": {
   100        "request_fields": {
   101          // pipeline input fields
   102        },
   103        "response_fields": {
   104          // pipeline output fields
   105        }
   106      }
   107    }
   108  }
   109  ```
   110  
   111  You can see an example recipe in the [component development
   112  guide](#example-recipe)
   113  
   114  ```mermaid
   115  sequenceDiagram
   116  participant u as User
   117  participant gw as api-gateway
   118  participant p as pipeline-backend
   119  participant db as pipeline-db
   120  
   121  u ->> gw: POST /users/<user>/pipelines
   122  gw ->> p: forward
   123  p ->> db: Store pipeline and its recipe
   124  ```
   125  
   126  ### Trigger
   127  
   128  When a pipeline is triggered, the DAG will be computed in order to execute
   129  components in topological order.
   130  
   131  ```mermaid
   132  sequenceDiagram
   133  
   134  participant u as User
   135  participant gw as api-gateway
   136  participant p as pipeline-backend
   137  participant db as pipeline-db
   138  participant c as component
   139  
   140  u ->> gw: POST /users/<user>/pipelines/<pipeline_id>/trigger
   141  gw ->> p: forward
   142  p ->> db: Get recipe
   143  db ->> p: Recipe
   144  loop over topological order of components
   145      p->>c: ExecuteWithValidation
   146  end
   147  ```
   148  
   149  ## Development
   150  
   151  This section will guide you through the steps to contribute with a new
   152  component. You'll add and test an operator that takes a string `target` as input
   153  and returns a `"Hello, ${target}!"` string as the component output
   154  
   155  In order to add a new component, you need to:
   156  - Define the component configuration. This will determine the tasks that can be
   157    performed by the component and their input and output parameters. The
   158    `console` frontend will use this configuration files to render the component
   159    in the pipeline builder.
   160  - Implement the component interfaces so `pippeline-backend` can execute the
   161    component without knowing its implementation details.
   162  - Initialize the component, i.e., include the implementation of the component
   163    interfaces as a dependency in the `pipeline-backend` execution.
   164  
   165  ### Environment setup
   166  
   167  Start by cloning this repository:
   168  
   169  ```sh
   170  $ git clone https://github.com/instill-ai/component
   171  ```
   172  
   173  Although all the development will be done in this repository, if you want to
   174  [see your component in action](#use-the-component-in-vdp), you'll need to build
   175  VDP locally. First, launch the latest version of
   176  [Core](https://github.com/instill-ai/instill-core). Then, build and
   177  launch [VDP](https://github.com/instill-ai/pipeline-backend) with
   178  your local changes.
   179  
   180  If you want to know more, you can refer to the documentation in these
   181  repositories, which explains in detail how to set up the
   182  development environment. In short, here's what we'll need to do for this guide:
   183  
   184  #### Building Core
   185  
   186  ```sh
   187  $ git clone https://github.com/instill-ai/instill-core && cd instill-core
   188  $ make latest PROFILE=all
   189  ```
   190  
   191  #### Buidling VDP
   192  
   193  ```sh
   194  $ git clone https://github.com/instill-ai/pipeline-backend && cd pipeline-backend
   195  $ make build
   196  ```
   197  
   198  `component` is a dependency in `pipeline-backend` so, in order to take your
   199  changes into account, you need reference them.
   200  
   201  ```sh
   202  $ go mod edit -replace="github.com/instill-ai/component=../component"
   203  ```
   204  
   205  Then, mount the `component` directory when running the `pipeline-backend`
   206  container. Add the `-v $(PWD)/../component:/component` option to `make dev` in
   207  the Makefile:
   208  
   209  ```Makefile
   210  dev:							## Run dev container
   211  	@docker compose ls -q | grep -q "instill-core" && true || \
   212  		(echo "Error: Run \"make latest PROFILE=pipeline\" in vdp repository (https://github.com/instill-ai/instill-core) in your local machine first." && exit 1)
   213  	@docker inspect --type container ${SERVICE_NAME} >/dev/null 2>&1 && echo "A container named ${SERVICE_NAME} is already running." || \
   214  		echo "Run dev container ${SERVICE_NAME}. To stop it, run \"make stop\"."
   215  	@docker run -d --rm \
   216  		-v $(PWD):/${SERVICE_NAME} \
   217  		-v $(PWD)/../component:/component \
   218  		-p ${SERVICE_PORT}:${SERVICE_PORT} \
   219  		--network instill-network \
   220  		--name ${SERVICE_NAME} \
   221  		instill/${SERVICE_NAME}:dev >/dev/null 2>&1
   222  ```
   223  
   224  2 processes must know about the new component: `main` and `worker`. You'll need
   225  to stop their Core version before running the local one.
   226  
   227  ```sh
   228  $ docker rm -f pipeline-backend pipeline-backend-worker
   229  $ make dev
   230  $ docker exec -d pipeline-backend go run ./cmd/worker # run without -d in a separate terminal if you want to access the logs
   231  $ docker exec pipeline-backend go run ./cmd/main
   232  ```
   233  
   234  ### Create the component package
   235  
   236  ```sh
   237  $ cd $WORKSPACE/component
   238  $ mkdir -p pkg/operator/hello/v0 && cd $_
   239  ```
   240  
   241  Components are isolated in their own packages under `pkg/connector` or
   242  `pkg/operator`. The package is versioned so, in case a breaking change needs to
   243  be introduced (e.g. supporting a new major version in a vendor API), existing
   244  pipelines using the previous version of the connector can keep being triggered.
   245  
   246  At the end of this guide, this will be the structure of the package:
   247  
   248  ```
   249  pkg/operator/hello/v0
   250   ├──assets
   251   │  └──hello.svg
   252   ├──config
   253   │  ├──definition.json
   254   │  └──tasks.json
   255   ├──main.go
   256   ├──operator_test.go
   257   └──README.mdx
   258   ```
   259  
   260  ### Add the configuration files
   261  
   262  Create a `config` directory and add the files `definition.json` and
   263  `tasks.json`. Together, they define the behaviour of the component.
   264  
   265  #### `definition.json`
   266  
   267  ```json
   268  {
   269    "available_tasks": [
   270      "TASK_GREET"
   271    ],
   272    "custom": false,
   273    "documentation_url": "https://www.instill.tech/docs/latest/vdp/operators/hello",
   274    "icon": "assets/hello.svg",
   275    "id": "hello",
   276    "public": true,
   277    "spec": {},
   278    "title": "Hello",
   279    "uid": "e05d3d71-779c-45f8-904d-e90a050ca3b2",
   280    "version": "0.1.0",
   281    "source_url": "https://github.com/instill-ai/component/blob/main/pkg/operator/hello/v0",
   282    "description": "'Hello, world' operator used as a template for adding components",
   283    "release_stage": "RELEASE_STAGE_ALPHA"
   284  }
   285  ```
   286  
   287  This file defines the component properties:
   288  - `id` is the ID of the component. It must be unique.
   289  - `uid` is a UUID string that must not be already taken by another component.
   290    Once it is set, it must not change.
   291  - `title` is the end-user name of the component.
   292  - `description` is a short sentence describing the purpose of the component. It
   293    should be written in imperative tense.
   294  - `spec` contains the parameters required to configure the component and that
   295    are independent from its tasks. E.g., the API token of a vendor. In general,
   296    only connectors need such parameters.
   297  - `available_tasks` defines the tasks the component can perform.
   298    - When a component is created in a pipeline, one of the tasks has to be
   299      selected, i.e., a configured component can only execute one task.
   300    - Task configurations are defined in `tasks.json`.
   301  - `documentation_url` points to the official documentation of the component.
   302  - `icon` is the local path to the icon that will be displayed in the Console
   303    when creating the component. If left blank, a placeholder icon will be shown.
   304  - `version` must be a [SemVer](https://semver.org/) string. It is encouraged to
   305    keep a [tidy version history](#sane-version-control).
   306  - `source_url` points to the codebase that implements the component. This will
   307    be used by the documentation generation tool and also will be part of the
   308    [component definition list](https://openapi.instill.tech/reference/pipelinepublicservice_listcomponentdefinitions) endpoint.
   309  - `release_stage` describes the release stage of the component. Unimplemented
   310    stages (`RELEASE_STAGE_COMING_SOON` or `RELEASE_STAGE_OPEN_FOR_CONTRIBUTION`)
   311    will hide the component from the console (i.e. they can't be used in
   312    pipelines) but they will appear in the component definition list endpoint.
   313  
   314  
   315  #### `tasks.json`
   316  
   317  ```json
   318  {
   319    "TASK_GREET": {
   320      "instillShortDescription": "Greet someone / something",
   321      "title": "Greet",
   322      "input": {
   323        "description": "Input",
   324        "instillUIOrder": 0,
   325        "properties": {
   326          "target": {
   327            "instillUIOrder": 0,
   328            "description": "The target of the greeting",
   329            "instillAcceptFormats": [
   330              "string"
   331            ],
   332            "instillUpstreamTypes": [
   333              "value",
   334              "reference",
   335              "template"
   336            ],
   337            "instillUIMultiline": true,
   338            "title": "Greeting target",
   339            "type": "string"
   340          }
   341        },
   342        "required": [
   343          "target"
   344        ],
   345        "title": "Input",
   346        "type": "object"
   347      },
   348      "output": {
   349        "description": "The greeting sentence",
   350        "instillUIOrder": 0,
   351        "properties": {
   352          "greeting": {
   353            "description": "A greeting sentence addressed to the target",
   354            "instillEditOnNodeFields": [],
   355            "instillUIOrder": 0,
   356            "required": [],
   357            "title": "Greeting",
   358            "type": "string",
   359            "instillFormat": "string"
   360          }
   361        },
   362        "required": [
   363          "greeting"
   364        ],
   365        "title": "Output",
   366        "type": "object"
   367      }
   368    }
   369  }
   370  ```
   371  
   372  This file defines the input and output schema of each task:
   373  
   374  - `title` and `instillShortDescription` will be used by the frontend to provide
   375    information about the task.
   376  - For each property within the `input` and `output` objects:
   377    - `instillUIOrder` defines the order in which the properties will be rendered
   378      by the frontend.
   379    - `required` properties will appear at the forefront of the component UI.
   380      Optional properties can be set in the advanced configuration.
   381    - `instillUpstreamTypes` define how an input property can be set: the direct
   382      value, a reference to another value in the pipeline (e.g.
   383      `${trigger.name}` or a combination of both (`my dear ${trigger.name}`).
   384  
   385  See the [example recipe](#example-recipe) to see how these fields map to the recipe
   386  of a pipeline when configured to use this operator.
   387  
   388  
   389  ### Implement the component interfaces
   390  
   391  Pipeline communicates with components through the `IComponent`, `IConnector`,
   392  `IOperator` and `IExecution` interfaces, defined in the [`base`](../pkg/base)
   393  package. This package also defines base implementations for these interfaces, so
   394  the `hello` component will only need to override the following methods:
   395  - `CreateExecution(vars map[string]any, task string) (*ExecutionWrapper, error)`
   396    will return an object that implements the `Execute` method.
   397    - `ExecutionWrapper` will wrap the execution call with the input and output
   398      schema validation.
   399  - `Execute([]*structpb.Struct) ([]*structpb.Struct, error)` is the most
   400    important function in the component. All the data manipulation will take place
   401    here.
   402  
   403  Paste the following code into a `main.go` file in `pkg/operator/hello/v0`:
   404  
   405  ```go
   406  package hello
   407  
   408  import (
   409  	_ "embed"
   410  	"fmt"
   411  	"sync"
   412  
   413  	"go.uber.org/zap"
   414  	"google.golang.org/protobuf/types/known/structpb"
   415  
   416  	"github.com/instill-ai/component/pkg/base"
   417  )
   418  
   419  const (
   420  	taskGreet = "TASK_GREET"
   421  )
   422  
   423  var (
   424  	//go:embed config/definition.json
   425  	definitionJSON []byte
   426  	//go:embed config/tasks.json
   427  	tasksJSON []byte
   428  
   429  	once sync.Once
   430  	op   *operator
   431  )
   432  
   433  type operator struct {
   434  	base.BaseOperator
   435  }
   436  
   437  type execution struct {
   438  	base.BaseOperatorExecution
   439  }
   440  
   441  // Init returns an implementation of IOperator that implements the greeting
   442  // task.
   443  func Init(l *zap.Logger, u base.UsageHandler) *operator {
   444  	once.Do(func() {
   445  		op = &operator{
   446  			BaseOperator: base.BaseOperator{
   447  				Logger:       l,
   448  				UsageHandler: u,
   449  			},
   450  		}
   451  		err := op.LoadOperatorDefinition(definitionJSON, tasksJSON, nil)
   452  		if err != nil {
   453  			panic(err)
   454  		}
   455  	})
   456  	return op
   457  }
   458  
   459  func (o *operator) CreateExecution(sysVars map[string]any, task string) (*base.ExecutionWrapper, error) {
   460  	e := &execution{
   461  		BaseOperatorExecution: base.BaseOperatorExecution{Operator: o, SystemVariables: sysVars, Task: task},
   462  	}
   463  
   464  	if task != taskGreet {
   465  		return nil, fmt.Errorf("unsupported task")
   466  	}
   467  
   468  	return &base.ExecutionWrapper{Execution: e}, nil
   469  }
   470  
   471  func (e *execution) Execute(_ []*structpb.Struct) ([]*structpb.Struct, error) {
   472  	return nil, nil
   473  }
   474  ```
   475  
   476  ### Add the execution logic
   477  
   478  The `hello` operator created in the previous section doesn't implement any
   479  logic. This section will add the greeting logic to the `Execute` method.
   480  
   481  Let's modify the following methods:
   482  
   483  ```go
   484  type execution struct {
   485  	base.BaseOperatorExecution
   486  	execute func(*structpb.Struct) (*structpb.Struct, error)
   487  }
   488  
   489  func (o *operator) CreateExecution(sysVars map[string]any, task string) (*base.ExecutionWrapper, error) {
   490  	e := &execution{
   491  		BaseOperatorExecution: base.BaseOperatorExecution{Operator: o, SystemVariables: sysVars, Task: task},
   492  	}
   493  
   494  	// A simple if statement would be enough in a component with a single task.
   495  	// If the number of task grows, here is where the execution task would be
   496  	// selected.
   497  	switch task {
   498  	case taskGreet:
   499  		e.execute = e.greet
   500  	default:
   501  		return nil, fmt.Errorf("unsupported task")
   502  	}
   503  	return &base.ExecutionWrapper{Execution: e}, nil
   504  }
   505  
   506  func (e *execution) Execute(inputs []*structpb.Struct) ([]*structpb.Struct, error) {
   507  	outputs := make([]*structpb.Struct, len(inputs))
   508  
   509  	// An execution  might take several inputs. One result will be returned for
   510  	// each one of them, containing the execution output for that set of
   511  	// parameters.
   512  	for i, input := range inputs {
   513  		output, err := e.execute(input)
   514  		if err != nil {
   515  			return nil, err
   516  		}
   517  
   518  		outputs[i] = output
   519  	}
   520  
   521  	return outputs, nil
   522  }
   523  
   524  func (e *execution) greet(in *structpb.Struct) (*structpb.Struct, error) {
   525  	out := new(structpb.Struct)
   526  
   527  	target := in.Fields["target"].GetStringValue()
   528  	greeting := "Hello, " + target + "!"
   529  
   530  	out.Fields = map[string]*structpb.Value{
   531  		"greeting": structpb.NewStringValue(greeting),
   532  	}
   533  
   534  	return out, nil
   535  }
   536  ```
   537  
   538  #### Unit tests
   539  
   540  Before initializing testing your component in VDP, we can unit test its
   541  behaviour. The following covers the newly added logic by replicating how the
   542  `pipeline-backend` workers execute the component logic:
   543  
   544  ```go
   545  package hello
   546  
   547  import (
   548  	"testing"
   549  
   550  	qt "github.com/frankban/quicktest"
   551  	"go.uber.org/zap"
   552  	"google.golang.org/protobuf/types/known/structpb"
   553  )
   554  
   555  func TestOperator_Execute(t *testing.T) {
   556  	c := qt.New(t)
   557  
   558  	logger := zap.NewNop()
   559  	operator := Init(logger, nil)
   560  
   561  	c.Run("ok - greet", func(c *qt.C) {
   562  		exec, err := operator.CreateExecution(nil, taskGreet)
   563  		c.Assert(err, qt.IsNil)
   564  
   565  		pbIn, err := structpb.NewStruct(map[string]any{"target": "bolero-wombat"})
   566  		c.Assert(err, qt.IsNil)
   567  
   568  		got, err := exec.Execution.Execute([]*structpb.Struct{pbIn})
   569  		c.Check(err, qt.IsNil)
   570  		c.Assert(got, qt.HasLen, 1)
   571  
   572  		// Check JSON in the output string.
   573  		greeting := got[0].Fields["greeting"].GetStringValue()
   574  		c.Check(greeting, qt.Equals, "Hello, bolero-wombat!")
   575  	})
   576  }
   577  
   578  func TestOperator_CreateExecution(t *testing.T) {
   579  	c := qt.New(t)
   580  
   581  	logger := zap.NewNop()
   582  	operator := Init(logger, nil)
   583  
   584  	c.Run("nok - unsupported task", func(c *qt.C) {
   585  		task := "FOOBAR"
   586  
   587  		_, err := operator.CreateExecution(nil, task)
   588  		c.Check(err, qt.ErrorMatches, "unsupported task")
   589  	})
   590  }
   591  ```
   592  
   593  ### Initialize the component
   594  
   595  The last step before being able to use the component in VDP is loading the
   596  `hello` operator. This is done in the `Init` function in
   597  [`pkg/operator/main.go`](../pkg/operator/main.go):
   598  
   599  ```go
   600  package operator
   601  
   602  import (
   603  	// ...
   604  	"github.com/instill-ai/component/pkg/operator/hello/v0"
   605  )
   606  
   607  // ...
   608  
   609  func Init(logger *zap.Logger, usageHandler base.UsageHandler) *OperatorStore {
   610  	once.Do(func() {
   611  		opStore = &OperatorStore{
   612  			operatorUIDMap: map[uuid.UUID]*operator{},
   613  			operatorIDMap:  map[string]*operator{},
   614  		}
   615  		// ...
   616  		opStore.Import(hello.Init(logger, usageHandler))
   617  	})
   618  
   619  	return opStore
   620  }
   621  ```
   622  
   623  ### Use the component in VDP
   624  
   625  Re-run your local `pipeline-backend` build:
   626  
   627  ```sh
   628  $ make stop && make dev
   629  $ docker exec -d pipeline-backend go run ./cmd/worker # run without -d in a separate terminal if you want to access the logs
   630  $ docker exec pipeline-backend go run ./cmd/main
   631  ```
   632  
   633  Head to the console at http://localhost:3000/ (default password is `password`)
   634  and create a pipeline.
   635  
   636  - In the **trigger** component, add a `who` text field.
   637  - Create a **hello** operator and reference the **trigger** input field by adding
   638    `${trigger.who}` to the `target` field.
   639  - In the **response** component, add a `greeting` output value that references the
   640    **hello** output by introducing `${hello_0.output.greeting}`.
   641  
   642  If you introduce a `Wombat` string value in the **trigger** component and
   643  **Run** the pipeline, you should see `Hello, Wombat!` in the response.
   644  
   645  #### Example recipe
   646  
   647  The created pipeline will have the following recipe:
   648  
   649  ```json
   650  {
   651    "version": "v1beta",
   652    "components": [
   653      {
   654        "id": "hello_0",
   655        "operator_component": {
   656          "definition_name": "operator-definitions/hello",
   657          "definition": null,
   658          "task": "TASK_GREET",
   659          "input": {
   660            "target": "${trigger.who}"
   661          },
   662          "condition": ""
   663        }
   664      }
   665    ],
   666    "trigger": {
   667      "trigger_by_request": {
   668        "request_fields": {
   669          "who": {
   670            "title": "Who",
   671            "description": "Who should be greeted?",
   672            "instill_format": "string",
   673            "instill_ui_order": 0,
   674            "instill_ui_multiline": false
   675          }
   676        },
   677        "response_fields": {
   678          "greeting": {
   679            "title": "Greeting",
   680            "description": "",
   681            "value": "${hello_0.output.greeting}",
   682            "instill_ui_order": 0
   683          }
   684        }
   685      }
   686    }
   687  }
   688  ```
   689  
   690  ### Document the component
   691  
   692  Documentation helps user to integrate the component in their pipelines. A good
   693  component definition will have clear names for their fields, which will also
   694  contain useful descriptions. The information described in `definition.json` and
   695  `tasks.json` is enough to understand how a component should be used. `compogen`
   696  is a tool that parses the component configuration and builds a `README.mdx` file
   697  document displaying its information in a human-readable way. To generate the
   698  document, just add the following line on top of `pkg/operator/hello/v0/main.go`:
   699  
   700  ```go
   701  //go:generate compogen readme --operator ./config ./README.mdx
   702  ```
   703  
   704  Then, go to the base of the `component` repository and run:
   705  
   706  ```sh
   707  $ make build-doc && make gen-doc
   708  ```
   709  
   710  ## Sane version control
   711  
   712  The version of a component is useful to track its evolution and to set
   713  expectations about its stability. When the interface of a component (defined by
   714  its configuration files) changes,
   715  its version should change following the Semantic Versioning guidelines.
   716  
   717  - Patch versions are intended for bug fixes.
   718  - Minor versions are intended for backwards-compatible changes, e.g., a new task
   719    or a new input field with a default value.
   720  - Major versions are intended for backwards-incompatible changes.
   721    - At this point, since there might be pipelines using the previous version, a
   722      new package MUST be created. E.g., `operator/pkg/json/v0` -> `operator/pkg/json/v1`.
   723  - Build and pre-release labels are discouraged, as components are shipped as
   724    part of Instill VDP and they aren't likely to need such fine-grained version
   725    control.
   726  
   727  It is recommended to start a component at `v0.1.0`.
   728  A major version 0 is intended for rapid development.
   729  
   730  The `release_stage` property in `definition.json` indicates the stability of a component.
   731  
   732  - A component skeleton (with only the minimal configuration files and a dummy
   733    implementation of the interfaces) may use the _Coming Soon_ or _Open For
   734    Contribution_ stages in order to communicate publicly about upcoming
   735    components. The major and minor versions in this case MUST be 0.
   736  - Alpha pre-releases are used in initial implementations, intended to gather
   737    feedback and issues from early adopters.  Breaking changes are acceptable at
   738    this stage.
   739  - Beta pre-releases are intended for stable components that don't expect
   740    breaking changes.
   741  - General availability indicates production readiness. A broad adoption of the
   742    beta version in production indicates the transition to GA is ready.
   743  
   744  The typical version and release stage evolution of a component might look like
   745  this:
   746  
   747  | Version | Release Stage         |
   748  | :------ | :-------------------- |
   749  | 0.1.0   | `RELEASE_STAGE_ALPHA` |
   750  | 0.1.1   | `RELEASE_STAGE_ALPHA` |
   751  | 0.1.2   | `RELEASE_STAGE_ALPHA` |
   752  | 0.2.0   | `RELEASE_STAGE_ALPHA` |
   753  | 0.2.1   | `RELEASE_STAGE_ALPHA` |
   754  | 0.3.0   | `RELEASE_STAGE_BETA`  |
   755  | 0.3.1   | `RELEASE_STAGE_BETA`  |
   756  | 0.4.0   | `RELEASE_STAGE_BETA`  |
   757  | 1.0.0   | `RELEASE_STAGE_GA`    |
   758  
   759  ## Sending PRs
   760  
   761  Please take these general guidelines into consideration when you are sending a PR:
   762  
   763  1. **Fork the Repository:** Begin by forking the repository to your GitHub account.
   764  2. **Create a New Branch:** Create a new branch to house your work. Use a clear and descriptive name, like `<your-github-username>/<what-your-pr-about>`.
   765  3. **Make and Commit Changes:** Implement your changes and commit them. We encourage you to follow these best practices for commits to ensure an efficient review process:
   766     - Adhere to the [conventional commits guidelines](https://www.conventionalcommits.org/) for meaningful commit messages.
   767     - Follow the [7 rules of commit messages](https://chris.beams.io/posts/git-commit/) for well-structured and informative commits.
   768     - Rearrange commits to squash trivial changes together, if possible. Utilize [git rebase](http://gitready.com/advanced/2009/03/20/reorder-commits-with-rebase.html) for this purpose.
   769  4. **Push to Your Branch:** Push your branch to your GitHub repository: `git push origin feat/<your-feature-name>`.
   770  5. **Open a Pull Request:** Initiate a pull request to our repository. Our team will review your changes and collaborate with you on any necessary refinements.
   771  
   772  When you are ready to send a PR, we recommend you to first open a `draft` one. This will trigger a bunch of `tests` [workflows](https://github.com/instill-ai/component/tree/main/.github/workflows) running a thorough test suite on multiple platforms. After the tests are done and passed, you can now mark the PR `open` to notify the codebase owners to review. We appreciate your endeavour to pass the integration test for your PR to make sure the sanity with respect to the entire scope of **Instill Core**.
   773  
   774  
   775  ## Last words
   776  
   777  Your contributions make a difference. Let's build something amazing together!