github.com/alwaysproblem/mlserving-tutorial@v0.0.0-20221124033215-121cfddbfbf4/TFserving/ClientAPI/cpp/cmake-static-lib/README.md (about)

     1  # GO API
     2  
     3  ## Start Server docker
     4  
     5  - [start tfserver](../README.md)
     6  - netowork setting
     7    - this we need to docker image as develop image, so plase find the server ip with `docker network inspect bridge`
     8  
     9      ```bash
    10      $ docker ps # using this to find the server name under NAMES feild.
    11      $ docker network inspect bridge
    12      [
    13          {
    14            ...
    15                  "82811806166f9250d0b1734479db6c368a8b90193811231e5125fdab1dfee6a0": {
    16                      "Name": "focused_borg",   # this is the name of server (need to check)
    17                      "EndpointID": "1af2f89e7617a837f28fe573aeaf5b57d650216167180b00c70a4be11cfb1510",
    18                      "MacAddress": "02:42:ac:11:00:03",
    19                      "IPv4Address": "172.17.0.3/16", # if name is right then this is your server IP
    20                      "IPv6Address": ""
    21            ...
    22          }
    23      ]
    24      ```
    25  
    26  - enter cpp directory
    27  
    28    ```bash
    29    $ cd cpp/
    30    ```
    31  
    32  ## Build your own C++ TFclient (optional)
    33  
    34  - environment preparation (detail on the [doeckerfile](./grpc-cpp.dockerfile))
    35  
    36    - [grpc](https://github.com/grpc/grpc/tree/master/src/cpp)
    37    - [protobuf](https://github.com/protocolbuffers/protobuf/tree/master/src)
    38  
    39  - build docker
    40  
    41    ```bash
    42    $ docker build -t grpc-cpp -f grpc-cpp-static.dockerfile .
    43    ```
    44  
    45  - start and enter `grpc-cpp` shell
    46  
    47    ```bash
    48    $ docker run --rm -ti -v `pwd`:/cpp  grpc-cpp
    49    root@5b9f27acaefe:/# git clone https://github.com/tensorflow/tensorflow
    50    root@5b9f27acaefe:/# git clone https://github.com/tensorflow/serving
    51    root@5b9f27acaefe:/# cd /cpp
    52    root@5b9f27acaefe:/cpp# mkdir gen
    53    root@5b9f27acaefe:/cpp# bash build-cpp-api.sh
    54    root@5b9f27acaefe:/cpp# mv gen ./src
    55    root@5b9f27acaefe:/cpp# cd /cpp/src/predict-service
    56    root@5b9f27acaefe:/cpp# cd /cpp/src/predict-service
    57    root@5b9f27acaefe:/cpp/src/predict-service# make
    58    root@5b9f27acaefe:/cpp/src/predict-service# ./bin/main
    59    # calling prediction service on 172.17.0.3:8500
    60    # call predict ok
    61    # outputs size is 1
    62    #
    63    # output_1:
    64    # 0.999035
    65    # 0.999735
    66    # 0.999927
    67    ```
    68  
    69  ## Run client examples
    70  
    71  - run go client for a simple example
    72    - enter the docker terminal
    73  
    74    ```bash
    75    $ docker run --rm -ti -v `pwd`:/cpp  grpc-cpp # or you can docker exec -ti <docker name> /bin/bash
    76    root@5b9f27acaefe:/# cp -R /cpps/cmake-static-lib /cpp && cd /cpp/src
    77    root@5b9f27acaefe:/cpp/src#
    78    ```
    79  
    80    **assume you are in the src directory**
    81    - generate static library
    82  
    83      ```bash
    84      # run under cmake-static-lib directory
    85      $ mkdir build
    86      $ cd build && cmake ..
    87      $ make install
    88      $ cd ..
    89      # the `lib` contains `libtfclient.a` and `include` contains `header` file
    90      ```
    91  
    92    - request data from server
    93  
    94      ```bash
    95      # run under predict-service directory
    96      $ mkdir build
    97      $ cd build && cmake ..
    98      $ make
    99      $ ./bin/main
   100      # calling prediction service on 172.17.0.3:8500
   101      # call predict ok
   102      # outputs size is 1
   103      #
   104      # output_1:
   105      # 0.999035
   106      # 0.999735
   107      # 0.999927
   108      # Done.
   109      ```
   110  
   111    - request different model name
   112  
   113      ```bash
   114      # run under predict-service directory
   115      $ mkdir build
   116      $ cd build && cmake ..
   117      $ make
   118      $ ./bin/main --model_name Toy
   119      # calling prediction service on 172.17.0.3:8500
   120      # call predict ok
   121      # outputs size is 1
   122      #
   123      # output_1:
   124      # 0.999035
   125      # 0.999735
   126      # 0.999927
   127      # Done.
   128      $ ./bin/main --model_name Toy_double
   129      # calling prediction service on 172.17.0.3:8500
   130      # call predict ok
   131      # outputs size is 1
   132  
   133      # output_1:
   134      # 6.80302
   135      # 8.26209
   136      # 9.72117
   137      # Done.
   138      ```
   139  
   140    - request different version through the version number
   141  
   142      ```bash
   143      # run under predict-service directory
   144      $ mkdir build
   145      $ cd build && cmake ..
   146      $ make
   147      $ ./bin/main --model_name Toy --model_version 1
   148      # calling prediction service on 172.17.0.3:8500
   149      # call predict ok
   150      # outputs size is 1
   151  
   152      # output_1:
   153      # 10.8054
   154      # 14.0101
   155      # 17.2148
   156      # Done.
   157      $ ./bin/main --model_name Toy --model_version 2
   158      # calling prediction service on 172.17.0.3:8500
   159      # call predict ok
   160      # outputs size is 1
   161  
   162      # output_1:
   163      # 0.999035
   164      # 0.999735
   165      # 0.999927
   166      # Done.
   167      ```
   168  
   169    - request different version through the version annotation
   170  
   171      ```bash
   172      # run under predict-service directory
   173      $ mkdir build
   174      $ cd build && cmake ..
   175      $ make
   176      $ ./bin/main --model_name Toy --model_version_label stable
   177      # calling prediction service on 172.17.0.3:8500
   178      # call predict ok
   179      # outputs size is 1
   180  
   181      # output_1:
   182      # 10.8054
   183      # 14.0101
   184      # 17.2148
   185      # Done.
   186      $ ./bin/main --model_name Toy --model_version_label canary
   187      # calling prediction service on 172.17.0.3:8500
   188      # call predict ok
   189      # outputs size is 1
   190  
   191      # output_1:
   192      # 0.999035
   193      # 0.999735
   194      # 0.999927
   195      # Done.
   196      ```
   197  
   198    - request multiple task model <!--  TODO: -->
   199  
   200      ```bash
   201      $ cd ...
   202      $ make
   203      $ ./bin/main
   204      ```
   205  
   206    - request model status
   207  
   208      ```bash
   209      # run under model-status directory
   210      $ mkdir build
   211      $ cd build && cmake ..
   212      $ make
   213      $ ./bin/main --model_name Toy
   214      # calling model service on 172.17.0.3:8500
   215      # model_spec {
   216      #   name: "Toy"
   217      #   signature_name: "serving_default"
   218      # }
   219      #
   220      # call predict ok
   221      # metadata size is 0
   222      # metadata DebugString is
   223      # model_version_status {
   224      #   version: 3
   225      #   state: END
   226      #   status {
   227      #   }
   228      # }
   229      # model_version_status {
   230      #   version: 2
   231      #   state: AVAILABLE
   232      #   status {
   233      #   }
   234      # }
   235      # model_version_status {
   236      #   version: 1
   237      #   state: AVAILABLE
   238      #   status {
   239      #   }
   240      # }
   241      ```
   242  
   243    - request model metadata
   244  
   245      ```bash
   246      # run under model-metadata directory
   247      $ mkdir build
   248      $ cd build && cmake ..
   249      $ make
   250      $ ./bin/main --model_name Toy
   251      # calling prediction service on 172.17.0.3:8500
   252      # call predict ok
   253      # metadata size is 1
   254      # metadata DebugString is
   255      # model_spec {
   256      #   name: "Toy"
   257      #   version {
   258      #     value: 2
   259      #   }
   260      # }
   261      # metadata {
   262      #   key: "signature_def"
   263      #   value {
   264      #     [type.googleapis.com/tensorflow.serving.SignatureDefMap] {
   265      #       signature_def {
   266      #         key: "__saved_model_init_op"
   267      #         value {
   268      #           outputs {
   269      #             key: "__saved_model_init_op"
   270      #             value {
   271      #               name: "NoOp"
   272      #               tensor_shape {
   273      #                 unknown_rank: true
   274      #               }
   275      #             }
   276      #           }
   277      #         }
   278      #       }
   279      #       signature_def {
   280      #         key: "serving_default"
   281      #         value {
   282      #           inputs {
   283      #             key: "input_1"
   284      #             value {
   285      #               name: "serving_default_input_1:0"
   286      #               dtype: DT_FLOAT
   287      #               tensor_shape {
   288      #                 dim {
   289      #                   size: -1
   290      #                 }
   291      #                 dim {
   292      #                   size: 2
   293      #                 }
   294      #               }
   295      #             }
   296      #           }
   297      #           outputs {
   298      #             key: "output_1"
   299      #             value {
   300      #               name: "StatefulPartitionedCall:0"
   301      #               dtype: DT_FLOAT
   302      #               tensor_shape {
   303      #                 dim {
   304      #                   size: -1
   305      #                 }
   306      #                 dim {
   307      #                   size: 1
   308      #                 }
   309      #               }
   310      #             }
   311      #           }
   312      #           method_name: "tensorflow/serving/predict"
   313      #         }
   314      #       }
   315      #     }
   316      #   }
   317      # }
   318      #
   319      ```
   320  
   321    - reload model through gRPC API
   322  
   323      ```bash
   324      # run under model-reload directory
   325      $ mkdir build
   326      $ cd build && cmake ..
   327      $ make
   328      $ ./bin/main --model_name Toy
   329      # calling model service on 172.17.0.3:8500
   330      # call model service ok
   331      # model Toy reloaded successfully.
   332      ```
   333  
   334    - request model log
   335  
   336      ```bash
   337      # run under predict-log directory
   338      $ mkdir build
   339      $ cd build && cmake ..
   340      $ make
   341      $ ./bin/main --model_name Toy # --model_version 1 --model_version_label stable
   342      # calling prediction service on 172.17.0.3:8500
   343      # call predict ok
   344      # outputs size is 1
   345  
   346      # output_1:
   347      # 0.999035
   348      # 0.999735
   349      # 0.999927
   350      # ********************Predict Log*********************
   351      # request {
   352      #   model_spec {
   353      #     name: "Toy"
   354      #     signature_name: "serving_default"
   355      #   }
   356      #   inputs {
   357      #     key: "input_1"
   358      #     value {
   359      #       dtype: DT_FLOAT
   360      #       tensor_shape {
   361      #         dim {
   362      #           size: 3
   363      #         }
   364      #         dim {
   365      #           size: 2
   366      #         }
   367      #       }
   368      #       float_val: 1
   369      #       float_val: 2
   370      #       float_val: 1
   371      #       float_val: 3
   372      #       float_val: 1
   373      #       float_val: 4
   374      #     }
   375      #   }
   376      # }
   377      # response {
   378      #   outputs {
   379      #     key: "output_1"
   380      #     value {
   381      #       dtype: DT_FLOAT
   382      #       tensor_shape {
   383      #         dim {
   384      #           size: 3
   385      #         }
   386      #         dim {
   387      #           size: 1
   388      #         }
   389      #       }
   390      #       float_val: 0.999035
   391      #       float_val: 0.999734938
   392      #       float_val: 0.999927282
   393      #     }
   394      #   }
   395      #   model_spec {
   396      #     name: "Toy"
   397      #     version {
   398      #       value: 2
   399      #     }
   400      #     signature_name: "serving_default"
   401      #   }
   402      # }
   403      # ****************************************************
   404      # Done.
   405      ```