StoryCode

docker-compose, models.config, http 테스트 호출

인공지능,AI,학습,ML,Tensorflow, Cafee2,MLFlow/Tensorflow
반응형

docker-compose.yml

    tfsservice:
        build:
            context: .
            dockerfile: Dockerfile.tfs
        image: tensorflow/serving
        container_name: tfs
        hostname: tfs
        command:
            - '--model_config_file=/models/models.config'
            - '--model_config_file_poll_wait_seconds=60'
        volumes:
            - ./volume.tfs/models:/models/
        ports:
            - 8500:8500
            - 8501:8501
        networks:
            - backend

 

models.config

git clone https://github.com/tensorflow/serving


model_config_list:
{
    config:
    {
        name:"serving"
        base_path:"/models/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu"
        model_platform: "tensorflow"
    }
}

 

command line >

curl -d '{"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:8501/v1/models/serving:predict 
{ 
    "predictions": [2.5, 3.0, 4.5 
    ] 
}


v1/models 는 디폴트로 입력
serving 은 models.config 의 name

 

반응형