Skip to content

Commit

Permalink
Add Dockerfile for LIT Vertex AI demo
Browse files Browse the repository at this point in the history
This Dockerfile can be used to build a container image for a LIT Vertex AI demo. The image includes all of the necessary dependencies for running a LIT demo, including the LIT Python package, the Vertex AI SDK, and the gunicorn web server.

The Dockerfile also includes a script for starting the LIT demo. The script takes several arguments, including the name of the demo, the port on which the demo should listen, and the location of the data files for the demo.

The Dockerfile can be used to build a container image for a LIT Vertex AI demo. The image can then be deployed to a Vertex AI endpoint.

PiperOrigin-RevId: 672527408
  • Loading branch information
llcourage authored and LIT team committed Sep 9, 2024
1 parent 75da3ef commit 33c6461
Show file tree
Hide file tree
Showing 4 changed files with 148 additions and 0 deletions.
72 changes: 72 additions & 0 deletions lit_nlp/examples/vertexai/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Copyright 2023 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
# Use the official lightweight Python image.
# https://hub.docker.com/_/python

# ---- LIT Base Container ----

FROM python:3.11-slim AS lit-nlp-base

# Update Ubuntu packages and install basic utils
RUN apt-get update
RUN apt-get install -y wget curl gnupg2 gcc g++ git

# Copy local code to the container image.
ENV APP_HOME /app
WORKDIR $APP_HOME

COPY ./lit_nlp/examples/vertexai/gunicorn_config.py ./



# ---- LIT Container for Hosted Demos ----

FROM lit-nlp-base AS lit-nlp-prod

RUN python -m pip install 'lit-nlp[examples]'

WORKDIR $APP_HOME
ENTRYPOINT ["gunicorn", "--config=gunicorn_config.py"]



# ---- LIT Container for Developing and Testing Hosted Demos ----

FROM lit-nlp-base AS lit-nlp-dev

# Install yarn
RUN curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add -
RUN echo "deb https://dl.yarnpkg.com/debian/ stable main" | \
tee /etc/apt/sources.list.d/yarn.list
RUN apt update && apt -y install yarn

# Set up python environment with production dependencies
# This step is slow as it installs many packages.
COPY ./requirements*.txt ./
RUN python -m pip install -r requirements.txt

# Copy the rest of the lit_nlp package
COPY . ./

# Build front-end with yarn
WORKDIR $APP_HOME/lit_nlp/client
ENV NODE_OPTIONS "--openssl-legacy-provider"
RUN yarn && yarn build && rm -rf node_modules/*

# Run LIT server
# Note that the config file supports configuring the LIT demo that is launched
# via the DEMO_NAME and DEMO_PORT environment variables.
WORKDIR $APP_HOME
ENTRYPOINT ["gunicorn", "--config=gunicorn_config.py"]
29 changes: 29 additions & 0 deletions lit_nlp/examples/vertexai/demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@
"""

from collections.abc import Sequence
import os
import sys
from typing import Optional
from absl import app
Expand Down Expand Up @@ -113,6 +114,34 @@ def get_wsgi_app() -> Optional[dev_server.LitServerType]:
"""Return WSGI app for container-hosted demos."""
FLAGS.set_default('server_type', 'external')
FLAGS.set_default('demo_mode', True)

location = os.getenv('PROJECT_LOCATION', None)
flags.FLAGS.set_default('project_location', location)

project_id = os.getenv('PROJECT_ID', None)
flags.FLAGS.set_default('project_id', project_id)

gemini_models = os.getenv('GEMINI_MODELS', None)
if gemini_models:
gemini_model_list = gemini_models.split(';')
flags.FLAGS.set_default('gemini_models', gemini_model_list)

generative_model_endpoints = os.getenv('GENERATIVE_MODEL_ENDPOINTS', None)
if generative_model_endpoints:
generative_model_endpoints_list = generative_model_endpoints.split(';')
flags.FLAGS.set_default(
'generative_model_endpoints', generative_model_endpoints_list
)

datasets = os.getenv('DATASETS', None)
if datasets:
datasets_list = datasets.split(';')
flags.FLAGS.set_default('datasets', datasets_list)

max_examples = os.getenv('MAX_EXAMPLES', None)
if max_examples:
flags.FLAGS.set_default('max_examples', int(max_examples))

# Parse flags without calling app.run(main), to avoid conflict with
# gunicorn command line flags.
unused = flags.FLAGS(sys.argv, known_only=True)
Expand Down
25 changes: 25 additions & 0 deletions lit_nlp/examples/vertexai/gunicorn_config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""gunicorn configuration for cloud-hosted demos."""

import os

_DEMO_PORT = os.getenv('DEMO_PORT', '5432')

bind = f'0.0.0.0:{_DEMO_PORT}'
timeout = 3600
threads = 8
worker_class = 'gthread'
wsgi_app = 'lit_nlp.examples.vertexai.demo:get_wsgi_app()'
22 changes: 22 additions & 0 deletions lit_nlp/examples/vertexai/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Copyright 2024 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================

-r ../../../requirements_core.txt
-r ../prompt_debugging/requirements.txt

google-cloud-aiplatform>=1.60.0
gunicorn>=20.1.0
lit-nlp>=1.2
vertexai>=1.49.0

0 comments on commit 33c6461

Please sign in to comment.